WorldWideScience

Sample records for largest prediction information

  1. Predicting Traffic Flow in Local Area Networks by the Largest Lyapunov Exponent

    Directory of Open Access Journals (Sweden)

    Yan Liu

    2016-01-01

    Full Text Available The dynamics of network traffic are complex and nonlinear, and chaotic behaviors and their prediction, which play an important role in local area networks (LANs, are studied in detail, using the largest Lyapunov exponent. With the introduction of phase space reconstruction based on the time sequence, the high-dimensional traffic is projected onto the low dimension reconstructed phase space, and a reduced dynamic system is obtained from the dynamic system viewpoint. Then, a numerical method for computing the largest Lyapunov exponent of the low-dimensional dynamic system is presented. Further, the longest predictable time, which is related to chaotic behaviors in the system, is studied using the largest Lyapunov exponent, and the Wolf method is used to predict the evolution of the traffic in a local area network by both Dot and Interval predictions, and a reliable result is obtained by the presented method. As the conclusion, the results show that the largest Lyapunov exponent can be used to describe the sensitivity of the trajectory in the reconstructed phase space to the initial values. Moreover, Dot Prediction can effectively predict the flow burst. The numerical simulation also shows that the presented method is feasible and efficient for predicting the complex dynamic behaviors in LAN traffic, especially for congestion and attack in networks, which are the main two complex phenomena behaving as chaos in networks.

  2. Exploiting Information Diffusion Feature for Link Prediction in Sina Weibo.

    Science.gov (United States)

    Li, Dong; Zhang, Yongchao; Xu, Zhiming; Chu, Dianhui; Li, Sheng

    2016-01-28

    The rapid development of online social networks (e.g., Twitter and Facebook) has promoted research related to social networks in which link prediction is a key problem. Although numerous attempts have been made for link prediction based on network structure, node attribute and so on, few of the current studies have considered the impact of information diffusion on link creation and prediction. This paper mainly addresses Sina Weibo, which is the largest microblog platform with Chinese characteristics, and proposes the hypothesis that information diffusion influences link creation and verifies the hypothesis based on real data analysis. We also detect an important feature from the information diffusion process, which is used to promote link prediction performance. Finally, the experimental results on Sina Weibo dataset have demonstrated the effectiveness of our methods.

  3. Exploiting Information Diffusion Feature for Link Prediction in Sina Weibo

    Science.gov (United States)

    Li, Dong; Zhang, Yongchao; Xu, Zhiming; Chu, Dianhui; Li, Sheng

    2016-01-01

    The rapid development of online social networks (e.g., Twitter and Facebook) has promoted research related to social networks in which link prediction is a key problem. Although numerous attempts have been made for link prediction based on network structure, node attribute and so on, few of the current studies have considered the impact of information diffusion on link creation and prediction. This paper mainly addresses Sina Weibo, which is the largest microblog platform with Chinese characteristics, and proposes the hypothesis that information diffusion influences link creation and verifies the hypothesis based on real data analysis. We also detect an important feature from the information diffusion process, which is used to promote link prediction performance. Finally, the experimental results on Sina Weibo dataset have demonstrated the effectiveness of our methods.

  4. Approaching the largest ‘API’: extracting information from the Internet with Python

    Directory of Open Access Journals (Sweden)

    Jonathan E. Germann

    2018-02-01

    Full Text Available This article explores the need for libraries to algorithmically access and manipulate the world’s largest API: the Internet. The billions of pages on the ‘Internet API’ (HTTP, HTML, CSS, XPath, DOM, etc. are easily accessible and manipulable. Libraries can assist in creating meaning through the datafication of information on the world wide web. Because most information is created for human consumption, some programming is required for automated extraction. Python is an easy-to-learn programming language with extensive packages and community support for web page automation. Four packages (Urllib, Selenium, BeautifulSoup, Scrapy in Python can automate almost any web page for all sized projects. An example warrant data project is explained to illustrate how well Python packages can manipulate web pages to create meaning through assembling custom datasets.

  5. Testing earthquake prediction algorithms: Statistically significant advance prediction of the largest earthquakes in the Circum-Pacific, 1992-1997

    Science.gov (United States)

    Kossobokov, V.G.; Romashkova, L.L.; Keilis-Borok, V. I.; Healy, J.H.

    1999-01-01

    Algorithms M8 and MSc (i.e., the Mendocino Scenario) were used in a real-time intermediate-term research prediction of the strongest earthquakes in the Circum-Pacific seismic belt. Predictions are made by M8 first. Then, the areas of alarm are reduced by MSc at the cost that some earthquakes are missed in the second approximation of prediction. In 1992-1997, five earthquakes of magnitude 8 and above occurred in the test area: all of them were predicted by M8 and MSc identified correctly the locations of four of them. The space-time volume of the alarms is 36% and 18%, correspondingly, when estimated with a normalized product measure of empirical distribution of epicenters and uniform time. The statistical significance of the achieved results is beyond 99% both for M8 and MSc. For magnitude 7.5 + , 10 out of 19 earthquakes were predicted by M8 in 40% and five were predicted by M8-MSc in 13% of the total volume considered. This implies a significance level of 81% for M8 and 92% for M8-MSc. The lower significance levels might result from a global change in seismic regime in 1993-1996, when the rate of the largest events has doubled and all of them become exclusively normal or reversed faults. The predictions are fully reproducible; the algorithms M8 and MSc in complete formal definitions were published before we started our experiment [Keilis-Borok, V.I., Kossobokov, V.G., 1990. Premonitory activation of seismic flow: Algorithm M8, Phys. Earth and Planet. Inter. 61, 73-83; Kossobokov, V.G., Keilis-Borok, V.I., Smith, S.W., 1990. Localization of intermediate-term earthquake prediction, J. Geophys. Res., 95, 19763-19772; Healy, J.H., Kossobokov, V.G., Dewey, J.W., 1992. A test to evaluate the earthquake prediction algorithm, M8. U.S. Geol. Surv. OFR 92-401]. M8 is available from the IASPEI Software Library [Healy, J.H., Keilis-Borok, V.I., Lee, W.H.K. (Eds.), 1997. Algorithms for Earthquake Statistics and Prediction, Vol. 6. IASPEI Software Library]. ?? 1999 Elsevier

  6. THE CHALLENGE OF THE LARGEST STRUCTURES IN THE UNIVERSE TO COSMOLOGY

    International Nuclear Information System (INIS)

    Park, Changbom; Choi, Yun-Young; Kim, Sungsoo S.; Kim, Kap-Sung; Kim, Juhan; Gott III, J. Richard

    2012-01-01

    Large galaxy redshift surveys have long been used to constrain cosmological models and structure formation scenarios. In particular, the largest structures discovered observationally are thought to carry critical information on the amplitude of large-scale density fluctuations or homogeneity of the universe, and have often challenged the standard cosmological framework. The Sloan Great Wall (SGW) recently found in the Sloan Digital Sky Survey (SDSS) region casts doubt on the concordance cosmological model with a cosmological constant (i.e., the flat ΛCDM model). Here we show that the existence of the SGW is perfectly consistent with the ΛCDM model, a result that only our very large cosmological N-body simulation (the Horizon Run 2, HR2) could supply. In addition, we report on the discovery of a void complex in the SDSS much larger than the SGW, and show that such size of the largest void is also predicted in the ΛCDM paradigm. Our results demonstrate that an initially homogeneous isotropic universe with primordial Gaussian random phase density fluctuations growing in accordance with the general relativity can explain the richness and size of the observed large-scale structures in the SDSS. Using the HR2 simulation we predict that a future galaxy redshift survey about four times deeper or with 3 mag fainter limit than the SDSS should reveal a largest structure of bright galaxies about twice as big as the SGW.

  7. Synchrotron Emission on the Largest Scales: Radio Detection of the ...

    Indian Academy of Sciences (India)

    Abstract. Shocks and turbulence generated during large-scale structure formation are predicted to produce large-scale, low surface-brightness synchrotron emission. On the largest scales, this emission is globally correlated with the thermal baryon distribution, and constitutes the 'syn- chrotron cosmic-web'. I present the ...

  8. Information assessment on predicting protein-protein interactions

    Directory of Open Access Journals (Sweden)

    Gerstein Mark

    2004-10-01

    Full Text Available Abstract Background Identifying protein-protein interactions is fundamental for understanding the molecular machinery of the cell. Proteome-wide studies of protein-protein interactions are of significant value, but the high-throughput experimental technologies suffer from high rates of both false positive and false negative predictions. In addition to high-throughput experimental data, many diverse types of genomic data can help predict protein-protein interactions, such as mRNA expression, localization, essentiality, and functional annotation. Evaluations of the information contributions from different evidences help to establish more parsimonious models with comparable or better prediction accuracy, and to obtain biological insights of the relationships between protein-protein interactions and other genomic information. Results Our assessment is based on the genomic features used in a Bayesian network approach to predict protein-protein interactions genome-wide in yeast. In the special case, when one does not have any missing information about any of the features, our analysis shows that there is a larger information contribution from the functional-classification than from expression correlations or essentiality. We also show that in this case alternative models, such as logistic regression and random forest, may be more effective than Bayesian networks for predicting interactions. Conclusions In the restricted problem posed by the complete-information subset, we identified that the MIPS and Gene Ontology (GO functional similarity datasets as the dominating information contributors for predicting the protein-protein interactions under the framework proposed by Jansen et al. Random forests based on the MIPS and GO information alone can give highly accurate classifications. In this particular subset of complete information, adding other genomic data does little for improving predictions. We also found that the data discretizations used in the

  9. Predictive Analytics in Information Systems Research

    NARCIS (Netherlands)

    G. Shmueli (Galit); O.R. Koppius (Otto)

    2011-01-01

    textabstractThis research essay highlights the need to integrate predictive analytics into information systems research and shows several concrete ways in which this goal can be accomplished. Predictive analytics include empirical methods (statistical and other) that generate data predictions as

  10. Predicting incident size from limited information

    International Nuclear Information System (INIS)

    Englehardt, J.D.

    1995-01-01

    Predicting the size of low-probability, high-consequence natural disasters, industrial accidents, and pollutant releases is often difficult due to limitations in the availability of data on rare events and future circumstances. When incident data are available, they may be difficult to fit with a lognormal distribution. Two Bayesian probability distributions for inferring future incident-size probabilities from limited, indirect, and subjective information are proposed in this paper. The distributions are derived from Pareto distributions that are shown to fit data on different incident types and are justified theoretically. The derived distributions incorporate both inherent variability and uncertainty due to information limitations. Results were analyzed to determine the amount of data needed to predict incident-size probabilities in various situations. Information requirements for incident-size prediction using the methods were low, particularly when the population distribution had a thick tail. Use of the distributions to predict accumulated oil-spill consequences was demonstrated

  11. Predictive Analytics in Information Systems Research

    OpenAIRE

    Shmueli, Galit; Koppius, Otto

    2011-01-01

    textabstractThis research essay highlights the need to integrate predictive analytics into information systems research and shows several concrete ways in which this goal can be accomplished. Predictive analytics include empirical methods (statistical and other) that generate data predictions as well as methods for assessing predictive power. Predictive analytics not only assist in creating practically useful models, they also play an important role alongside explanatory modeling in theory bu...

  12. Changes in Pilot Behavior with Predictive System Status Information

    Science.gov (United States)

    Trujillo, Anna C.

    1998-01-01

    Research has shown a strong pilot preference for predictive information of aircraft system status in the flight deck. However, changes in pilot behavior associated with using this predictive information have not been ascertained. The study described here quantified these changes using three types of predictive information (none, whether a parameter was changing abnormally, and the time for a parameter to reach an alert range) and three initial time intervals until a parameter alert range was reached (ITIs) (1 minute, 5 minutes, and 15 minutes). With predictive information, subjects accomplished most of their tasks before an alert occurred. Subjects organized the time they did their tasks by locus-of-control with no predictive information and for the 1-minute ITI, and by aviatenavigate-communicate for the time for a parameter to reach an alert range and the 15-minute conditions. Overall, predictive information and the longer ITIs moved subjects to performing tasks before the alert actually occurred and had them more mission oriented as indicated by their tasks grouping of aviate-navigate-communicate.

  13. Suboptimal choice, reward-predictive signals, and temporal information.

    Science.gov (United States)

    Cunningham, Paul J; Shahan, Timothy A

    2018-01-01

    Suboptimal choice refers to preference for an alternative offering a low probability of food (suboptimal alternative) over an alternative offering a higher probability of food (optimal alternative). Numerous studies have found that stimuli signaling probabilistic food play a critical role in the development and maintenance of suboptimal choice. However, there is still much debate about how to characterize how these stimuli influence suboptimal choice. There is substantial evidence that the temporal information conveyed by a food-predictive signal governs its function as both a Pavlovian conditioned stimulus and as an instrumental conditioned reinforcer. Thus, we explore the possibility that food-predictive signals influence suboptimal choice via the temporal information they convey. Application of this temporal information-theoretic approach to suboptimal choice provides a formal, quantitative framework that describes how food-predictive signals influence suboptimal choice in a manner consistent with related phenomena in Pavlovian conditioning and conditioned reinforcement. Our reanalysis of previous data on suboptimal choice suggests that, generally speaking, preference in the suboptimal choice procedure tracks relative temporal information conveyed by food-predictive signals for the suboptimal and optimal alternatives. The model suggests that suboptimal choice develops when the food-predictive signal for the suboptimal alternative conveys more temporal information than that for the optimal alternative. Finally, incorporating a role for competition between temporal information provided by food-predictive signals and relative primary reinforcement rate provides a reasonable account of existing data on suboptimal choice. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  14. Predicting Information Flows in Network Traffic.

    Science.gov (United States)

    Hinich, Melvin J.; Molyneux, Robert E.

    2003-01-01

    Discusses information flow in networks and predicting network traffic and describes a study that uses time series analysis on a day's worth of Internet log data. Examines nonlinearity and traffic invariants, and suggests that prediction of network traffic may not be possible with current techniques. (Author/LRW)

  15. Curiosity and reward: Valence predicts choice and information prediction errors enhance learning.

    Science.gov (United States)

    Marvin, Caroline B; Shohamy, Daphna

    2016-03-01

    Curiosity drives many of our daily pursuits and interactions; yet, we know surprisingly little about how it works. Here, we harness an idea implied in many conceptualizations of curiosity: that information has value in and of itself. Reframing curiosity as the motivation to obtain reward-where the reward is information-allows one to leverage major advances in theoretical and computational mechanisms of reward-motivated learning. We provide new evidence supporting 2 predictions that emerge from this framework. First, we find an asymmetric effect of positive versus negative information, with positive information enhancing both curiosity and long-term memory for information. Second, we find that it is not the absolute value of information that drives learning but, rather, the gap between the reward expected and reward received, an "information prediction error." These results support the idea that information functions as a reward, much like money or food, guiding choices and driving learning in systematic ways. (c) 2016 APA, all rights reserved).

  16. Predictive medical information and underwriting.

    Science.gov (United States)

    Dodge, John H

    2007-01-01

    Medical underwriting involves the application of actuarial science by analyzing medical information to predict the future risk of a claim. The objective is that individuals with like risk are treated in a like manner so that the premium paid is proportional to the risk of future claim.

  17. Empirical Information Metrics for Prediction Power and Experiment Planning

    Directory of Open Access Journals (Sweden)

    Christopher Lee

    2011-01-01

    Full Text Available In principle, information theory could provide useful metrics for statistical inference. In practice this is impeded by divergent assumptions: Information theory assumes the joint distribution of variables of interest is known, whereas in statistical inference it is hidden and is the goal of inference. To integrate these approaches we note a common theme they share, namely the measurement of prediction power. We generalize this concept as an information metric, subject to several requirements: Calculation of the metric must be objective or model-free; unbiased; convergent; probabilistically bounded; and low in computational complexity. Unfortunately, widely used model selection metrics such as Maximum Likelihood, the Akaike Information Criterion and Bayesian Information Criterion do not necessarily meet all these requirements. We define four distinct empirical information metrics measured via sampling, with explicit Law of Large Numbers convergence guarantees, which meet these requirements: Ie, the empirical information, a measure of average prediction power; Ib, the overfitting bias information, which measures selection bias in the modeling procedure; Ip, the potential information, which measures the total remaining information in the observations not yet discovered by the model; and Im, the model information, which measures the model’s extrapolation prediction power. Finally, we show that Ip + Ie, Ip + Im, and Ie — Im are fixed constants for a given observed dataset (i.e. prediction target, independent of the model, and thus represent a fundamental subdivision of the total information contained in the observations. We discuss the application of these metrics to modeling and experiment planning.    

  18. Learning and Prediction of Slip from Visual Information

    Science.gov (United States)

    Angelova, Anelia; Matthies, Larry; Helmick, Daniel; Perona, Pietro

    2007-01-01

    This paper presents an approach for slip prediction from a distance for wheeled ground robots using visual information as input. Large amounts of slippage which can occur on certain surfaces, such as sandy slopes, will negatively affect rover mobility. Therefore, obtaining information about slip before entering such terrain can be very useful for better planning and avoiding these areas. To address this problem, terrain appearance and geometry information about map cells are correlated to the slip measured by the rover while traversing each cell. This relationship is learned from previous experience, so slip can be predicted remotely from visual information only. The proposed method consists of terrain type recognition and nonlinear regression modeling. The method has been implemented and tested offline on several off-road terrains including: soil, sand, gravel, and woodchips. The final slip prediction error is about 20%. The system is intended for improved navigation on steep slopes and rough terrain for Mars rovers.

  19. Loy Yang A - Australia's largest privatisation

    International Nuclear Information System (INIS)

    Yenckin, C.

    1997-01-01

    The recent A$4,746 million privatisation of the 2000MW Loy Yang A power station and the Loy Yang coal mine by the Victorian Government is Australia's largest privatisation and one of 1997's largest project financing deals. (author)

  20. The Environmental Responsibility of the World’s Largest Banks

    Directory of Open Access Journals (Sweden)

    Ryszawska Bożena

    2018-03-01

    Full Text Available Sustainability transition is changing the role and function of banks, specially their products and services also in relation to stakeholders. Banks are one of the main actors supporting the transition to sustainable economy. The purpose of this study is to emphasise the role of world’s largest banks in that process. Banks are slowly responding to the new demand of sustainability and responsibility, and they try to align with it. The paper is based on an overview of the world’s five largest banks that employ corporate social responsibility (CSR reporting standards, together with detailed enumeration of pro-environmental activities included in the reports. The first section of this paper presents the most popular approaches to the problem at hand, as reported in professional literature. Section two presents the characteristics of the CSR actions in banks. The third section discusses the environmental actions of the biggest banks in Global Reporting Initiative (GRI reporting the most popular standard for reporting non-financial information. And the last part of the paper presents the conclusions resulting from the article. The research was conducted using a variety of sources, such as scientific articles, statistical data, CSR reports of the world’s largest banks, as well reporting principles and standard disclosures. The basic method used in the process of writing was a critical analysis of literature and reports concerning the CSR reporting standards, environmental responsibilities of different kinds of entities, as well as own observations based on special reports of banks. In the article, also the analysis of financial market data, induction method and comparison method have been used. The main conclusions of the analysis of the CSR reports disclosed by the world’s largest banks confirm all three of the theses presented in the article. The findings suggest that the banks under study can be regarded as environmentally responsible

  1. Mean Velocity Prediction Information Feedback Strategy in Two-Route Systems under ATIS

    Directory of Open Access Journals (Sweden)

    Jianqiang Wang

    2015-02-01

    Full Text Available Feedback contents of previous information feedback strategies in advanced traveler information systems are almost real-time traffic information. Compared with real-time information, prediction traffic information obtained by a reliable and effective prediction algorithm has many undisputable advantages. In prediction information environment, a traveler is prone to making a more rational route-choice. For these considerations, a mean velocity prediction information feedback strategy (MVPFS is presented. The approach adopts the autoregressive-integrated moving average model (ARIMA to forecast short-term traffic flow. Furthermore, prediction results of mean velocity are taken as feedback contents and displayed on a variable message sign to guide travelers' route-choice. Meanwhile, discrete choice model (Logit model is selected to imitate more appropriately travelers' route-choice behavior. In order to investigate the performance of MVPFS, a cellular automaton model with ARIMA is adopted to simulate a two-route scenario. The simulation shows that such innovative prediction feedback strategy is feasible and efficient. Even more importantly, this study demonstrates the excellence of prediction feedback ideology.

  2. SPEEDI: system for prediction of environmental emergency dose information

    International Nuclear Information System (INIS)

    Chino, Masamichi; Ishikawa, Hirohiko; Kai, Michiaki

    1984-03-01

    In this report a computer code system for prediction of environmental emergency dose information , i.e., SPEEDI for short, is presented. In case of an accidental release of radioactive materials from a nuclear plant, it is very important for an emergency planning to predict the concentration and dose caused by the materials. The SPEEDI code system has been developed for this purpose and it has features to predict by calculation the released nuclides, wind fields, concentrations and dose based on release information, actual weather and topographical data. (author)

  3. Largest US oil and gas fields, August 1993

    Energy Technology Data Exchange (ETDEWEB)

    1993-08-06

    The Largest US Oil and Gas Fields is a technical report and part of an Energy Information Administration (EIA) series presenting distributions of US crude oil and natural gas resources, developed using field-level data collected by EIA`s annual survey of oil and gas proved reserves. The series` objective is to provide useful information beyond that routinely presented in the EIA annual report on crude oil and natural gas reserves. These special reports also will provide oil and gas resource analysts with a fuller understanding of the nature of US crude oil and natural gas occurrence, both at the macro level and with respect to the specific subjects addressed. The series` approach is to integrate EIA`s crude oil and natural gas survey data with related data obtained from other authoritative sources, and then to present illustrations and analyses of interest to a broad spectrum of energy information users ranging from the general public to oil and gas industry personnel.

  4. Largest US oil and gas fields, August 1993

    International Nuclear Information System (INIS)

    1993-01-01

    The Largest US Oil and Gas Fields is a technical report and part of an Energy Information Administration (EIA) series presenting distributions of US crude oil and natural gas resources, developed using field-level data collected by EIA's annual survey of oil and gas proved reserves. The series' objective is to provide useful information beyond that routinely presented in the EIA annual report on crude oil and natural gas reserves. These special reports also will provide oil and gas resource analysts with a fuller understanding of the nature of US crude oil and natural gas occurrence, both at the macro level and with respect to the specific subjects addressed. The series' approach is to integrate EIA's crude oil and natural gas survey data with related data obtained from other authoritative sources, and then to present illustrations and analyses of interest to a broad spectrum of energy information users ranging from the general public to oil and gas industry personnel

  5. Hydrodynamic and Inundation Modeling of China’s Largest Freshwater Lake Aided by Remote Sensing Data

    Directory of Open Access Journals (Sweden)

    Peng Zhang

    2015-04-01

    Full Text Available China’s largest freshwater lake, Poyang Lake, is characterized by rapid changes in its inundation area and hydrodynamics, so in this study, a hydrodynamic model of Poyang Lake was established to simulate these long-term changes. Inundation information was extracted from Moderate Resolution Imaging Spectroradiometer (MODIS remote sensing data and used to calibrate the wetting and drying parameter by assessing the accuracy of the simulated inundation area and its boundary. The bottom friction parameter was calibrated using current velocity measurements from Acoustic Doppler Current Profilers (ADCP. The results show the model is capable of predicting the inundation area dynamic through cross-validation with remotely sensed inundation data, and can reproduce the seasonal dynamics of the water level, and water discharge through a comparison with hydrological data. Based on the model results, the characteristics of the current velocities of the lake in the wet season and the dry season of the lake were explored, and the potential effect of the current dynamic on water quality patterns was discussed. The model is a promising basic tool for prediction and management of the water resource and water quality of Poyang Lake.

  6. Vertical structure of predictability and information transport over the Northern Hemisphere

    International Nuclear Information System (INIS)

    Feng Ai-Xia; Wang Qi-Gang; Gong Zhi-Qiang; Feng Guo-Lin

    2014-01-01

    Based on nonlinear prediction and information theory, vertical heterogeneity of predictability and information loss rate in geopotential height field are obtained over the Northern Hemisphere. On a seasonal-to-interannual time scale, the predictability is low in the lower troposphere and high in the mid-upper troposphere. However, within mid-upper troposphere over the subtropics ocean area, there is a relatively poor predictability. These conclusions also fit the seasonal time scale. Moving to the interannual time scale, the predictability becomes high in the lower troposphere and low in the mid-upper troposphere, contrary to the former case. On the whole the interannual trend is more predictable than the seasonal trend. The average information loss rate is low over the mid-east Pacific, west of North America, Atlantic and Eurasia, and the atmosphere over other places has a relatively high information loss rate on all-time scales. Two channels are found steadily over the Pacific Ocean and Atlantic Ocean in subtropics. There are also unstable channels. The four-season influence on predictability and information communication are studied. The predictability is low, no matter which season data are removed and each season plays an important role in the existence of the channels, except for the winter. The predictability and teleconnections are paramount issues in atmospheric science, and the teleconnections may be established by communication channels. So, this work is interesting since it reveals the vertical structure of predictability distribution, channel locations, and the contributions of different time scales to them and their variations under different seasons. (geophysics, astronomy, and astrophysics)

  7. Basic disturbances of information processing in psychosis prediction.

    Science.gov (United States)

    Bodatsch, Mitja; Klosterkötter, Joachim; Müller, Ralf; Ruhrmann, Stephan

    2013-01-01

    The basic symptoms (BS) approach provides a valid instrument in predicting psychosis onset and represents moreover a significant heuristic framework for research. The term "basic symptoms" denotes subtle changes of cognition and perception in the earliest and prodromal stages of psychosis development. BS are thought to correspond to disturbances of neural information processing. Following the heuristic implications of the BS approach, the present paper aims at exploring disturbances of information processing, revealed by functional magnetic resonance imaging (fMRI) and electro-encephalographic as characteristics of the at-risk state of psychosis. Furthermore, since high-risk studies employing ultra-high-risk criteria revealed non-conversion rates commonly exceeding 50%, thus warranting approaches that increase specificity, the potential contribution of neural information processing disturbances to psychosis prediction is reviewed. In summary, the at-risk state seems to be associated with information processing disturbances. Moreover, fMRI investigations suggested that disturbances of language processing domains might be a characteristic of the prodromal state. Neurophysiological studies revealed that disturbances of sensory processing may assist psychosis prediction in allowing for a quantification of risk in terms of magnitude and time. The latter finding represents a significant advancement since an estimation of the time to event has not yet been achieved by clinical approaches. Some evidence suggests a close relationship between self-experienced BS and neural information processing. With regard to future research, the relationship between neural information processing disturbances and different clinical risk concepts warrants further investigations. Thereby, a possible time sequence in the prodromal phase might be of particular interest.

  8. Speech Intelligibility Prediction Based on Mutual Information

    DEFF Research Database (Denmark)

    Jensen, Jesper; Taal, Cees H.

    2014-01-01

    This paper deals with the problem of predicting the average intelligibility of noisy and potentially processed speech signals, as observed by a group of normal hearing listeners. We propose a model which performs this prediction based on the hypothesis that intelligibility is monotonically related...... to the mutual information between critical-band amplitude envelopes of the clean signal and the corresponding noisy/processed signal. The resulting intelligibility predictor turns out to be a simple function of the mean-square error (mse) that arises when estimating a clean critical-band amplitude using...... a minimum mean-square error (mmse) estimator based on the noisy/processed amplitude. The proposed model predicts that speech intelligibility cannot be improved by any processing of noisy critical-band amplitudes. Furthermore, the proposed intelligibility predictor performs well ( ρ > 0.95) in predicting...

  9. Predicting the future trend of popularity by network diffusion

    Science.gov (United States)

    Zeng, An; Yeung, Chi Ho

    2016-06-01

    Conventional approaches to predict the future popularity of products are mainly based on extrapolation of their current popularity, which overlooks the hidden microscopic information under the macroscopic trend. Here, we study diffusion processes on consumer-product and citation networks to exploit the hidden microscopic information and connect consumers to their potential purchase, publications to their potential citers to obtain a prediction for future item popularity. By using the data obtained from the largest online retailers including Netflix and Amazon as well as the American Physical Society citation networks, we found that our method outperforms the accurate short-term extrapolation and identifies the potentially popular items long before they become prominent.

  10. Predicting the future trend of popularity by network diffusion.

    Science.gov (United States)

    Zeng, An; Yeung, Chi Ho

    2016-06-01

    Conventional approaches to predict the future popularity of products are mainly based on extrapolation of their current popularity, which overlooks the hidden microscopic information under the macroscopic trend. Here, we study diffusion processes on consumer-product and citation networks to exploit the hidden microscopic information and connect consumers to their potential purchase, publications to their potential citers to obtain a prediction for future item popularity. By using the data obtained from the largest online retailers including Netflix and Amazon as well as the American Physical Society citation networks, we found that our method outperforms the accurate short-term extrapolation and identifies the potentially popular items long before they become prominent.

  11. Predictable information in neural signals during resting state is reduced in autism spectrum disorder.

    Science.gov (United States)

    Brodski-Guerniero, Alla; Naumer, Marcus J; Moliadze, Vera; Chan, Jason; Althen, Heike; Ferreira-Santos, Fernando; Lizier, Joseph T; Schlitt, Sabine; Kitzerow, Janina; Schütz, Magdalena; Langer, Anne; Kaiser, Jochen; Freitag, Christine M; Wibral, Michael

    2018-04-04

    The neurophysiological underpinnings of the nonsocial symptoms of autism spectrum disorder (ASD) which include sensory and perceptual atypicalities remain poorly understood. Well-known accounts of less dominant top-down influences and more dominant bottom-up processes compete to explain these characteristics. These accounts have been recently embedded in the popular framework of predictive coding theory. To differentiate between competing accounts, we studied altered information dynamics in ASD by quantifying predictable information in neural signals. Predictable information in neural signals measures the amount of stored information that is used for the next time step of a neural process. Thus, predictable information limits the (prior) information which might be available for other brain areas, for example, to build predictions for upcoming sensory information. We studied predictable information in neural signals based on resting-state magnetoencephalography (MEG) recordings of 19 ASD patients and 19 neurotypical controls aged between 14 and 27 years. Using whole-brain beamformer source analysis, we found reduced predictable information in ASD patients across the whole brain, but in particular in posterior regions of the default mode network. In these regions, epoch-by-epoch predictable information was positively correlated with source power in the alpha and beta frequency range as well as autocorrelation decay time. Predictable information in precuneus and cerebellum was negatively associated with nonsocial symptom severity, indicating a relevance of the analysis of predictable information for clinical research in ASD. Our findings are compatible with the assumption that use or precision of prior knowledge is reduced in ASD patients. © 2018 Wiley Periodicals, Inc.

  12. Characterization of Initial Parameter Information for Lifetime Prediction of Electronic Devices.

    Science.gov (United States)

    Li, Zhigang; Liu, Boying; Yuan, Mengxiong; Zhang, Feifei; Guo, Jiaqiang

    2016-01-01

    Newly manufactured electronic devices are subject to different levels of potential defects existing among the initial parameter information of the devices. In this study, a characterization of electromagnetic relays that were operated at their optimal performance with appropriate and steady parameter values was performed to estimate the levels of their potential defects and to develop a lifetime prediction model. First, the initial parameter information value and stability were quantified to measure the performance of the electronics. In particular, the values of the initial parameter information were estimated using the probability-weighted average method, whereas the stability of the parameter information was determined by using the difference between the extrema and end points of the fitting curves for the initial parameter information. Second, a lifetime prediction model for small-sized samples was proposed on the basis of both measures. Finally, a model for the relationship of the initial contact resistance and stability over the lifetime of the sampled electromagnetic relays was proposed and verified. A comparison of the actual and predicted lifetimes of the relays revealed a 15.4% relative error, indicating that the lifetime of electronic devices can be predicted based on their initial parameter information.

  13. Characterization of Initial Parameter Information for Lifetime Prediction of Electronic Devices.

    Directory of Open Access Journals (Sweden)

    Zhigang Li

    Full Text Available Newly manufactured electronic devices are subject to different levels of potential defects existing among the initial parameter information of the devices. In this study, a characterization of electromagnetic relays that were operated at their optimal performance with appropriate and steady parameter values was performed to estimate the levels of their potential defects and to develop a lifetime prediction model. First, the initial parameter information value and stability were quantified to measure the performance of the electronics. In particular, the values of the initial parameter information were estimated using the probability-weighted average method, whereas the stability of the parameter information was determined by using the difference between the extrema and end points of the fitting curves for the initial parameter information. Second, a lifetime prediction model for small-sized samples was proposed on the basis of both measures. Finally, a model for the relationship of the initial contact resistance and stability over the lifetime of the sampled electromagnetic relays was proposed and verified. A comparison of the actual and predicted lifetimes of the relays revealed a 15.4% relative error, indicating that the lifetime of electronic devices can be predicted based on their initial parameter information.

  14. Statistics of the largest sunspot and facular areas per solar cycle

    International Nuclear Information System (INIS)

    Willis, D.M.; Kabasakal Tulunay, Y.

    1979-01-01

    The statistics of extreme values is used to investigate the statistical properties of the largest areas sunspots and photospheric faculae per solar cycle. The largest values of the synodic-solar-rotation mean areas of umbrae, whole spots and faculae, which have been recorded for nine solar cycles, are each shown to comply with the general form of the extreme value probability function. Empirical expressions are derived for the three extreme value populations from which the characteristic statistical parameters, namely the mode, median, mean and standard deviation, can be calculated for each population. These three extreme value populations are also used to find the expected ranges of the extreme areas in a group of solar cycles as a function of the number of cycles in the group. The extreme areas of umbrae and whole spots have a dispersion comparable to that found by Siscoe for the extreme values of sunspot number, whereas the extreme areas of faculae have a smaller dispersion which is comparable to that found by Siscoe for the largest geomagnetic storm per solar cycle. The expected range of the largest sunspot area per solar cycle for a group of one hundred cycles appears to be inconsistent with the existence of the prolonged periods of sunspot minima that have been inferred from the historical information on solar variability. This inconsistency supports the contention that there are temporal changes of solar-cycle statistics during protracted periods of sunspot minima (or maxima). Indeed, without such temporal changes, photospheric faculae should have been continually observable throughout the lifetime of the Sun. (orig.)

  15. Information-Theoretic Evidence for Predictive Coding in the Face-Processing System.

    Science.gov (United States)

    Brodski-Guerniero, Alla; Paasch, Georg-Friedrich; Wollstadt, Patricia; Özdemir, Ipek; Lizier, Joseph T; Wibral, Michael

    2017-08-23

    Predictive coding suggests that the brain infers the causes of its sensations by combining sensory evidence with internal predictions based on available prior knowledge. However, the neurophysiological correlates of (pre)activated prior knowledge serving these predictions are still unknown. Based on the idea that such preactivated prior knowledge must be maintained until needed, we measured the amount of maintained information in neural signals via the active information storage (AIS) measure. AIS was calculated on whole-brain beamformer-reconstructed source time courses from MEG recordings of 52 human subjects during the baseline of a Mooney face/house detection task. Preactivation of prior knowledge for faces showed as α-band-related and β-band-related AIS increases in content-specific areas; these AIS increases were behaviorally relevant in the brain's fusiform face area. Further, AIS allowed decoding of the cued category on a trial-by-trial basis. Our results support accounts indicating that activated prior knowledge and the corresponding predictions are signaled in low-frequency activity (information our eyes/retina and other sensory organs receive from the outside world, but strongly depends also on information already present in our brains, such as prior knowledge about specific situations or objects. A currently popular theory in neuroscience, predictive coding theory, suggests that this prior knowledge is used by the brain to form internal predictions about upcoming sensory information. However, neurophysiological evidence for this hypothesis is rare, mostly because this kind of evidence requires strong a priori assumptions about the specific predictions the brain makes and the brain areas involved. Using a novel, assumption-free approach, we find that face-related prior knowledge and the derived predictions are represented in low-frequency brain activity. Copyright © 2017 the authors 0270-6474/17/378273-11$15.00/0.

  16. Tax Evasion, Information Reporting, and the Regressive Bias Prediction

    DEFF Research Database (Denmark)

    Boserup, Simon Halphen; Pinje, Jori Veng

    2013-01-01

    evasion and audit probabilities once we account for information reporting in the tax compliance game. When conditioning on information reporting, we find that both reduced-form evidence and simulations exhibit the predicted regressive bias. However, in the overall economy, this bias is negated by the tax......Models of rational tax evasion and optimal enforcement invariably predict a regressive bias in the effective tax system, which reduces redistribution in the economy. Using Danish administrative data, we show that a calibrated structural model of this type replicates moments and correlations of tax...

  17. Improving Multi-Sensor Drought Monitoring, Prediction and Recovery Assessment Using Gravimetry Information

    Science.gov (United States)

    Aghakouchak, Amir; Tourian, Mohammad J.

    2015-04-01

    Development of reliable drought monitoring, prediction and recovery assessment tools are fundamental to water resources management. This presentation focuses on how gravimetry information can improve drought assessment. First, we provide an overview of the Global Integrated Drought Monitoring and Prediction System (GIDMaPS) which offers near real-time drought information using remote sensing observations and model simulations. Then, we present a framework for integration of satellite gravimetry information for improving drought prediction and recovery assessment. The input data include satellite-based and model-based precipitation, soil moisture estimates and equivalent water height. Previous studies show that drought assessment based on one single indicator may not be sufficient. For this reason, GIDMaPS provides drought information based on multiple drought indicators including Standardized Precipitation Index (SPI), Standardized Soil Moisture Index (SSI) and the Multivariate Standardized Drought Index (MSDI) which combines SPI and SSI probabilistically. MSDI incorporates the meteorological and agricultural drought conditions and provides composite multi-index drought information for overall characterization of droughts. GIDMaPS includes a seasonal prediction component based on a statistical persistence-based approach. The prediction component of GIDMaPS provides the empirical probability of drought for different severity levels. In this presentation we present a new component in which the drought prediction information based on SPI, SSI and MSDI are conditioned on equivalent water height obtained from the Gravity Recovery and Climate Experiment (GRACE). Using a Bayesian approach, GRACE information is used to evaluate persistence of drought. Finally, the deficit equivalent water height based on GRACE is used for assessing drought recovery. In this presentation, both monitoring and prediction components of GIDMaPS will be discussed, and the results from 2014

  18. Why hydrological predictions should be evaluated using information theory

    Directory of Open Access Journals (Sweden)

    S. V. Weijs

    2010-12-01

    Full Text Available Probabilistic predictions are becoming increasingly popular in hydrology. Equally important are methods to test such predictions, given the topical debate on uncertainty analysis in hydrology. Also in the special case of hydrological forecasting, there is still discussion about which scores to use for their evaluation. In this paper, we propose to use information theory as the central framework to evaluate predictions. From this perspective, we hope to shed some light on what verification scores measure and should measure. We start from the ''divergence score'', a relative entropy measure that was recently found to be an appropriate measure for forecast quality. An interpretation of a decomposition of this measure provides insight in additive relations between climatological uncertainty, correct information, wrong information and remaining uncertainty. When the score is applied to deterministic forecasts, it follows that these increase uncertainty to infinity. In practice, however, deterministic forecasts tend to be judged far more mildly and are widely used. We resolve this paradoxical result by proposing that deterministic forecasts either are implicitly probabilistic or are implicitly evaluated with an underlying decision problem or utility in mind. We further propose that calibration of models representing a hydrological system should be the based on information-theoretical scores, because this allows extracting all information from the observations and avoids learning from information that is not there. Calibration based on maximizing utility for society trains an implicit decision model rather than the forecasting system itself. This inevitably results in a loss or distortion of information in the data and more risk of overfitting, possibly leading to less valuable and informative forecasts. We also show this in an example. The final conclusion is that models should preferably be explicitly probabilistic and calibrated to maximize the

  19. Identification of informative features for predicting proinflammatory potentials of engine exhausts.

    Science.gov (United States)

    Wang, Chia-Chi; Lin, Ying-Chi; Lin, Yuan-Chung; Jhang, Syu-Ruei; Tung, Chun-Wei

    2017-08-18

    The immunotoxicity of engine exhausts is of high concern to human health due to the increasing prevalence of immune-related diseases. However, the evaluation of immunotoxicity of engine exhausts is currently based on expensive and time-consuming experiments. It is desirable to develop efficient methods for immunotoxicity assessment. To accelerate the development of safe alternative fuels, this study proposed a computational method for identifying informative features for predicting proinflammatory potentials of engine exhausts. A principal component regression (PCR) algorithm was applied to develop prediction models. The informative features were identified by a sequential backward feature elimination (SBFE) algorithm. A total of 19 informative chemical and biological features were successfully identified by SBFE algorithm. The informative features were utilized to develop a computational method named FS-CBM for predicting proinflammatory potentials of engine exhausts. FS-CBM model achieved a high performance with correlation coefficient values of 0.997 and 0.943 obtained from training and independent test sets, respectively. The FS-CBM model was developed for predicting proinflammatory potentials of engine exhausts with a large improvement on prediction performance compared with our previous CBM model. The proposed method could be further applied to construct models for bioactivities of mixtures.

  20. Largest College Endowments, 2011

    Science.gov (United States)

    Chronicle of Higher Education, 2012

    2012-01-01

    Of all endowments valued at more than $250-million, the UCLA Foundation had the highest rate of growth over the previous year, at 49 percent. This article presents a table of the largest college endowments in 2011. The table covers the "rank," "institution," "market value as of June 30, 2011," and "1-year change" of institutions participating in…

  1. Protein Sub-Nuclear Localization Prediction Using SVM and Pfam Domain Information

    Science.gov (United States)

    Kumar, Ravindra; Jain, Sohni; Kumari, Bandana; Kumar, Manish

    2014-01-01

    The nucleus is the largest and the highly organized organelle of eukaryotic cells. Within nucleus exist a number of pseudo-compartments, which are not separated by any membrane, yet each of them contains only a specific set of proteins. Understanding protein sub-nuclear localization can hence be an important step towards understanding biological functions of the nucleus. Here we have described a method, SubNucPred developed by us for predicting the sub-nuclear localization of proteins. This method predicts protein localization for 10 different sub-nuclear locations sequentially by combining presence or absence of unique Pfam domain and amino acid composition based SVM model. The prediction accuracy during leave-one-out cross-validation for centromeric proteins was 85.05%, for chromosomal proteins 76.85%, for nuclear speckle proteins 81.27%, for nucleolar proteins 81.79%, for nuclear envelope proteins 79.37%, for nuclear matrix proteins 77.78%, for nucleoplasm proteins 76.98%, for nuclear pore complex proteins 88.89%, for PML body proteins 75.40% and for telomeric proteins it was 83.33%. Comparison with other reported methods showed that SubNucPred performs better than existing methods. A web-server for predicting protein sub-nuclear localization named SubNucPred has been established at http://14.139.227.92/mkumar/subnucpred/. Standalone version of SubNucPred can also be downloaded from the web-server. PMID:24897370

  2. Adaptive plasticity in speech perception: Effects of external information and internal predictions.

    Science.gov (United States)

    Guediche, Sara; Fiez, Julie A; Holt, Lori L

    2016-07-01

    When listeners encounter speech under adverse listening conditions, adaptive adjustments in perception can improve comprehension over time. In some cases, these adaptive changes require the presence of external information that disambiguates the distorted speech signals, whereas in other cases mere exposure is sufficient. Both external (e.g., written feedback) and internal (e.g., prior word knowledge) sources of information can be used to generate predictions about the correct mapping of a distorted speech signal. We hypothesize that these predictions provide a basis for determining the discrepancy between the expected and actual speech signal that can be used to guide adaptive changes in perception. This study provides the first empirical investigation that manipulates external and internal factors through (a) the availability of explicit external disambiguating information via the presence or absence of postresponse orthographic information paired with a repetition of the degraded stimulus, and (b) the accuracy of internally generated predictions; an acoustic distortion is introduced either abruptly or incrementally. The results demonstrate that the impact of external information on adaptive plasticity is contingent upon whether the intelligibility of the stimuli permits accurate internally generated predictions during exposure. External information sources enhance adaptive plasticity only when input signals are severely degraded and cannot reliably access internal predictions. This is consistent with a computational framework for adaptive plasticity in which error-driven supervised learning relies on the ability to compute sensory prediction error signals from both internal and external sources of information. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  3. Information system of rice planting calendar based on ten-day (Dasarian) rainfall prediction

    International Nuclear Information System (INIS)

    Susandi, Armi; Tamamadin, Mamad; Djamal, Erizal; Las, Irsal

    2015-01-01

    This paper describes information system of rice planting calendar to help farmers in determining the time for rice planting. The information includes rainfall prediction in ten days (dasarian) scale overlaid to map of rice field to produce map of rice planting in village level. The rainfall prediction was produced by stochastic modeling using Fast Fourier Transform (FFT) and Non-Linier Least Squares methods to fit the curve of function to the rainfall data. In this research, the Fourier series has been modified become non-linear function to follow the recent characteristics of rainfall that is non stationary. The results have been also validated in 4 steps, including R-Square, RMSE, R-Skill, and comparison with field data. The development of information system (cyber extension) provides information such as rainfall prediction, prediction of the planting time, and interactive space for farmers to respond to the information submitted. Interfaces for interactive response will be critical to the improvement of prediction accuracy of information, both rainfall and planting time. The method used to get this information system includes mapping on rice planting prediction, converting the format file, developing database system, developing website, and posting website. Because of this map was overlaid with the Google map, the map files must be converted to the .kml file format

  4. Information system of rice planting calendar based on ten-day (Dasarian) rainfall prediction

    Energy Technology Data Exchange (ETDEWEB)

    Susandi, Armi, E-mail: armi@meteo.itb.ac.id [Department of Meteorology, Institut Teknologi Bandung, Labtek XI Building floor 1, Jalan Ganesa 10 Bandung 40132 (Indonesia); Tamamadin, Mamad, E-mail: mamadtama@meteo.itb.ac.id [Laboratory of Applied Meteorology, Institut Teknologi Bandung Ged. Labtek XI lt. 1, Jalan Ganesa 10 Bandung 40132 (Indonesia); Djamal, Erizal, E-mail: erizal-jamal@yahoo.com [Center for Agricultural Technology Transfer Management, Ministry of Agriculture Jl. Salak No. 22 Bogor (Indonesia); Las, Irsal, E-mail: irsallas@yahoo.com [Indonesian Agroclimate and Hydrology Research Institute, Ministry of Agriculture Jl. Tentara Pelajar 1a Bogor 16111 (Indonesia)

    2015-09-30

    This paper describes information system of rice planting calendar to help farmers in determining the time for rice planting. The information includes rainfall prediction in ten days (dasarian) scale overlaid to map of rice field to produce map of rice planting in village level. The rainfall prediction was produced by stochastic modeling using Fast Fourier Transform (FFT) and Non-Linier Least Squares methods to fit the curve of function to the rainfall data. In this research, the Fourier series has been modified become non-linear function to follow the recent characteristics of rainfall that is non stationary. The results have been also validated in 4 steps, including R-Square, RMSE, R-Skill, and comparison with field data. The development of information system (cyber extension) provides information such as rainfall prediction, prediction of the planting time, and interactive space for farmers to respond to the information submitted. Interfaces for interactive response will be critical to the improvement of prediction accuracy of information, both rainfall and planting time. The method used to get this information system includes mapping on rice planting prediction, converting the format file, developing database system, developing website, and posting website. Because of this map was overlaid with the Google map, the map files must be converted to the .kml file format.

  5. Estimating the decomposition of predictive information in multivariate systems

    Science.gov (United States)

    Faes, Luca; Kugiumtzis, Dimitris; Nollo, Giandomenico; Jurysta, Fabrice; Marinazzo, Daniele

    2015-03-01

    In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of conditional mutual information, to the present target process. Moreover, it computes all information-theoretic quantities using a nearest-neighbor technique designed to compensate the bias due to the different dimensionality of individual entropy terms. The resulting estimators of prediction entropy, storage entropy, transfer entropy, and partial transfer entropy are tested on simulations of coupled linear stochastic and nonlinear deterministic dynamic processes, demonstrating the superiority of the proposed approach over the traditional estimators based on uniform embedding. The framework is then applied to multivariate physiologic time series, resulting in physiologically well-interpretable information decompositions of cardiovascular and cardiorespiratory interactions during head-up tilt and of joint brain-heart dynamics during sleep.

  6. Constraining the magnitude of the largest event in a foreshock-main shock-aftershock sequence

    Science.gov (United States)

    Shcherbakov, Robert; Zhuang, Jiancang; Ogata, Yosihiko

    2018-01-01

    Extreme value statistics and Bayesian methods are used to constrain the magnitudes of the largest expected earthquakes in a sequence governed by the parametric time-dependent occurrence rate and frequency-magnitude statistics. The Bayesian predictive distribution for the magnitude of the largest event in a sequence is derived. Two types of sequences are considered, that is, the classical aftershock sequences generated by large main shocks and the aftershocks generated by large foreshocks preceding a main shock. For the former sequences, the early aftershocks during a training time interval are used to constrain the magnitude of the future extreme event during the forecasting time interval. For the latter sequences, the earthquakes preceding the main shock are used to constrain the magnitudes of the subsequent extreme events including the main shock. The analysis is applied retrospectively to past prominent earthquake sequences.

  7. Prediction information - GRIPDB | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available switchLanguage; BLAST Search Image Search Home About Archive Update History Data List Contact us GRI...a.nbdc01665-004 Description of data contents Predicted GPCR interaction regions Data file File name: gripdb_...predicted_info.zip File URL: ftp://ftp.biosciencedbc.jp/archive/gripdb/LATEST/gripdb_predicted_info.zip File... size: 219 KB Simple search URL http://togodb.biosciencedbc.jp/togodb/view/gripdb...entries Data item Description ID Prediction information ID GRIP ID GRIP ID related wigh the prediction Predi

  8. Drug-Target Interaction Prediction through Label Propagation with Linear Neighborhood Information.

    Science.gov (United States)

    Zhang, Wen; Chen, Yanlin; Li, Dingfang

    2017-11-25

    Interactions between drugs and target proteins provide important information for the drug discovery. Currently, experiments identified only a small number of drug-target interactions. Therefore, the development of computational methods for drug-target interaction prediction is an urgent task of theoretical interest and practical significance. In this paper, we propose a label propagation method with linear neighborhood information (LPLNI) for predicting unobserved drug-target interactions. Firstly, we calculate drug-drug linear neighborhood similarity in the feature spaces, by considering how to reconstruct data points from neighbors. Then, we take similarities as the manifold of drugs, and assume the manifold unchanged in the interaction space. At last, we predict unobserved interactions between known drugs and targets by using drug-drug linear neighborhood similarity and known drug-target interactions. The experiments show that LPLNI can utilize only known drug-target interactions to make high-accuracy predictions on four benchmark datasets. Furthermore, we consider incorporating chemical structures into LPLNI models. Experimental results demonstrate that the model with integrated information (LPLNI-II) can produce improved performances, better than other state-of-the-art methods. The known drug-target interactions are an important information source for computational predictions. The usefulness of the proposed method is demonstrated by cross validation and the case study.

  9. Largest particle detector nearing completion

    CERN Multimedia

    2006-01-01

    "Construction of another part of the Large Hadron Collider (LHC), the worl's largest particle accelerator at CERN in Switzerland, is nearing completion. The Compact Muon Solenoid (CMS) is oner of the LHC project's four large particle detectors. (1/2 page)

  10. Aggregation of Information and Beliefs in Prediction Markets

    DEFF Research Database (Denmark)

    Ottaviani, Marco; Sørensen, Peter Norman

    We analyze a binary prediction market in which traders have heterogeneous prior beliefs and private information. Realistically, we assume that traders are allowed to invest a limited amount of money (or have decreasing absolute risk aversion). We show that the rational expectations equilibrium...... price underreacts to information. When favorable information to an event is available and is revealed by the market, the price increases and this forces optimists to reduce the number of assets they can (or want to) buy. For the market to equilibrate, the price must increase less than a posterior belief...

  11. Prediction of Missing Streamflow Data using Principle of Information Entropy

    Directory of Open Access Journals (Sweden)

    Santosa, B.

    2014-01-01

    Full Text Available Incomplete (missing of streamflow data often occurs. This can be caused by a not continous data recording or poor storage. In this study, missing consecutive streamflow data are predicted using the principle of information entropy. Predictions are performed ​​using the complete monthly streamflow information from the nearby river. Data on average monthly streamflow used as a simulation sample are taken from observation stations Katulampa, Batubeulah, and Genteng, which are the Ciliwung Cisadane river areas upstream. The simulated prediction of missing streamflow data in 2002 and 2003 at Katulampa Station are based on information from Genteng Station, and Batubeulah Station. The mean absolute error (MAE average obtained was 0,20 and 0,21 in 2002 and the MAE average in 2003 was 0,12 and 0,16. Based on the value of the error and pattern of filled gaps, this method has the potential to be developed further.

  12. Application of the largest Lyapunov exponent and non-linear fractal extrapolation algorithm to short-term load forecasting

    International Nuclear Information System (INIS)

    Wang Jianzhou; Jia Ruiling; Zhao Weigang; Wu Jie; Dong Yao

    2012-01-01

    Highlights: ► The maximal predictive step size is determined by the largest Lyapunov exponent. ► A proper forecasting step size is applied to load demand forecasting. ► The improved approach is validated by the actual load demand data. ► Non-linear fractal extrapolation method is compared with three forecasting models. ► Performance of the models is evaluated by three different error measures. - Abstract: Precise short-term load forecasting (STLF) plays a key role in unit commitment, maintenance and economic dispatch problems. Employing a subjective and arbitrary predictive step size is one of the most important factors causing the low forecasting accuracy. To solve this problem, the largest Lyapunov exponent is adopted to estimate the maximal predictive step size so that the step size in the forecasting is no more than this maximal one. In addition, in this paper a seldom used forecasting model, which is based on the non-linear fractal extrapolation (NLFE) algorithm, is considered to develop the accuracy of predictions. The suitability and superiority of the two solutions are illustrated through an application to real load forecasting using New South Wales electricity load data from the Australian National Electricity Market. Meanwhile, three forecasting models: the gray model, the seasonal autoregressive integrated moving average approach and the support vector machine method, which received high approval in STLF, are selected to compare with the NLFE algorithm. Comparison results also show that the NLFE model is outstanding, effective, practical and feasible.

  13. Health insurance premium increases for the 5 largest school districts in the United States, 2004-2008.

    Science.gov (United States)

    Cantillo, John R

    2010-03-01

    Local school districts are often one of the largest, if not the largest, employers in their respective communities. Like many large employers, school districts offer health insurance to their employees. There is a lack of information about the rate of health insurance premiums in US school districts relative to other employers. To assess the change in the costs of healthcare insurance in the 5 largest public school districts in the United States, between 2004 and 2008, as representative of large public employers in the country. Data for this study were drawn exclusively from a survey sent to the 5 largest public school districts in the United States. The survey requested responses on 3 data elements for each benefit plan offered from 2004 through 2008; these included enrollment, employee costs, and employer costs. The premium growth for the 5 largest school districts has slowed down and is consistent with other purchasers-Kaiser/Health Research & Educational Trust and the Federal Employee Health Benefit Program. The average increase in health insurance premium for the schools was 5.9% in 2008, and the average annual growth rate over the study period was 7.5%. For family coverage, these schools provide the most generous employer contribution (80.8%) compared with the employer contribution reported by other employers (73.5%) for 2008. Often the largest employers in their communities, school districts demonstrate a commitment to provide choice of benefits and affordability for employees and their families. Despite constraints typical of public employers, the 5 largest school districts in the United States have decelerated in premium growth consistent with other purchasers, albeit at a slower pace.

  14. Social networks predict selective observation and information spread in ravens

    Science.gov (United States)

    Rubenstein, Daniel I.; Bugnyar, Thomas; Hoppitt, William; Mikus, Nace; Schwab, Christine

    2016-01-01

    Animals are predicted to selectively observe and learn from the conspecifics with whom they share social connections. Yet, hardly anything is known about the role of different connections in observation and learning. To address the relationships between social connections, observation and learning, we investigated transmission of information in two raven (Corvus corax) groups. First, we quantified social connections in each group by constructing networks on affiliative interactions, aggressive interactions and proximity. We then seeded novel information by training one group member on a novel task and allowing others to observe. In each group, an observation network based on who observed whose task-solving behaviour was strongly correlated with networks based on affiliative interactions and proximity. Ravens with high social centrality (strength, eigenvector, information centrality) in the affiliative interaction network were also central in the observation network, possibly as a result of solving the task sooner. Network-based diffusion analysis revealed that the order that ravens first solved the task was best predicted by connections in the affiliative interaction network in a group of subadult ravens, and by social rank and kinship (which influenced affiliative interactions) in a group of juvenile ravens. Our results demonstrate that not all social connections are equally effective at predicting the patterns of selective observation and information transmission. PMID:27493780

  15. Adult age differences in predicting memory performance: the effects of normative information and task experience.

    Science.gov (United States)

    McDonald-Miszczak, L; Hunter, M A; Hultsch, D F

    1994-03-01

    Two experiments addressed the effects of task information and experience on younger and older adults' ability to predict their memory for words. The first study examined the effects of normative task information on subjects' predictions for 30-word lists across three trials. The second study looked at the effects of making predictions and recalling either an easy (15) or a difficult (45) word list prior to making predictions and recalling a moderately difficult (30) word list. The results from both studies showed that task information and experience affected subjects' predictions and that elderly adults predicted their performance more accurately than younger adults.

  16. Improving protein-protein interaction prediction using evolutionary information from low-quality MSAs.

    Science.gov (United States)

    Várnai, Csilla; Burkoff, Nikolas S; Wild, David L

    2017-01-01

    Evolutionary information stored in multiple sequence alignments (MSAs) has been used to identify the interaction interface of protein complexes, by measuring either co-conservation or co-mutation of amino acid residues across the interface. Recently, maximum entropy related correlated mutation measures (CMMs) such as direct information, decoupling direct from indirect interactions, have been developed to identify residue pairs interacting across the protein complex interface. These studies have focussed on carefully selected protein complexes with large, good-quality MSAs. In this work, we study protein complexes with a more typical MSA consisting of fewer than 400 sequences, using a set of 79 intramolecular protein complexes. Using a maximum entropy based CMM at the residue level, we develop an interface level CMM score to be used in re-ranking docking decoys. We demonstrate that our interface level CMM score compares favourably to the complementarity trace score, an evolutionary information-based score measuring co-conservation, when combined with the number of interface residues, a knowledge-based potential and the variability score of individual amino acid sites. We also demonstrate, that, since co-mutation and co-complementarity in the MSA contain orthogonal information, the best prediction performance using evolutionary information can be achieved by combining the co-mutation information of the CMM with co-conservation information of a complementarity trace score, predicting a near-native structure as the top prediction for 41% of the dataset. The method presented is not restricted to small MSAs, and will likely improve interface prediction also for complexes with large and good-quality MSAs.

  17. Alternative Fuels Data Center: America's Largest Home Runs on Biodiesel in

    Science.gov (United States)

    North Carolina America's Largest Home Runs on Biodiesel in North Carolina to someone by E-mail Share Alternative Fuels Data Center: America's Largest Home Runs on Biodiesel in North Carolina on Facebook Tweet about Alternative Fuels Data Center: America's Largest Home Runs on Biodiesel in North

  18. Quantifying predictability through information theory: small sample estimation in a non-Gaussian framework

    International Nuclear Information System (INIS)

    Haven, Kyle; Majda, Andrew; Abramov, Rafail

    2005-01-01

    Many situations in complex systems require quantitative estimates of the lack of information in one probability distribution relative to another. In short term climate and weather prediction, examples of these issues might involve the lack of information in the historical climate record compared with an ensemble prediction, or the lack of information in a particular Gaussian ensemble prediction strategy involving the first and second moments compared with the non-Gaussian ensemble itself. The relative entropy is a natural way to quantify the predictive utility in this information, and recently a systematic computationally feasible hierarchical framework has been developed. In practical systems with many degrees of freedom, computational overhead limits ensemble predictions to relatively small sample sizes. Here the notion of predictive utility, in a relative entropy framework, is extended to small random samples by the definition of a sample utility, a measure of the unlikeliness that a random sample was produced by a given prediction strategy. The sample utility is the minimum predictability, with a statistical level of confidence, which is implied by the data. Two practical algorithms for measuring such a sample utility are developed here. The first technique is based on the statistical method of null-hypothesis testing, while the second is based upon a central limit theorem for the relative entropy of moment-based probability densities. These techniques are tested on known probability densities with parameterized bimodality and skewness, and then applied to the Lorenz '96 model, a recently developed 'toy' climate model with chaotic dynamics mimicking the atmosphere. The results show a detection of non-Gaussian tendencies of prediction densities at small ensemble sizes with between 50 and 100 members, with a 95% confidence level

  19. Health Insurance Premium Increases for the 5 Largest School Districts in the United States, 2004–2008

    Science.gov (United States)

    Cantillo, John R.

    2010-01-01

    Background Local school districts are often one of the largest, if not the largest, employers in their respective communities. Like many large employers, school districts offer health insurance to their employees. There is a lack of information about the rate of health insurance premiums in US school districts relative to other employers. Objective To assess the change in the costs of healthcare insurance in the 5 largest public school districts in the United States, between 2004 and 2008, as representative of large public employers in the country. Methods Data for this study were drawn exclusively from a survey sent to the 5 largest public school districts in the United States. The survey requested responses on 3 data elements for each benefit plan offered from 2004 through 2008; these included enrollment, employee costs, and employer costs. Results The premium growth for the 5 largest school districts has slowed down and is consistent with other purchasers—Kaiser/Health Research & Educational Trust and the Federal Employee Health Benefit Program. The average increase in health insurance premium for the schools was 5.9% in 2008, and the average annual growth rate over the study period was 7.5%. For family coverage, these schools provide the most generous employer contribution (80.8%) compared with the employer contribution reported by other employers (73.5%) for 2008. Conclusions Often the largest employers in their communities, school districts demonstrate a commitment to provide choice of benefits and affordability for employees and their families. Despite constraints typical of public employers, the 5 largest school districts in the United States have decelerated in premium growth consistent with other purchasers, albeit at a slower pace. PMID:25126311

  20. Predicting Genes Involved in Human Cancer Using Network Contextual Information

    Directory of Open Access Journals (Sweden)

    Rahmani Hossein

    2012-03-01

    Full Text Available Protein-Protein Interaction (PPI networks have been widely used for the task of predicting proteins involved in cancer. Previous research has shown that functional information about the protein for which a prediction is made, proximity to specific other proteins in the PPI network, as well as local network structure are informative features in this respect. In this work, we introduce two new types of input features, reflecting additional information: (1 Functional Context: the functions of proteins interacting with the target protein (rather than the protein itself; and (2 Structural Context: the relative position of the target protein with respect to specific other proteins selected according to a novel ANOVA (analysis of variance based measure. We also introduce a selection strategy to pinpoint the most informative features. Results show that the proposed feature types and feature selection strategy yield informative features. A standard machine learning method (Naive Bayes that uses the features proposed here outperforms the current state-of-the-art methods by more than 5% with respect to F-measure. In addition, manual inspection confirms the biological relevance of the top-ranked features.

  1. Identifying the node spreading influence with largest k-core values

    International Nuclear Information System (INIS)

    Lin, Jian-Hong; Guo, Qiang; Dong, Wen-Zhao; Tang, Li-Ying; Liu, Jian-Guo

    2014-01-01

    Identifying the nodes with largest spreading influence of complex networks is one of the most promising domains. By taking into account the neighbors' k-core values, we present an improved neighbors' k-core (INK) method which is the sum of the neighbors' k-core values with a tunable parameter α to evaluate the node spreading influence with largest k-core values. Comparing with the Susceptible–Infected–Recovered (SIR) results for four real networks, the INK method could identify the node spreading influence with largest k-core values more accurately than the ones generated by the degree k, closeness C, betweenness B and coreness centrality method. - Highlights: • We present an improved neighbors' k-core (INK) method to evaluate the node spreading influence with largest k-core values. • The INK method could identify the node spreading influence with largest k-core values more accurately. • Kendall's tau τ of INK method with α=1 are highly identical to rank the node influence

  2. Endogenous Information, Risk Characterization, and the Predictability of Average Stock Returns

    Directory of Open Access Journals (Sweden)

    Pradosh Simlai

    2012-09-01

    Full Text Available In this paper we provide a new type of risk characterization of the predictability of two widely known abnormal patterns in average stock returns: momentum and reversal. The purpose is to illustrate the relative importance of common risk factors and endogenous information. Our results demonstrates that in the presence of zero-investment factors, spreads in average momentum and reversal returns correspond to spreads in the slopes of the endogenous information. The empirical findings support the view that various classes of firms react differently to volatility risk, and endogenous information harbor important sources of potential risk loadings. Taken together, our results suggest that returns are influenced by random endogenous information flow, which is asymmetric in nature, and can be used as a performance attribution factor. If one fails to incorporate the existing asymmetric endogenous information hidden in the historical behavior, any attempt to explore average stock return predictability will be subject to an unquantified specification bias.

  3. CERN tests largest superconducting solenoid magnet

    CERN Multimedia

    2006-01-01

    "CERN's Compacts Muon Solenoid (CMS) - the world's largest superconducting solenoid magnet - has reached full field in testing. The instrument is part of the proton-proton Large Hadron Collider (LHC) project, located in a giant subterranean chamber at Cessy on the Franco-Swiss border." (1 page)

  4. Information trimming: Sufficient statistics, mutual information, and predictability from effective channel states

    Science.gov (United States)

    James, Ryan G.; Mahoney, John R.; Crutchfield, James P.

    2017-06-01

    One of the most basic characterizations of the relationship between two random variables, X and Y , is the value of their mutual information. Unfortunately, calculating it analytically and estimating it empirically are often stymied by the extremely large dimension of the variables. One might hope to replace such a high-dimensional variable by a smaller one that preserves its relationship with the other. It is well known that either X (or Y ) can be replaced by its minimal sufficient statistic about Y (or X ) while preserving the mutual information. While intuitively reasonable, it is not obvious or straightforward that both variables can be replaced simultaneously. We demonstrate that this is in fact possible: the information X 's minimal sufficient statistic preserves about Y is exactly the information that Y 's minimal sufficient statistic preserves about X . We call this procedure information trimming. As an important corollary, we consider the case where one variable is a stochastic process' past and the other its future. In this case, the mutual information is the channel transmission rate between the channel's effective states. That is, the past-future mutual information (the excess entropy) is the amount of information about the future that can be predicted using the past. Translating our result about minimal sufficient statistics, this is equivalent to the mutual information between the forward- and reverse-time causal states of computational mechanics. We close by discussing multivariate extensions to this use of minimal sufficient statistics.

  5. Ownership, financing, and management strategies of the ten largest for-profit nursing home chains in the United States.

    Science.gov (United States)

    Harrington, Charlene; Hauser, Clarilee; Olney, Brian; Rosenau, Pauline Vaillancourt

    2011-01-01

    This study examined the ownership, financing, and management strategies of the 10 largest for-profit nursing home chains in the United States, including the four largest chains purchased by private equity corporations. Descriptive data were collected from Internet searches, company reports, and other sources for the decade 1998-2008. Since 1998, the largest chains have made many changes in their ownership and structure, and some have converted from publicly traded companies to private ownership. This study shows the increasing complexity of corporate nursing home ownership and the lack of public information about ownership and financial status. The chains have used strategies to maximize shareholder and investor value that include increasing Medicare revenues, occupancy rates, and company diversification, establishing multiple layers of corporate ownership, developing real estate investment trusts, and creating limited liability companies. These strategies enhance shareholder and investor profits, reduce corporate taxes, and reduce liability risk. There is a need for greater transparency in ownership and financial reporting and for more government oversight of the largest for-profit chains, including those owned by private equity companies.

  6. Predicting masking release of lateralized speech

    DEFF Research Database (Denmark)

    Chabot-Leclerc, Alexandre; MacDonald, Ewen; Dau, Torsten

    2016-01-01

    . The largest masking release (MR) was observed when all maskers were on the opposite side of the target. The data in the conditions containing only energetic masking and modulation masking could be accounted for using a binaural extension of the speech-based envelope power spectrum model [sEPSM; Jørgensen et...... al., 2013, J. Acoust. Soc. Am. 130], which uses a short-term equalization-cancellation process to model binaural unmasking. In the conditions where informational masking (IM) was involved, the predicted SRTs were lower than the measured values because the model is blind to confusions experienced...

  7. The global diversion of pharmaceutical drugs. India: the third largest illicit opium producer?

    Science.gov (United States)

    Paoli, Letizia; Greenfield, Victoria A; Charles, Molly; Reuter, Peter

    2009-03-01

    This paper explores India's role in the world illicit opiate market, particularly its role as a producer. India, a major illicit opiate consumer, is also the sole licensed exporter of raw opium: this unique status may be enabling substantial diversion to the illicit market. Participant observation and interviews were carried out at eight different sites. Information was also drawn from all standard secondary sources and the analysis of about 180 drug-related criminal proceedings reviewed by Indian High Courts and the Supreme Court from 1985 to 2001. Diversion from licit opium production takes place on such a large scale that India may be the third largest illicit opium producer after Afghanistan and Burma. With the possible exceptions of 2005 and 2006, 200-300 tons of India's opium may be diverted yearly. After estimating India's opiate consumption on the basis of UN-reported prevalence estimates, we find that diversion from licit production might have satisfied a quarter to more than a third of India's illicit opiate demand to 2004. India is not only among the world's largest consumer of illicit opiates but also one of the largest illicit opium producers. In contrast to all other illicit producers, India owes the latter distinction not to blatantly illicit cultivation but to diversion from licit cultivation. India's experience suggests the difficulty of preventing substantial leakage, even in a relatively well-governed nation.

  8. A rule-based backchannel prediction model using pitch and pause information

    NARCIS (Netherlands)

    Truong, Khiet Phuong; Poppe, Ronald Walter; Heylen, Dirk K.J.

    We manually designed rules for a backchannel (BC) prediction model based on pitch and pause information. In short, the model predicts a BC when there is a pause of a certain length that is preceded by a falling or rising pitch. This model was validated against the Dutch IFADV Corpus in a

  9. Analysis of Human Standing Balance by Largest Lyapunov Exponent

    Directory of Open Access Journals (Sweden)

    Kun Liu

    2015-01-01

    Full Text Available The purpose of this research is to analyse the relationship between nonlinear dynamic character and individuals’ standing balance by the largest Lyapunov exponent, which is regarded as a metric for assessing standing balance. According to previous study, the largest Lyapunov exponent from centre of pressure time series could not well quantify the human balance ability. In this research, two improvements were made. Firstly, an external stimulus was applied to feet in the form of continuous horizontal sinusoidal motion by a moving platform. Secondly, a multiaccelerometer subsystem was adopted. Twenty healthy volunteers participated in this experiment. A new metric, coordinated largest Lyapunov exponent was proposed, which reflected the relationship of body segments by integrating multidimensional largest Lyapunov exponent values. By using this metric in actual standing performance under sinusoidal stimulus, an obvious relationship between the new metric and the actual balance ability was found in the majority of the subjects. These results show that the sinusoidal stimulus can make human balance characteristics more obvious, which is beneficial to assess balance, and balance is determined by the ability of coordinating all body segments.

  10. Efficient network disintegration under incomplete information: the comic effect of link prediction

    Science.gov (United States)

    Tan, Suo-Yi; Wu, Jun; Lü, Linyuan; Li, Meng-Jun; Lu, Xin

    2016-01-01

    The study of network disintegration has attracted much attention due to its wide applications, including suppressing the epidemic spreading, destabilizing terrorist network, preventing financial contagion, controlling the rumor diffusion and perturbing cancer networks. The crux of this matter is to find the critical nodes whose removal will lead to network collapse. This paper studies the disintegration of networks with incomplete link information. An effective method is proposed to find the critical nodes by the assistance of link prediction techniques. Extensive experiments in both synthetic and real networks suggest that, by using link prediction method to recover partial missing links in advance, the method can largely improve the network disintegration performance. Besides, to our surprise, we find that when the size of missing information is relatively small, our method even outperforms than the results based on complete information. We refer to this phenomenon as the “comic effect” of link prediction, which means that the network is reshaped through the addition of some links that identified by link prediction algorithms, and the reshaped network is like an exaggerated but characteristic comic of the original one, where the important parts are emphasized. PMID:26960247

  11. Efficient network disintegration under incomplete information: the comic effect of link prediction

    Science.gov (United States)

    Tan, Suo-Yi; Wu, Jun; Lü, Linyuan; Li, Meng-Jun; Lu, Xin

    2016-03-01

    The study of network disintegration has attracted much attention due to its wide applications, including suppressing the epidemic spreading, destabilizing terrorist network, preventing financial contagion, controlling the rumor diffusion and perturbing cancer networks. The crux of this matter is to find the critical nodes whose removal will lead to network collapse. This paper studies the disintegration of networks with incomplete link information. An effective method is proposed to find the critical nodes by the assistance of link prediction techniques. Extensive experiments in both synthetic and real networks suggest that, by using link prediction method to recover partial missing links in advance, the method can largely improve the network disintegration performance. Besides, to our surprise, we find that when the size of missing information is relatively small, our method even outperforms than the results based on complete information. We refer to this phenomenon as the “comic effect” of link prediction, which means that the network is reshaped through the addition of some links that identified by link prediction algorithms, and the reshaped network is like an exaggerated but characteristic comic of the original one, where the important parts are emphasized.

  12. GIS learning tool for world's largest earthquakes and their causes

    Science.gov (United States)

    Chatterjee, Moumita

    The objective of this thesis is to increase awareness about earthquakes among people, especially young students by showing the five largest and two most predictable earthquake locations in the world and their plate tectonic settings. This is a geographic based interactive tool which could be used for learning about the cause of great earthquakes in the past and the safest places on the earth in order to avoid direct effect of earthquakes. This approach provides an effective way of learning for the students as it is very user friendly and more aligned to the interests of the younger generation. In this tool the user can click on the various points located on the world map which will open a picture and link to the webpage for that point, showing detailed information of the earthquake history of that place including magnitude of quake, year of past quakes and the plate tectonic settings that made this place earthquake prone. Apart from knowing the earthquake related information students will also be able to customize the tool to suit their needs or interests. Students will be able to add/remove layers, measure distance between any two points on the map, select any place on the map and know more information for that place, create a layer from this set to do a detail analysis, run a query, change display settings, etc. At the end of this tool the user has to go through the earthquake safely guidelines in order to be safe during an earthquake. This tool uses Java as programming language and uses Map Objects Java Edition (MOJO) provided by ESRI. This tool is developed for educational purpose and hence its interface has been kept simple and easy to use so that students can gain maximum knowledge through it instead of having a hard time to install it. There are lots of details to explore which can help more about what a GIS based tool is capable of. Only thing needed to run this tool is latest JAVA edition installed in their machine. This approach makes study more fun and

  13. Evaluating and Predicting Patient Safety for Medical Devices With Integral Information Technology

    Science.gov (United States)

    2005-01-01

    323 Evaluating and Predicting Patient Safety for Medical Devices with Integral Information Technology Jiajie Zhang, Vimla L. Patel, Todd R...errors are due to inappropriate designs for user interactions, rather than mechanical failures. Evaluating and predicting patient safety in medical ...the users on the identified trouble spots in the devices. We developed two methods for evaluating and predicting patient safety in medical devices

  14. Network information improves cancer outcome prediction.

    Science.gov (United States)

    Roy, Janine; Winter, Christof; Isik, Zerrin; Schroeder, Michael

    2014-07-01

    Disease progression in cancer can vary substantially between patients. Yet, patients often receive the same treatment. Recently, there has been much work on predicting disease progression and patient outcome variables from gene expression in order to personalize treatment options. Despite first diagnostic kits in the market, there are open problems such as the choice of random gene signatures or noisy expression data. One approach to deal with these two problems employs protein-protein interaction networks and ranks genes using the random surfer model of Google's PageRank algorithm. In this work, we created a benchmark dataset collection comprising 25 cancer outcome prediction datasets from literature and systematically evaluated the use of networks and a PageRank derivative, NetRank, for signature identification. We show that the NetRank performs significantly better than classical methods such as fold change or t-test. Despite an order of magnitude difference in network size, a regulatory and protein-protein interaction network perform equally well. Experimental evaluation on cancer outcome prediction in all of the 25 underlying datasets suggests that the network-based methodology identifies highly overlapping signatures over all cancer types, in contrast to classical methods that fail to identify highly common gene sets across the same cancer types. Integration of network information into gene expression analysis allows the identification of more reliable and accurate biomarkers and provides a deeper understanding of processes occurring in cancer development and progression. © The Author 2012. Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  15. Early Prediction of Student Dropout and Performance in MOOCSs Using Higher Granularity Temporal Information

    Science.gov (United States)

    Ye, Cheng; Biswas, Gautam

    2014-01-01

    Our project is motivated by the early dropout and low completion rate problem in MOOCs. We have extended traditional features for MOOC analysis with richer and higher granularity information to make more accurate predictions of dropout and performance. The results show that finer-grained temporal information increases the predictive power in the…

  16. Ensemble Architecture for Prediction of Enzyme-ligand Binding Residues Using Evolutionary Information.

    Science.gov (United States)

    Pai, Priyadarshini P; Dattatreya, Rohit Kadam; Mondal, Sukanta

    2017-11-01

    Enzyme interactions with ligands are crucial for various biochemical reactions governing life. Over many years attempts to identify these residues for biotechnological manipulations have been made using experimental and computational techniques. The computational approaches have gathered impetus with the accruing availability of sequence and structure information, broadly classified into template-based and de novo methods. One of the predominant de novo methods using sequence information involves application of biological properties for supervised machine learning. Here, we propose a support vector machines-based ensemble for prediction of protein-ligand interacting residues using one of the most important discriminative contributing properties in the interacting residue neighbourhood, i. e., evolutionary information in the form of position-specific- scoring matrix (PSSM). The study has been performed on a non-redundant dataset comprising of 9269 interacting and 91773 non-interacting residues for prediction model generation and further evaluation. Of the various PSSM-based models explored, the proposed method named ROBBY (pRediction Of Biologically relevant small molecule Binding residues on enzYmes) shows an accuracy of 84.0 %, Matthews Correlation Coefficient of 0.343 and F-measure of 39.0 % on 78 test enzymes. Further, scope of adding domain knowledge such as pocket information has also been investigated; results showed significant enhancement in method precision. Findings are hoped to boost the reliability of small-molecule ligand interaction prediction for enzyme applications and drug design. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Comparison of Predictive Contract Mechanisms from an Information Theory Perspective

    OpenAIRE

    Zhang, Xin; Ward, Tomas; McLoone, Seamus

    2012-01-01

    Inconsistency arises across a Distributed Virtual Environment due to network latency induced by state changes communications. Predictive Contract Mechanisms (PCMs) combat this problem through reducing the amount of messages transmitted in return for perceptually tolerable inconsistency. To date there are no methods to quantify the efficiency of PCMs in communicating this reduced state information. This article presents an approach derived from concepts in information theory for a dee...

  18. The information value of early career productivity in mathematics: a ROC analysis of prediction errors in bibliometricly informed decision making.

    Science.gov (United States)

    Lindahl, Jonas; Danell, Rickard

    The aim of this study was to provide a framework to evaluate bibliometric indicators as decision support tools from a decision making perspective and to examine the information value of early career publication rate as a predictor of future productivity. We used ROC analysis to evaluate a bibliometric indicator as a tool for binary decision making. The dataset consisted of 451 early career researchers in the mathematical sub-field of number theory. We investigated the effect of three different definitions of top performance groups-top 10, top 25, and top 50 %; the consequences of using different thresholds in the prediction models; and the added prediction value of information on early career research collaboration and publications in prestige journals. We conclude that early career performance productivity has an information value in all tested decision scenarios, but future performance is more predictable if the definition of a high performance group is more exclusive. Estimated optimal decision thresholds using the Youden index indicated that the top 10 % decision scenario should use 7 articles, the top 25 % scenario should use 7 articles, and the top 50 % should use 5 articles to minimize prediction errors. A comparative analysis between the decision thresholds provided by the Youden index which take consequences into consideration and a method commonly used in evaluative bibliometrics which do not take consequences into consideration when determining decision thresholds, indicated that differences are trivial for the top 25 and the 50 % groups. However, a statistically significant difference between the methods was found for the top 10 % group. Information on early career collaboration and publication strategies did not add any prediction value to the bibliometric indicator publication rate in any of the models. The key contributions of this research is the focus on consequences in terms of prediction errors and the notion of transforming uncertainty

  19. broken magnet highlights largest collider's engineering challenges

    CERN Multimedia

    Inman, Mason

    2007-01-01

    "Even at the world's soon-to-be largest particle accelerator - a device that promises to push the boundaries of physics - scientists need to be mindful of one of the most fundamental laws in the universe: Murphy's Law. (2 pages)

  20. Drug-target interaction prediction from PSSM based evolutionary information.

    Science.gov (United States)

    Mousavian, Zaynab; Khakabimamaghani, Sahand; Kavousi, Kaveh; Masoudi-Nejad, Ali

    2016-01-01

    The labor-intensive and expensive experimental process of drug-target interaction prediction has motivated many researchers to focus on in silico prediction, which leads to the helpful information in supporting the experimental interaction data. Therefore, they have proposed several computational approaches for discovering new drug-target interactions. Several learning-based methods have been increasingly developed which can be categorized into two main groups: similarity-based and feature-based. In this paper, we firstly use the bi-gram features extracted from the Position Specific Scoring Matrix (PSSM) of proteins in predicting drug-target interactions. Our results demonstrate the high-confidence prediction ability of the Bigram-PSSM model in terms of several performance indicators specifically for enzymes and ion channels. Moreover, we investigate the impact of negative selection strategy on the performance of the prediction, which is not widely taken into account in the other relevant studies. This is important, as the number of non-interacting drug-target pairs are usually extremely large in comparison with the number of interacting ones in existing drug-target interaction data. An interesting observation is that different levels of performance reduction have been attained for four datasets when we change the sampling method from the random sampling to the balanced sampling. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Distinguishing prognostic and predictive biomarkers: An information theoretic approach.

    Science.gov (United States)

    Sechidis, Konstantinos; Papangelou, Konstantinos; Metcalfe, Paul D; Svensson, David; Weatherall, James; Brown, Gavin

    2018-05-02

    The identification of biomarkers to support decision-making is central to personalised medicine, in both clinical and research scenarios. The challenge can be seen in two halves: identifying predictive markers, which guide the development/use of tailored therapies; and identifying prognostic markers, which guide other aspects of care and clinical trial planning, i.e. prognostic markers can be considered as covariates for stratification. Mistakenly assuming a biomarker to be predictive, when it is in fact largely prognostic (and vice-versa) is highly undesirable, and can result in financial, ethical and personal consequences. We present a framework for data-driven ranking of biomarkers on their prognostic/predictive strength, using a novel information theoretic method. This approach provides a natural algebra to discuss and quantify the individual predictive and prognostic strength, in a self-consistent mathematical framework. Our contribution is a novel procedure, INFO+, which naturally distinguishes the prognostic vs predictive role of each biomarker and handles higher order interactions. In a comprehensive empirical evaluation INFO+ outperforms more complex methods, most notably when noise factors dominate, and biomarkers are likely to be falsely identified as predictive, when in fact they are just strongly prognostic. Furthermore, we show that our methods can be 1-3 orders of magnitude faster than competitors, making it useful for biomarker discovery in 'big data' scenarios. Finally, we apply our methods to identify predictive biomarkers on two real clinical trials, and introduce a new graphical representation that provides greater insight into the prognostic and predictive strength of each biomarker. R implementations of the suggested methods are available at https://github.com/sechidis. konstantinos.sechidis@manchester.ac.uk. Supplementary data are available at Bioinformatics online.

  2. Performance of local information-based link prediction: a sampling perspective

    Science.gov (United States)

    Zhao, Jichang; Feng, Xu; Dong, Li; Liang, Xiao; Xu, Ke

    2012-08-01

    Link prediction is pervasively employed to uncover the missing links in the snapshots of real-world networks, which are usually obtained through different kinds of sampling methods. In the previous literature, in order to evaluate the performance of the prediction, known edges in the sampled snapshot are divided into the training set and the probe set randomly, without considering the underlying sampling approaches. However, different sampling methods might lead to different missing links, especially for the biased ways. For this reason, random partition-based evaluation of performance is no longer convincing if we take the sampling method into account. In this paper, we try to re-evaluate the performance of local information-based link predictions through sampling method governed division of the training set and the probe set. It is interesting that we find that for different sampling methods, each prediction approach performs unevenly. Moreover, most of these predictions perform weakly when the sampling method is biased, which indicates that the performance of these methods might have been overestimated in the prior works.

  3. Reduced Predictable Information in Brain Signals in Autism Spectrum Disorder

    Directory of Open Access Journals (Sweden)

    Carlos eGomez

    2014-02-01

    Full Text Available Autism spectrum disorder (ASD is a common developmental disorder characterized by communication difficulties and impaired social interaction. Recent results suggest altered brain dynamics as a potential cause of symptoms in ASD. Here, we aim to describe potential information-processing consequences of these alterations by measuring active information storage (AIS – a key quantity in the theory of distributed computation in biological networks. AIS is defined as the mutual information between the semi-infinite past of a process and its next state. It measures the amount of stored information that is used for computation of the next time step of a process. AIS is high for rich but predictable dynamics. We recorded magnetoencephalography (MEG signals in 13 ASD patients and 14 matched control subjects in a visual task. After a beamformer source analysis, twelve task-relevant sources were obtained. For these sources, stationary baseline activity was analyzed using AIS. Our results showed a decrease of AIS values in the hippocampus of ASD patients in comparison with controls, meaning that brain signals in ASD were either less predictable, reduced in their dynamic richness or both. Our study suggests the usefulness of AIS to detect an abnormal type of dynamics in ASD. The observed changes in AIS are compatible with Bayesian theories of reduced use or precision of priors in ASD.

  4. System for prediction of environmental emergency dose information

    International Nuclear Information System (INIS)

    Moriuchi, Shigeru

    1989-01-01

    According to the national research program revised by the Japan Nuclear Safety Commission after the TMI-2 reactor accident JAERI started the development of a computer code system for the real-time prediction of environmental consequences following a nuclear reactor accident, and in 1985 the basic development of the System for Prediction of Environmental Emergency Dose Information SPEEDI was completed. The system consists of three-dimensional models of wind field calculation (WIND04), dispersion calculation (PRWDA) and internal and external dose calculation (CIDE), and is designed to speedily predict radioactive concentration in the air, the ground deposition and radiation doses of upto 100 km range by simulation calculation when the radioactive materials are accidentally released from a reactor. At Chernobyl accident the calculational domain of SPEEDI were extended tentatively upto 2000 km, and simulation calculations of the movement of radioactive cloud were executed, and the estimation of the amounts of released radioactivities were made using calculated results and observed data. The calculated distribution and the movement of plume well agreed with the distribution patterns evaluated from observation data, and the estimated source term agreed approximately with data reported from USSR and other countries. (author)

  5. THE POTENTIAL OF THE EQUITY WORKING CAPITAL IN THE PREDICTION OF BANKRUPTCY

    OpenAIRE

    Daniel BRÎNDESCU – OLARIU

    2014-01-01

    The current study evaluates the potential of the equity working capital in predicting corporate bankruptcy. The population subjected to the analysis included all companies form Timis County (largest Romanian County) with yearly sales of over 10000 lei. The interest for the equity working capital was based on the recommendations of the literature, as well as on the availability of information concerning its values to all stakeholders. The event on which the research was focused was repr...

  6. Getting the Most out of Macroeconomic Information for Predicting Stock Returns and Volatility

    NARCIS (Netherlands)

    C. Cakmakli (Cem); D.J.C. van Dijk (Dick)

    2010-01-01

    textabstractThis paper documents that factors extracted from a large set of macroeconomic variables bear useful information for predicting monthly US excess stock returns and volatility over the period 1980-2005. Factor-augmented predictive regression models improve upon both benchmark models that

  7. Getting the most out of macroeconomic information for predicting stock returns and volatility

    NARCIS (Netherlands)

    Cakmakli, C.; van Dijk, D.

    2011-01-01

    This paper documents that factors extracted from a large set of macroeconomic variables bear useful information for predicting monthly US excess stock returns and volatility over the period 1980-2005. Factor-augmented predictive regression models improve upon both benchmark models that only include

  8. Joint Asymptotic Distributions of Smallest and Largest Insurance Claims

    Directory of Open Access Journals (Sweden)

    Hansjörg Albrecher

    2014-07-01

    Full Text Available Assume that claims in a portfolio of insurance contracts are described by independent and identically distributed random variables with regularly varying tails and occur according to a near mixed Poisson process. We provide a collection of results pertaining to the joint asymptotic Laplace transforms of the normalised sums of the smallest and largest claims, when the length of the considered time interval tends to infinity. The results crucially depend on the value of the tail index of the claim distribution, as well as on the number of largest claims under consideration.

  9. On predicting student performance using low-rank matrix factorization techniques

    DEFF Research Database (Denmark)

    Lorenzen, Stephan Sloth; Pham, Dang Ninh; Alstrup, Stephen

    2017-01-01

    Predicting the score of a student is one of the important problems in educational data mining. The scores given by an individual student reflect how a student understands and applies the knowledge conveyed in class. A reliable performance prediction enables teachers to identify weak students...... that require remedial support, generate adaptive hints, and improve the learning of students. This work focuses on predicting the score of students in the quiz system of the Clio Online learning platform, the largest Danish supplier of online learning materials, covering 90% of Danish elementary schools...... and the current version of the data set is very sparse, the very low-rank approximation can capture enough information. This means that the simple baseline approach achieves similar performance compared to other advanced methods. In future work, we will restrict the quiz data set, e.g. only including quizzes...

  10. Cognitive Factors in Predicting Continued Use of Information Systems with Technology Adoption Models

    Science.gov (United States)

    Huang, Chi-Cheng

    2017-01-01

    Introduction: The ultimate viability of an information system is dependent on individuals' continued use of the information system. In this study, we use the technology acceptance model and the theory of interpersonal behaviour to predict continued use of information systems. Method: We established a Web questionnaire on the mySurvey Website and…

  11. Guaranteeing uptime at worl's largest particle physics lab

    CERN Multimedia

    Brodkin, Jon

    2007-01-01

    "As the European agency CERN was gearing up to build the world's largest particle accelerator, officials there knew they could not afford to have problems in their technical infrastructure cause any downtime." (1 page)

  12. Information-theoretic model selection for optimal prediction of stochastic dynamical systems from data

    Science.gov (United States)

    Darmon, David

    2018-03-01

    In the absence of mechanistic or phenomenological models of real-world systems, data-driven models become necessary. The discovery of various embedding theorems in the 1980s and 1990s motivated a powerful set of tools for analyzing deterministic dynamical systems via delay-coordinate embeddings of observations of their component states. However, in many branches of science, the condition of operational determinism is not satisfied, and stochastic models must be brought to bear. For such stochastic models, the tool set developed for delay-coordinate embedding is no longer appropriate, and a new toolkit must be developed. We present an information-theoretic criterion, the negative log-predictive likelihood, for selecting the embedding dimension for a predictively optimal data-driven model of a stochastic dynamical system. We develop a nonparametric estimator for the negative log-predictive likelihood and compare its performance to a recently proposed criterion based on active information storage. Finally, we show how the output of the model selection procedure can be used to compare candidate predictors for a stochastic system to an information-theoretic lower bound.

  13. Prediction of microsleeps using pairwise joint entropy and mutual information between EEG channels.

    Science.gov (United States)

    Baseer, Abdul; Weddell, Stephen J; Jones, Richard D

    2017-07-01

    Microsleeps are involuntary and brief instances of complete loss of responsiveness, typically of 0.5-15 s duration. They adversely affect performance in extended attention-driven jobs and can be fatal. Our aim was to predict microsleeps from 16 channel EEG signals. Two information theoretic concepts - pairwise joint entropy and mutual information - were independently used to continuously extract features from EEG signals. k-nearest neighbor (kNN) with k = 3 was used to calculate both joint entropy and mutual information. Highly correlated features were discarded and the rest were ranked using Fisher score followed by an average of 3-fold cross-validation area under the curve of the receiver operating characteristic (AUC ROC ). Leave-one-out method (LOOM) was performed to test the performance of microsleep prediction system on independent data. The best prediction for 0.25 s ahead was AUCROC, sensitivity, precision, geometric mean (GM), and φ of 0.93, 0.68, 0.33, 0.75, and 0.38 respectively with joint entropy using single linear discriminant analysis (LDA) classifier.

  14. Prediction of N2O emission from local information with Random Forest

    International Nuclear Information System (INIS)

    Philibert, Aurore; Loyce, Chantal; Makowski, David

    2013-01-01

    Nitrous oxide is a potent greenhouse gas, with a global warming potential 298 times greater than that of CO 2 . In agricultural soils, N 2 O emissions are influenced by a large number of environmental characteristics and crop management techniques that are not systematically reported in experiments. Random Forest (RF) is a machine learning method that can handle missing data and ranks input variables on the basis of their importance. We aimed to predict N 2 O emission on the basis of local information, to rank environmental and crop management variables according to their influence on N 2 O emission, and to compare the performances of RF with several regression models. RF outperformed the regression models for predictive purposes, and this approach led to the identification of three important input variables: N fertilization, type of crop, and experiment duration. This method could be used in the future for prediction of N 2 O emissions from local information. -- Highlights: ► Random Forest gave more accurate N 2 O predictions than regression. ► Missing data were well handled by Random Forest. ► The most important factors were nitrogen rate, type of crop and experiment duration. -- Random Forest, a machine learning method, outperformed the regression models for predicting N 2 O emissions and led to the identification of three important input variables

  15. Making smart social judgments takes time: infants' recruitment of goal information when generating action predictions.

    Science.gov (United States)

    Krogh-Jespersen, Sheila; Woodward, Amanda L

    2014-01-01

    Previous research has shown that young infants perceive others' actions as structured by goals. One open question is whether the recruitment of this understanding when predicting others' actions imposes a cognitive challenge for young infants. The current study explored infants' ability to utilize their knowledge of others' goals to rapidly predict future behavior in complex social environments and distinguish goal-directed actions from other kinds of movements. Fifteen-month-olds (N = 40) viewed videos of an actor engaged in either a goal-directed (grasping) or an ambiguous (brushing the back of her hand) action on a Tobii eye-tracker. At test, critical elements of the scene were changed and infants' predictive fixations were examined to determine whether they relied on goal information to anticipate the actor's future behavior. Results revealed that infants reliably generated goal-based visual predictions for the grasping action, but not for the back-of-hand behavior. Moreover, response latencies were longer for goal-based predictions than for location-based predictions, suggesting that goal-based predictions are cognitively taxing. Analyses of areas of interest indicated that heightened attention to the overall scene, as opposed to specific patterns of attention, was the critical indicator of successful judgments regarding an actor's future goal-directed behavior. These findings shed light on the processes that support "smart" social behavior in infants, as it may be a challenge for young infants to use information about others' intentions to inform rapid predictions.

  16. Predicting Key Events in the Popularity Evolution of Online Information.

    Science.gov (United States)

    Hu, Ying; Hu, Changjun; Fu, Shushen; Fang, Mingzhe; Xu, Wenwen

    2017-01-01

    The popularity of online information generally experiences a rising and falling evolution. This paper considers the "burst", "peak", and "fade" key events together as a representative summary of popularity evolution. We propose a novel prediction task-predicting when popularity undergoes these key events. It is of great importance to know when these three key events occur, because doing so helps recommendation systems, online marketing, and containment of rumors. However, it is very challenging to solve this new prediction task due to two issues. First, popularity evolution has high variation and can follow various patterns, so how can we identify "burst", "peak", and "fade" in different patterns of popularity evolution? Second, these events usually occur in a very short time, so how can we accurately yet promptly predict them? In this paper we address these two issues. To handle the first one, we use a simple moving average to smooth variation, and then a universal method is presented for different patterns to identify the key events in popularity evolution. To deal with the second one, we extract different types of features that may have an impact on the key events, and then a correlation analysis is conducted in the feature selection step to remove irrelevant and redundant features. The remaining features are used to train a machine learning model. The feature selection step improves prediction accuracy, and in order to emphasize prediction promptness, we design a new evaluation metric which considers both accuracy and promptness to evaluate our prediction task. Experimental and comparative results show the superiority of our prediction solution.

  17. Predicting Key Events in the Popularity Evolution of Online Information.

    Directory of Open Access Journals (Sweden)

    Ying Hu

    Full Text Available The popularity of online information generally experiences a rising and falling evolution. This paper considers the "burst", "peak", and "fade" key events together as a representative summary of popularity evolution. We propose a novel prediction task-predicting when popularity undergoes these key events. It is of great importance to know when these three key events occur, because doing so helps recommendation systems, online marketing, and containment of rumors. However, it is very challenging to solve this new prediction task due to two issues. First, popularity evolution has high variation and can follow various patterns, so how can we identify "burst", "peak", and "fade" in different patterns of popularity evolution? Second, these events usually occur in a very short time, so how can we accurately yet promptly predict them? In this paper we address these two issues. To handle the first one, we use a simple moving average to smooth variation, and then a universal method is presented for different patterns to identify the key events in popularity evolution. To deal with the second one, we extract different types of features that may have an impact on the key events, and then a correlation analysis is conducted in the feature selection step to remove irrelevant and redundant features. The remaining features are used to train a machine learning model. The feature selection step improves prediction accuracy, and in order to emphasize prediction promptness, we design a new evaluation metric which considers both accuracy and promptness to evaluate our prediction task. Experimental and comparative results show the superiority of our prediction solution.

  18. A general scaling law reveals why the largest animals are not the fastest.

    Science.gov (United States)

    Hirt, Myriam R; Jetz, Walter; Rall, Björn C; Brose, Ulrich

    2017-08-01

    Speed is the fundamental constraint on animal movement, yet there is no general consensus on the determinants of maximum speed itself. Here, we provide a general scaling model of maximum speed with body mass, which holds across locomotion modes, ecosystem types and taxonomic groups. In contrast to traditional power-law scaling, we predict a hump-shaped relationship resulting from a finite acceleration time for animals, which explains why the largest animals are not the fastest. This model is strongly supported by extensive empirical data (474 species, with body masses ranging from 30 μg to 100 tonnes) from terrestrial as well as aquatic ecosystems. Our approach unravels a fundamental constraint on the upper limit of animal movement, thus enabling a better understanding of realized movement patterns in nature and their multifold ecological consequences.

  19. The world's largest LNG producer's next market

    International Nuclear Information System (INIS)

    Fuller, R.; Isworo Suharno; Simandjuntak, W.M.P.

    1996-01-01

    The development of the domestic gas market in Indonesia, the world's largest liquefied natural gas producing country, is described as part of the overall impact of the country's oil and gas production. The first large scale use of natural gas in Indonesia was established in 1968 when a fertiliser plant using gas as the feedstock was built. Ultimately, through increased yields, this has enabled Indonesia to be self-sufficient in rice and an exporter of fertiliser. Problems which stand in the way of further developments include: capital, though Pertamina and PGN are perceived as attractive for foreign investment; the lack of a regulatory framework for gas; geographical constraints, among them the fact that the gas deposits are remote from the largest population concentrations; lack of infrastructure. There are nevertheless plans for expansion and the provision of an integrated gas pipeline system. Pertamina, which has responsibility for all oil and gas developments, and PGN, whose primary role has been as a manufacturer and distributor of gas, are now working together in the coordination of all gas activities. (10 figures). (UK)

  20. Incorporating information on predicted solvent accessibility to the co-evolution-based study of protein interactions.

    Science.gov (United States)

    Ochoa, David; García-Gutiérrez, Ponciano; Juan, David; Valencia, Alfonso; Pazos, Florencio

    2013-01-27

    A widespread family of methods for studying and predicting protein interactions using sequence information is based on co-evolution, quantified as similarity of phylogenetic trees. Part of the co-evolution observed between interacting proteins could be due to co-adaptation caused by inter-protein contacts. In this case, the co-evolution is expected to be more evident when evaluated on the surface of the proteins or the internal layers close to it. In this work we study the effect of incorporating information on predicted solvent accessibility to three methods for predicting protein interactions based on similarity of phylogenetic trees. We evaluate the performance of these methods in predicting different types of protein associations when trees based on positions with different characteristics of predicted accessibility are used as input. We found that predicted accessibility improves the results of two recent versions of the mirrortree methodology in predicting direct binary physical interactions, while it neither improves these methods, nor the original mirrortree method, in predicting other types of interactions. That improvement comes at no cost in terms of applicability since accessibility can be predicted for any sequence. We also found that predictions of protein-protein interactions are improved when multiple sequence alignments with a richer representation of sequences (including paralogs) are incorporated in the accessibility prediction.

  1. Enhancing the prediction of protein pairings between interacting families using orthology information

    Directory of Open Access Journals (Sweden)

    Pazos Florencio

    2008-01-01

    Full Text Available Abstract Background It has repeatedly been shown that interacting protein families tend to have similar phylogenetic trees. These similarities can be used to predicting the mapping between two families of interacting proteins (i.e. which proteins from one family interact with which members of the other. The correct mapping will be that which maximizes the similarity between the trees. The two families may eventually comprise orthologs and paralogs, if members of the two families are present in more than one organism. This fact can be exploited to restrict the possible mappings, simply by impeding links between proteins of different organisms. We present here an algorithm to predict the mapping between families of interacting proteins which is able to incorporate information regarding orthologues, or any other assignment of proteins to "classes" that may restrict possible mappings. Results For the first time in methods for predicting mappings, we have tested this new approach on a large number of interacting protein domains in order to statistically assess its performance. The method accurately predicts around 80% in the most favourable cases. We also analysed in detail the results of the method for a well defined case of interacting families, the sensor and kinase components of the Ntr-type two-component system, for which up to 98% of the pairings predicted by the method were correct. Conclusion Based on the well established relationship between tree similarity and interactions we developed a method for predicting the mapping between two interacting families using genomic information alone. The program is available through a web interface.

  2. Worlds Largest Wave Energy Project 2007 in Wales

    DEFF Research Database (Denmark)

    Christensen, Lars; Friis-Madsen, Erik; Kofoed, Jens Peter

    2006-01-01

    This paper introduces world largest wave energy project being developed in Wales and based on one of the leading wave energy technologies. The background for the development of wave energy, the total resource ands its distribution around the world is described. In contrast to wind energy turbines...... Dragon has to be scaled in accordance with the wave climate at the deployment site, which makes the Welch demonstrator device the worlds largest WEC so far with a total width of 300 meters. The project budget, the construction methods and the deployment site are also given....... a large number of fundamentally different technologies are utilised to harvest wave energy. The Wave Dragon belongs to the wave overtopping class of converters and the paper describes the fundamentals and the technical solutions used in this wave energy converter. An offshore floating WEC like the Wave...

  3. First Experience from the World Largest fully commercial Solar Heating Plant

    DEFF Research Database (Denmark)

    Heller, Alfred; Furbo, Simon

    1997-01-01

    The first experience from the largest solar heating plant in the world is given. The plant is situated in Marstal and is has a total area of 8000 square m.......The first experience from the largest solar heating plant in the world is given. The plant is situated in Marstal and is has a total area of 8000 square m....

  4. Challenges with the largest commercial hydrogen station in the world

    Energy Technology Data Exchange (ETDEWEB)

    Charbonneau, Thomas; Gauthier, Pierre [Air Liquide Canada (Canada)

    2010-07-01

    This abstract's objective is to share with the participants the story of the largest hydrogen fueling station made to this date and to kick-start the story, we will cover the challenges; first the technical ones; the operational ones; the distribution ones and; the financial ones. We will then move on to review the logistic (geographic) issues raised by the project and conclude our presentation by sharing the output values of the largest fueling station built so far in the world. (orig.)

  5. Maladaptive social information processing in childhood predicts young men's atypical amygdala reactivity to threat.

    Science.gov (United States)

    Choe, Daniel Ewon; Shaw, Daniel S; Forbes, Erika E

    2015-05-01

    Maladaptive social information processing, such as hostile attributional bias and aggressive response generation, is associated with childhood maladjustment. Although social information processing problems are correlated with heightened physiological responses to social threat, few studies have examined their associations with neural threat circuitry, specifically amygdala activation to social threat. A cohort of 310 boys participated in an ongoing longitudinal study and completed questionnaires and laboratory tasks assessing their social and cognitive characteristics the boys were between 10 and 12 years of age. At age 20, 178 of these young men underwent functional magnetic resonance imaging and a social threat task. At age 22, adult criminal arrest records and self-reports of impulsiveness were obtained. Path models indicated that maladaptive social information-processing at ages 10 and 11 predicted increased left amygdala reactivity to fear faces, an ambiguous threat, at age 20 while accounting for childhood antisocial behavior, empathy, IQ, and socioeconomic status. Exploratory analyses indicated that aggressive response generation - the tendency to respond to threat with reactive aggression - predicted left amygdala reactivity to fear faces and was concurrently associated with empathy, antisocial behavior, and hostile attributional bias, whereas hostile attributional bias correlated with IQ. Although unrelated to social information-processing problems, bilateral amygdala reactivity to anger faces at age 20 was unexpectedly predicted by low IQ at age 11. Amygdala activation did not mediate associations between social information processing and number of criminal arrests, but both impulsiveness at age 22 and arrests were correlated with right amygdala reactivity to anger facial expressions at age 20. Childhood social information processing and IQ predicted young men's amygdala response to threat a decade later, which suggests that childhood social

  6. [Prediction of regional soil quality based on mutual information theory integrated with decision tree algorithm].

    Science.gov (United States)

    Lin, Fen-Fang; Wang, Ke; Yang, Ning; Yan, Shi-Guang; Zheng, Xin-Yu

    2012-02-01

    In this paper, some main factors such as soil type, land use pattern, lithology type, topography, road, and industry type that affect soil quality were used to precisely obtain the spatial distribution characteristics of regional soil quality, mutual information theory was adopted to select the main environmental factors, and decision tree algorithm See 5.0 was applied to predict the grade of regional soil quality. The main factors affecting regional soil quality were soil type, land use, lithology type, distance to town, distance to water area, altitude, distance to road, and distance to industrial land. The prediction accuracy of the decision tree model with the variables selected by mutual information was obviously higher than that of the model with all variables, and, for the former model, whether of decision tree or of decision rule, its prediction accuracy was all higher than 80%. Based on the continuous and categorical data, the method of mutual information theory integrated with decision tree could not only reduce the number of input parameters for decision tree algorithm, but also predict and assess regional soil quality effectively.

  7. Differential contribution of visual and auditory information to accurately predict the direction and rotational motion of a visual stimulus.

    Science.gov (United States)

    Park, Seoung Hoon; Kim, Seonjin; Kwon, MinHyuk; Christou, Evangelos A

    2016-03-01

    Vision and auditory information are critical for perception and to enhance the ability of an individual to respond accurately to a stimulus. However, it is unknown whether visual and auditory information contribute differentially to identify the direction and rotational motion of the stimulus. The purpose of this study was to determine the ability of an individual to accurately predict the direction and rotational motion of the stimulus based on visual and auditory information. In this study, we recruited 9 expert table-tennis players and used table-tennis service as our experimental model. Participants watched recorded services with different levels of visual and auditory information. The goal was to anticipate the direction of the service (left or right) and the rotational motion of service (topspin, sidespin, or cut). We recorded their responses and quantified the following outcomes: (i) directional accuracy and (ii) rotational motion accuracy. The response accuracy was the accurate predictions relative to the total number of trials. The ability of the participants to predict the direction of the service accurately increased with additional visual information but not with auditory information. In contrast, the ability of the participants to predict the rotational motion of the service accurately increased with the addition of auditory information to visual information but not with additional visual information alone. In conclusion, this finding demonstrates that visual information enhances the ability of an individual to accurately predict the direction of the stimulus, whereas additional auditory information enhances the ability of an individual to accurately predict the rotational motion of stimulus.

  8. Prediction of future nitrogen loading to Lake Rotorua

    International Nuclear Information System (INIS)

    Morgenstern, U.; Gordon, D.

    2006-01-01

    loading estimate for the direct groundwater has the largest uncertainty because very limited age and chemistry data is available. Lake side springs and minor streams together contribute only about 5% of the total nitrogen load to Lake Rotorua. Hamurana, Awahou and Waingaehe streams are expected to show the largest increases in N loading in the future because they contain the oldest water, and Hamurana and Awahou streams will have the largest increase in nitrogen mass loading because they have the largest flow. Utuhina, Waiteti, and Puarenga streams are expected to have medium increases in nitrogen loading because of younger water age and lower flow. Ngongotaha, Waiohewa, and Waiowhiro streams are expected to have little further increase in N loading because of low flow or steady-state already reached. Landuse intensification that has occurred within the last 20 years is not yet reflected in the current nitrogen prediction model because information on the timing and amount of intensification was not yet available. The present nitrogen prediction model assumes that the nitrogen input in the catchment from landuse development has remained relatively constant since the 1950's. The current predictions would therefore represent a lower limit to which the more recent nitrogen loads would have to be added. If more information on timing and amount of landuse changes becomes available, the N load predictions can be refined to incorporate landuse change in several stages by calculating the predicted N load for each stage and adding these. Positive (intensification) or negative (retirement) changes can be considered. The large groundwater system of the Lake Rotorua catchment responds delayed by decades to landuse changes. These timeframes will need to be considered carefully for any possible mitigation options in the catchment. (author). 8 refs., 14 figs., 2 tabs

  9. Prediction of natural disasters basing of chrono-and-information field characters

    Science.gov (United States)

    Sapunov, Valentin

    2013-04-01

    Living organisms are able to predict some future events particular catastrophic incidents. This is adaptive characters producing by evolution. The more energy produces incident the more possibility to predict one. Wild animals escaped natural hazards including tsunami (e.g. extremal tsunami in Asia December 2004). Living animals are able to predict strong phenomena of obscure nature. For example majority of animals escaped Tungus catastrophe taking place in Siberia at 1908. Wild animals are able to predict nuclear weapon experiences. The obscure characters are not typical for human, but they are fixed under probability 15%. Such were summarized by L.Vasiliev (1961). Effective theory describing such a characters is absent till now. N.Kozyrev (1991) suggested existence of unknown physical field (but gravitation and electro magnetic). The field was named "time" or "chrono". Some characters of the field appeared to be object of physical experiment. Kozyrev suggested specific role of the field for function of living organisms. Transition of biological information throw space (telepathy) and time (proscopy) may be based on characters of such a field. Hence physical chrono-and-information field is under consideration. Animals are more familiar with such a field than human. Evolutionary process experienced with possibility of extremal development of contact with such a field using highest primates. This mode of evolution appeared to stay obscure producing probable species "Wildman" (Bigfoot). Specific adaptive fitches suggest impossibility to study of such a species by usual ecological approaches. The perspective way for study of mysterious phenomena of physic is researches of this field characters.

  10. Integration of relational and hierarchical network information for protein function prediction

    Directory of Open Access Journals (Sweden)

    Jiang Xiaoyu

    2008-08-01

    Full Text Available Abstract Background In the current climate of high-throughput computational biology, the inference of a protein's function from related measurements, such as protein-protein interaction relations, has become a canonical task. Most existing technologies pursue this task as a classification problem, on a term-by-term basis, for each term in a database, such as the Gene Ontology (GO database, a popular rigorous vocabulary for biological functions. However, ontology structures are essentially hierarchies, with certain top to bottom annotation rules which protein function predictions should in principle follow. Currently, the most common approach to imposing these hierarchical constraints on network-based classifiers is through the use of transitive closure to predictions. Results We propose a probabilistic framework to integrate information in relational data, in the form of a protein-protein interaction network, and a hierarchically structured database of terms, in the form of the GO database, for the purpose of protein function prediction. At the heart of our framework is a factorization of local neighborhood information in the protein-protein interaction network across successive ancestral terms in the GO hierarchy. We introduce a classifier within this framework, with computationally efficient implementation, that produces GO-term predictions that naturally obey a hierarchical 'true-path' consistency from root to leaves, without the need for further post-processing. Conclusion A cross-validation study, using data from the yeast Saccharomyces cerevisiae, shows our method offers substantial improvements over both standard 'guilt-by-association' (i.e., Nearest-Neighbor and more refined Markov random field methods, whether in their original form or when post-processed to artificially impose 'true-path' consistency. Further analysis of the results indicates that these improvements are associated with increased predictive capabilities (i.e., increased

  11. Predictive ability of machine learning methods for massive crop yield prediction

    Directory of Open Access Journals (Sweden)

    Alberto Gonzalez-Sanchez

    2014-04-01

    Full Text Available An important issue for agricultural planning purposes is the accurate yield estimation for the numerous crops involved in the planning. Machine learning (ML is an essential approach for achieving practical and effective solutions for this problem. Many comparisons of ML methods for yield prediction have been made, seeking for the most accurate technique. Generally, the number of evaluated crops and techniques is too low and does not provide enough information for agricultural planning purposes. This paper compares the predictive accuracy of ML and linear regression techniques for crop yield prediction in ten crop datasets. Multiple linear regression, M5-Prime regression trees, perceptron multilayer neural networks, support vector regression and k-nearest neighbor methods were ranked. Four accuracy metrics were used to validate the models: the root mean square error (RMS, root relative square error (RRSE, normalized mean absolute error (MAE, and correlation factor (R. Real data of an irrigation zone of Mexico were used for building the models. Models were tested with samples of two consecutive years. The results show that M5-Prime and k-nearest neighbor techniques obtain the lowest average RMSE errors (5.14 and 4.91, the lowest RRSE errors (79.46% and 79.78%, the lowest average MAE errors (18.12% and 19.42%, and the highest average correlation factors (0.41 and 0.42. Since M5-Prime achieves the largest number of crop yield models with the lowest errors, it is a very suitable tool for massive crop yield prediction in agricultural planning.

  12. Damage and protection cost curves for coastal floods within the 600 largest European cities

    Science.gov (United States)

    Prahl, Boris F.; Boettle, Markus; Costa, Luís; Kropp, Jürgen P.; Rybski, Diego

    2018-01-01

    The economic assessment of the impacts of storm surges and sea-level rise in coastal cities requires high-level information on the damage and protection costs associated with varying flood heights. We provide a systematically and consistently calculated dataset of macroscale damage and protection cost curves for the 600 largest European coastal cities opening the perspective for a wide range of applications. Offering the first comprehensive dataset to include the costs of dike protection, we provide the underpinning information to run comparative assessments of costs and benefits of coastal adaptation. Aggregate cost curves for coastal flooding at the city-level are commonly regarded as by-products of impact assessments and are generally not published as a standalone dataset. Hence, our work also aims at initiating a more critical discussion on the availability and derivation of cost curves. PMID:29557944

  13. Damage and protection cost curves for coastal floods within the 600 largest European cities

    Science.gov (United States)

    Prahl, Boris F.; Boettle, Markus; Costa, Luís; Kropp, Jürgen P.; Rybski, Diego

    2018-03-01

    The economic assessment of the impacts of storm surges and sea-level rise in coastal cities requires high-level information on the damage and protection costs associated with varying flood heights. We provide a systematically and consistently calculated dataset of macroscale damage and protection cost curves for the 600 largest European coastal cities opening the perspective for a wide range of applications. Offering the first comprehensive dataset to include the costs of dike protection, we provide the underpinning information to run comparative assessments of costs and benefits of coastal adaptation. Aggregate cost curves for coastal flooding at the city-level are commonly regarded as by-products of impact assessments and are generally not published as a standalone dataset. Hence, our work also aims at initiating a more critical discussion on the availability and derivation of cost curves.

  14. Building Earth's Largest Library: Driving into the Future.

    Science.gov (United States)

    Coffman, Steve

    1999-01-01

    Examines the Amazon.com online bookstore as a blueprint for designing the world's largest library. Topics include selection; accessibility and convenience; quality of Web sites and search tools; personalized service; library collection development, including interlibrary loan; library catalogs and catalog records; a circulation system; costs;…

  15. Toward sustainable harvesting of Africa's largest medicinal plant ...

    African Journals Online (AJOL)

    Global demand for treating prostate disorders with Prunus africana bark extract has made P. africana Africa's largest medicinal plant export. Unsustainable harvesting practices can lead to local extirpations of this multipurpose tree. Survey research targeting P. africana harvesters in a Tanzania forest reserve revealed that ...

  16. Scaling relationships among drivers of aquatic respiration from the smallest to the largest freshwater ecosystems

    Science.gov (United States)

    Hall, Ed K; Schoolmaster, Donald; Amado, A.M; Stets, Edward G.; Lennon, J.T.; Domaine, L.; Cotner, J.B.

    2016-01-01

    To address how various environmental parameters control or constrain planktonic respiration (PR), we used geometric scaling relationships and established biological scaling laws to derive quantitative predictions for the relationships among key drivers of PR. We then used empirical measurements of PR and environmental (soluble reactive phosphate [SRP], carbon [DOC], chlorophyll a [Chl-a)], and temperature) and landscape parameters (lake area [LA] and watershed area [WA]) from a set of 44 lakes that varied in size and trophic status to test our hypotheses. We found that landscape-level processes affected PR through direct effects on DOC and temperature and indirectly via SRP. In accordance with predictions made from known relationships and scaling laws, scale coefficients (the parameter that describes the shape of a relationship between 2 variables) were found to be negative and have an absolute value 1, others respiration from small pond catchments to the largest body of freshwater on the planet, Lake Superior, these findings should be applicable to controls of PR for the great majority of temperate aquatic ecosystems.

  17. Predictive control strategies for energy saving of hybrid electric vehicles based on traffic light information

    Directory of Open Access Journals (Sweden)

    Kaijiang YU

    2015-10-01

    Full Text Available As the conventional control method for hybrid electric vehicle doesn’t consider the effect of known traffic light information on the vehicle energy management, this paper proposes a model predictive control intelligent optimization strategies based on traffic light information for hybrid electric vehicles. By building the simplified model of the hybrid electric vehicle and adopting the continuation/generalized minimum residual method, the model prediction problem is solved. The simulation is conducted by using MATLAB/Simulink platform. The simulation results show the effectiveness of the proposed model of the traffic light information, and that the proposed model predictive control method can improve fuel economy and the real-time control performance significantly. The research conclusions show that the proposed control strategy can achieve optimal control of the vehicle trajectory, significantly improving fuel economy of the vehicle, and meet the system requirements for the real-time optimal control.

  18. Prediction on sunspot activity based on fuzzy information granulation and support vector machine

    Science.gov (United States)

    Peng, Lingling; Yan, Haisheng; Yang, Zhigang

    2018-04-01

    In order to analyze the range of sunspots, a combined prediction method of forecasting the fluctuation range of sunspots based on fuzzy information granulation (FIG) and support vector machine (SVM) was put forward. Firstly, employing the FIG to granulate sample data and extract va)alid information of each window, namely the minimum value, the general average value and the maximum value of each window. Secondly, forecasting model is built respectively with SVM and then cross method is used to optimize these parameters. Finally, the fluctuation range of sunspots is forecasted with the optimized SVM model. Case study demonstrates that the model have high accuracy and can effectively predict the fluctuation of sunspots.

  19. Kabob report. Pt. 3. Chevron plant largest in Canada

    Energy Technology Data Exchange (ETDEWEB)

    1971-01-18

    Canada's largest fully integrated primary natural- gas processing and sulfur recovery plant is heading for physical completion by mid-summer of 1971. The Ralph M. Parsons Construction Co. of Canada Ltd., contractor for the S. Kaybob Beaverhill Lake Unit No. 3 gas-processing plant, to be operated by Chevron Standard Ltd., estimates completion by June 30. After that the $80 million complex will have tests and running in time. With any reasonable luck, it should be fully on stream by late summer. Preliminary construction on the 200-acre site started in Jan. 1969 with clearing and contouring of the main plant and sulfur storage sites. Initial rough grading started in the early summer, after spring breakup was over. Delivery of most of the big items was made by rail because the local secondary roads were inadequate for them. Concrete has been a large item. The contractor has its own batch plant on the site for the estimated 28,000 cu yd which will be needed for the whole job. Dominating the construction site from the start has been the high sulfur plant stack, first of the major items to be finished. It will serve to dispose of effluent from the largest sulfur recovery unit in Canada. It is 465 ft high, one of the largest in Alberta, and a significant contribution to pollution control and environmental protection.

  20. Distribution of the largest aftershocks in branching models of triggered seismicity: Theory of the universal Baath law

    International Nuclear Information System (INIS)

    Saichev, A.; Sornette, D.

    2005-01-01

    Using the epidemic-type aftershock sequence (ETAS) branching model of triggered seismicity, we apply the formalism of generating probability functions to calculate exactly the average difference between the magnitude of a mainshock and the magnitude of its largest aftershock over all generations. This average magnitude difference is found empirically to be independent of the mainshock magnitude and equal to 1.2, a universal behavior known as Baath's law. Our theory shows that Baath's law holds only sufficiently close to the critical regime of the ETAS branching process. Allowing for error bars ±0.1 for Baath's constant value around 1.2, our exact analytical treatment of Baath's law provides new constraints on the productivity exponent α and the branching ratio n: 0.9 < or approx. α≤1 and 0.8 < or approx. n≤1. We propose a method for measuring α based on the predicted renormalization of the Gutenberg-Richter distribution of the magnitudes of the largest aftershock. We also introduce the 'second Baath law for foreshocks': the probability that a main earthquake turns out to be the foreshock does not depend on its magnitude ρ

  1. Scoring function to predict solubility mutagenesis

    Directory of Open Access Journals (Sweden)

    Deutsch Christopher

    2010-10-01

    Full Text Available Abstract Background Mutagenesis is commonly used to engineer proteins with desirable properties not present in the wild type (WT protein, such as increased or decreased stability, reactivity, or solubility. Experimentalists often have to choose a small subset of mutations from a large number of candidates to obtain the desired change, and computational techniques are invaluable to make the choices. While several such methods have been proposed to predict stability and reactivity mutagenesis, solubility has not received much attention. Results We use concepts from computational geometry to define a three body scoring function that predicts the change in protein solubility due to mutations. The scoring function captures both sequence and structure information. By exploring the literature, we have assembled a substantial database of 137 single- and multiple-point solubility mutations. Our database is the largest such collection with structural information known so far. We optimize the scoring function using linear programming (LP methods to derive its weights based on training. Starting with default values of 1, we find weights in the range [0,2] so that predictions of increase or decrease in solubility are optimized. We compare the LP method to the standard machine learning techniques of support vector machines (SVM and the Lasso. Using statistics for leave-one-out (LOO, 10-fold, and 3-fold cross validations (CV for training and prediction, we demonstrate that the LP method performs the best overall. For the LOOCV, the LP method has an overall accuracy of 81%. Availability Executables of programs, tables of weights, and datasets of mutants are available from the following web page: http://www.wsu.edu/~kbala/OptSolMut.html.

  2. System for prediction of environmental emergency dose information network system

    International Nuclear Information System (INIS)

    Misawa, Makoto; Nagamori, Fumio

    2009-01-01

    In cases when an accident happens to arise with some risk for emission of a large amount radioactivity from the nuclear facilities, the environmental emergency due to this accident should be predicted rapidly and be informed immediately. The SPEEDI network system for such purpose was completed and now operated by Nuclear Safety Technology Center (NUSTEC) commissioned to do by Ministry of Education, Culture, Sports, Science and Technology, Japan. Fujitsu has been contributing to this project by developing the principal parts of the network performance, by introducing necessary servers, and also by keeping the network in good condition, such as with construction of the system followed by continuous operation and maintenance of the system. Real-time prediction of atmospheric diffusion of radionuclides for nuclear accidents in the world is now available with experimental verification for the real-time emergency response system. Improvement of worldwide version of the SPEEDI network system, accidental discharge of radionuclides with the function of simultaneous prediction for multiple domains and its evaluation is possible. (S. Ohno)

  3. Predicting the fidelity of JPEG2000 compressed CT images using DICOM header information

    International Nuclear Information System (INIS)

    Kim, Kil Joong; Kim, Bohyoung; Lee, Hyunna; Choi, Hosik; Jeon, Jong-June; Ahn, Jeong-Hwan; Lee, Kyoung Ho

    2011-01-01

    Purpose: To propose multiple logistic regression (MLR) and artificial neural network (ANN) models constructed using digital imaging and communications in medicine (DICOM) header information in predicting the fidelity of Joint Photographic Experts Group (JPEG) 2000 compressed abdomen computed tomography (CT) images. Methods: Our institutional review board approved this study and waived informed patient consent. Using a JPEG2000 algorithm, 360 abdomen CT images were compressed reversibly (n = 48, as negative control) or irreversibly (n = 312) to one of different compression ratios (CRs) ranging from 4:1 to 10:1. Five radiologists independently determined whether the original and compressed images were distinguishable or indistinguishable. The 312 irreversibly compressed images were divided randomly into training (n = 156) and testing (n = 156) sets. The MLR and ANN models were constructed regarding the DICOM header information as independent variables and the pooled radiologists' responses as dependent variable. As independent variables, we selected the CR (DICOM tag number: 0028, 2112), effective tube current-time product (0018, 9332), section thickness (0018, 0050), and field of view (0018, 0090) among the DICOM tags. Using the training set, an optimal subset of independent variables was determined by backward stepwise selection in a four-fold cross-validation scheme. The MLR and ANN models were constructed with the determined independent variables using the training set. The models were then evaluated on the testing set by using receiver-operating-characteristic (ROC) analysis regarding the radiologists' pooled responses as the reference standard and by measuring Spearman rank correlation between the model prediction and the number of radiologists who rated the two images as distinguishable. Results: The CR and section thickness were determined as the optimal independent variables. The areas under the ROC curve for the MLR and ANN predictions were 0.91 (95% CI; 0

  4. Information-Based Maintenance Optimization with Focus on Predictive Maintenance (Informatiegebaseerde onderhoudsoptimalisatie met focus op predictief onderhoud)

    OpenAIRE

    Van Horenbeek, Adriaan

    2013-01-01

    This dissertation presents an information-based maintenance optimization methodology for physical assets; with focus on, but not limited to, predictive maintenance (PdM). The overall concept of information-based maintenance is that of updating maintenance decisions based on evolving knowledge of operation history and anticipated usage of the machinery, as well as the physics and dynamics of material degradation in critical machinery components. Within this concept, predictive maintenance is a...

  5. Upgrade and modernization of the six largest HPPs in Macedonia

    International Nuclear Information System (INIS)

    Hadzievska, M.

    2002-01-01

    In 1998, Electric Power Company of Macedonia and the International Bank for Development and Reconstruction, started the Power System Improvement Project a part of which is the Project for rehabilitation of the six largest Hydro Power Plants (HPPs) in the Republic of Macedonia. The six largest Hydro Power Plants (HPP Vrutok, HPP Raven, HPP Globocica, HPP Tikves and HPP Spilje and HPP Vrben) represent 91% of the country's hydropower capacity. The rehabilitation program is divided in five parts (contracts) and covers the refurbishment of: turbine runners, turbine and generator bearings, governors, inlet valves; butterfly valves, including accessories and control systems; generators, excitation system and voltage regulation; control system, protection and LV auxiliaries; switch gears and control gears in 220 kV, 110 kV and 35 kV substations. At the moment, only the implementation of switch gears has started, the first phase is already finished, and 50 % of the rehabilitation works for HPP Vrutok, the largest HPP, has been finished. With the realization of this project, greater hydropower production is expected. It also expected that HPPs will become a more vital part of the Macedonian power system

  6. Protein Function Prediction Based on Sequence and Structure Information

    KAUST Repository

    Smaili, Fatima Z.

    2016-05-25

    The number of available protein sequences in public databases is increasing exponentially. However, a significant fraction of these sequences lack functional annotation which is essential to our understanding of how biological systems and processes operate. In this master thesis project, we worked on inferring protein functions based on the primary protein sequence. In the approach we follow, 3D models are first constructed using I-TASSER. Functions are then deduced by structurally matching these predicted models, using global and local similarities, through three independent enzyme commission (EC) and gene ontology (GO) function libraries. The method was tested on 250 “hard” proteins, which lack homologous templates in both structure and function libraries. The results show that this method outperforms the conventional prediction methods based on sequence similarity or threading. Additionally, our method could be improved even further by incorporating protein-protein interaction information. Overall, the method we use provides an efficient approach for automated functional annotation of non-homologous proteins, starting from their sequence.

  7. PINGU: PredIction of eNzyme catalytic residues usinG seqUence information.

    Directory of Open Access Journals (Sweden)

    Priyadarshini P Pai

    Full Text Available Identification of catalytic residues can help unveil interesting attributes of enzyme function for various therapeutic and industrial applications. Based on their biochemical roles, the number of catalytic residues and sequence lengths of enzymes vary. This article describes a prediction approach (PINGU for such a scenario. It uses models trained using physicochemical properties and evolutionary information of 650 non-redundant enzymes (2136 catalytic residues in a support vector machines architecture. Independent testing on 200 non-redundant enzymes (683 catalytic residues in predefined prediction settings, i.e., with non-catalytic per catalytic residue ranging from 1 to 30, suggested that the prediction approach was highly sensitive and specific, i.e., 80% or above, over the incremental challenges. To learn more about the discriminatory power of PINGU in real scenarios, where the prediction challenge is variable and susceptible to high false positives, the best model from independent testing was used on 60 diverse enzymes. Results suggested that PINGU was able to identify most catalytic residues and non-catalytic residues properly with 80% or above accuracy, sensitivity and specificity. The effect of false positives on precision was addressed in this study by application of predicted ligand-binding residue information as a post-processing filter. An overall improvement of 20% in F-measure and 0.138 in Correlation Coefficient with 16% enhanced precision could be achieved. On account of its encouraging performance, PINGU is hoped to have eventual applications in boosting enzyme engineering and novel drug discovery.

  8. NetTurnP – Neural Network Prediction of Beta-turns by Use of Evolutionary Information and Predicted Protein Sequence Features

    DEFF Research Database (Denmark)

    Petersen, Bent; Lundegaard, Claus; Petersen, Thomas Nordahl

    2010-01-01

    is the highest reported performance on a two-class prediction of β-turn and not-β-turn. Furthermore NetTurnP shows improved performance on some of the specific β-turn types. In the present work, neural network methods have been trained to predict β-turn or not and individual β-turn types from the primary amino......β-turns are the most common type of non-repetitive structures, and constitute on average 25% of the amino acids in proteins. The formation of β-turns plays an important role in protein folding, protein stability and molecular recognition processes. In this work we present the neural network method...... NetTurnP, for prediction of two-class β-turns and prediction of the individual β-turn types, by use of evolutionary information and predicted protein sequence features. It has been evaluated against a commonly used dataset BT426, and achieves a Matthews correlation coefficient of 0.50, which...

  9. World's third-largest producer of nuclear power. Japan in need of energy

    International Nuclear Information System (INIS)

    Anon.

    2008-01-01

    Japan is the third largest oil consumer in the world behind the United States and China, and the second largest net importer of oil. Japan boasts one of the largest economies in the world. The country continues to experience a moderate economic recovery that began in 2003, following a decade of economic stagnation. Japan's real gross domestic product (GDP) grew by 2.5% in 2005 and 2.3% in 2004. The modest upturn over the last few years reflects higher business confidence in Japan, a surge in export demand led by exports to China, and robust consumer spending. Unemployment in Japan fell to 4.4% in 2005, down from an early 2003 peak of 5.5%. Japan has virtually no domestic oil or natural gas reserves, and in 2005 was the second largest net importer of crude oil in the world. Despite the country's dearth of hydrocarbon resources, Japanese companies have actively pursued upstream oil and natural gas projects overseas. Japan remains one of the major exporters of energy-sector capital equipment, and Japanese companies provide engineering, construction, and project management services for energy projects. (orig.)

  10. Informative sensor selection and learning for prediction of lower limb kinematics using generative stochastic neural networks.

    Science.gov (United States)

    Eunsuk Chong; Taejin Choi; Hyungmin Kim; Seung-Jong Kim; Yoha Hwang; Jong Min Lee

    2017-07-01

    We propose a novel approach of selecting useful input sensors as well as learning a mathematical model for predicting lower limb joint kinematics. We applied a feature selection method based on the mutual information called the variational information maximization, which has been reported as the state-of-the-art work among information based feature selection methods. The main difficulty in applying the method is estimating reliable probability density of input and output data, especially when the data are high dimensional and real-valued. We addressed this problem by applying a generative stochastic neural network called the restricted Boltzmann machine, through which we could perform sampling based probability estimation. The mutual informations between inputs and outputs are evaluated in each backward sensor elimination step, and the least informative sensor is removed with its network connections. The entire network is fine-tuned by maximizing conditional likelihood in each step. Experimental results are shown for 4 healthy subjects walking with various speeds, recording 64 sensor measurements including electromyogram, acceleration, and foot-pressure sensors attached on both lower limbs for predicting hip and knee joint angles. For test set of walking with arbitrary speed, our results show that our suggested method can select informative sensors while maintaining a good prediction accuracy.

  11. Making detailed predictions makes (some) predictions worse

    Science.gov (United States)

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  12. Informal Learning in Online Knowledge Communities: Predicting Community Response to Visitor Inquiries

    NARCIS (Netherlands)

    Nistor, Nicolae; Dascalu, Mihai; Stavarache, Lucia Larise; Serafin, Yvonne; Trausan-Matu, Stefan

    2016-01-01

    Nistor, N., Dascalu, M., Stavarache, L.L., Serafin, Y., & Trausan-Matu, S. (2015). Informal Learning in Online Knowledge Communities: Predicting Community Response to Visitor Inquiries. In G. Conole, T. Klobucar, C. Rensing, J. Konert & É. Lavoué (Eds.), 10th European Conf. on Technology Enhanced

  13. NAFTA: The World's Largest Trading Zone Turns 20

    Science.gov (United States)

    Ferrarini, Tawni Hunt; Day, Stephen

    2014-01-01

    Everyone under the age of 20 who has grown up in North America has lived in the common market created by NAFTA--the North American Free Trade Agreement. In a zone linking the United States, Canada, and Mexico, most goods and investments flow freely across borders to users, consumers, and investors. In 1994, NAFTA created the largest relatively…

  14. What kinds of fish stock predictions do we need and what kinds of information will help us to make better predictions?

    Directory of Open Access Journals (Sweden)

    Keith Brander

    2003-04-01

    Full Text Available Fish stock predictions are used to guide fisheries management, but stocks continue to be over-exploited. Traditional single-species age-structured stock assessment models, which became an operational component of fisheries management in the 1950s, ignore biological and environmental effects. As our knowledge of the marine environment improves and our concern about the state of the marine ecosystem and about global change increases, the scope of our models needs to be widened. We need different kinds of predictions as well as better predictions. Population characteristics (rates of mortality, growth, recruitment of 61 stocks of 17 species of NE Atlantic fish are reviewed in order to consider the implications for the time-scale and quality of stock predictions. Short life expectancy limits the time horizon for predictability based on the current fishable stock and predictions are therefore more dependent on estimates or assumptions about future rates. Evidence is presented that rates of growth and recruitment are influenced by environmental factors and possibilities for including new information are explored in order to improve predictions.

  15. Assimilation of remote sensing observations into a sediment transport model of China's largest freshwater lake: spatial and temporal effects.

    Science.gov (United States)

    Zhang, Peng; Chen, Xiaoling; Lu, Jianzhong; Zhang, Wei

    2015-12-01

    Numerical models are important tools that are used in studies of sediment dynamics in inland and coastal waters, and these models can now benefit from the use of integrated remote sensing observations. This study explores a scheme for assimilating remotely sensed suspended sediment (from charge-coupled device (CCD) images obtained from the Huanjing (HJ) satellite) into a two-dimensional sediment transport model of Poyang Lake, the largest freshwater lake in China. Optimal interpolation is used as the assimilation method, and model predictions are obtained by combining four remote sensing images. The parameters for optimal interpolation are determined through a series of assimilation experiments evaluating the sediment predictions based on field measurements. The model with assimilation of remotely sensed sediment reduces the root-mean-square error of the predicted sediment concentrations by 39.4% relative to the model without assimilation, demonstrating the effectiveness of the assimilation scheme. The spatial effect of assimilation is explored by comparing model predictions with remotely sensed sediment, revealing that the model with assimilation generates reasonable spatial distribution patterns of suspended sediment. The temporal effect of assimilation on the model's predictive capabilities varies spatially, with an average temporal effect of approximately 10.8 days. The current velocities which dominate the rate and direction of sediment transport most likely result in spatial differences in the temporal effect of assimilation on model predictions.

  16. The prediction of engineering cost for green buildings based on information entropy

    Science.gov (United States)

    Liang, Guoqiang; Huang, Jinglian

    2018-03-01

    Green building is the developing trend in the world building industry. Additionally, construction costs are an essential consideration in building constructions. Therefore, it is necessary to investigate the problems of cost prediction in green building. On the basis of analyzing the cost of green building, this paper proposes the forecasting method of actual cost in green building based on information entropy and provides the forecasting working procedure. Using the probability density obtained from statistical data, such as labor costs, material costs, machinery costs, administration costs, profits, risk costs a unit project quotation and etc., situations can be predicted which lead to cost variations between budgeted cost and actual cost in constructions, through estimating the information entropy of budgeted cost and actual cost. The research results of this article have a practical significance in cost control of green building. Additionally, the method proposed in this article can be generalized and applied to a variety of other aspects in building management.

  17. A marine heatwave drives massive losses from the world's largest seagrass carbon stocks

    Science.gov (United States)

    Arias-Ortiz, A.; Serrano, O.; Masqué, P.; Lavery, P. S.; Mueller, U.; Kendrick, G. A.; Rozaimi, M.; Esteban, A.; Fourqurean, J. W.; Marbà, N.; Mateo, M. A.; Murray, K.; Rule, M. J.; Duarte, C. M.

    2018-04-01

    Seagrass ecosystems contain globally significant organic carbon (C) stocks. However, climate change and increasing frequency of extreme events threaten their preservation. Shark Bay, Western Australia, has the largest C stock reported for a seagrass ecosystem, containing up to 1.3% of the total C stored within the top metre of seagrass sediments worldwide. On the basis of field studies and satellite imagery, we estimate that 36% of Shark Bay's seagrass meadows were damaged following a marine heatwave in 2010/2011. Assuming that 10 to 50% of the seagrass sediment C stock was exposed to oxic conditions after disturbance, between 2 and 9 Tg CO2 could have been released to the atmosphere during the following three years, increasing emissions from land-use change in Australia by 4-21% per annum. With heatwaves predicted to increase with further climate warming, conservation of seagrass ecosystems is essential to avoid adverse feedbacks on the climate system.

  18. Best-fitting prediction equations for basal metabolic rate: informing obesity interventions in diverse populations.

    Science.gov (United States)

    Sabounchi, N S; Rahmandad, H; Ammerman, A

    2013-10-01

    Basal metabolic rate (BMR) represents the largest component of total energy expenditure and is a major contributor to energy balance. Therefore, accurately estimating BMR is critical for developing rigorous obesity prevention and control strategies. Over the past several decades, numerous BMR formulas have been developed targeted to different population groups. A comprehensive literature search revealed 248 BMR estimation equations developed using diverse ranges of age, gender, race, fat-free mass, fat mass, height, waist-to-hip ratio, body mass index and weight. A subset of 47 studies included enough detail to allow for development of meta-regression equations. Utilizing these studies, meta-equations were developed targeted to 20 specific population groups. This review provides a comprehensive summary of available BMR equations and an estimate of their accuracy. An accompanying online BMR prediction tool (available at http://www.sdl.ise.vt.edu/tutorials.html) was developed to automatically estimate BMR based on the most appropriate equation after user-entry of individual age, race, gender and weight.

  19. Fishing down the largest coral reef fish species.

    Science.gov (United States)

    Fenner, Douglas

    2014-07-15

    Studies on remote, uninhabited, near-pristine reefs have revealed surprisingly large populations of large reef fish. Locations such as the northwestern Hawaiian Islands, northern Marianas Islands, Line Islands, U.S. remote Pacific Islands, Cocos-Keeling Atoll and Chagos archipelago have much higher reef fish biomass than islands and reefs near people. Much of the high biomass of most remote reef fish communities lies in the largest species, such as sharks, bumphead parrots, giant trevally, and humphead wrasse. Some, such as sharks and giant trevally, are apex predators, but others such as bumphead parrots and humphead wrasse, are not. At many locations, decreases in large reef fish species have been attributed to fishing. Fishing is well known to remove the largest fish first, and a quantitative measure of vulnerability to fishing indicates that large reef fish species are much more vulnerable to fishing than small fish. The removal of large reef fish by fishing parallels the extinction of terrestrial megafauna by early humans. However large reef fish have great value for various ecological roles and for reef tourism. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Co-seismic slip, post-seismic slip, and largest aftershock associated with the 1994 Sanriku-haruka-oki, Japan, earthquake

    Science.gov (United States)

    Yagi, Yuji; Kikuchi, Masayuki; Nishimura, Takuya

    2003-11-01

    We analyzed continuous GPS data to investigate the spatio-temporal distribution of co-seismic slip, post-seismic slip, and largest aftershock associated with the 1994 Sanriku-haruka-oki, Japan, earthquake (Mw = 7.7). To get better resolution for co-seismic and post-seismic slip distribution, we imposed a weak constraint as a priori information of the co-seismic slip determined by seismic wave analyses. We found that the post-seismic slip during 100 days following the main-shock amount to as much moment release as the main-shock, and that the sites of co-seismic slip and post-seismic slip are partitioning on a plate boundary region in complimentary fashion. The major post-seismic slip was triggered by the mainshock in western side of the co-seismic slip, and the extent of the post-seismic slip is almost unchanged with time. It rapidly developed a shear stress concentration ahead of the slip area, and triggered the largest aftershock.

  1. The Multivariate Largest Lyapunov Exponent as an Age-Related Metric of Quiet Standing Balance

    Directory of Open Access Journals (Sweden)

    Kun Liu

    2015-01-01

    Full Text Available The largest Lyapunov exponent has been researched as a metric of the balance ability during human quiet standing. However, the sensitivity and accuracy of this measurement method are not good enough for clinical use. The present research proposes a metric of the human body’s standing balance ability based on the multivariate largest Lyapunov exponent which can quantify the human standing balance. The dynamic multivariate time series of ankle, knee, and hip were measured by multiple electrical goniometers. Thirty-six normal people of different ages participated in the test. With acquired data, the multivariate largest Lyapunov exponent was calculated. Finally, the results of the proposed approach were analysed and compared with the traditional method, for which the largest Lyapunov exponent and power spectral density from the centre of pressure were also calculated. The following conclusions can be obtained. The multivariate largest Lyapunov exponent has a higher degree of differentiation in differentiating balance in eyes-closed conditions. The MLLE value reflects the overall coordination between multisegment movements. Individuals of different ages can be distinguished by their MLLE values. The standing stability of human is reduced with the increment of age.

  2. Crash testing the largest experiment on Earth

    OpenAIRE

    Cauchi, Marija

    2015-01-01

    Under Europe lies a 27 km tunnel that is both the coldest and hottest place on Earth. The Large Hadron Collider (LHC) has already found out what gives mass to all the matter in the Universe. It is now trying to go even deeper into what makes up everything we see around us. Dr Marija Cauchi writes about her research that helped protect this atom smasher from itself. Photography by Jean Claude Vancell. http://www.um.edu.mt/think/crash-testing-the-largest-experiment-on-earth/

  3. Prediction of glutathionylation sites in proteins using minimal sequence information and their experimental validation.

    Science.gov (United States)

    Pal, Debojyoti; Sharma, Deepak; Kumar, Mukesh; Sandur, Santosh K

    2016-09-01

    S-glutathionylation of proteins plays an important role in various biological processes and is known to be protective modification during oxidative stress. Since, experimental detection of S-glutathionylation is labor intensive and time consuming, bioinformatics based approach is a viable alternative. Available methods require relatively longer sequence information, which may prevent prediction if sequence information is incomplete. Here, we present a model to predict glutathionylation sites from pentapeptide sequences. It is based upon differential association of amino acids with glutathionylated and non-glutathionylated cysteines from a database of experimentally verified sequences. This data was used to calculate position dependent F-scores, which measure how a particular amino acid at a particular position may affect the likelihood of glutathionylation event. Glutathionylation-score (G-score), indicating propensity of a sequence to undergo glutathionylation, was calculated using position-dependent F-scores for each amino-acid. Cut-off values were used for prediction. Our model returned an accuracy of 58% with Matthew's correlation-coefficient (MCC) value of 0.165. On an independent dataset, our model outperformed the currently available model, in spite of needing much less sequence information. Pentapeptide motifs having high abundance among glutathionylated proteins were identified. A list of potential glutathionylation hotspot sequences were obtained by assigning G-scores and subsequent Protein-BLAST analysis revealed a total of 254 putative glutathionable proteins, a number of which were already known to be glutathionylated. Our model predicted glutathionylation sites in 93.93% of experimentally verified glutathionylated proteins. Outcome of this study may assist in discovering novel glutathionylation sites and finding candidate proteins for glutathionylation.

  4. Prediction of membrane transport proteins and their substrate specificities using primary sequence information.

    Directory of Open Access Journals (Sweden)

    Nitish K Mishra

    Full Text Available Membrane transport proteins (transporters move hydrophilic substrates across hydrophobic membranes and play vital roles in most cellular functions. Transporters represent a diverse group of proteins that differ in topology, energy coupling mechanism, and substrate specificity as well as sequence similarity. Among the functional annotations of transporters, information about their transporting substrates is especially important. The experimental identification and characterization of transporters is currently costly and time-consuming. The development of robust bioinformatics-based methods for the prediction of membrane transport proteins and their substrate specificities is therefore an important and urgent task.Support vector machine (SVM-based computational models, which comprehensively utilize integrative protein sequence features such as amino acid composition, dipeptide composition, physico-chemical composition, biochemical composition, and position-specific scoring matrices (PSSM, were developed to predict the substrate specificity of seven transporter classes: amino acid, anion, cation, electron, protein/mRNA, sugar, and other transporters. An additional model to differentiate transporters from non-transporters was also developed. Among the developed models, the biochemical composition and PSSM hybrid model outperformed other models and achieved an overall average prediction accuracy of 76.69% with a Mathews correlation coefficient (MCC of 0.49 and a receiver operating characteristic area under the curve (AUC of 0.833 on our main dataset. This model also achieved an overall average prediction accuracy of 78.88% and MCC of 0.41 on an independent dataset.Our analyses suggest that evolutionary information (i.e., the PSSM and the AAIndex are key features for the substrate specificity prediction of transport proteins. In comparison, similarity-based methods such as BLAST, PSI-BLAST, and hidden Markov models do not provide accurate predictions

  5. Detecting Weather Radar Clutter by Information Fusion With Satellite Images and Numerical Weather Prediction Model Output

    DEFF Research Database (Denmark)

    Bøvith, Thomas; Nielsen, Allan Aasbjerg; Hansen, Lars Kai

    2006-01-01

    A method for detecting clutter in weather radar images by information fusion is presented. Radar data, satellite images, and output from a numerical weather prediction model are combined and the radar echoes are classified using supervised classification. The presented method uses indirect...... information on precipitation in the atmosphere from Meteosat-8 multispectral images and near-surface temperature estimates from the DMI-HIRLAM-S05 numerical weather prediction model. Alternatively, an operational nowcasting product called 'Precipitating Clouds' based on Meteosat-8 input is used. A scale...

  6. Predicting RNA Structure Using Mutual Information

    DEFF Research Database (Denmark)

    Freyhult, E.; Moulton, V.; Gardner, P. P.

    2005-01-01

    , to display and predict conserved RNA secondary structure (including pseudoknots) from an alignment. Results: We show that MIfold can be used to predict simple pseudoknots, and that the performance can be adjusted to make it either more sensitive or more selective. We also demonstrate that the overall...... package. Conclusion: MIfold provides a useful supplementary tool to programs such as RNA Structure Logo, RNAalifold and COVE, and should be useful for automatically generating structural predictions for databases such as Rfam. Availability: MIfold is freely available from http......Background: With the ever-increasing number of sequenced RNAs and the establishment of new RNA databases, such as the Comparative RNA Web Site and Rfam, there is a growing need for accurately and automatically predicting RNA structures from multiple alignments. Since RNA secondary structure...

  7. Genome-Wide Prediction of SH2 Domain Targets Using Structural Information and the FoldX Algorithm

    DEFF Research Database (Denmark)

    Sanchez, Ignacio E.; Beltrao, Pedro; Stricher, Francois

    2008-01-01

    validated the predictions using literature-derived SH2 interactions and a probabilistic score obtained from a naive Bayes integration of information on coexpression, conservation of the interaction in other species, shared interaction partners, and functions. We show how our predictions lead to a new...

  8. Cognitive factors predicting intentions to search for health information: an application of the theory of planned behaviour.

    Science.gov (United States)

    Austvoll-Dahlgren, Astrid; Falk, Ragnhild S; Helseth, Sølvi

    2012-12-01

    Peoples' ability to obtain health information is a precondition for their effective participation in decision making about health. However, there is limited evidence describing which cognitive factors can predict the intention of people to search for health information. To test the utility of a questionnaire in predicting intentions to search for health information, and to identify important predictors associated with this intention such that these could be targeted in an Intervention. A questionnaire was developed based on the Theory of Planned Behaviour and tested on both a mixed population sample (n=30) and a sample of parents (n = 45). The questionnaire was explored by testing for internal consistency, calculating inter-correlations between theoretically-related constructs, and by using multiple regression analysis. The reliability and validity of the questionnaire were found to be satisfactory and consistent across the two samples. The questionnaires' direct measures prediction of intention was high and accounted for 47% and 55% of the variance in behavioural intentions. Attitudes and perceived behavioural control were identified as important predictors to intention for search for health information. The questionnaire may be a useful tool for understanding and evaluating behavioural intentions and beliefs related to searches for health information. © 2012 The authors. Health Information and Libraries Journal © 2012 Health Libraries Group.

  9. Prediction Model of Collapse Risk Based on Information Entropy and Distance Discriminant Analysis Method

    Directory of Open Access Journals (Sweden)

    Hujun He

    2017-01-01

    Full Text Available The prediction and risk classification of collapse is an important issue in the process of highway construction in mountainous regions. Based on the principles of information entropy and Mahalanobis distance discriminant analysis, we have produced a collapse hazard prediction model. We used the entropy measure method to reduce the influence indexes of the collapse activity and extracted the nine main indexes affecting collapse activity as the discriminant factors of the distance discriminant analysis model (i.e., slope shape, aspect, gradient, and height, along with exposure of the structural face, stratum lithology, relationship between weakness face and free face, vegetation cover rate, and degree of rock weathering. We employ postearthquake collapse data in relation to construction of the Yingxiu-Wolong highway, Hanchuan County, China, as training samples for analysis. The results were analyzed using the back substitution estimation method, showing high accuracy and no errors, and were the same as the prediction result of uncertainty measure. Results show that the classification model based on information entropy and distance discriminant analysis achieves the purpose of index optimization and has excellent performance, high prediction accuracy, and a zero false-positive rate. The model can be used as a tool for future evaluation of collapse risk.

  10. NetTurnP--neural network prediction of beta-turns by use of evolutionary information and predicted protein sequence features.

    Directory of Open Access Journals (Sweden)

    Bent Petersen

    Full Text Available UNLABELLED: β-turns are the most common type of non-repetitive structures, and constitute on average 25% of the amino acids in proteins. The formation of β-turns plays an important role in protein folding, protein stability and molecular recognition processes. In this work we present the neural network method NetTurnP, for prediction of two-class β-turns and prediction of the individual β-turn types, by use of evolutionary information and predicted protein sequence features. It has been evaluated against a commonly used dataset BT426, and achieves a Matthews correlation coefficient of 0.50, which is the highest reported performance on a two-class prediction of β-turn and not-β-turn. Furthermore NetTurnP shows improved performance on some of the specific β-turn types. In the present work, neural network methods have been trained to predict β-turn or not and individual β-turn types from the primary amino acid sequence. The individual β-turn types I, I', II, II', VIII, VIa1, VIa2, VIba and IV have been predicted based on classifications by PROMOTIF, and the two-class prediction of β-turn or not is a superset comprised of all β-turn types. The performance is evaluated using a golden set of non-homologous sequences known as BT426. Our two-class prediction method achieves a performance of: MCC=0.50, Qtotal=82.1%, sensitivity=75.6%, PPV=68.8% and AUC=0.864. We have compared our performance to eleven other prediction methods that obtain Matthews correlation coefficients in the range of 0.17-0.47. For the type specific β-turn predictions, only type I and II can be predicted with reasonable Matthews correlation coefficients, where we obtain performance values of 0.36 and 0.31, respectively. CONCLUSION: The NetTurnP method has been implemented as a webserver, which is freely available at http://www.cbs.dtu.dk/services/NetTurnP/. NetTurnP is the only available webserver that allows submission of multiple sequences.

  11. NetTurnP--neural network prediction of beta-turns by use of evolutionary information and predicted protein sequence features.

    Science.gov (United States)

    Petersen, Bent; Lundegaard, Claus; Petersen, Thomas Nordahl

    2010-11-30

    β-turns are the most common type of non-repetitive structures, and constitute on average 25% of the amino acids in proteins. The formation of β-turns plays an important role in protein folding, protein stability and molecular recognition processes. In this work we present the neural network method NetTurnP, for prediction of two-class β-turns and prediction of the individual β-turn types, by use of evolutionary information and predicted protein sequence features. It has been evaluated against a commonly used dataset BT426, and achieves a Matthews correlation coefficient of 0.50, which is the highest reported performance on a two-class prediction of β-turn and not-β-turn. Furthermore NetTurnP shows improved performance on some of the specific β-turn types. In the present work, neural network methods have been trained to predict β-turn or not and individual β-turn types from the primary amino acid sequence. The individual β-turn types I, I', II, II', VIII, VIa1, VIa2, VIba and IV have been predicted based on classifications by PROMOTIF, and the two-class prediction of β-turn or not is a superset comprised of all β-turn types. The performance is evaluated using a golden set of non-homologous sequences known as BT426. Our two-class prediction method achieves a performance of: MCC=0.50, Qtotal=82.1%, sensitivity=75.6%, PPV=68.8% and AUC=0.864. We have compared our performance to eleven other prediction methods that obtain Matthews correlation coefficients in the range of 0.17-0.47. For the type specific β-turn predictions, only type I and II can be predicted with reasonable Matthews correlation coefficients, where we obtain performance values of 0.36 and 0.31, respectively. The NetTurnP method has been implemented as a webserver, which is freely available at http://www.cbs.dtu.dk/services/NetTurnP/. NetTurnP is the only available webserver that allows submission of multiple sequences.

  12. A Mine of Information.

    Science.gov (United States)

    Williams, Lisa B.

    1986-01-01

    Business researchers and marketers find certain databases useful for finding information on investments, competitors, products, and markets. Colleges can use these same databases to get background on corporate prospects. The largest data source available, DIALOG Information Services and some other databases are described. (MLW)

  13. Resolution, Scales and Predictability: Is High Resolution Detrimental To Predictability At Extended Forecast Times?

    Science.gov (United States)

    Mesinger, F.

    The traditional views hold that high-resolution limited area models (LAMs) down- scale large-scale lateral boundary information, and that predictability of small scales is short. Inspection of various rms fits/errors has contributed to these views. It would follow that the skill of LAMs should visibly deteriorate compared to that of their driver models at more extended forecast times. The limited area Eta Model at NCEP has an additional handicap of being driven by LBCs of the previous Avn global model run, at 0000 and 1200 UTC estimated to amount to about an 8 h loss in accuracy. This should make its relative skill compared to that of the Avn deteriorate even faster. These views are challenged by various Eta results including rms fits to raobs out to 84 h. It is argued that it is the largest scales that contribute the most to the skill of the Eta relative to that of the Avn.

  14. A comparison of SAR ATR performance with information theoretic predictions

    Science.gov (United States)

    Blacknell, David

    2003-09-01

    Performance assessment of automatic target detection and recognition algorithms for SAR systems (or indeed any other sensors) is essential if the military utility of the system / algorithm mix is to be quantified. This is a relatively straightforward task if extensive trials data from an existing system is used. However, a crucial requirement is to assess the potential performance of novel systems as a guide to procurement decisions. This task is no longer straightforward since a hypothetical system cannot provide experimental trials data. QinetiQ has previously developed a theoretical technique for classification algorithm performance assessment based on information theory. The purpose of the study presented here has been to validate this approach. To this end, experimental SAR imagery of targets has been collected using the QinetiQ Enhanced Surveillance Radar to allow algorithm performance assessments as a number of parameters are varied. In particular, performance comparisons can be made for (i) resolutions up to 0.1m, (ii) single channel versus polarimetric (iii) targets in the open versus targets in scrubland and (iv) use versus non-use of camouflage. The change in performance as these parameters are varied has been quantified from the experimental imagery whilst the information theoretic approach has been used to predict the expected variation of performance with parameter value. A comparison of these measured and predicted assessments has revealed the strengths and weaknesses of the theoretical technique as will be discussed in the paper.

  15. A network integration approach for drug-target interaction prediction and computational drug repositioning from heterogeneous information.

    Science.gov (United States)

    Luo, Yunan; Zhao, Xinbin; Zhou, Jingtian; Yang, Jinglin; Zhang, Yanqing; Kuang, Wenhua; Peng, Jian; Chen, Ligong; Zeng, Jianyang

    2017-09-18

    The emergence of large-scale genomic, chemical and pharmacological data provides new opportunities for drug discovery and repositioning. In this work, we develop a computational pipeline, called DTINet, to predict novel drug-target interactions from a constructed heterogeneous network, which integrates diverse drug-related information. DTINet focuses on learning a low-dimensional vector representation of features, which accurately explains the topological properties of individual nodes in the heterogeneous network, and then makes prediction based on these representations via a vector space projection scheme. DTINet achieves substantial performance improvement over other state-of-the-art methods for drug-target interaction prediction. Moreover, we experimentally validate the novel interactions between three drugs and the cyclooxygenase proteins predicted by DTINet, and demonstrate the new potential applications of these identified cyclooxygenase inhibitors in preventing inflammatory diseases. These results indicate that DTINet can provide a practically useful tool for integrating heterogeneous information to predict new drug-target interactions and repurpose existing drugs.Network-based data integration for drug-target prediction is a promising avenue for drug repositioning, but performance is wanting. Here, the authors introduce DTINet, whose performance is enhanced in the face of noisy, incomplete and high-dimensional biological data by learning low-dimensional vector representations.

  16. Advances in criticality predictions for EBR-II

    International Nuclear Information System (INIS)

    Schaefer, R.W.; Imel, G.R.

    1994-01-01

    Improvements to startup criticality predictions for the EBR-II reactor have been made. More exact calculational models, methods and data are now used, and better procedures for obtaining experimental data that enter into the prediction are in place. Accuracy improved by more than a factor of two and the largest ECP error observed since the changes is only 18 cents. An experimental method using subcritical counts is also being implemented

  17. Lagisza, world's largest CFB boiler, begins commercial operation

    Energy Technology Data Exchange (ETDEWEB)

    Nuortimo, K. [Foster Wheeler, Varkaus (Finland)

    2010-04-15

    Early operating experience with the Lagisza circulating fluidised bed (CFB) boiler in Poland - the world's largest such boiler to date, and also the first one with supercritical steam conditions - has been positive. 3 figs., 4 tabs.

  18. Temporal properties of seismicity and largest earthquakes in SE Carpathians

    Directory of Open Access Journals (Sweden)

    S. Byrdina

    2006-01-01

    Full Text Available In order to estimate the hazard rate distribution of the largest seismic events in Vrancea, South-Eastern Carpathians, we study temporal properties of historical and instrumental catalogues of seismicity. First, on the basis of Generalized Extreme Value theory we estimate the average return period of the largest events. Then, following Bak et al. (2002 and Corral (2005a, we study scaling properties of recurrence times between earthquakes in appropriate spatial volumes. We come to the conclusion that the seismicity is temporally clustered, and that the distribution of recurrence times is significantly different from a Poisson process even for times largely exceeding corresponding periods of foreshock and aftershock activity. Modeling the recurrence times by a gamma distributed variable, we finally estimate hazard rates with respect to the time elapsed from the last large earthquake.

  19. A marine heatwave drives massive losses from the world’s largest seagrass carbon stocks

    KAUST Repository

    Arias-Ortiz, A.

    2018-03-29

    Seagrass ecosystems contain globally significant organic carbon (C) stocks. However, climate change and increasing frequency of extreme events threaten their preservation. Shark Bay, Western Australia, has the largest C stock reported for a seagrass ecosystem, containing up to 1.3% of the total C stored within the top metre of seagrass sediments worldwide. On the basis of field studies and satellite imagery, we estimate that 36% of Shark Bay’s seagrass meadows were damaged following a marine heatwave in 2010/2011. Assuming that 10 to 50% of the seagrass sediment C stock was exposed to oxic conditions after disturbance, between 2 and 9 Tg CO could have been released to the atmosphere during the following three years, increasing emissions from land-use change in Australia by 4–21% per annum. With heatwaves predicted to increase with further climate warming, conservation of seagrass ecosystems is essential to avoid adverse feedbacks on the climate system.

  20. PNNL supercomputer to become largest computing resource on the Grid

    CERN Multimedia

    2002-01-01

    Hewlett Packard announced that the US DOE Pacific Northwest National Laboratory will connect a 9.3-teraflop HP supercomputer to the DOE Science Grid. This will be the largest supercomputer attached to a computer grid anywhere in the world (1 page).

  1. Worlds largest particle physics laboratory selects Proxim Wireless Mesh

    CERN Multimedia

    2007-01-01

    "Proxim Wireless has announced that the European Organization for Nuclear Research (CERN), the world's largest particle physics laboratory and the birthplace of the World Wide Web, is using it's ORiNOCO AP-4000 mesh access points to extend the range of the laboratory's Wi-Fi network and to provide continuous monitoring of the lab's calorimeters" (1/2 page)

  2. Discovery of the largest orbweaving spider species: the evolution of gigantism in Nephila.

    Science.gov (United States)

    Kuntner, Matjaz; Coddington, Jonathan A

    2009-10-21

    More than 41,000 spider species are known with about 400-500 added each year, but for some well-known groups, such as the giant golden orbweavers, Nephila, the last valid described species dates from the 19(th) century. Nephila are renowned for being the largest web-spinning spiders, making the largest orb webs, and are model organisms for the study of extreme sexual size dimorphism (SSD) and sexual biology. Here, we report on the discovery of a new, giant Nephila species from Africa and Madagascar, and review size evolution and SSD in Nephilidae. We formally describe N. komaci sp. nov., the largest web spinning species known, and place the species in phylogenetic context to reconstruct the evolution of mean size (via squared change parsimony). We then test female and male mean size correlation using phylogenetically independent contrasts, and simulate nephilid body size evolution using Monte Carlo statistics. Nephila females increased in size almost monotonically to establish a mostly African clade of true giants. In contrast, Nephila male size is effectively decoupled and hovers around values roughly one fifth of female size. Although N. komaci females are the largest Nephila yet discovered, the males are also large and thus their SSD is not exceptional.

  3. Discovery of the largest orbweaving spider species: the evolution of gigantism in Nephila.

    Directory of Open Access Journals (Sweden)

    Matjaz Kuntner

    2009-10-01

    Full Text Available More than 41,000 spider species are known with about 400-500 added each year, but for some well-known groups, such as the giant golden orbweavers, Nephila, the last valid described species dates from the 19(th century. Nephila are renowned for being the largest web-spinning spiders, making the largest orb webs, and are model organisms for the study of extreme sexual size dimorphism (SSD and sexual biology. Here, we report on the discovery of a new, giant Nephila species from Africa and Madagascar, and review size evolution and SSD in Nephilidae.We formally describe N. komaci sp. nov., the largest web spinning species known, and place the species in phylogenetic context to reconstruct the evolution of mean size (via squared change parsimony. We then test female and male mean size correlation using phylogenetically independent contrasts, and simulate nephilid body size evolution using Monte Carlo statistics.Nephila females increased in size almost monotonically to establish a mostly African clade of true giants. In contrast, Nephila male size is effectively decoupled and hovers around values roughly one fifth of female size. Although N. komaci females are the largest Nephila yet discovered, the males are also large and thus their SSD is not exceptional.

  4. On the statistics of the largest geomagnetic storms per solar cycle

    International Nuclear Information System (INIS)

    Siscoe, G.L.

    1976-01-01

    The theory of extreme value statistics is applied to the first, second, and third largest geomagnetic storms in nine solar cycles measured by the average half-daily aa indices compiled by Mayaud. Analytic expressions giving the probability of the extremes per solar cycle as a contour function of storm magnitude are obtained by least squares fitting of the observations to the appropriate theoretical extreme value probability functions. The results are used to obtain the statistical characteristics (mode, median, mean, and standard deviation) for the extreme values. The results are applied to find the expected range of extreme values in a set as a function of the number of solar cycles in the set. We find that the expected range of the largest storm is quite narrow and is larger for the second and third largest storms. The observed range of the extreme half-daily aa index for the nine solar cycles is 354--546 γ. In a set of 100 cycles the range is expanded esentially to 311--680γ, an increase of only 39% in the range. The result supports the argument for a change in solar cycle statistics in the latter part of the Seventeenth Century (the Maunder minimum)

  5. City-ecological perspectives of the development of high urbanized multifunctional centers of the largest Russian cities

    Directory of Open Access Journals (Sweden)

    Kolesnikov Sergey Anatol’evich

    2015-01-01

    Full Text Available This article presents some results of the author’s dissertation research dedicated to formation of an architectural typology of high urbanized multifunctional units of urban structure of the largest cities (further HUMUUS as centers of social activity, which include buildings, constructions, transportation equipment and open spaces, where human flows transpose, start and end with the purpose of bringing into this space a concentrated maximum of goods, services and information with minimum time expenditures. This article draws attention to the development analysis of the structure-forming functions of HUMUUS and their town planning and environmental impact on the surrounding area. The study of planning structures of the largest Russian cities (Samara, Kazan, Nizhny Novgorod made it possible to identify a number of main objects, in which structure-forming functions of HUMUUS are materialized: railroad complex (historically formed, developed, dominated, system-wide road junction, transport interchange hub (providing intraurban messages, public office and business centers, leisure and entertainment centers, shopping centers. Basing on researches of Russian and foreign experience, it is possible to predict with full confidence the following trends and streams of environmental and urban development of HUMUUS in the near-term perspective: Strengthening of the environmental and urban frame by network evolution of HUMUUS; Inclusion of green areas of HUMUUS in the system of citywide green areas; Increment of the interest of the investors to the public road junction for the purpose of reorganization of them to full HUMUUS with all characteristics of high-urbanized and environmental and urban reorganization (separation of traffic and pedestrian flows, maximum capacity, multiple-level system, multifunctional, increase in landscaped green space, reconstruction of engineering systems and communications, the use of modern ecological building designs and

  6. Contribution of temporal data to predictive performance in 30-day readmission of morbidly obese patients

    Directory of Open Access Journals (Sweden)

    Petra Povalej Brzan

    2017-04-01

    Full Text Available Background Reduction of readmissions after discharge represents an important challenge for many hospitals and has attracted the interest of many researchers in the past few years. Most of the studies in this field focus on building cross-sectional predictive models that aim to predict the occurrence of readmission within 30-days based on information from the current hospitalization. The aim of this study is demonstration of predictive performance gain obtained by inclusion of information from historical hospitalization records among morbidly obese patients. Methods The California Statewide inpatient database was used to build regularized logistic regression models for prediction of readmission in morbidly obese patients (n = 18,881. Temporal features were extracted from historical patient hospitalization records in a one-year timeframe. Five different datasets of patients were prepared based on the number of available hospitalizations per patient. Sample size of the five datasets ranged from 4,787 patients with more than five hospitalizations to 20,521 patients with at least two hospitalization records in one year. A 10-fold cross validation was repeted 100 times to assess the variability of the results. Additionally, random forest and extreme gradient boosting were used to confirm the results. Results Area under the ROC curve increased significantly when including information from up to three historical records on all datasets. The inclusion of more than three historical records was not efficient. Similar results can be observed for Brier score and PPV value. The number of selected predictors corresponded to the complexity of the dataset ranging from an average of 29.50 selected features on the smallest dataset to 184.96 on the largest dataset based on 100 repetitions of 10-fold cross-validation. Discussion The results show positive influence of adding information from historical hospitalization records on predictive performance using all

  7. Use of secondary structural information and Cα-Cα distance ...

    Indian Academy of Sciences (India)

    PRAKASH KUMAR

    2007-06-21

    Jun 21, 2007 ... Model evolution; protein modelling; residue contact prediction; secondary structure prediction. Abbreviations used: ... set of sequence data (NR) and calculated conservation index of each ... evaluators (Moult et al 2003) to evaluate these model ... (Siew et al 2000), is a measure aims at identifying the largest.

  8. Historical maintenance relevant information road-map for a self-learning maintenance prediction procedural approach

    Science.gov (United States)

    Morales, Francisco J.; Reyes, Antonio; Cáceres, Noelia; Romero, Luis M.; Benitez, Francisco G.; Morgado, Joao; Duarte, Emanuel; Martins, Teresa

    2017-09-01

    A large percentage of transport infrastructures are composed of linear assets, such as roads and rail tracks. The large social and economic relevance of these constructions force the stakeholders to ensure a prolonged health/durability. Even though, inevitable malfunctioning, breaking down, and out-of-service periods arise randomly during the life cycle of the infrastructure. Predictive maintenance techniques tend to diminish the appearance of unpredicted failures and the execution of needed corrective interventions, envisaging the adequate interventions to be conducted before failures show up. This communication presents: i) A procedural approach, to be conducted, in order to collect the relevant information regarding the evolving state condition of the assets involved in all maintenance interventions; this reported and stored information constitutes a rich historical data base to train Machine Learning algorithms in order to generate reliable predictions of the interventions to be carried out in further time scenarios. ii) A schematic flow chart of the automatic learning procedure. iii) Self-learning rules from automatic learning from false positive/negatives. The description, testing, automatic learning approach and the outcomes of a pilot case are presented; finally some conclusions are outlined regarding the methodology proposed for improving the self-learning predictive capability.

  9. EKORISK project - an information system for prediction and expert evaluation of environmental impact

    International Nuclear Information System (INIS)

    Zaimov, V.; Antonov, A.

    1993-01-01

    The aim of this project is to create an expert system for prediction, evaluation and decision making support in case of accidents. The system consists of the following modules: 1) A data base containing information about the situation - geographical and demographical data for the region of the accident as well as data about the contaminants. The data about geographic objects (boundaries, rivers, roads, towns, soils, etc.) is managed and visualized by a geographic information system (GIS), which produces multi-layer geographical maps, showing different viewpoints of the region of interest. Information about the pollutants, their use and storage, as well as data about the available resources for action in case of accidents, are stored in relational data bases which guarantee easy access, search, sorting and proper visualisation. 2) Predicting the propagation of contamination by using actual meteorological information and applying mathematical models for propagation of the spilled substances in the air, water and ground. They calculate the concentration of the substance as a function of time and distance from the initial spill location. The choice of the proper model is made by applying expert knowledge for evaluation of situation and comparing the model characteristics. 3) Suggesting actions for minimising the accident's impact. Expert knowledge is used for recommendations concerning deactivating of the region as well as actions for reducing the absorbed radiation doses of population. The modern technologies for knowledge processing and the object-oriented approach ensure flexibility and integration of all subsystems. (author)

  10. Position-specific prediction of methylation sites from sequence conservation based on information theory.

    Science.gov (United States)

    Shi, Yinan; Guo, Yanzhi; Hu, Yayun; Li, Menglong

    2015-07-23

    Protein methylation plays vital roles in many biological processes and has been implicated in various human diseases. To fully understand the mechanisms underlying methylation for use in drug design and work in methylation-related diseases, an initial but crucial step is to identify methylation sites. The use of high-throughput bioinformatics methods has become imperative to predict methylation sites. In this study, we developed a novel method that is based only on sequence conservation to predict protein methylation sites. Conservation difference profiles between methylated and non-methylated peptides were constructed by the information entropy (IE) in a wider neighbor interval around the methylation sites that fully incorporated all of the environmental information. Then, the distinctive neighbor residues were identified by the importance scores of information gain (IG). The most representative model was constructed by support vector machine (SVM) for Arginine and Lysine methylation, respectively. This model yielded a promising result on both the benchmark dataset and independent test set. The model was used to screen the entire human proteome, and many unknown substrates were identified. These results indicate that our method can serve as a useful supplement to elucidate the mechanism of protein methylation and facilitate hypothesis-driven experimental design and validation.

  11. Gray correlation analysis and prediction models of living refuse generation in Shanghai city.

    Science.gov (United States)

    Liu, Gousheng; Yu, Jianguo

    2007-01-01

    A better understanding of the factors that affect the generation of municipal living refuse (MLF) and the accurate prediction of its generation are crucial for municipal planning projects and city management. Up to now, most of the design efforts have been based on a rough prediction of MLF without any actual support. In this paper, based on published data of socioeconomic variables and MLF generation from 1990 to 2003 in the city of Shanghai, the main factors that affect MLF generation have been quantitatively studied using the method of gray correlation coefficient. Several gray models, such as GM(1,1), GIM(1), GPPM(1) and GLPM(1), have been studied, and predicted results are verified with subsequent residual test. Results show that, among the selected seven factors, consumption of gas, water and electricity are the largest three factors affecting MLF generation, and GLPM(1) is the optimized model to predict MLF generation. Through this model, the predicted MLF generation in 2010 in Shanghai will be 7.65 million tons. The methods and results developed in this paper can provide valuable information for MLF management and related municipal planning projects.

  12. Revisiting the phylogeny of Ocellularieae, the second largest tribe within Graphidaceae (lichenized Ascomycota: Ostropales)

    Science.gov (United States)

    Ekaphan Kraichak; Sittiporn Parnmen; Robert Lücking; Eimy Rivas Plata; Andre Aptroot; Marcela E.S. Caceres; Damien Ertz; Armin Mangold; Joel A. Mercado-Diaz; Khwanruan Papong; Dries Van der Broeck; Gothamie Weerakoon; H. Thorsten. Lumbsch; NO-VALUE

    2014-01-01

    We present an updated 3-locus molecular phylogeny of tribe Ocellularieae, the second largest tribe within subfamily Graphidoideae in the Graphidaceae. Adding 165 newly generated sequences from the mitochondrial small subunit rDNA (mtSSU), the nuclear large subunit rDNA (nuLSU), and the second largest subunit of the DNA-directed RNA polymerase II (RPB2), we currently...

  13. Ultrasonography-guided Fine-needle Aspiration for Solid Thyroid Nodules Less than 5 mm in the Largest Diameter: Comparison in Diagnostic Adequacy and Accuracy According to Nodule Size

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Hee; Kim, Dong Wook; Baek, Seung Hun [Busan Paik Hospital, Inje University College of Medicine, Busan (Korea, Republic of)

    2012-03-15

    This study assessed the adequacy and accuracy of ultrasonography (US)-guided fine-needle aspiration (US-FNA) of solid thyroid nodules, less than 5 mm in maximum diameter. From January to December 2009, US-FNA was performed for small solid thyroid nodules in 201 patients. Each thyroid nodule was classified into group A and B according to the largest diameter (1 mm {<=} group A < 3 mm and 3 mm {<=} group B < 5 mm). The adequacy and accuracy of US-FNA in two groups were compared using the histopathological results as a reference standard. Of the 227 thyroid nodules in 201 patients, the inadequacy of US-FNA in group A and B was 24.3% (18/74) and 13.1% (20/153), respectively, showing a statistically significant difference between the two groups (p = 0.0333, chi-square test). Eighty nodules were removed surgically in 72 patients, from which papillary thyroid carcinoma (n = 52), follicular thyroid carcinoma (n = 1), nodular hyperplasia (n = 26), and pseudonodule related to thyroiditis (n = 1) were confirmed. Based on the histopathological results of the 80 surgical nodules, the sensitivity, specificity, positive predictive value, negative predictive value and accuracy of US-FNA in group A and B were 55.0% and 79.4%, 81.8% and 100%, 84.6% and 100%, 50% and 68.2%, and 64.5% and 85.7%, respectively. The adequacy and accuracy of US-FNA for solid thyroid nodules, {>=} 3 mm in the largest diameter, were higher than those of US-FNA for very small nodules, < 3 mm in the largest diameter

  14. Fine-Tuning Nonhomogeneous Regression for Probabilistic Precipitation Forecasts: Unanimous Predictions, Heavy Tails, and Link Functions

    DEFF Research Database (Denmark)

    Gebetsberger, Manuel; Messner, Jakob W.; Mayr, Georg J.

    2017-01-01

    functions for the optimization of regression coefficients for the scale parameter. These three refinements are tested for 10 stations in a small area of the European Alps for lead times from +24 to +144 h and accumulation periods of 24 and 6 h. Together, they improve probabilistic forecasts...... to obtain automatically corrected weather forecasts. This study applies the nonhomogenous regression framework as a state-of-the-art ensemble postprocessing technique to predict a full forecast distribution and improves its forecast performance with three statistical refinements. First of all, a novel split...... for precipitation amounts as well as the probability of precipitation events over the default postprocessing method. The improvements are largest for the shorter accumulation periods and shorter lead times, where the information of unanimous ensemble predictions is more important....

  15. Genetic Variance Partitioning and Genome-Wide Prediction with Allele Dosage Information in Autotetraploid Potato.

    Science.gov (United States)

    Endelman, Jeffrey B; Carley, Cari A Schmitz; Bethke, Paul C; Coombs, Joseph J; Clough, Mark E; da Silva, Washington L; De Jong, Walter S; Douches, David S; Frederick, Curtis M; Haynes, Kathleen G; Holm, David G; Miller, J Creighton; Muñoz, Patricio R; Navarro, Felix M; Novy, Richard G; Palta, Jiwan P; Porter, Gregory A; Rak, Kyle T; Sathuvalli, Vidyasagar R; Thompson, Asunta L; Yencho, G Craig

    2018-05-01

    As one of the world's most important food crops, the potato ( Solanum tuberosum L.) has spurred innovation in autotetraploid genetics, including in the use of SNP arrays to determine allele dosage at thousands of markers. By combining genotype and pedigree information with phenotype data for economically important traits, the objectives of this study were to (1) partition the genetic variance into additive vs. nonadditive components, and (2) determine the accuracy of genome-wide prediction. Between 2012 and 2017, a training population of 571 clones was evaluated for total yield, specific gravity, and chip fry color. Genomic covariance matrices for additive ( G ), digenic dominant ( D ), and additive × additive epistatic ( G # G ) effects were calculated using 3895 markers, and the numerator relationship matrix ( A ) was calculated from a 13-generation pedigree. Based on model fit and prediction accuracy, mixed model analysis with G was superior to A for yield and fry color but not specific gravity. The amount of additive genetic variance captured by markers was 20% of the total genetic variance for specific gravity, compared to 45% for yield and fry color. Within the training population, including nonadditive effects improved accuracy and/or bias for all three traits when predicting total genotypic value. When six F 1 populations were used for validation, prediction accuracy ranged from 0.06 to 0.63 and was consistently lower (0.13 on average) without allele dosage information. We conclude that genome-wide prediction is feasible in potato and that it will improve selection for breeding value given the substantial amount of nonadditive genetic variance in elite germplasm. Copyright © 2018 by the Genetics Society of America.

  16. CERN, World's largest particle physics lab, selects Progress SonicMQ

    CERN Document Server

    2007-01-01

    "Progress Software Corporation (NADAQ: PRGS), a global supplier of application insfrastructure software used to develop, deploy, integrate and manage business applications, today announced that CERN the world's largest physis laboratory and particle accelerator, has chosen Progress® SonicMQ® for mission-critical message delivery." (1 page)

  17. Distribution of the Largest Eigenvalues of the Levi-Smirnov Ensemble

    International Nuclear Information System (INIS)

    Wieczorek, W.

    2004-01-01

    We calculate the distribution of the k-th largest eigenvalue in the random matrix Levi - Smirnov Ensemble (LSE), using the spectral dualism between LSE and chiral Gaussian Unitary Ensemble (GUE). Then we reconstruct universal spectral oscillations and we investigate an asymptotic behavior of the spectral distribution. (author)

  18. An improved method for predicting the evolution of the characteristic parameters of an information system

    Science.gov (United States)

    Dushkin, A. V.; Kasatkina, T. I.; Novoseltsev, V. I.; Ivanov, S. V.

    2018-03-01

    The article proposes a forecasting method that allows, based on the given values of entropy and error level of the first and second kind, to determine the allowable time for forecasting the development of the characteristic parameters of a complex information system. The main feature of the method under consideration is the determination of changes in the characteristic parameters of the development of the information system in the form of the magnitude of the increment in the ratios of its entropy. When a predetermined value of the prediction error ratio is reached, that is, the entropy of the system, the characteristic parameters of the system and the depth of the prediction in time are estimated. The resulting values of the characteristics and will be optimal, since at that moment the system possessed the best ratio of entropy as a measure of the degree of organization and orderliness of the structure of the system. To construct a method for estimating the depth of prediction, it is expedient to use the maximum principle of the value of entropy.

  19. Predicting Career Advancement with Structural Equation Modelling

    Science.gov (United States)

    Heimler, Ronald; Rosenberg, Stuart; Morote, Elsa-Sofia

    2012-01-01

    Purpose: The purpose of this paper is to use the authors' prior findings concerning basic employability skills in order to determine which skills best predict career advancement potential. Design/methodology/approach: Utilizing survey responses of human resource managers, the employability skills showing the largest relationships to career…

  20. Software engineering principles applied to large healthcare information systems--a case report.

    Science.gov (United States)

    Nardon, Fabiane Bizinella; de A Moura, Lincoln

    2007-01-01

    São Paulo is the largest city in Brazil and one of the largest cities in the world. In 2004, São Paulo City Department of Health decided to implement a Healthcare Information System to support managing healthcare services and provide an ambulatory health record. The resulting information system is one of the largest public healthcare information systems ever built, with more than 2 million lines of code. Although statistics shows that most software projects fail, and the risks for the São Paulo initiative were enormous, the information system was completed on-time and on-budget. In this paper, we discuss the software engineering principles adopted that allowed to accomplish that project's goals, hoping that sharing the experience of this project will help other healthcare information systems initiatives to succeed.

  1. The impact of real-time and predictive traffic information on travelers' behavior on the I-4 corridor. Final report.

    Science.gov (United States)

    2003-07-01

    Real time and predicted traffic information plays a key role in the successful implementation of advanced traveler information systems (ATIS) and advance traffic management systems (ATMS). Traffic information is essentially valuable to both transport...

  2. Improving local clustering based top-L link prediction methods via asymmetric link clustering information

    Science.gov (United States)

    Wu, Zhihao; Lin, Youfang; Zhao, Yiji; Yan, Hongyan

    2018-02-01

    Networks can represent a wide range of complex systems, such as social, biological and technological systems. Link prediction is one of the most important problems in network analysis, and has attracted much research interest recently. Many link prediction methods have been proposed to solve this problem with various techniques. We can note that clustering information plays an important role in solving the link prediction problem. In previous literatures, we find node clustering coefficient appears frequently in many link prediction methods. However, node clustering coefficient is limited to describe the role of a common-neighbor in different local networks, because it cannot distinguish different clustering abilities of a node to different node pairs. In this paper, we shift our focus from nodes to links, and propose the concept of asymmetric link clustering (ALC) coefficient. Further, we improve three node clustering based link prediction methods via the concept of ALC. The experimental results demonstrate that ALC-based methods outperform node clustering based methods, especially achieving remarkable improvements on food web, hamster friendship and Internet networks. Besides, comparing with other methods, the performance of ALC-based methods are very stable in both globalized and personalized top-L link prediction tasks.

  3. Identifying and predicting subgroups of information needs among cancer patients: an initial study using latent class analysis.

    Science.gov (United States)

    Neumann, Melanie; Wirtz, Markus; Ernstmann, Nicole; Ommen, Oliver; Längler, Alfred; Edelhäuser, Friedrich; Scheffer, Christian; Tauschel, Diethard; Pfaff, Holger

    2011-08-01

    Understanding how the information needs of cancer patients (CaPts) vary is important because met information needs affect health outcomes and CaPts' satisfaction. The goals of the study were to identify subgroups of CaPts based on self-reported cancer- and treatment-related information needs and to determine whether subgroups could be predicted on the basis of selected sociodemographic, clinical and clinician-patient relationship variables. Three hundred twenty-three CaPts participated in a survey using the "Cancer Patients Information Needs" scale, which is a new tool for measuring cancer-related information needs. The number of information need subgroups and need profiles within each subgroup was identified using latent class analysis (LCA). Multinomial logistic regression was applied to predict class membership. LCA identified a model of five subgroups exhibiting differences in type and extent of CaPts' unmet information needs: a subgroup with "no unmet needs" (31.4% of the sample), two subgroups with "high level of psychosocial unmet information needs" (27.0% and 12.0%), a subgroup with "high level of purely medical unmet information needs" (16.0%) and a subgroup with "high level of medical and psychosocial unmet information needs" (13.6%). An assessment of sociodemographic and clinical characteristics revealed that younger CaPts and CaPts' requiring psychological support seem to belong to subgroups with a higher level of unmet information needs. However, the most significant predictor for the subgroups with unmet information needs is a good clinician-patient relationship, i.e. subjective perception of high level of trust in and caring attention from nurses together with high degree of physician empathy seems to be predictive for inclusion in the subgroup with no unmet information needs. The results of our study can be used by oncology nurses and physicians to increase their awareness of the complexity and heterogeneity of information needs among CaPts and of

  4. Information Mining from Heterogeneous Data Sources: A Case Study on Drought Predictions

    Directory of Open Access Journals (Sweden)

    Getachew B. Demisse

    2017-07-01

    Full Text Available The objective of this study was to develop information mining methodology for drought modeling and predictions using historical records of climate, satellite, environmental, and oceanic data. The classification and regression tree (CART approach was used for extracting drought episodes at different time-lag prediction intervals. Using the CART approach, a number of successful model trees were constructed, which can easily be interpreted and used by decision makers in their drought management decisions. The regression rules produced by CART were found to have correlation coefficients from 0.71–0.95 in rules-alone modeling. The accuracies of the models were found to be higher in the instance and rules model (0.77–0.96 compared to the rules-alone model. From the experimental analysis, it was concluded that different combinations of the nearest neighbor and committee models significantly increase the performances of CART drought models. For more robust results from the developed methodology, it is recommended that future research focus on selecting relevant attributes for slow-onset drought episode identification and prediction.

  5. Discovery of the Largest Orbweaving Spider Species: The Evolution of Gigantism in Nephila

    OpenAIRE

    Kuntner, Matja?; Coddington, Jonathan A.

    2009-01-01

    Background More than 41,000 spider species are known with about 400?500 added each year, but for some well-known groups, such as the giant golden orbweavers, Nephila, the last valid described species dates from the 19th century. Nephila are renowned for being the largest web-spinning spiders, making the largest orb webs, and are model organisms for the study of extreme sexual size dimorphism (SSD) and sexual biology. Here, we report on the discovery of a new, giant Nephila species from Africa...

  6. A heat transport benchmark problem for predicting the impact of measurements on experimental facility design

    International Nuclear Information System (INIS)

    Cacuci, Dan Gabriel

    2016-01-01

    the test-section benchmark, it is shown that the maximum coolant and rod surface temperatures responses are the most important measurable quantities. Using all of the available information, the PM_CMPS formulas yield optimally predicted best-estimate nominal model parameter values and reduced predicted standard deviations for the predicted parameters, such that: (i) the model parameters displaying the largest relative sensitivities and largest relative standard deviations are affected the most; and (ii) the model parameters with zero sensitivities remain unaffected by applying the PM_CMPS methodology. The PM_CMPS formulas also yield optimally predicted best-estimate nominal values for the model responses, along with correspondingly reduced predicted standard deviations. Assimilating within the PM_CMPS methodology an accurate measurement of the benchmark's maximum coolant temperature has the following impacts on the quantity that has been measured (i.e., on the maximum coolant temperature): (i) the nominal value of the predicted maximum coolant temperature is closer to the more “accurate” experimental value; (ii) the originally computed standard deviation is decreased significantly. In addition, the measurement of the maximum coolant temperature impacts not only the quantity being measured, but also impacts the other responses of interest, e.g., the maximum temperature in the rod and the maximum rod surface temperature. In all cases, the predicted standard deviations were decreased substantially, indicating that the predicted values are more accurate than the originally computed ones. The PM_CMPS formulas were also used to predict the impact of the simultaneous assimilation of the maximum coolant and the maximum rod's surface temperature measurements. As expected in view of the fact that the PM_CMPS methodology uses Bayes’ theorem to combine information, the assimilation of two accurate experiments reduces the predicted standard deviations of the

  7. Multiple genetic interaction experiments provide complementary information useful for gene function prediction.

    Directory of Open Access Journals (Sweden)

    Magali Michaut

    Full Text Available Genetic interactions help map biological processes and their functional relationships. A genetic interaction is defined as a deviation from the expected phenotype when combining multiple genetic mutations. In Saccharomyces cerevisiae, most genetic interactions are measured under a single phenotype - growth rate in standard laboratory conditions. Recently genetic interactions have been collected under different phenotypic readouts and experimental conditions. How different are these networks and what can we learn from their differences? We conducted a systematic analysis of quantitative genetic interaction networks in yeast performed under different experimental conditions. We find that networks obtained using different phenotypic readouts, in different conditions and from different laboratories overlap less than expected and provide significant unique information. To exploit this information, we develop a novel method to combine individual genetic interaction data sets and show that the resulting network improves gene function prediction performance, demonstrating that individual networks provide complementary information. Our results support the notion that using diverse phenotypic readouts and experimental conditions will substantially increase the amount of gene function information produced by genetic interaction screens.

  8. Informant-reported cognitive symptoms that predict amnestic mild cognitive impairment

    Directory of Open Access Journals (Sweden)

    Malek-Ahmadi Michael

    2012-02-01

    Full Text Available Abstract Background Differentiating amnestic mild cognitive impairment (aMCI from normal cognition is difficult in clinical settings. Self-reported and informant-reported memory complaints occur often in both clinical groups, which then necessitates the use of a comprehensive neuropsychological examination to make a differential diagnosis. However, the ability to identify cognitive symptoms that are predictive of aMCI through informant-based information may provide some clinical utility in accurately identifying individuals who are at risk for developing Alzheimer's disease (AD. Methods The current study utilized a case-control design using data from an ongoing validation study of the Alzheimer's Questionnaire (AQ, an informant-based dementia assessment. Data from 51 cognitively normal (CN individuals participating in a brain donation program and 47 aMCI individuals seen in a neurology practice at the same institute were analyzed to determine which AQ items differentiated aMCI from CN individuals. Results Forward stepwise multiple logistic regression analysis which controlled for age and education showed that 4 AQ items were strong indicators of aMCI which included: repetition of statements and/or questions [OR 13.20 (3.02, 57.66]; trouble knowing the day, date, month, year, and time [OR 17.97 (2.63, 122.77]; difficulty managing finances [OR 11.60 (2.10, 63.99]; and decreased sense of direction [OR 5.84 (1.09, 31.30]. Conclusions Overall, these data indicate that certain informant-reported cognitive symptoms may help clinicians differentiate individuals with aMCI from those with normal cognition. Items pertaining to repetition of statements, orientation, ability to manage finances, and visuospatial disorientation had high discriminatory power.

  9. NetTurnP – Neural Network Prediction of Beta-turns by Use of Evolutionary Information and Predicted Protein Sequence Features

    Science.gov (United States)

    Petersen, Bent; Lundegaard, Claus; Petersen, Thomas Nordahl

    2010-01-01

    β-turns are the most common type of non-repetitive structures, and constitute on average 25% of the amino acids in proteins. The formation of β-turns plays an important role in protein folding, protein stability and molecular recognition processes. In this work we present the neural network method NetTurnP, for prediction of two-class β-turns and prediction of the individual β-turn types, by use of evolutionary information and predicted protein sequence features. It has been evaluated against a commonly used dataset BT426, and achieves a Matthews correlation coefficient of 0.50, which is the highest reported performance on a two-class prediction of β-turn and not-β-turn. Furthermore NetTurnP shows improved performance on some of the specific β-turn types. In the present work, neural network methods have been trained to predict β-turn or not and individual β-turn types from the primary amino acid sequence. The individual β-turn types I, I', II, II', VIII, VIa1, VIa2, VIba and IV have been predicted based on classifications by PROMOTIF, and the two-class prediction of β-turn or not is a superset comprised of all β-turn types. The performance is evaluated using a golden set of non-homologous sequences known as BT426. Our two-class prediction method achieves a performance of: MCC  = 0.50, Qtotal = 82.1%, sensitivity  = 75.6%, PPV  = 68.8% and AUC  = 0.864. We have compared our performance to eleven other prediction methods that obtain Matthews correlation coefficients in the range of 0.17 – 0.47. For the type specific β-turn predictions, only type I and II can be predicted with reasonable Matthews correlation coefficients, where we obtain performance values of 0.36 and 0.31, respectively. Conclusion The NetTurnP method has been implemented as a webserver, which is freely available at http://www.cbs.dtu.dk/services/NetTurnP/. NetTurnP is the only available webserver that allows submission of multiple sequences. PMID:21152409

  10. Poor sleep quality predicts deficient emotion information processing over time in early adolescence.

    Science.gov (United States)

    Soffer-Dudek, Nirit; Sadeh, Avi; Dahl, Ronald E; Rosenblat-Stein, Shiran

    2011-11-01

    There is deepening understanding of the effects of sleep on emotional information processing. Emotion information processing is a key aspect of social competence, which undergoes important maturational and developmental changes in adolescence; however, most research in this area has focused on adults. Our aim was to test the links between sleep and emotion information processing during early adolescence. Sleep and facial information processing were assessed objectively during 3 assessment waves, separated by 1-year lags. Data were obtained in natural environments-sleep was assessed in home settings, and facial information processing was assessed at school. 94 healthy children (53 girls, 41 boys), aged 10 years at Time 1. N/A. Facial information processing was tested under neutral (gender identification) and emotional (emotional expression identification) conditions. Sleep was assessed in home settings using actigraphy for 7 nights at each assessment wave. Waking > 5 min was considered a night awakening. Using multilevel modeling, elevated night awakenings and decreased sleep efficiency significantly predicted poor performance only in the emotional information processing condition (e.g., b = -1.79, SD = 0.52, confidence interval: lower boundary = -2.82, upper boundary = -0.076, t(416.94) = -3.42, P = 0.001). Poor sleep quality is associated with compromised emotional information processing during early adolescence, a sensitive period in socio-emotional development.

  11. Information processing biases concurrently and prospectively predict depressive symptoms in adolescents: Evidence from a self-referent encoding task.

    Science.gov (United States)

    Connolly, Samantha L; Abramson, Lyn Y; Alloy, Lauren B

    2016-01-01

    Negative information processing biases have been hypothesised to serve as precursors for the development of depression. The current study examined negative self-referent information processing and depressive symptoms in a community sample of adolescents (N = 291, Mage at baseline = 12.34 ± 0.61, 53% female, 47.4% African-American, 49.5% Caucasian and 3.1% Biracial). Participants completed a computerised self-referent encoding task (SRET) and a measure of depressive symptoms at baseline and completed an additional measure of depressive symptoms nine months later. Several negative information processing biases on the SRET were associated with concurrent depressive symptoms and predicted increases in depressive symptoms at follow-up. Findings partially support the hypothesis that negative information processing biases are associated with depressive symptoms in a nonclinical sample of adolescents, and provide preliminary evidence that these biases prospectively predict increases in depressive symptoms.

  12. Predictive information speeds up visual awareness in an individuation task by modulating threshold setting, not processing efficiency.

    Science.gov (United States)

    De Loof, Esther; Van Opstal, Filip; Verguts, Tom

    2016-04-01

    Theories on visual awareness claim that predicted stimuli reach awareness faster than unpredicted ones. In the current study, we disentangle whether prior information about the upcoming stimulus affects visual awareness of stimulus location (i.e., individuation) by modulating processing efficiency or threshold setting. Analogous research on stimulus identification revealed that prior information modulates threshold setting. However, as identification and individuation are two functionally and neurally distinct processes, the mechanisms underlying identification cannot simply be extrapolated directly to individuation. The goal of this study was therefore to investigate how individuation is influenced by prior information about the upcoming stimulus. To do so, a drift diffusion model was fitted to estimate the processing efficiency and threshold setting for predicted versus unpredicted stimuli in a cued individuation paradigm. Participants were asked to locate a picture, following a cue that was congruent, incongruent or neutral with respect to the picture's identity. Pictures were individuated faster in the congruent and neutral condition compared to the incongruent condition. In the diffusion model analysis, the processing efficiency was not significantly different across conditions. However, the threshold setting was significantly higher following an incongruent cue compared to both congruent and neutral cues. Our results indicate that predictive information about the upcoming stimulus influences visual awareness by shifting the threshold for individuation rather than by enhancing processing efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Learning Predictive Interactions Using Information Gain and Bayesian Network Scoring.

    Directory of Open Access Journals (Sweden)

    Xia Jiang

    Full Text Available The problems of correlation and classification are long-standing in the fields of statistics and machine learning, and techniques have been developed to address these problems. We are now in the era of high-dimensional data, which is data that can concern billions of variables. These data present new challenges. In particular, it is difficult to discover predictive variables, when each variable has little marginal effect. An example concerns Genome-wide Association Studies (GWAS datasets, which involve millions of single nucleotide polymorphism (SNPs, where some of the SNPs interact epistatically to affect disease status. Towards determining these interacting SNPs, researchers developed techniques that addressed this specific problem. However, the problem is more general, and so these techniques are applicable to other problems concerning interactions. A difficulty with many of these techniques is that they do not distinguish whether a learned interaction is actually an interaction or whether it involves several variables with strong marginal effects.We address this problem using information gain and Bayesian network scoring. First, we identify candidate interactions by determining whether together variables provide more information than they do separately. Then we use Bayesian network scoring to see if a candidate interaction really is a likely model. Our strategy is called MBS-IGain. Using 100 simulated datasets and a real GWAS Alzheimer's dataset, we investigated the performance of MBS-IGain.When analyzing the simulated datasets, MBS-IGain substantially out-performed nine previous methods at locating interacting predictors, and at identifying interactions exactly. When analyzing the real Alzheimer's dataset, we obtained new results and results that substantiated previous findings. We conclude that MBS-IGain is highly effective at finding interactions in high-dimensional datasets. This result is significant because we have increasingly

  14. Mass balance of Greenland's three largest outlet glaciers - 2000–2010

    NARCIS (Netherlands)

    Howat, I.M.; Ahn, Y.; Joughin, I.; van den Broeke, M.R.; Lenaerts, J.T.M.; Smith, B.

    2011-01-01

    Acceleration of Greenland's three largest outlet glaciers, Helheim, Kangerdlugssuaq and Jakobshavn Isbræ, accounted for a substantial portion of the ice sheet's mass loss over the past decade. Rapid changes in their discharge, however, make their cumulative mass-change uncertain. We derive monthly

  15. An Information Retrieval Approach for Robust Prediction of Road Surface States.

    Science.gov (United States)

    Park, Jae-Hyung; Kim, Kwanho

    2017-01-28

    Recently, due to the increasing importance of reducing severe vehicle accidents on roads (especially on highways), the automatic identification of road surface conditions, and the provisioning of such information to drivers in advance, have recently been gaining significant momentum as a proactive solution to decrease the number of vehicle accidents. In this paper, we firstly propose an information retrieval approach that aims to identify road surface states by combining conventional machine-learning techniques and moving average methods. Specifically, when signal information is received from a radar system, our approach attempts to estimate the current state of the road surface based on the similar instances observed previously based on utilizing a given similarity function. Next, the estimated state is then calibrated by using the recently estimated states to yield both effective and robust prediction results. To validate the performances of the proposed approach, we established a real-world experimental setting on a section of actual highway in South Korea and conducted a comparison with the conventional approaches in terms of accuracy. The experimental results show that the proposed approach successfully outperforms the previously developed methods.

  16. Prediction of Periodontitis Occurrence: Influence of Classification and Sociodemographic and General Health Information

    DEFF Research Database (Denmark)

    Manzolli Leite, Fabio Renato; Peres, Karen Glazer; Do, Loc Giang

    2017-01-01

    BACKGROUND: Prediction of periodontitis development is challenging. Use of oral health-related data alone, especially in a young population, might underestimate disease risk. This study investigates accuracy of oral, systemic, and socioeconomic data on estimating periodontitis development...... in a population-based prospective cohort. METHODS: General health history and sociodemographic information were collected throughout the life-course of individuals. Oral examinations were performed at ages 24 and 31 years in the Pelotas 1982 birth cohort. Periodontitis at age 31 years according to six...... classifications was used as the gold standard to compute area under the receiver operating characteristic curve (AUC). Multivariable binomial regression models were used to evaluate the effects of oral health, general health, and socioeconomic characteristics on accuracy of periodontitis development prediction...

  17. A New Prediction Model for Transformer Winding Hotspot Temperature Fluctuation Based on Fuzzy Information Granulation and an Optimized Wavelet Neural Network

    Directory of Open Access Journals (Sweden)

    Li Zhang

    2017-12-01

    Full Text Available Winding hotspot temperature is the key factor affecting the load capacity and service life of transformers. For the early detection of transformer winding hotspot temperature anomalies, a new prediction model for the hotspot temperature fluctuation range based on fuzzy information granulation (FIG and the chaotic particle swarm optimized wavelet neural network (CPSO-WNN is proposed in this paper. The raw data are firstly processed by FIG to extract useful information from each time window. The extracted information is then used to construct a wavelet neural network (WNN prediction model. Furthermore, the structural parameters of WNN are optimized by chaotic particle swarm optimization (CPSO before it is used to predict the fluctuation range of the hotspot temperature. By analyzing the experimental data with four different prediction models, we find that the proposed method is more effective and is of guiding significance for the operation and maintenance of transformers.

  18. Soliciting scientific information and beliefs in predictive modeling and adaptive management

    Science.gov (United States)

    Glynn, P. D.; Voinov, A. A.; Shapiro, C. D.

    2015-12-01

    Post-normal science requires public engagement and adaptive corrections in addressing issues with high complexity and uncertainty. An adaptive management framework is presented for the improved management of natural resources and environments through a public participation process. The framework solicits the gathering and transformation and/or modeling of scientific information but also explicitly solicits the expression of participant beliefs. Beliefs and information are compared, explicitly discussed for alignments or misalignments, and ultimately melded back together as a "knowledge" basis for making decisions. An effort is made to recognize the human or participant biases that may affect the information base and the potential decisions. In a separate step, an attempt is made to recognize and predict the potential "winners" and "losers" (perceived or real) of any decision or action. These "winners" and "losers" include present human communities with different spatial, demographic or socio-economic characteristics as well as more dispersed or more diffusely characterized regional or global communities. "Winners" and "losers" may also include future human communities as well as communities of other biotic species. As in any adaptive management framework, assessment of predictions, iterative follow-through and adaptation of policies or actions is essential, and commonly very difficult or impossible to achieve. Recognizing beforehand the limits of adaptive management is essential. More generally, knowledge of the behavioral and economic sciences and of ethics and sociology will be key to a successful implementation of this adaptive management framework. Knowledge of biogeophysical processes will also be essential, but by definition of the issues being addressed, will always be incomplete and highly uncertain. The human dimensions of the issues addressed and the participatory processes used carry their own complexities and uncertainties. Some ideas and principles are

  19. Middle Range Sea Ice Prediction System of Voyage Environmental Information System in Arctic Sea Route

    Science.gov (United States)

    Lim, H. S.

    2017-12-01

    Due to global warming, the sea ice in the Arctic Ocean is melting dramatically in summer, which is providing a new opportunity to exploit the Northern Sea Route (NSR) connecting Asia and Europe ship route. Recent increases in logistics transportation through NSR and resource development reveal the possible threats of marine pollution and marine transportation accidents without real-time navigation system. To develop a safe Voyage Environmental Information System (VEIS) for vessels operating, the Korea Institute of Ocean Science and Technology (KIOST) which is supported by the Ministry of Oceans and Fisheries, Korea has initiated the development of short-term and middle range prediction system for the sea ice concentration (SIC) and sea ice thickness (SIT) in NSR since 2014. The sea ice prediction system of VEIS consists of AMSR2 satellite composite images (a day), short-term (a week) prediction system, and middle range (a month) prediction system using a statistical method with re-analysis data (TOPAZ) and short-term predicted model data. In this study, the middle range prediction system for the SIC and SIT in NSR is calibrated with another middle range predicted atmospheric and oceanic data (NOAA CFSv2). The system predicts one month SIC and SIT on a daily basis, as validated with dynamic composite SIC data extracted from AMSR2 L2 satellite images.

  20. Informal Workplace Learning among Nurses: Organisational Learning Conditions and Personal Characteristics That Predict Learning Outcomes

    Science.gov (United States)

    Kyndt, Eva; Vermeire, Eva; Cabus, Shana

    2016-01-01

    Purpose: This paper aims to examine which organisational learning conditions and individual characteristics predict the learning outcomes nurses achieve through informal learning activities. There is specific relevance for the nursing profession because of the rapidly changing healthcare systems. Design/Methodology/Approach: In total, 203 nurses…

  1. The largest glitch observed in the Crab pulsar

    Science.gov (United States)

    Shaw, B.; Lyne, A. G.; Stappers, B. W.; Weltevrede, P.; Bassa, C. G.; Lien, A. Y.; Mickaliger, M. B.; Breton, R. P.; Jordan, C. A.; Keith, M. J.; Krimm, H. A.

    2018-05-01

    We have observed a large glitch in the Crab pulsar (PSR B0531+21). The glitch occurred around MJD 58064 (2017 November 8) when the pulsar underwent an increase in the rotation rate of Δν = 1.530 × 10-5 Hz, corresponding to a fractional increase of Δν/ν = 0.516 × 10-6 making this event the largest glitch ever observed in this source. Due to our high-cadence and long-dwell time observations of the Crab pulsar we are able to partially resolve a fraction of the total spin-up of the star. This delayed spin-up occurred over a timescale of ˜1.7 days and is similar to the behaviour seen in the 1989 and 1996 large Crab pulsar glitches. The spin-down rate also increased at the glitch epoch by Δ \\dot{ν } / \\dot{ν } = 7 × 10^{-3}. In addition to being the largest such event observed in the Crab, the glitch occurred after the longest period of glitch inactivity since at least 1984 and we discuss a possible relationship between glitch size and waiting time. No changes to the shape of the pulse profile were observed near the glitch epoch at 610 MHz or 1520 MHz, nor did we identify any changes in the X-ray flux from the pulsar. The long-term recovery from the glitch continues to progress as \\dot{ν } slowly rises towards pre-glitch values. In line with other large Crab glitches, we expect there to be a persistent change to \\dot{ν }. We continue to monitor the long-term recovery with frequent, high quality observations.

  2. Predicting disease risk using bootstrap ranking and classification algorithms.

    Directory of Open Access Journals (Sweden)

    Ohad Manor

    Full Text Available Genome-wide association studies (GWAS are widely used to search for genetic loci that underlie human disease. Another goal is to predict disease risk for different individuals given their genetic sequence. Such predictions could either be used as a "black box" in order to promote changes in life-style and screening for early diagnosis, or as a model that can be studied to better understand the mechanism of the disease. Current methods for risk prediction typically rank single nucleotide polymorphisms (SNPs by the p-value of their association with the disease, and use the top-associated SNPs as input to a classification algorithm. However, the predictive power of such methods is relatively poor. To improve the predictive power, we devised BootRank, which uses bootstrapping in order to obtain a robust prioritization of SNPs for use in predictive models. We show that BootRank improves the ability to predict disease risk of unseen individuals in the Wellcome Trust Case Control Consortium (WTCCC data and results in a more robust set of SNPs and a larger number of enriched pathways being associated with the different diseases. Finally, we show that combining BootRank with seven different classification algorithms improves performance compared to previous studies that used the WTCCC data. Notably, diseases for which BootRank results in the largest improvements were recently shown to have more heritability than previously thought, likely due to contributions from variants with low minimum allele frequency (MAF, suggesting that BootRank can be beneficial in cases where SNPs affecting the disease are poorly tagged or have low MAF. Overall, our results show that improving disease risk prediction from genotypic information may be a tangible goal, with potential implications for personalized disease screening and treatment.

  3. Longevity in Calumma parsonii, the World's largest chameleon.

    Science.gov (United States)

    Tessa, Giulia; Glaw, Frank; Andreone, Franco

    2017-03-01

    Large body size of ectothermic species can be correlated with high life expectancy. We assessed the longevity of the World's largest chameleon, the Parson's chameleon Calumma parsonii from Madagascar by using skeletochronology of phalanges taken from preserved specimens held in European natural history museums. Due to the high bone resorption we can provide only the minimum age of each specimen. The highest minimum age detected was nine years for a male and eight years for a female, confirming that this species is considerably long living among chameleons. Our data also show a strong correlation between snout-vent length and estimated age. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Predictive information processing in music cognition. A critical review.

    Science.gov (United States)

    Rohrmeier, Martin A; Koelsch, Stefan

    2012-02-01

    Expectation and prediction constitute central mechanisms in the perception and cognition of music, which have been explored in theoretical and empirical accounts. We review the scope and limits of theoretical accounts of musical prediction with respect to feature-based and temporal prediction. While the concept of prediction is unproblematic for basic single-stream features such as melody, it is not straight-forward for polyphonic structures or higher-order features such as formal predictions. Behavioural results based on explicit and implicit (priming) paradigms provide evidence of priming in various domains that may reflect predictive behaviour. Computational learning models, including symbolic (fragment-based), probabilistic/graphical, or connectionist approaches, provide well-specified predictive models of specific features and feature combinations. While models match some experimental results, full-fledged music prediction cannot yet be modelled. Neuroscientific results regarding the early right-anterior negativity (ERAN) and mismatch negativity (MMN) reflect expectancy violations on different levels of processing complexity, and provide some neural evidence for different predictive mechanisms. At present, the combinations of neural and computational modelling methodologies are at early stages and require further research. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Analytic approximation to the largest eigenvalue distribution of a white Wishart matrix

    CSIR Research Space (South Africa)

    Vlok, JD

    2012-08-14

    Full Text Available offers largely simplified computation and provides statistics such as the mean value and region of support of the largest eigenvalue distribution. Numeric results from the literature are compared with the approximation and Monte Carlo simulation results...

  6. Possible User-Dependent CFD Predictions of Transitional Flow in Building Ventilation

    DEFF Research Database (Denmark)

    Peng, Lei; Nielsen, Peter Vilhelm; Wang, Xiaoxue

    2016-01-01

    A modified backward-facing step flow with a large expansion ratio of five (5) was modelled by 19 teams without benchmark solutions or experimental data for validation in an ISHVAC-COBEE July 2015 Tianjin Workshop, entitled as “to predict low turbulent flow”. Different computational fluid dynamics...... (CFD) codes/software, turbulence models, boundary conditions, numerical schemes and convergent criteria were adopted based on the own CFD experience of each participating team. The largest coefficient of variation is larger than 50% and the largest relative maximum difference of penetration length......, is shown to be still a very challenging task. This calls for a solid approach of validation and uncertainty assessment in CFD “experiments”. The users are recommended to follow an existing guideline of uncertainty assessment of CFD predictions to minimize the errors and uncertainties in the future....

  7. An analysis of audit committee effectiveness at the largest listed companies in South Africa from a CFO and audit committee perspective

    Directory of Open Access Journals (Sweden)

    Ben Marx

    2009-12-01

    regarding IT-related aspects. Value of research: The study provides valuable information on audit committee practices and the effectiveness of audit committees at the largest listed companies in South Africa. These findings can therefore serve as guidelines for best practice standards for audit committees at other companies and institutions. Conclusion: Audit committees at the largest listed companies in South Africa were found to be well established and according to the views of the CFOs and audit committee chairs to be functioning effectively. Further research regarding the subject field of audit committees should focus on the status and effective functioning thereof at smaller companies, unlisted entities, higher education institutions and public sector entities.

  8. Alpha Oscillations during Incidental Encoding Predict Subsequent Memory for New "Foil" Information.

    Science.gov (United States)

    Vogelsang, David A; Gruber, Matthias; Bergström, Zara M; Ranganath, Charan; Simons, Jon S

    2018-05-01

    People can employ adaptive strategies to increase the likelihood that previously encoded information will be successfully retrieved. One such strategy is to constrain retrieval toward relevant information by reimplementing the neurocognitive processes that were engaged during encoding. Using EEG, we examined the temporal dynamics with which constraining retrieval toward semantic versus nonsemantic information affects the processing of new "foil" information encountered during a memory test. Time-frequency analysis of EEG data acquired during an initial study phase revealed that semantic compared with nonsemantic processing was associated with alpha decreases in a left frontal electrode cluster from around 600 msec after stimulus onset. Successful encoding of semantic versus nonsemantic foils during a subsequent memory test was related to decreases in alpha oscillatory activity in the same left frontal electrode cluster, which emerged relatively late in the trial at around 1000-1600 msec after stimulus onset. Across participants, left frontal alpha power elicited by semantic processing during the study phase correlated significantly with left frontal alpha power associated with semantic foil encoding during the memory test. Furthermore, larger left frontal alpha power decreases elicited by semantic foil encoding during the memory test predicted better subsequent semantic foil recognition in an additional surprise foil memory test, although this effect did not reach significance. These findings indicate that constraining retrieval toward semantic information involves reimplementing semantic encoding operations that are mediated by alpha oscillations and that such reimplementation occurs at a late stage of memory retrieval, perhaps reflecting additional monitoring processes.

  9. Genome of Phaeocystis globosa virus PgV-16T highlights the common ancestry of the largest known DNA viruses infecting eukaryotes

    Science.gov (United States)

    Santini, Sebastien; Jeudy, Sandra; Bartoli, Julia; Poirot, Olivier; Lescot, Magali; Abergel, Chantal; Barbe, Valérie; Wommack, K. Eric; Noordeloos, Anna A. M.; Brussaard, Corina P. D.; Claverie, Jean-Michel

    2013-01-01

    Large dsDNA viruses are involved in the population control of many globally distributed species of eukaryotic phytoplankton and have a prominent role in bloom termination. The genus Phaeocystis (Haptophyta, Prymnesiophyceae) includes several high-biomass-forming phytoplankton species, such as Phaeocystis globosa, the blooms of which occur mostly in the coastal zone of the North Atlantic and the North Sea. Here, we report the 459,984-bp-long genome sequence of P. globosa virus strain PgV-16T, encoding 434 proteins and eight tRNAs and, thus, the largest fully sequenced genome to date among viruses infecting algae. Surprisingly, PgV-16T exhibits no phylogenetic affinity with other viruses infecting microalgae (e.g., phycodnaviruses), including those infecting Emiliania huxleyi, another ubiquitous bloom-forming haptophyte. Rather, PgV-16T belongs to an emerging clade (the Megaviridae) clustering the viruses endowed with the largest known genomes, including Megavirus, Mimivirus (both infecting acanthamoeba), and a virus infecting the marine microflagellate grazer Cafeteria roenbergensis. Seventy-five percent of the best matches of PgV-16T–predicted proteins correspond to two viruses [Organic Lake phycodnavirus (OLPV)1 and OLPV2] from a hypersaline lake in Antarctica (Organic Lake), the hosts of which are unknown. As for OLPVs and other Megaviridae, the PgV-16T sequence data revealed the presence of a virophage-like genome. However, no virophage particle was detected in infected P. globosa cultures. The presence of many genes found only in Megaviridae in its genome and the presence of an associated virophage strongly suggest that PgV-16T shares a common ancestry with the largest known dsDNA viruses, the host range of which already encompasses the earliest diverging branches of domain Eukarya. PMID:23754393

  10. Digital contract approach for consistent and predictable multimedia information delivery in electronic commerce

    Science.gov (United States)

    Konana, Prabhudev; Gupta, Alok; Whinston, Andrew B.

    1997-01-01

    A pure 'technological' solution to network quality problems is incomplete since any benefits from new technologies are offset by the demand from exponentially growing electronic commerce ad data-intensive applications. SInce an economic paradigm is implicit in electronic commerce, we propose a 'market-system' approach to improve quality of service. Quality of service for digital products takes on a different meaning since users view quality of service differently and value information differently. We propose a framework for electronic commerce that is based on an economic paradigm and mass-customization, and works as a wide-area distributed management system. In our framework, surrogate-servers act as intermediaries between information provides and end- users, and arrange for consistent and predictable information delivery through 'digital contracts.' These contracts are negotiated and priced based on economic principles. Surrogate servers pre-fetched, through replication, information from many different servers and consolidate based on demand expectations. In order to recognize users' requirements and process requests accordingly, real-time databases are central to our framework. We also propose that multimedia information be separated into slowly changing and rapidly changing data streams to improve response time requirements. Surrogate- servers perform the tasks of integration of these data streams that is transparent to end-users.

  11. Switzerland's largest wood-pellet factory in Balsthal

    International Nuclear Information System (INIS)

    Stohler, F.

    2004-01-01

    This article describes how a small Swiss electricity utility has broken out of its traditional role in power generation and the distribution of electricity and gone into the production of wood pellets. The pellets, which are made from waste wood (sawdust) available from wood processing companies, are produced on a large scale in one of Europe's largest pellets production facilities. The boom in the use of wood pellets for heating purposes is discussed. The article discusses this unusual approach for a Swiss power utility, which also operates a wood-fired power station and is even involved in an incineration plant for household wastes. The markets being aimed for in Switzerland and in Europe are described, including modern low-energy-consumption housing projects. A further project is described that is to use waste wood available from a large wood processing facility planned in the utility's own region

  12. Predicting Greater Prairie-Chicken Lek Site Suitability to Inform Conservation Actions.

    Directory of Open Access Journals (Sweden)

    Torre J Hovick

    Full Text Available The demands of a growing human population dictates that expansion of energy infrastructure, roads, and other development frequently takes place in native rangelands. Particularly, transmission lines and roads commonly divide rural landscapes and increase fragmentation. This has direct and indirect consequences on native wildlife that can be mitigated through thoughtful planning and proactive approaches to identifying areas of high conservation priority. We used nine years (2003-2011 of Greater Prairie-Chicken (Tympanuchus cupido lek locations totaling 870 unique leks sites in Kansas and seven geographic information system (GIS layers describing land cover, topography, and anthropogenic structures to model habitat suitability across the state. The models obtained had low omission rates (0.81, indicating high model performance and reliability of predicted habitat suitability for Greater Prairie-Chickens. We found that elevation was the most influential in predicting lek locations, contributing three times more predictive power than any other variable. However, models were improved by the addition of land cover and anthropogenic features (transmission lines, roads, and oil and gas structures. Overall, our analysis provides a hierarchal understanding of Greater Prairie-Chicken habitat suitability that is broadly based on geomorphological features followed by land cover suitability. We found that when land features and vegetation cover are suitable for Greater Prairie-Chickens, fragmentation by anthropogenic sources such as roadways and transmission lines are a concern. Therefore, it is our recommendation that future human development in Kansas avoid areas that our models identified as highly suitable for Greater Prairie-Chickens and focus development on land cover types that are of lower conservation concern.

  13. HemeBIND: a novel method for heme binding residue prediction by combining structural and sequence information

    Directory of Open Access Journals (Sweden)

    Hu Jianjun

    2011-05-01

    Full Text Available Abstract Background Accurate prediction of binding residues involved in the interactions between proteins and small ligands is one of the major challenges in structural bioinformatics. Heme is an essential and commonly used ligand that plays critical roles in electron transfer, catalysis, signal transduction and gene expression. Although much effort has been devoted to the development of various generic algorithms for ligand binding site prediction over the last decade, no algorithm has been specifically designed to complement experimental techniques for identification of heme binding residues. Consequently, an urgent need is to develop a computational method for recognizing these important residues. Results Here we introduced an efficient algorithm HemeBIND for predicting heme binding residues by integrating structural and sequence information. We systematically investigated the characteristics of binding interfaces based on a non-redundant dataset of heme-protein complexes. It was found that several sequence and structural attributes such as evolutionary conservation, solvent accessibility, depth and protrusion clearly illustrate the differences between heme binding and non-binding residues. These features can then be separately used or combined to build the structure-based classifiers using support vector machine (SVM. The results showed that the information contained in these features is largely complementary and their combination achieved the best performance. To further improve the performance, an attempt has been made to develop a post-processing procedure to reduce the number of false positives. In addition, we built a sequence-based classifier based on SVM and sequence profile as an alternative when only sequence information can be used. Finally, we employed a voting method to combine the outputs of structure-based and sequence-based classifiers, which demonstrated remarkably better performance than the individual classifier alone

  14. Predicting long-term average concentrations of traffic-related air pollutants using GIS-based information

    Science.gov (United States)

    Hochadel, Matthias; Heinrich, Joachim; Gehring, Ulrike; Morgenstern, Verena; Kuhlbusch, Thomas; Link, Elke; Wichmann, H.-Erich; Krämer, Ursula

    Global regression models were developed to estimate individual levels of long-term exposure to traffic-related air pollutants. The models are based on data of a one-year measurement programme including geographic data on traffic and population densities. This investigation is part of a cohort study on the impact of traffic-related air pollution on respiratory health, conducted at the westerly end of the Ruhr-area in North-Rhine Westphalia, Germany. Concentrations of NO 2, fine particle mass (PM 2.5) and filter absorbance of PM 2.5 as a marker for soot were measured at 40 sites spread throughout the study region. Fourteen-day samples were taken between March 2002 and March 2003 for each season and site. Annual average concentrations for the sites were determined after adjustment for temporal variation. Information on traffic counts in major roads, building densities and community population figures were collected in a geographical information system (GIS). This information was used to calculate different potential traffic-based predictors: (a) daily traffic flow and maximum traffic intensity of buffers with radii from 50 to 10 000 m and (b) distances to main roads and highways. NO 2 concentration and PM 2.5 absorbance were strongly correlated with the traffic-based variables. Linear regression prediction models, which involved predictors with radii of 50 to 1000 m, were developed for the Wesel region where most of the cohort members lived. They reached a model fit ( R2) of 0.81 and 0.65 for NO 2 and PM 2.5 absorbance, respectively. Regression models for the whole area required larger spatial scales and reached R2=0.90 and 0.82. Comparison of predicted values with NO 2 measurements at independent public monitoring stations showed a satisfactory association ( r=0.66). PM 2.5 concentration, however, was only slightly correlated and thus poorly predictable by traffic-based variables ( rGIS-based regression models offer a promising approach to assess individual levels of

  15. Predicting nucleic acid binding interfaces from structural models of proteins.

    Science.gov (United States)

    Dror, Iris; Shazman, Shula; Mukherjee, Srayanta; Zhang, Yang; Glaser, Fabian; Mandel-Gutfreund, Yael

    2012-02-01

    The function of DNA- and RNA-binding proteins can be inferred from the characterization and accurate prediction of their binding interfaces. However, the main pitfall of various structure-based methods for predicting nucleic acid binding function is that they are all limited to a relatively small number of proteins for which high-resolution three-dimensional structures are available. In this study, we developed a pipeline for extracting functional electrostatic patches from surfaces of protein structural models, obtained using the I-TASSER protein structure predictor. The largest positive patches are extracted from the protein surface using the patchfinder algorithm. We show that functional electrostatic patches extracted from an ensemble of structural models highly overlap the patches extracted from high-resolution structures. Furthermore, by testing our pipeline on a set of 55 known nucleic acid binding proteins for which I-TASSER produces high-quality models, we show that the method accurately identifies the nucleic acids binding interface on structural models of proteins. Employing a combined patch approach we show that patches extracted from an ensemble of models better predicts the real nucleic acid binding interfaces compared with patches extracted from independent models. Overall, these results suggest that combining information from a collection of low-resolution structural models could be a valuable approach for functional annotation. We suggest that our method will be further applicable for predicting other functional surfaces of proteins with unknown structure. Copyright © 2011 Wiley Periodicals, Inc.

  16. The epidemiological trends of head injury in the largest Canadian adult trauma center from 1986 to 2007.

    Science.gov (United States)

    Cadotte, David W; Vachhrajani, Shobhan; Pirouzmand, Farhad

    2011-06-01

    This study documents the epidemiology of head injury over the course of 22 years in the largest Level I adult trauma center in Canada. This information defines the current state, changing pattern, and relative distribution of demographic factors in a defined group of trauma patients. It will aid in hypothesis generation to direct etiological research, administrative resource allocation, and preventative strategies. Data on all the trauma patients treated at Sunnybrook Health Sciences Centre (SHSC) from 1986 to 2007 were collected in a consecutive, prospective fashion. The authors reviewed these data from the Sunnybrook Trauma Registry Database in a retrospective fashion. The aggregate data on head injury included demographic data, cause of injury, and Injury Severity Score (ISS). The collected data were analyzed using univariate techniques to depict the trend of variables over years. The authors used the length of stay (LOS) and number of deaths per year (case fatality rate) as crude measures of outcome. A total of 16,678 patients were treated through the Level I trauma center at SHSC from January 1986 to December 2007. Of these, 9315 patients met the inclusion criteria (ISS > 12, head Abbreviated Injury Scale score > 0). The median age of all trauma patients was 36 years, and 69.6% were male. The median ISS of the head-injury patients was 27. The median age of this group of patients increased by 12 years over the study period. Motorized vehicle accidents accounted for the greatest number of head injuries (60.3%) although the relative percentage decreased over the study period. The median transfer time of patients sustaining a head injury was 2.58 hours, and there was an approximately 45 minute improvement over the 22-year study period. The median LOS in our center decreased from 19 to 10 days over the study period. The average case fatality rate was 17.4% over the study period. In multivariate analysis, more severe injuries were associated with increased LOS as

  17. BALU: Largest autoclave research facility in the world

    Directory of Open Access Journals (Sweden)

    Hakan Ucan

    2016-03-01

    Full Text Available Among the large-scale facilities operated at the Center for Lightweight-Production-Technology of the German Aerospace Center in Stade BALU is the world's largest research autoclave. With a loading length of 20m and a loading diameter of 5.8 m the main objective of the facility is the optimization of the curing process operated by components made of carbon fiber on an industrial scale. For this reason, a novel dynamic autoclaving control has been developed that is characterized by peripheral devices to expend the performance of the facility for differential applications, by sensing systems to detect the component state throughout the curing process and by a feedback system, which is capable to intervene into the running autoclave process.

  18. An Information Theory Account of Preference Prediction Accuracy

    NARCIS (Netherlands)

    Pollmann, Monique; Scheibehenne, Benjamin

    2015-01-01

    Knowledge about other people's preferences is essential for successful social interactions, but what exactly are the driving factors that determine how well we can predict the likes and dislikes of people around us? To investigate the accuracy of couples’ preference predictions we outline and

  19. The Prediction of Teacher Turnover Employing Time Series Analysis.

    Science.gov (United States)

    Costa, Crist H.

    The purpose of this study was to combine knowledge of teacher demographic data with time-series forecasting methods to predict teacher turnover. Moving averages and exponential smoothing were used to forecast discrete time series. The study used data collected from the 22 largest school districts in Iowa, designated as FACT schools. Predictions…

  20. Transforming Atmospheric and Remotely-Sensed Information to Hydrologic Predictability in South Asia

    Science.gov (United States)

    Hopson, T. M.; Riddle, E. E.; Broman, D.; Brakenridge, G. R.; Birkett, C. M.; Kettner, A.; Sampson, K. M.; Boehnert, J.; Priya, S.; Collins, D. C.; Rostkier-Edelstein, D.; Young, W.; Singh, D.; Islam, A. S.

    2017-12-01

    South Asia is a flashpoint for natural disasters with profound societal impacts for the region and globally. Although close to 40% of the world's population depends on the Greater Himalaya's great rivers, $20 Billion of GDP is affected by river floods each year. The frequent occurrence of floods, combined with large and rapidly growing populations with high levels of poverty, make South Asia highly susceptible to humanitarian disasters. The challenges of mitigating such devastating disasters are exacerbated by the limited availability of real-time rain and stream gauge measuring stations and transboundary data sharing, and by constrained institutional commitments to overcome these challenges. To overcome such limitations, India and the World Bank have committed resources to the National Hydrology Project III, with the development objective to improve the extent, quality, and accessibility of water resources information and to strengthen the capacity of targeted water resources management institutions in India. The availability and application of remote sensing products and weather forecasts from ensemble prediction systems (EPS) have transformed river forecasting capability over the last decade, and is of interest to India. In this talk, we review the potential predictability of river flow contributed by some of the freely-available remotely-sensed and weather forecasting products within the framework of the physics of water migration through a watershed. Our specific geographical context is the Ganges, Brahmaputra, and Meghna river basin and a newly-available set of stream gauge measurements located over the region. We focus on satellite rainfall estimation, river height and width estimation, and EPS weather forecasts. For the later, we utilize the THORPEX-TIGGE dataset of global forecasts, and discuss how atmospheric predictability, as measured by an EPS, is transformed into hydrometeorological predictability. We provide an overview of the strengths and

  1. Computational methods using weighed-extreme learning machine to predict protein self-interactions with protein evolutionary information.

    Science.gov (United States)

    An, Ji-Yong; Zhang, Lei; Zhou, Yong; Zhao, Yu-Jun; Wang, Da-Fu

    2017-08-18

    Self-interactions Proteins (SIPs) is important for their biological activity owing to the inherent interaction amongst their secondary structures or domains. However, due to the limitations of experimental Self-interactions detection, one major challenge in the study of prediction SIPs is how to exploit computational approaches for SIPs detection based on evolutionary information contained protein sequence. In the work, we presented a novel computational approach named WELM-LAG, which combined the Weighed-Extreme Learning Machine (WELM) classifier with Local Average Group (LAG) to predict SIPs based on protein sequence. The major improvement of our method lies in presenting an effective feature extraction method used to represent candidate Self-interactions proteins by exploring the evolutionary information embedded in PSI-BLAST-constructed position specific scoring matrix (PSSM); and then employing a reliable and robust WELM classifier to carry out classification. In addition, the Principal Component Analysis (PCA) approach is used to reduce the impact of noise. The WELM-LAG method gave very high average accuracies of 92.94 and 96.74% on yeast and human datasets, respectively. Meanwhile, we compared it with the state-of-the-art support vector machine (SVM) classifier and other existing methods on human and yeast datasets, respectively. Comparative results indicated that our approach is very promising and may provide a cost-effective alternative for predicting SIPs. In addition, we developed a freely available web server called WELM-LAG-SIPs to predict SIPs. The web server is available at http://219.219.62.123:8888/WELMLAG/ .

  2. Predicting forest insect flight activity: A Bayesian network approach.

    Directory of Open Access Journals (Sweden)

    Stephen M Pawson

    Full Text Available Daily flight activity patterns of forest insects are influenced by temporal and meteorological conditions. Temperature and time of day are frequently cited as key drivers of activity; however, complex interactions between multiple contributing factors have also been proposed. Here, we report individual Bayesian network models to assess the probability of flight activity of three exotic insects, Hylurgus ligniperda, Hylastes ater, and Arhopalus ferus in a managed plantation forest context. Models were built from 7,144 individual hours of insect sampling, temperature, wind speed, relative humidity, photon flux density, and temporal data. Discretized meteorological and temporal variables were used to build naïve Bayes tree augmented networks. Calibration results suggested that the H. ater and A. ferus Bayesian network models had the best fit for low Type I and overall errors, and H. ligniperda had the best fit for low Type II errors. Maximum hourly temperature and time since sunrise had the largest influence on H. ligniperda flight activity predictions, whereas time of day and year had the greatest influence on H. ater and A. ferus activity. Type II model errors for the prediction of no flight activity is improved by increasing the model's predictive threshold. Improvements in model performance can be made by further sampling, increasing the sensitivity of the flight intercept traps, and replicating sampling in other regions. Predicting insect flight informs an assessment of the potential phytosanitary risks of wood exports. Quantifying this risk allows mitigation treatments to be targeted to prevent the spread of invasive species via international trade pathways.

  3. Algorithm for predicting the evolution of series of dynamics of complex systems in solving information problems

    Science.gov (United States)

    Kasatkina, T. I.; Dushkin, A. V.; Pavlov, V. A.; Shatovkin, R. R.

    2018-03-01

    In the development of information, systems and programming to predict the series of dynamics, neural network methods have recently been applied. They are more flexible, in comparison with existing analogues and are capable of taking into account the nonlinearities of the series. In this paper, we propose a modified algorithm for predicting the series of dynamics, which includes a method for training neural networks, an approach to describing and presenting input data, based on the prediction by the multilayer perceptron method. To construct a neural network, the values of a series of dynamics at the extremum points and time values corresponding to them, formed based on the sliding window method, are used as input data. The proposed algorithm can act as an independent approach to predicting the series of dynamics, and be one of the parts of the forecasting system. The efficiency of predicting the evolution of the dynamics series for a short-term one-step and long-term multi-step forecast by the classical multilayer perceptron method and a modified algorithm using synthetic and real data is compared. The result of this modification was the minimization of the magnitude of the iterative error that arises from the previously predicted inputs to the inputs to the neural network, as well as the increase in the accuracy of the iterative prediction of the neural network.

  4. Numerical approximation abilities correlate with and predict informal but not formal mathematics abilities.

    Science.gov (United States)

    Libertus, Melissa E; Feigenson, Lisa; Halberda, Justin

    2013-12-01

    Previous research has found a relationship between individual differences in children's precision when nonverbally approximating quantities and their school mathematics performance. School mathematics performance emerges from both informal (e.g., counting) and formal (e.g., knowledge of mathematics facts) abilities. It remains unknown whether approximation precision relates to both of these types of mathematics abilities. In the current study, we assessed the precision of numerical approximation in 85 3- to 7-year-old children four times over a span of 2years. In addition, at the final time point, we tested children's informal and formal mathematics abilities using the Test of Early Mathematics Ability (TEMA-3). We found that children's numerical approximation precision correlated with and predicted their informal, but not formal, mathematics abilities when controlling for age and IQ. These results add to our growing understanding of the relationship between an unlearned nonsymbolic system of quantity representation and the system of mathematics reasoning that children come to master through instruction. Copyright © 2013 Elsevier Inc. All rights reserved.

  5. Unrequested information from routine diagnostic chest CT predicts future cardiovascular events

    International Nuclear Information System (INIS)

    Jacobs, Peter C.; Gondrie, Martijn J.; Grobbee, Diederick E.; Graaf, Yolanda van der; Mali, Willem P.; Oen, Ayke L.; Prokop, Mathias

    2011-01-01

    An increase in the number of CT investigations will likely result in a an increase in unrequested information. Clinical relevance of these findings is unknown. This is the first follow-up study to investigate the prognostic relevance of subclinical coronary (CAC) and aortic calcification (TAC) as contained in routine diagnostic chest CT in a clinical care population. The follow-up of 10,410 subjects (>40 years) from a multicentre, clinical care-based cohort of patients included 240 fatal to 275 non-fatal cardiovascular disease (CVD) events (mean follow-up 17.8 months). Patients with a history of CVD were excluded. Coronary (0-12) and aortic calcification (0-8) were semi-quantitatively scored. We used Cox proportional-hazard models to compute hazard ratios to predict CVD events. CAC and TAC were significantly and independently predictive of CVD events. Compared with subjects with no calcium, the adjusted risk of a CVD event was 3.7 times higher (95% CI, 2.7-5.2) among patients with severe coronary calcification (CAC score ≥6) and 2.7 times higher (95% CI, 2.0-3.7) among patients with severe aortic calcification (TAC score ≥5). Subclinical vascular calcification on CT is a strong predictor of incident CVD events in a routine clinical care population. (orig.)

  6. Information dynamics of brain–heart physiological networks during sleep

    International Nuclear Information System (INIS)

    Faes, L; Nollo, G; Jurysta, F; Marinazzo, D

    2014-01-01

    This study proposes an integrated approach, framed in the emerging fields of network physiology and information dynamics, for the quantitative analysis of brain–heart interaction networks during sleep. With this approach, the time series of cardiac vagal autonomic activity and brain wave activities measured respectively as the normalized high frequency component of heart rate variability and the EEG power in the δ, θ, α, σ, and β bands, are considered as realizations of the stochastic processes describing the dynamics of the heart system and of different brain sub-systems. Entropy-based measures are exploited to quantify the predictive information carried by each (sub)system, and to dissect this information into a part actively stored in the system and a part transferred to it from the other connected systems. The application of this approach to polysomnographic recordings of ten healthy subjects led us to identify a structured network of sleep brain–brain and brain–heart interactions, with the node described by the β EEG power acting as a hub which conveys the largest amount of information flowing between the heart and brain nodes. This network was found to be sustained mostly by the transitions across different sleep stages, as the information transfer was weaker during specific stages than during the whole night, and vanished progressively when moving from light sleep to deep sleep and to REM sleep. (paper)

  7. Information dynamics of brain-heart physiological networks during sleep

    Science.gov (United States)

    Faes, L.; Nollo, G.; Jurysta, F.; Marinazzo, D.

    2014-10-01

    This study proposes an integrated approach, framed in the emerging fields of network physiology and information dynamics, for the quantitative analysis of brain-heart interaction networks during sleep. With this approach, the time series of cardiac vagal autonomic activity and brain wave activities measured respectively as the normalized high frequency component of heart rate variability and the EEG power in the δ, θ, α, σ, and β bands, are considered as realizations of the stochastic processes describing the dynamics of the heart system and of different brain sub-systems. Entropy-based measures are exploited to quantify the predictive information carried by each (sub)system, and to dissect this information into a part actively stored in the system and a part transferred to it from the other connected systems. The application of this approach to polysomnographic recordings of ten healthy subjects led us to identify a structured network of sleep brain-brain and brain-heart interactions, with the node described by the β EEG power acting as a hub which conveys the largest amount of information flowing between the heart and brain nodes. This network was found to be sustained mostly by the transitions across different sleep stages, as the information transfer was weaker during specific stages than during the whole night, and vanished progressively when moving from light sleep to deep sleep and to REM sleep.

  8. Canada's largest co-gen project

    International Nuclear Information System (INIS)

    Salaff, S.

    2000-01-01

    In November 2000, the TransAlta Energy Corp. began construction on its $400 million natural gas fuelled cogeneration project in Sarnia Ontario. The Sarnia Regional Cogeneration Project (SRCP) is designed to integrate a new 440 MW cogeneration facility to be built at the Sarnia Division of Dow Chemicals Canada Inc. with nearby existing generators totaling 210 MW at Dow and Bayer Inc. At 650 MW, the new facility will rank as Canada's largest cogeneration installation. Commercial operation is scheduled for October 2002. TransAlta owns three natural gas fuelled cogeneration facilities in Ontario (in Ottawa, Mississauga and Windsor) totaling 250 MW. The cost of electric power in Ontario is currently controlled by rising natural gas prices and the supply demand imbalance. This balance will be significantly affected by the possible return to service of 2000 MW of nuclear generating capacity. The SRCP project was announced just prior to the Ontario Energy Competition Act of October 1998 which committed the province to introduce competition to the electricity sector and which created major uncertainties in the electricity market. Some of the small, 25 MW projects which survived the market uncertainty included the Toronto-based Toromont Energy Ltd. project involving gas fuelled cogeneration and methane gas generation from landfill projects in Sudbury and Waterloo. It was emphasized that cogeneration and combined heat and power projects have significant environmental advantages over large combined cycle facilities. The Ontario Energy Board is currently considering an application from TransAlta to link the SRCP facility to Ontario's Hydro One Network Inc.'s transmission grid. 1 fig

  9. Stock return predictability and market integration: The role of global and local information

    Directory of Open Access Journals (Sweden)

    David G. McMillan

    2016-12-01

    Full Text Available This paper examines the predictability of a range of international stock markets where we allow the presence of both local and global predictive factors. Recent research has argued that US returns have predictive power for international stock returns. We expand this line of research, following work on market integration, to include a more general definition of the global factor, based on principal components analysis. Results identify three global expected returns factors, one related to the major stock markets of the US, UK and Asia and one related to the other markets analysed. The third component is related to dividend growth. A single dominant realised returns factor is also noted. A forecasting exercise comparing the principal components based factors to a US return factor and local market only factors, as well as the historical mean benchmark finds supportive evidence for the former approach. It is hoped that the results from this paper will be informative on three counts. First, to academics interested in understanding the dynamics asset price movement. Second, to market participants who aim to time the market and engage in portfolio and risk management. Third, to those (policy makers and others who are interested in linkages across international markets and the nature and degree of integration.

  10. Prediction of Groundwater Arsenic Contamination using Geographic Information System and Artificial Neural Network

    Directory of Open Access Journals (Sweden)

    Md. Moqbul Hossain

    2013-01-01

    Full Text Available Ground water arsenic contamination is a well known health and environmental problem in Bangladesh. Sources of this heavy metal are known to be geogenic, however, the processes of its release into groundwater are poorly understood phenomena. In quest of mitigation of the problem it is necessary to predict probable contamination before it causes any damage to human health. Hence our research has been carried out to find the factor relations of arsenic contamination and develop an arsenic contamination prediction model. Researchers have generally agreed that the elevated concentration of arsenic is affected by several factors such as soil reaction (pH, organic matter content, geology, iron content, etc. However, the variability of concentration within short lateral and vertical intervals, and the inter-relationships of variables among themselves, make the statistical analyses highly non-linear and difficult to converge with a meaningful relationship. Artificial Neural Networks (ANN comes in handy for such a black box type problem. This research uses Back propagation Neural Networks (BPNN to train and validate the data derived from Geographic Information System (GIS spatial distribution grids. The neural network architecture with (6-20-1 pattern was able to predict the arsenic concentration with reasonable accuracy.

  11. Foreign exchange risk management : how are the largest non-financial companies in Norway managing their foreign exchange rate exposure?

    OpenAIRE

    Eriksen, Krister; Wedøe, Ola

    2010-01-01

    The purpose of this thesis is to investigate how the largest non-financial companies in Norway manage their foreign exchange rate exposure. This is investigated through the use of a survey distributed to a sample the largest non-financial firms in Norway. According to our results, the largest non-financial companies in Norway have a predefined strategy for managing foreign exchange risk, which is defined by the board of directors or by the management in the organisation. The companies’ mai...

  12. Predicting protein folding rate change upon point mutation using residue-level coevolutionary information.

    Science.gov (United States)

    Mallik, Saurav; Das, Smita; Kundu, Sudip

    2016-01-01

    Change in folding kinetics of globular proteins upon point mutation is crucial to a wide spectrum of biological research, such as protein misfolding, toxicity, and aggregations. Here we seek to address whether residue-level coevolutionary information of globular proteins can be informative to folding rate changes upon point mutations. Generating residue-level coevolutionary networks of globular proteins, we analyze three parameters: relative coevolution order (rCEO), network density (ND), and characteristic path length (CPL). A point mutation is considered to be equivalent to a node deletion of this network and respective percentage changes in rCEO, ND, CPL are found linearly correlated (0.84, 0.73, and -0.61, respectively) with experimental folding rate changes. The three parameters predict the folding rate change upon a point mutation with 0.031, 0.045, and 0.059 standard errors, respectively. © 2015 Wiley Periodicals, Inc.

  13. New technical functions for WSPEEDI: Worldwide version of System for Prediction of Environmental Emergency Dose Information

    International Nuclear Information System (INIS)

    Chino, Masamichi; Nagai, Haruyasu; Furuno, Akiko; Kitabata, Hideyuki; Yamazawa, Hiromi

    2000-01-01

    The Worldwide version of System for Prediction of Environmental Emergency Dose Information (WSPEEDI) at Japan Atomic Energy Research Institute (JAERI) is a computer-based system for providing real-time, world-wide, assessment of radiological impact due to nuclear emergencies. Since JAERI started the developpement of the system in 1980, various components of the system, e.g., three-dimensional atmospheric models, databases, data acquisition network, graphics, etc., have been integrated. The objective area has been also extended from local area for domestic nuclear incidents to hemispheric area for foreign ones. Furthermore, through the validation, exercises and responses to real events during the last decade, the following three state-of-the-art functions are under construction. (1) Construction of prototype international data communications network: For quick exchange of atmospheric modeling products and environmental data during emergency among world-wide emergency response systems, JAERI and Lawrence Livermore National Laboratory started a prototype information exchange protocol between WSPEEDI and the Atmospheric Release Advisory Capability ARAC. The network consists of the Web site/browser portion and the video-teleconferencing tool. The network has been utilized for a fire accident at bituminization plant for radioactive wastes of the former Power Reactor and Nuclear Fuel Development Corporation in March, 1997 and Argeciras incident in Spain occurred in May, 1998. (2) Development of synoptic hydrodynamic model: At present, WSPEEDI simply parameterizes the turbulence diffusion and precipitation scavenging, because information on the boundary layer, cloud and precipitation is insufficient in available global forecasts. Thus, to provide WSPEEDI with such information, this study aims to introduce a hydrodynamic model into WSPEEDI, which can predict boundary layer processes and moist processes, e.g., cloud formation and precipitation processes. (3) Development of

  14. New Chicago-Indiana computer network will handle dataflow from world's largest scientific experiment

    CERN Multimedia

    2006-01-01

    "Massive quantities of data will soon begin flowing from the largest scientific instrument ever built into an international netword of computer centers, including one operated jointly by the University of Chicago and Indiana University." (1,5 page)

  15. Thermal analyses. Information on the expected baking process; Thermische analyses. Informatie over een te verwachten bakgedrag

    Energy Technology Data Exchange (ETDEWEB)

    Van Wijck, H. [Stichting Technisch Centrum voor de Keramische Industrie TCKI, Velp (Netherlands)

    2009-09-01

    The design process and the drying process for architectural ceramics and pottery partly determine the characteristics of the final product, but the largest changes occur during the baking process. An overview is provided of the different thermal analyses and how the information from these analyses can predict the process in practice. (mk) [Dutch] Het vormgevingsproces en het droogproces voor bouwkeramische producten en aardewerk bepalen voor een deel de eigenschappen van de eindproducten, maar de grootste veranderingen treden op bij het bakproces. Een overzicht wordt gegeven van de verschillende thermische analyses en hoe de informatie uit deze analyses het in de praktijk te verwachten gedrag kan voorspellen.

  16. Companies on Facebook : How many of the 100  largest Swedish Companies have a Facebook page, and how do they use it?

    OpenAIRE

    Björkqvist, Johanna; Johannesson, Erik; Jorikson, Linn

    2011-01-01

    Purpose: The purpose of  this thesis is to see if the 100 largest Swedish companies are present on  Facebook, and if they are, how they use their business pages. Further the  customers’ perception of companies’ use of Facebook will be included. To  investigate this, three research questions were created. Background: As Web 2.0 and  its application has changed, the use of Internet, both for companies and  customers, there has been change in how information is delivered and how  people take in ...

  17. Prediction of Land Use Change Based on Markov and GM(1,1 Models

    Directory of Open Access Journals (Sweden)

    SUN Yi-yang

    2016-05-01

    Full Text Available In order to explore the law of land use change in Laiwu City, Markov and GM(1,1 were respectively employed in the prediction of land use change in Laiwu from 2015 to 2050, after which the results were analyzed and discussed. The results showed that:(1The variational trends of all kinds of land use change predicted by the two models were consistent and the goodness of fit of the predictive value in corresponding years in the near future was high, illustrating that the predicted results in the near future were credible and the trend predicted in mid long term could be used as reference. (2The cultivated land would remanin almost no change from 2015 to 2020, and then gradually decreaseed in a small range from 2020 to 2050. The garden, the woodland, the grassland always reducing and the decreare range of the grassland was the largest. The urban village and industrial and mining land, the transportation land would be continuously increased and the range of urban village and industrial and mining land was the largest. The water and water conservancy facilities land and the other land would be always reduced in a very small range. It could be concluded that the results predicted by the two models in the near future were credible and could provide scientific basis for land use planning of Laiwu, while the method could provide reference for the prediction of land use change.

  18. Evaluation of mathematical methods for predicting optimum dose of gamma radiation in sugarcane (Saccharum sp.)

    International Nuclear Information System (INIS)

    Wu, K.K.; Siddiqui, S.H.; Heinz, D.J.; Ladd, S.L.

    1978-01-01

    Two mathematical methods - the reversed logarithmic method and the regression method - were used to compare the predicted and the observed optimum gamma radiation dose (OD 50 ) in vegetative propagules of sugarcane. The reversed logarithmic method, usually used in sexually propagated crops, showed the largest difference between the predicted and observed optimum dose. The regression method resulted in a better prediction of the observed values and is suggested as a better method for the prediction of optimum dose for vegetatively propagated crops. (author)

  19. Opportunities for biodiversity gains under the world’s largest reforestation programme

    OpenAIRE

    Hua, Fangyuan; Wang, Xiaoyang; Zheng, Xinlei; Fisher, Brendan; Wang, Lin; Zhu, Jianguo; Tang, Ya; Yu, Douglas W.; Wilcove, David S.

    2016-01-01

    Reforestation is a critical means of addressing the environmental and social problems of deforestation. China’s Grain-for-Green Program (GFGP) is the world’s largest reforestation scheme. Here we provide the first nationwide assessment of the tree composition of GFGP forests and the first combined ecological and economic study aimed at understanding GFGP’s biodiversity implications. Across China, GFGP forests are overwhelmingly monocultures or compositionally simple mixed forests. Focusing on...

  20. Rehabilitation of the 6 largest hydropower plants in the Republic of Macedonia

    International Nuclear Information System (INIS)

    Chingoski, Vlatko; Savevski, Vasil

    2004-01-01

    In 1998, ESM (Electric Power Co. of Macedonia) received a loan from the International Bank for Reconstruction and Development (IBRD - The World bank) for the cost of the Power System Improvement Project, major part of which is the partial rehabilitation of the six largest HPPs in the Republic of Macedonia. Rehabilitation and life extension of these six largest hydro power plants is given the highest priority in the whole Power System Improvement Project mainly because these HPPs are, in general, fairly old, older than most of the thermal generation capacity and because a significant part of their equipment is wearing out, or is now obsolete with spare parts difficult to obtain. Furthermore, these plants play a vital role in the Macedonian Power System, providing peaking capacity, reserve capacity and frequency control. With the realization of this project, greater hydropower production is expected. It is also expected that HPPs will become a more vital part of the Macedonian Power System, which is also beneficial from an environmental aspect, due to greater usage of renewable energy resources in the country. (Original)

  1. Perceived Threat and Corroboration: Key Factors That Improve a Predictive Model of Trust in Internet-based Health Information and Advice

    Science.gov (United States)

    Harris, Peter R; Briggs, Pam

    2011-01-01

    Background How do people decide which sites to use when seeking health advice online? We can assume, from related work in e-commerce, that general design factors known to affect trust in the site are important, but in this paper we also address the impact of factors specific to the health domain. Objective The current study aimed to (1) assess the factorial structure of a general measure of Web trust, (2) model how the resultant factors predicted trust in, and readiness to act on, the advice found on health-related websites, and (3) test whether adding variables from social cognition models to capture elements of the response to threatening, online health-risk information enhanced the prediction of these outcomes. Methods Participants were asked to recall a site they had used to search for health-related information and to think of that site when answering an online questionnaire. The questionnaire consisted of a general Web trust questionnaire plus items assessing appraisals of the site, including threat appraisals, information checking, and corroboration. It was promoted on the hungersite.com website. The URL was distributed via Yahoo and local print media. We assessed the factorial structure of the measures using principal components analysis and modeled how well they predicted the outcome measures using structural equation modeling (SEM) with EQS software. Results We report an analysis of the responses of participants who searched for health advice for themselves (N = 561). Analysis of the general Web trust questionnaire revealed 4 factors: information quality, personalization, impartiality, and credible design. In the final SEM model, information quality and impartiality were direct predictors of trust. However, variables specific to eHealth (perceived threat, coping, and corroboration) added substantially to the ability of the model to predict variance in trust and readiness to act on advice on the site. The final model achieved a satisfactory fit: χ2 5 = 10

  2. Dispersal syndromes in the largest protection area of the Atlantic Forest in the state of Paraiba, Brazil

    Directory of Open Access Journals (Sweden)

    Camila Ângelo Jerônimo Domingues

    2013-09-01

    Full Text Available The diaspore dispersal process is crucial for plant reproduction, since the diaspores must reach a suitable site to germinate. This paper aimed to study morphological aspects of diaspores and determine the dispersal syndromes of species occurring in the largest protection area of the Atlantic Forest in the state of Paraiba, Brazil, the Guaribas Biological Reserve. One conducted a monthly collection of fruits/seeds within the period from September 2007 to February 2009. All diaspores of the fruiting species were collected. After analyzing characteristics such as fruit and seed consistency, odor, color, size, and weight, one determined the dispersal syndrome of each species. One collected 3,080 diaspores belonging to 136 different species distributed into 27 families. Zoochory was the most abundant dispersal syndrome (58%, with 79 fruits adapted to it, followed by autochory (29%, and anemochory (13%. Throughout the study period, one found fruiting species, with a predominance of zoochoric fruits, a predictable fact in the Atlantic Forest, which provides fleshy fruits all the year round.

  3. SSC RIAR is the largest centre of research reactors

    International Nuclear Information System (INIS)

    Kalygin, V.V.

    1997-01-01

    The State Scientific Centre (SSC) ''Research Institute of Atomic Reactors'' (RIAR) is situated 100 km to the south-east from Moscow, in Dimitrovgrad, the Volga Region of the Russian Federation. SSC RIAR is the largest centre of research reactors in Russia. At present there are 5 types of reactor facilities in operation, including two NPP. One of the main tasks the Centre is the investigations on safety increase for power reactors. Broad international connections are available at the Institute. On the basis of the SSC RIAR during 3 years work has been done on the development of the branch training centre (TC) for the training of operation personnel of research and pilot reactors in Russia. (author). 3 tabs

  4. SSC RIAR is the largest centre of research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Kalygin, V V [State Scientific Centre, Research Inst. of Atomic Reactors (Russian Federation)

    1997-10-01

    The State Scientific Centre (SSC) ``Research Institute of Atomic Reactors`` (RIAR) is situated 100 km to the south-east from Moscow, in Dimitrovgrad, the Volga Region of the Russian Federation. SSC RIAR is the largest centre of research reactors in Russia. At present there are 5 types of reactor facilities in operation, including two NPP. One of the main tasks the Centre is the investigations on safety increase for power reactors. Broad international connections are available at the Institute. On the basis of the SSC RIAR during 3 years work has been done on the development of the branch training centre (TC) for the training of operation personnel of research and pilot reactors in Russia. (author). 3 tabs.

  5. Hospital Prices Increase in California, Especially Among Hospitals in the Largest Multi-hospital Systems

    Directory of Open Access Journals (Sweden)

    Glenn A. Melnick PhD

    2016-06-01

    Full Text Available A surge in hospital consolidation is fueling formation of ever larger multi-hospital systems throughout the United States. This article examines hospital prices in California over time with a focus on hospitals in the largest multi-hospital systems. Our data show that hospital prices in California grew substantially (+76% per hospital admission across all hospitals and all services between 2004 and 2013 and that prices at hospitals that are members of the largest, multi-hospital systems grew substantially more (113% than prices paid to all other California hospitals (70%. Prices were similar in both groups at the start of the period (approximately $9200 per admission. By the end of the period, prices at hospitals in the largest systems exceeded prices at other California hospitals by almost $4000 per patient admission. Our study findings are potentially useful to policy makers across the country for several reasons. Our data measure actual prices for a large sample of hospitals over a long period of time in California. California experienced its wave of consolidation much earlier than the rest of the country and as such our findings may provide some insights into what may happen across the United States from hospital consolidation including growth of large, multi-hospital systems now forming in the rest of the rest of the country.

  6. "When does making detailed predictions make predictions worse?": Correction to Kelly and Simmons (2016).

    Science.gov (United States)

    2016-10-01

    Reports an error in "When Does Making Detailed Predictions Make Predictions Worse" by Theresa F. Kelly and Joseph P. Simmons ( Journal of Experimental Psychology: General , Advanced Online Publication, Aug 8, 2016, np). In the article, the symbols in Figure 2 were inadvertently altered in production. All versions of this article have been corrected. (The following abstract of the original article appeared in record 2016-37952-001.) In this article, we investigate whether making detailed predictions about an event worsens other predictions of the event. Across 19 experiments, 10,896 participants, and 407,045 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes useless or redundant information more accessible and thus more likely to be incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of events will and will not be susceptible to the negative effect of making detailed predictions. PsycINFO Database Record (c) 2016 APA, all rights reserved

  7. What predicts performance during clinical psychology training?

    OpenAIRE

    Scior, Katrina; Bradley, Caroline E; Potts, Henry W W; Woolf, Katherine; de C Williams, Amanda C

    2013-01-01

    Objectives While the question of who is likely to be selected for clinical psychology training has been studied, evidence on performance during training is scant. This study explored data from seven consecutive intakes of the UK's largest clinical psychology training course, aiming to identify what factors predict better or poorer outcomes. Design Longitudinal cross-sectional study using prospective and retrospective data. Method Characteristics at application were analysed in relation to a r...

  8. Towards prediction of soil erodibility using hyperspectral information: a case study in a semi-arid region of Iran

    DEFF Research Database (Denmark)

    Ostovari, Yaser; Ghorbani-Dashtaki, Shoja; Bahrami, Hossein-Ali

    2018-01-01

    and develop Spectrotransfer Function (STF) using spectral reflectance information and Pedotransfer Function (PTF) to predict the K-factor, respectively. The derived STF was compared with developed PTF using measurable soil properties by Ostovari et al. (2016) and the Universal Soil Loss Equation (USLE......Soil Visible–Near-Infrared (Vis-NIR) spectroscopy has become an applicable and interesting technique to evaluate a number of soil properties because it is a fast, cost-effective, and non-invasive measurement technique. The main objective of the study to predict soil erodibility (K-factor), soil...... organic matter (SOM), and calcium carbonate equivalent (CaCO3) in calcareous soils of semi-arid regions located in south of Iran using spectral reflectance information in the Vis-NIR range. The K-factor was measured in 40 erosion plots under natural rainfall and the spectral reflectance of soil samples...

  9. A marine heatwave drives massive losses from the world’s largest seagrass carbon stocks

    KAUST Repository

    Arias-Ortiz, Ariane; Serrano, Oscar; Masqué , Pere; Lavery, P. S.; Mueller, U.; Kendrick, G. A.; Rozaimi, M.; Esteban, A.; Fourqurean, J. W.; Marbà , N.; Mateo, M. A.; Murray, K.; Rule, M. J.; Duarte, Carlos M.

    2018-01-01

    Seagrass ecosystems contain globally significant organic carbon (C) stocks. However, climate change and increasing frequency of extreme events threaten their preservation. Shark Bay, Western Australia, has the largest C stock reported for a seagrass

  10. CorVue algorithm efficacy to predict heart failure in real life: Unnecessary and potentially misleading information?

    Science.gov (United States)

    Palfy, Julia Anna; Benezet-Mazuecos, Juan; Milla, Juan Martinez; Iglesias, Jose Antonio; de la Vieja, Juan Jose; Sanchez-Borque, Pepa; Miracle, Angel; Rubio, Jose Manuel

    2018-06-01

    Heart failure (HF) hospitalizations have a negative impact on quality of life and imply important costs. Intrathoracic impedance (ITI) variations detected by cardiac devices have been hypothesized to predict HF hospitalizations. Although Optivol™ algorithm (Medtronic) has been widely studied, CorVue™ algorithm (St. Jude Medical) long term efficacy has not been systematically evaluated in a "real life" cohort. CorVue™ was activated in ICD/CRT-D patients to store information about ITI measures. Clinical events (new episodes of HF requiring treatment and hospitalizations) and CorVue™ data were recorded every three months. Appropriate CorVue™ detection for HF was considered if it occurred in the four prior weeks to the clinical event. 53 ICD/CRT-D (26 ICD and 27 CRT-D) patients (67±1 years-old, 79% male) were included. Device position was subcutaneous in 28 patients. At inclusion, mean LVEF was 25±7% and 27 patients (51%) were in NYHA class I, 18 (34%) class II and 8 (15%) class III. After a mean follow-up of 17±9 months, 105 ITI drops alarms were detected in 32 patients (60%). Only six alarms were appropriate (true positive) and required hospitalization. Eighteen patients (34%) presented 25 clinical episodes (12 hospitalizations and 13 ER/ambulatory treatment modifications). Nineteen of these clinical episodes (76%) remained undetected by the CorVue™ (false negative). Sensitivity of CorVue™ resulted in 24%, specificity was 70%, positive predictive value of 6% and negative predictive value of 93%. CorVue™ showed a low sensitivity to predict HF events. Therefore, routinely activation of this algorithm could generate misleading information. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  11. Automated information system for analysis and prediction of production situations in blast furnace plant

    Science.gov (United States)

    Lavrov, V. V.; Spirin, N. A.

    2016-09-01

    Advances in modern science and technology are inherently connected with the development, implementation, and widespread use of computer systems based on mathematical modeling. Algorithms and computer systems are gaining practical significance solving a range of process tasks in metallurgy of MES-level (Manufacturing Execution Systems - systems controlling industrial process) of modern automated information systems at the largest iron and steel enterprises in Russia. This fact determines the necessity to develop information-modeling systems based on mathematical models that will take into account the physics of the process, the basics of heat and mass exchange, the laws of energy conservation, and also the peculiarities of the impact of technological and standard characteristics of raw materials on the manufacturing process data. Special attention in this set of operations for metallurgic production is devoted to blast-furnace production, as it consumes the greatest amount of energy, up to 50% of the fuel used in ferrous metallurgy. The paper deals with the requirements, structure and architecture of BF Process Engineer's Automated Workstation (AWS), a computer decision support system of MES Level implemented in the ICS of the Blast Furnace Plant at Magnitogorsk Iron and Steel Works. It presents a brief description of main model subsystems as well as assumptions made in the process of mathematical modelling. Application of the developed system allows the engineering and process staff to analyze online production situations in the blast furnace plant, to solve a number of process tasks related to control of heat, gas dynamics and slag conditions of blast-furnace smelting as well as to calculate the optimal composition of blast-furnace slag, which eventually results in increasing technical and economic performance of blast-furnace production.

  12. Prediction of vitamin interacting residues in a vitamin binding protein using evolutionary information

    Directory of Open Access Journals (Sweden)

    Panwar Bharat

    2013-02-01

    Full Text Available Abstract Background The vitamins are important cofactors in various enzymatic-reactions. In past, many inhibitors have been designed against vitamin binding pockets in order to inhibit vitamin-protein interactions. Thus, it is important to identify vitamin interacting residues in a protein. It is possible to detect vitamin-binding pockets on a protein, if its tertiary structure is known. Unfortunately tertiary structures of limited proteins are available. Therefore, it is important to develop in-silico models for predicting vitamin interacting residues in protein from its primary structure. Results In this study, first we compared protein-interacting residues of vitamins with other ligands using Two Sample Logo (TSL. It was observed that ATP, GTP, NAD, FAD and mannose preferred {G,R,K,S,H}, {G,K,T,S,D,N}, {T,G,Y}, {G,Y,W} and {Y,D,W,N,E} residues respectively, whereas vitamins preferred {Y,F,S,W,T,G,H} residues for the interaction with proteins. Furthermore, compositional information of preferred and non-preferred residues along with patterns-specificity was also observed within different vitamin-classes. Vitamins A, B and B6 preferred {F,I,W,Y,L,V}, {S,Y,G,T,H,W,N,E} and {S,T,G,H,Y,N} interacting residues respectively. It suggested that protein-binding patterns of vitamins are different from other ligands, and motivated us to develop separate predictor for vitamins and their sub-classes. The four different prediction modules, (i vitamin interacting residues (VIRs, (ii vitamin-A interacting residues (VAIRs, (iii vitamin-B interacting residues (VBIRs and (iv pyridoxal-5-phosphate (vitamin B6 interacting residues (PLPIRs have been developed. We applied various classifiers of SVM, BayesNet, NaiveBayes, ComplementNaiveBayes, NaiveBayesMultinomial, RandomForest and IBk etc., as machine learning techniques, using binary and Position-Specific Scoring Matrix (PSSM features of protein sequences. Finally, we selected best performing SVM modules and

  13. Prediction of vitamin interacting residues in a vitamin binding protein using evolutionary information.

    Science.gov (United States)

    Panwar, Bharat; Gupta, Sudheer; Raghava, Gajendra P S

    2013-02-07

    The vitamins are important cofactors in various enzymatic-reactions. In past, many inhibitors have been designed against vitamin binding pockets in order to inhibit vitamin-protein interactions. Thus, it is important to identify vitamin interacting residues in a protein. It is possible to detect vitamin-binding pockets on a protein, if its tertiary structure is known. Unfortunately tertiary structures of limited proteins are available. Therefore, it is important to develop in-silico models for predicting vitamin interacting residues in protein from its primary structure. In this study, first we compared protein-interacting residues of vitamins with other ligands using Two Sample Logo (TSL). It was observed that ATP, GTP, NAD, FAD and mannose preferred {G,R,K,S,H}, {G,K,T,S,D,N}, {T,G,Y}, {G,Y,W} and {Y,D,W,N,E} residues respectively, whereas vitamins preferred {Y,F,S,W,T,G,H} residues for the interaction with proteins. Furthermore, compositional information of preferred and non-preferred residues along with patterns-specificity was also observed within different vitamin-classes. Vitamins A, B and B6 preferred {F,I,W,Y,L,V}, {S,Y,G,T,H,W,N,E} and {S,T,G,H,Y,N} interacting residues respectively. It suggested that protein-binding patterns of vitamins are different from other ligands, and motivated us to develop separate predictor for vitamins and their sub-classes. The four different prediction modules, (i) vitamin interacting residues (VIRs), (ii) vitamin-A interacting residues (VAIRs), (iii) vitamin-B interacting residues (VBIRs) and (iv) pyridoxal-5-phosphate (vitamin B6) interacting residues (PLPIRs) have been developed. We applied various classifiers of SVM, BayesNet, NaiveBayes, ComplementNaiveBayes, NaiveBayesMultinomial, RandomForest and IBk etc., as machine learning techniques, using binary and Position-Specific Scoring Matrix (PSSM) features of protein sequences. Finally, we selected best performing SVM modules and obtained highest MCC of 0.53, 0.48, 0.61, 0

  14. Predicting summer residential electricity demand across the U.S.A using climate information

    Science.gov (United States)

    Sun, X.; Wang, S.; Lall, U.

    2017-12-01

    We developed a Bayesian Hierarchical model to predict monthly residential per capita electricity consumption at the state level across the USA using climate information. The summer period was selected since cooling requirements may be directly associated with electricity use, while for winter a mix of energy sources may be used to meet heating needs. Historical monthly electricity consumption data from 1990 to 2013 were used to build a predictive model with a set of corresponding climate and non-climate covariates. A clustering analysis was performed first to identify groups of states that had similar temporal patterns for the cooling degree days of each state. Then, a partial pooling model was applied to each cluster to assess the sensitivity of monthly per capita residential electricity demand to each predictor (including cooling-degree-days, gross domestic product (GDP) per capita, per capita electricity demand of previous month and previous year, and the residential electricity price). The sensitivity of residential electricity to cooling-degree-days has an identifiable geographic distribution with higher values in northeastern United States.

  15. The largest Silurian vertebrate and its palaeoecological implications

    Science.gov (United States)

    Choo, Brian; Zhu, Min; Zhao, Wenjin; Jia, Liaotao; Zhu, You'an

    2014-01-01

    An apparent absence of Silurian fishes more than half-a-metre in length has been viewed as evidence that gnathostomes were restricted in size and diversity prior to the Devonian. Here we describe the largest pre-Devonian vertebrate (Megamastax amblyodus gen. et sp. nov.), a predatory marine osteichthyan from the Silurian Kuanti Formation (late Ludlow, ~423 million years ago) of Yunnan, China, with an estimated length of about 1 meter. The unusual dentition of the new form suggests a durophagous diet which, combined with its large size, indicates a considerable degree of trophic specialisation among early osteichthyans. The lack of large Silurian vertebrates has recently been used as constraint in palaeoatmospheric modelling, with purported lower oxygen levels imposing a physiological size limit. Regardless of the exact causal relationship between oxygen availability and evolutionary success, this finding refutes the assumption that pre-Emsian vertebrates were restricted to small body sizes. PMID:24921626

  16. THE MASS OF (4) VESTA DERIVED FROM ITS LARGEST GRAVITATIONAL EFFECTS

    International Nuclear Information System (INIS)

    Kuzmanoski, Mike; Novakovic, Bojan; Apostolovska, Gordana

    2010-01-01

    In this paper, we present a recalculated value of the mass of (4) Vesta, derived from its largest gravitational perturbations on selected asteroids during their mutual close encounters. This was done by using a new method for mass determination, which is based on the linking of pre-encounter observations to the orbit determined from post-encounter ones. The estimated weighted mean of the mass of (4) Vesta is (1.300 ± 0.001) x 10 -10 M sun .

  17. Accounting for information: Information and knowledge in the annual reports of FTSE 100 companies

    OpenAIRE

    Cummins, J.; Bawden, D.

    2010-01-01

    The purpose of this study was to assess the ways in which a sample group of companies discuss information and knowledge.\\ud \\ud Quantitative and qualitative content analyses were used to survey the way that companies present and value information and knowledge, based on the annual reports of the FTSE 100, the United Kingdom's largest publicly-listed companies. A novel content analysis approach is used, based on a set of categories proposed by Oppenheim, Stenson and Wilson.\\ud \\ud Many of the ...

  18. On predicting student performance using low-rank matrix factorization techniques

    DEFF Research Database (Denmark)

    Lorenzen, Stephan Sloth; Pham, Dang Ninh; Alstrup, Stephen

    2017-01-01

    that require remedial support, generate adaptive hints, and improve the learning of students. This work focuses on predicting the score of students in the quiz system of the Clio Online learning platform, the largest Danish supplier of online learning materials, covering 90% of Danish elementary schools....... Experimental results in the Clio Online data set confirm that the proposed initialization methods lead to very fast convergence. Regarding the prediction accuracy, surprisingly, the advanced EM method is just slightly better than the baseline approach based on the global mean score and student/quiz bias...

  19. Key U.S.-built part fails during testing for world's largest particle collider

    CERN Multimedia

    2007-01-01

    "Scientists are scrambling to redesign a key U.S.-built part that broke "with a loud bang and a cloud of dust" during a high-pressure test for the world's largest particle physics collider that is supposed to start up in November, officials sais Tuesday." (1,5 page)

  20. The value of blood oxygenation level-dependent (BOLD MR imaging in differentiation of renal solid mass and grading of renal cell carcinoma (RCC: analysis based on the largest cross-sectional area versus the entire whole tumour.

    Directory of Open Access Journals (Sweden)

    Guang-Yu Wu

    Full Text Available To study the value of assessing renal masses using different methods in parameter approaches and to determine whether BOLD MRI is helpful in differentiating RCC from benign renal masses, differentiating clear-cell RCC from renal masses other than clear-cell RCC and determining the tumour grade.Ninety-five patients with 139 renal masses (93 malignant and 46 benign who underwent abdominal BOLD MRI were enrolled. R2* values were derived from the largest cross-section (R2*largest and from the whole tumour (R2*whole. Intra-observer and inter-observer agreements were analysed based on two measurements by the same observer and the first measurement from each observer, respectively, and these agreements are reported with intra-class correlation coefficients and 95% confidence intervals. The diagnostic value of the R2* value in the evaluation was assessed with receiver-operating characteristic analysis.The intra-observer agreement was very good for R2*largest and R2*whole (all > 0.8. The inter-observer agreement of R2*whole (0.75, 95% confidence interval: 0.69~0.79 was good and was significantly improved compared with the R2*largest (0.61, 95% confidence interval: 0.52~0.68, as there was no overlap in the 95% confidence interval of the intra-class correlation coefficients. The diagnostic value in differentiating renal cell carcinoma from benign lesions with R2*whole (AUC=0.79/0.78[observer1/observer2] and R2*largest (AUC=0.75[observer1] was good and significantly higher (p=0.01 for R2*largest[observer2] vs R2*whole[observer2], p 0.7 and were not significantly different (p=0.89/0.93 for R2*largest vs R2*whole[observer1/observer2], 0.96 for R2*whole[observer1] vs R2*largest[observer2] and 0.96 for R2*whole [observer2] vs R2*largest[observer1].BOLD MRI could provide a feasible parameter for differentiating renal cell carcinoma from benign renal masses and for predicting clear-cell renal cell carcinoma grading. Compared with the largest cross

  1. Do pseudo-absence selection strategies influence species distribution models and their predictions? An information-theoretic approach based on simulated data

    Directory of Open Access Journals (Sweden)

    Guisan Antoine

    2009-04-01

    Full Text Available Abstract Background Multiple logistic regression is precluded from many practical applications in ecology that aim to predict the geographic distributions of species because it requires absence data, which are rarely available or are unreliable. In order to use multiple logistic regression, many studies have simulated "pseudo-absences" through a number of strategies, but it is unknown how the choice of strategy influences models and their geographic predictions of species. In this paper we evaluate the effect of several prevailing pseudo-absence strategies on the predictions of the geographic distribution of a virtual species whose "true" distribution and relationship to three environmental predictors was predefined. We evaluated the effect of using a real absences b pseudo-absences selected randomly from the background and c two-step approaches: pseudo-absences selected from low suitability areas predicted by either Ecological Niche Factor Analysis: (ENFA or BIOCLIM. We compared how the choice of pseudo-absence strategy affected model fit, predictive power, and information-theoretic model selection results. Results Models built with true absences had the best predictive power, best discriminatory power, and the "true" model (the one that contained the correct predictors was supported by the data according to AIC, as expected. Models based on random pseudo-absences had among the lowest fit, but yielded the second highest AUC value (0.97, and the "true" model was also supported by the data. Models based on two-step approaches had intermediate fit, the lowest predictive power, and the "true" model was not supported by the data. Conclusion If ecologists wish to build parsimonious GLM models that will allow them to make robust predictions, a reasonable approach is to use a large number of randomly selected pseudo-absences, and perform model selection based on an information theoretic approach. However, the resulting models can be expected to have

  2. Availability of point-of-purchase nutrition information at a fast-food restaurant.

    Science.gov (United States)

    Wootan, Margo G; Osborn, Melissa; Malloy, Claudia J

    2006-12-01

    Given the link between eating out, poor diets, and obesity, we assessed the availability of point-of-purchase nutrition information at the largest fast-food restaurant in the U.S., McDonald's. In August 2004, we visited 29 of 33 (88%) of the McDonald's outlets in Washington, DC and visually inspected the premises, as well as asked cashiers or restaurant managers whether they had nutrition information available in the restaurant. In Washington, DC, 59% of McDonald's outlets provided in-store nutrition information for the majority of their standard menu items. In 62% of the restaurants, it was necessary to ask two or more employees in order to obtain a copy of that information. We found that even at the largest chain restaurant in the country, nutrition information at the point of decision-making is often difficult to find or completely absent.

  3. Information management and technology strategy in healthcare: local timescales and national requirements

    Directory of Open Access Journals (Sweden)

    Les Smith

    2000-01-01

    Full Text Available The UK National Health Service’s strategic switch-back is well documented and each centrally originated change results in various attempts to record the repercussions and predict the outcomes. The most recent shift is embodied in the Department of Health’s information strategy, Information for health published in September 1998. This document provides the context for an examination of the issue of developing an Information Management and Technology (IM&T strategy at the local level within the changing national requirements for NHS information management. The particular pressures on an individual unit and the need to react to them alongside the requirements of the national strategy are the subjects of this article. The case detailed is that of Clatterbridge Centre for Oncology (CCO on Merseyside, the second largest centre of its type in the UK. Its initial investigation of information needs preceded the publication of the national strategy and its implementation straddled the timescale devised by the NHS Information Authority. The inevitable incompatibility between timescales for the local and the national developments is examined within the case. The work of the new NHS Information Authority and its supporting guidance in its Circular, Information for Health: Initial Local Implementation Strategies, is evaluated as a tool in aligning local and national strategy. Information Managers in other centrally governed organisations within the public sector and large corporations are often alert to similar issues.

  4. A prediction model for assessing residential radon concentration in Switzerland

    International Nuclear Information System (INIS)

    Hauri, Dimitri D.; Huss, Anke; Zimmermann, Frank; Kuehni, Claudia E.; Röösli, Martin

    2012-01-01

    Indoor radon is regularly measured in Switzerland. However, a nationwide model to predict residential radon levels has not been developed. The aim of this study was to develop a prediction model to assess indoor radon concentrations in Switzerland. The model was based on 44,631 measurements from the nationwide Swiss radon database collected between 1994 and 2004. Of these, 80% randomly selected measurements were used for model development and the remaining 20% for an independent model validation. A multivariable log-linear regression model was fitted and relevant predictors selected according to evidence from the literature, the adjusted R², the Akaike's information criterion (AIC), and the Bayesian information criterion (BIC). The prediction model was evaluated by calculating Spearman rank correlation between measured and predicted values. Additionally, the predicted values were categorised into three categories (50th, 50th–90th and 90th percentile) and compared with measured categories using a weighted Kappa statistic. The most relevant predictors for indoor radon levels were tectonic units and year of construction of the building, followed by soil texture, degree of urbanisation, floor of the building where the measurement was taken and housing type (P-values <0.001 for all). Mean predicted radon values (geometric mean) were 66 Bq/m³ (interquartile range 40–111 Bq/m³) in the lowest exposure category, 126 Bq/m³ (69–215 Bq/m³) in the medium category, and 219 Bq/m³ (108–427 Bq/m³) in the highest category. Spearman correlation between predictions and measurements was 0.45 (95%-CI: 0.44; 0.46) for the development dataset and 0.44 (95%-CI: 0.42; 0.46) for the validation dataset. Kappa coefficients were 0.31 for the development and 0.30 for the validation dataset, respectively. The model explained 20% overall variability (adjusted R²). In conclusion, this residential radon prediction model, based on a large number of measurements, was demonstrated to be

  5. Evolution of the Largest Mammalian Genome.

    Science.gov (United States)

    Evans, Ben J; Upham, Nathan S; Golding, Goeffrey B; Ojeda, Ricardo A; Ojeda, Agustina A

    2017-06-01

    The genome of the red vizcacha rat (Rodentia, Octodontidae, Tympanoctomys barrerae) is the largest of all mammals, and about double the size of their close relative, the mountain vizcacha rat Octomys mimax, even though the lineages that gave rise to these species diverged from each other only about 5 Ma. The mechanism for this rapid genome expansion is controversial, and hypothesized to be a consequence of whole genome duplication or accumulation of repetitive elements. To test these alternative but nonexclusive hypotheses, we gathered and evaluated evidence from whole transcriptome and whole genome sequences of T. barrerae and O. mimax. We recovered support for genome expansion due to accumulation of a diverse assemblage of repetitive elements, which represent about one half and one fifth of the genomes of T. barrerae and O. mimax, respectively, but we found no strong signal of whole genome duplication. In both species, repetitive sequences were rare in transcribed regions as compared with the rest of the genome, and mostly had no close match to annotated repetitive sequences from other rodents. These findings raise new questions about the genomic dynamics of these repetitive elements, their connection to widespread chromosomal fissions that occurred in the T. barrerae ancestor, and their fitness effects-including during the evolution of hypersaline dietary tolerance in T. barrerae. ©The Author(s) 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution.

  6. The predictive model on the user reaction time using the information similarity

    International Nuclear Information System (INIS)

    Lee, Sung Jin; Heo, Gyun Young; Chang, Soon Heung

    2005-01-01

    Human performance is frequently degraded because people forget. Memory is one of brain processes that are important when trying to understand how people process information. Although a large number of studies have been made on the human performance, little is known about the similarity effect in human performance. The purpose of this paper is to propose and validate the quantitative and predictive model on the human response time in the user interface with the concept of similarity. However, it is not easy to explain the human performance with only similarity or information amount. We are confronted by two difficulties: making the quantitative model on the human response time with the similarity and validating the proposed model by experimental work. We made the quantitative model based on the Hick's law and the law of practice. In addition, we validated the model with various experimental conditions by measuring participants' response time in the environment of computer-based display. Experimental results reveal that the human performance is improved by the user interface's similarity. We think that the proposed model is useful for the user interface design and evaluation phases

  7. PREDICTING THE EFFECTIVENESS OF WEB INFORMATION SYSTEMS USING NEURAL NETWORKS MODELING: FRAMEWORK & EMPIRICAL TESTING

    Directory of Open Access Journals (Sweden)

    Dr. Kamal Mohammed Alhendawi

    2018-02-01

    Full Text Available The information systems (IS assessment studies have still used the commonly traditional tools such as questionnaires in evaluating the dependent variables and specially effectiveness of systems. Artificial neural networks have been recently accepted as an effective alternative tool for modeling the complicated systems and widely used for forecasting. A very few is known about the employment of Artificial Neural Network (ANN in the prediction IS effectiveness. For this reason, this study is considered as one of the fewest studies to investigate the efficiency and capability of using ANN for forecasting the user perceptions towards IS effectiveness where MATLAB is utilized for building and training the neural network model. A dataset of 175 subjects collected from international organization are utilized for ANN learning where each subject consists of 6 features (5 quality factors as inputs and one Boolean output. A percentage of 75% o subjects are used in the training phase. The results indicate an evidence on the ANN models has a reasonable accuracy in forecasting the IS effectiveness. For prediction, ANN with PURELIN (ANNP and ANN with TANSIG (ANNTS transfer functions are used. It is found that both two models have a reasonable prediction, however, the accuracy of ANNTS model is better than ANNP model (88.6% and 70.4% respectively. As the study proposes a new model for predicting IS dependent variables, it could save the considerably high cost that might be spent in sample data collection in the quantitative studies in the fields science, management, education, arts and others.

  8. How the largest electric and gas utility companies administer public relations

    Energy Technology Data Exchange (ETDEWEB)

    Bogart, J.D.

    1979-04-12

    This article describes the findings of a survey conducted by the author in the second half of 1978 to determine the sizes of the public relations staffs of the nation's largest operating electric and gas utilities, their budgets, organizational differences, and specific functions. Common public relations issues and major public relations problems of the utilities are identified, as well as recent trends or changes in budgeting and organization. Some functional variations of public relations departments among utility companies were detected and described.

  9. Environmental isotope signatures of the largest freshwater lake in Kerala

    International Nuclear Information System (INIS)

    Unnikrishnan Warrier, C.

    2007-01-01

    Sasthamkotta lake, the largest freshwater lake in Kerala, serves as a source for drinking water for more than half a million people. Environmental 137 Cs analysis done on undisturbed sediment core samples reveals that the recent rate of sedimentation is not uniform in the lake. The useful life of lake is estimated as about 800 years. The δD and δ 18 O values of the lake waters indicate that the lake is well mixed with a slight variation horizontally. The stable isotope studies on well waters from the catchment indicate hydraulic communication with the lake and lake groundwater system is flow-through type. Analytical model also supports this view. (author)

  10. CERN: Digital image analysis in the world's largest research center for particle physics

    CERN Multimedia

    2005-01-01

    Those interested in researching into the smallest building blocks that matter is made up of need the largest instruments. CERN, near Geneva, Switzerland is where the most powerful circular accelerator in the world is being built: the Large Hadron Collider (LHC) for proton collisions. It has a circumference of 26.7 km (4 pages)

  11. Zephyr - the next generation prediction

    DEFF Research Database (Denmark)

    Giebel, G.; Landberg, L.; Nielsen, Torben Skov

    2001-01-01

    Technical University. This paper will describe a new project funded by the Danish Ministry of Energy where the largest Danish utilities (Elkraft, Elsam, Eltra and SEAS) are participating. Two advantages can be achieved by combining the effort: The software architecture will be state-of-the-art, using...... the Java2TM platform and Enterprise Java Beans technology, and it will ensure that the best forecasts are given on all prediction horizons from the short range (0-9 hours) to the long range (36-48 hours). This is because the IMM approach uses online data and advanced statistical methods, which...

  12. Diabetes prevention information in Japanese magazines with the largest print runs. Content analysis using clinical guidelines as a standard.

    Science.gov (United States)

    Noda, Emi; Mifune, Taka; Nakayama, Takeo

    2013-01-01

    To characterize information on diabetes prevention appearing in Japanese general health magazines and to examine the agreement of the content with that in clinical practice guidelines for the treatment of diabetes in Japan. We used the Japanese magazines' databases provided by the Media Research Center and selected magazines with large print runs published in 2006. Two medical professionals independently conducted content analysis based on items in the diabetes prevention guidelines. The number of pages for each item and agreement with the information in the guidelines were determined. We found 63 issues of magazines amounting to 8,982 pages; 484 pages included diabetes prevention related content. For 23 items included in the diabetes prevention guidelines, overall agreement of information printed in the magazines with that in the guidelines was 64.5% (471 out of 730). The number of times these items were referred to in the magazines varied widely, from 247 times for food items to 0 times for items on screening for pregnancy-induced diabetes, dyslipidemia, and hypertension. Among the 20 items that were referred to at least once, 18 items showed more than 90% agreement with the guidelines. However, there was poor agreement for information on vegetable oil (2/14, 14%) and for specific foods (5/247, 2%). For the fatty acids category, "fat" was not mentioned in the guidelines; however, the term frequently appeared in magazines. "Uncertainty" was never mentioned in magazines for specific food items. The diabetes prevention related content in the health magazines differed from that defined in clinical practice guidelines. Most information in the magazines agreed with the guidelines, however some items were referred to inappropriately. To disseminate correct information to the public on diabetes prevention, health professionals and the media must collaborate.

  13. A Semi-Supervised Learning Algorithm for Predicting Four Types MiRNA-Disease Associations by Mutual Information in a Heterogeneous Network.

    Science.gov (United States)

    Zhang, Xiaotian; Yin, Jian; Zhang, Xu

    2018-03-02

    Increasing evidence suggests that dysregulation of microRNAs (miRNAs) may lead to a variety of diseases. Therefore, identifying disease-related miRNAs is a crucial problem. Currently, many computational approaches have been proposed to predict binary miRNA-disease associations. In this study, in order to predict underlying miRNA-disease association types, a semi-supervised model called the network-based label propagation algorithm is proposed to infer multiple types of miRNA-disease associations (NLPMMDA) by mutual information derived from the heterogeneous network. The NLPMMDA method integrates disease semantic similarity, miRNA functional similarity, and Gaussian interaction profile kernel similarity information of miRNAs and diseases to construct a heterogeneous network. NLPMMDA is a semi-supervised model which does not require verified negative samples. Leave-one-out cross validation (LOOCV) was implemented for four known types of miRNA-disease associations and demonstrated the reliable performance of our method. Moreover, case studies of lung cancer and breast cancer confirmed effective performance of NLPMMDA to predict novel miRNA-disease associations and their association types.

  14. Collaborative spectrum sensing based on the ratio between largest eigenvalue and Geometric mean of eigenvalues

    KAUST Repository

    Shakir, Muhammad

    2011-12-01

    In this paper, we introduce a new detector referred to as Geometric mean detector (GEMD) which is based on the ratio of the largest eigenvalue to the Geometric mean of the eigenvalues for collaborative spectrum sensing. The decision threshold has been derived by employing Gaussian approximation approach. In this approach, the two random variables, i.e. The largest eigenvalue and the Geometric mean of the eigenvalues are considered as independent Gaussian random variables such that their cumulative distribution functions (CDFs) are approximated by a univariate Gaussian distribution function for any number of cooperating secondary users and received samples. The approximation approach is based on the calculation of exact analytical moments of the largest eigenvalue and the Geometric mean of the eigenvalues of the received covariance matrix. The decision threshold has been calculated by exploiting the CDF of the ratio of two Gaussian distributed random variables. In this context, we exchange the analytical moments of the two random variables with the moments of the Gaussian distribution function. The performance of the detector is compared with the performance of the energy detector and eigenvalue ratio detector. Analytical and simulation results show that our newly proposed detector yields considerable performance advantage in realistic spectrum sensing scenarios. Moreover, our results based on proposed approximation approach are in perfect agreement with the empirical results. © 2011 IEEE.

  15. Prediction of the human response time with the similarity and quantity of information

    International Nuclear Information System (INIS)

    Lee, Sungjin; Heo, Gyunyoung; Chang, Soon Heung

    2006-01-01

    Memory is one of brain processes that are important when trying to understand how people process information. Although a large number of studies have been made on the human performance, little is known about the similarity effect in human performance. The purpose of this paper is to propose and validate the quantitative and predictive model on the human response time in the user interface with the concept of similarity. However, it is not easy to explain the human performance with only similarity or information amount. We are confronted by two difficulties: making the quantitative model on the human response time with the similarity and validating the proposed model by experimental work. We made the quantitative model based on the Hick's law and the law of practice. In addition, we validated the model with various experimental conditions by measuring participants' response time in the environment of computer-based display. Experimental results reveal that the human performance is improved by the user interface's similarity. We think that the proposed model is useful for the user interface design and evaluation phases

  16. Multimethod prediction of physical parent-child aggression risk in expectant mothers and fathers with Social Information Processing theory.

    Science.gov (United States)

    Rodriguez, Christina M; Smith, Tamika L; Silvia, Paul J

    2016-01-01

    The Social Information Processing (SIP) model postulates that parents undergo a series of stages in implementing physical discipline that can escalate into physical child abuse. The current study utilized a multimethod approach to investigate whether SIP factors can predict risk of parent-child aggression (PCA) in a diverse sample of expectant mothers and fathers. SIP factors of PCA attitudes, negative child attributions, reactivity, and empathy were considered as potential predictors of PCA risk; additionally, analyses considered whether personal history of PCA predicted participants' own PCA risk through its influence on their attitudes and attributions. Findings indicate that, for both mothers and fathers, history influenced attitudes but not attributions in predicting PCA risk, and attitudes and attributions predicted PCA risk; empathy and reactivity predicted negative child attributions for expectant mothers, but only reactivity significantly predicted attributions for expectant fathers. Path models for expectant mothers and fathers were remarkably similar. Overall, the findings provide support for major aspects of the SIP model. Continued work is needed in studying the progression of these factors across time for both mothers and fathers as well as the inclusion of other relevant ecological factors to the SIP model. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. A predictability study of Lorenz's 28-variable model as a dynamical system

    Science.gov (United States)

    Krishnamurthy, V.

    1993-01-01

    The dynamics of error growth in a two-layer nonlinear quasi-geostrophic model has been studied to gain an understanding of the mathematical theory of atmospheric predictability. The growth of random errors of varying initial magnitudes has been studied, and the relation between this classical approach and the concepts of the nonlinear dynamical systems theory has been explored. The local and global growths of random errors have been expressed partly in terms of the properties of an error ellipsoid and the Liapunov exponents determined by linear error dynamics. The local growth of small errors is initially governed by several modes of the evolving error ellipsoid but soon becomes dominated by the longest axis. The average global growth of small errors is exponential with a growth rate consistent with the largest Liapunov exponent. The duration of the exponential growth phase depends on the initial magnitude of the errors. The subsequent large errors undergo a nonlinear growth with a steadily decreasing growth rate and attain saturation that defines the limit of predictability. The degree of chaos and the largest Liapunov exponent show considerable variation with change in the forcing, which implies that the time variation in the external forcing can introduce variable character to the predictability.

  18. Coupled information diffusion--pest dynamics models predict delayed benefits of farmer cooperation in pest management programs.

    Science.gov (United States)

    Rebaudo, François; Dangles, Olivier

    2011-10-01

    Worldwide, the theory and practice of agricultural extension system have been dominated for almost half a century by Rogers' "diffusion of innovation theory". In particular, the success of integrated pest management (IPM) extension programs depends on the effectiveness of IPM information diffusion from trained farmers to other farmers, an important assumption which underpins funding from development organizations. Here we developed an innovative approach through an agent-based model (ABM) combining social (diffusion theory) and biological (pest population dynamics) models to study the role of cooperation among small-scale farmers to share IPM information for controlling an invasive pest. The model was implemented with field data, including learning processes and control efficiency, from large scale surveys in the Ecuadorian Andes. Our results predict that although cooperation had short-term costs for individual farmers, it paid in the long run as it decreased pest infestation at the community scale. However, the slow learning process placed restrictions on the knowledge that could be generated within farmer communities over time, giving rise to natural lags in IPM diffusion and applications. We further showed that if individuals learn from others about the benefits of early prevention of new pests, then educational effort may have a sustainable long-run impact. Consistent with models of information diffusion theory, our results demonstrate how an integrated approach combining ecological and social systems would help better predict the success of IPM programs. This approach has potential beyond pest management as it could be applied to any resource management program seeking to spread innovations across populations.

  19. Information Technology Skills Recommended for Business Students by Fortune 500 Executives.

    Science.gov (United States)

    Zhao, Jensen J.; Alexander, Melody W.

    2002-01-01

    Responses from 51 Fortune 500 training and development executives identified 28 information technology skills strongly recommended for business graduates. A similar 1995 survey identified only 11 skills. The largest increase occurred in Internet/Web telecommunications and discipline-specific information systems. (Contains 31 references.) (SK)

  20. Effects of historical and predictive information on ability of transport pilot to predict an alert

    Science.gov (United States)

    Trujillo, Anna C.

    1994-01-01

    In the aviation community, the early detection of the development of a possible subsystem problem during a flight is potentially useful for increasing the safety of the flight. Commercial airlines are currently using twin-engine aircraft for extended transport operations over water, and the early detection of a possible problem might increase the flight crew's options for safely landing the aircraft. One method for decreasing the severity of a developing problem is to predict the behavior of the problem so that appropriate corrective actions can be taken. To investigate the pilots' ability to predict long-term events, a computer workstation experiment was conducted in which 18 airline pilots predicted the alert time (the time to an alert) using 3 different dial displays and 3 different parameter behavior complexity levels. The three dial displays were as follows: standard (resembling current aircraft round dial presentations); history (indicating the current value plus the value of the parameter 5 sec in the past); and predictive (indicating the current value plus the value of the parameter 5 sec into the future). The time profiles describing the behavior of the parameter consisted of constant rate-of-change profiles, decelerating profiles, and accelerating-then-decelerating profiles. Although the pilots indicated that they preferred the near term predictive dial, the objective data did not support its use. The objective data did show that the time profiles had the most significant effect on performance in estimating the time to an alert.

  1. Mortgage loans: an analysis of the portfolios of the largest banks in Brazil

    Directory of Open Access Journals (Sweden)

    Bruno Vinícius Ramos Fernandes

    2013-05-01

    Full Text Available Given the current macroeconomic environment experienced in Brazil, where inflation has stabilized and the basic interest rate of the economy is in one of their historical lows, demand for mortgages is increasing. In this context, the mortgage is presented with great emphasis to meet the demand for purchasing housing in addition to being a catalyst for the reduction of the high housing deficit. From a descriptive and empirical-analytic was analyzed the mortgage loan portfolio of the largest banks of the country between the years 2001 and 2010 through Quarterly Financial Information (IFT available on the Central Bank website. It was settled a comparative relationship between the data in order to check the development of mortgage portfolios over the years and the factors that influenced this evolution, and evaluate the timeliness and quality of those loans. For the evolution of the portfolio there was an economic context in which Brazil was included in the period, and observed that for most of these operations are long term the banks are more exposed to market risk. With regard to credit risk parse that, over the years, Brazilian banks are presenting a mortgage loan portfolio with lower risk, and it is found that institutions with real estate credits with higher levels of portfolio risk are subject to have higher losses on such operations in the possibility of default.

  2. Group spike-and-slab lasso generalized linear models for disease prediction and associated genes detection by incorporating pathway information.

    Science.gov (United States)

    Tang, Zaixiang; Shen, Yueping; Li, Yan; Zhang, Xinyan; Wen, Jia; Qian, Chen'ao; Zhuang, Wenzhuo; Shi, Xinghua; Yi, Nengjun

    2018-03-15

    Large-scale molecular data have been increasingly used as an important resource for prognostic prediction of diseases and detection of associated genes. However, standard approaches for omics data analysis ignore the group structure among genes encoded in functional relationships or pathway information. We propose new Bayesian hierarchical generalized linear models, called group spike-and-slab lasso GLMs, for predicting disease outcomes and detecting associated genes by incorporating large-scale molecular data and group structures. The proposed model employs a mixture double-exponential prior for coefficients that induces self-adaptive shrinkage amount on different coefficients. The group information is incorporated into the model by setting group-specific parameters. We have developed a fast and stable deterministic algorithm to fit the proposed hierarchal GLMs, which can perform variable selection within groups. We assess the performance of the proposed method on several simulated scenarios, by varying the overlap among groups, group size, number of non-null groups, and the correlation within group. Compared with existing methods, the proposed method provides not only more accurate estimates of the parameters but also better prediction. We further demonstrate the application of the proposed procedure on three cancer datasets by utilizing pathway structures of genes. Our results show that the proposed method generates powerful models for predicting disease outcomes and detecting associated genes. The methods have been implemented in a freely available R package BhGLM (http://www.ssg.uab.edu/bhglm/). nyi@uab.edu. Supplementary data are available at Bioinformatics online.

  3. Fair value versus historical cost-based valuation for biological assets: predictability of financial information

    Directory of Open Access Journals (Sweden)

    Josep M. Argilés

    2011-08-01

    This paper performs an empirical study with a sample of Spanish farms valuing biological assets at HC and a sample applying FV, finding no significant differences between both valuation methods to assess future cash flows. However, most tests reveal more predictive power of future earnings under fair valuation of biological assets, which is not explained by differences in volatility of earnings and profitability. The study also evidences the existence of flawed HC accounting practices for biological assets in agriculture, which suggests scarce information content of this valuation method in the predominant small business units existing in the agricultural sector in advanced Western countries.

  4. Geopositioning with a quadcopter: Extracted feature locations and predicted accuracy without a priori sensor attitude information

    Science.gov (United States)

    Dolloff, John; Hottel, Bryant; Edwards, David; Theiss, Henry; Braun, Aaron

    2017-05-01

    This paper presents an overview of the Full Motion Video-Geopositioning Test Bed (FMV-GTB) developed to investigate algorithm performance and issues related to the registration of motion imagery and subsequent extraction of feature locations along with predicted accuracy. A case study is included corresponding to a video taken from a quadcopter. Registration of the corresponding video frames is performed without the benefit of a priori sensor attitude (pointing) information. In particular, tie points are automatically measured between adjacent frames using standard optical flow matching techniques from computer vision, an a priori estimate of sensor attitude is then computed based on supplied GPS sensor positions contained in the video metadata and a photogrammetric/search-based structure from motion algorithm, and then a Weighted Least Squares adjustment of all a priori metadata across the frames is performed. Extraction of absolute 3D feature locations, including their predicted accuracy based on the principles of rigorous error propagation, is then performed using a subset of the registered frames. Results are compared to known locations (check points) over a test site. Throughout this entire process, no external control information (e.g. surveyed points) is used other than for evaluation of solution errors and corresponding accuracy.

  5. Triangle network motifs predict complexes by complementing high-error interactomes with structural information.

    Science.gov (United States)

    Andreopoulos, Bill; Winter, Christof; Labudde, Dirk; Schroeder, Michael

    2009-06-27

    A lot of high-throughput studies produce protein-protein interaction networks (PPINs) with many errors and missing information. Even for genome-wide approaches, there is often a low overlap between PPINs produced by different studies. Second-level neighbors separated by two protein-protein interactions (PPIs) were previously used for predicting protein function and finding complexes in high-error PPINs. We retrieve second level neighbors in PPINs, and complement these with structural domain-domain interactions (SDDIs) representing binding evidence on proteins, forming PPI-SDDI-PPI triangles. We find low overlap between PPINs, SDDIs and known complexes, all well below 10%. We evaluate the overlap of PPI-SDDI-PPI triangles with known complexes from Munich Information center for Protein Sequences (MIPS). PPI-SDDI-PPI triangles have ~20 times higher overlap with MIPS complexes than using second-level neighbors in PPINs without SDDIs. The biological interpretation for triangles is that a SDDI causes two proteins to be observed with common interaction partners in high-throughput experiments. The relatively few SDDIs overlapping with PPINs are part of highly connected SDDI components, and are more likely to be detected in experimental studies. We demonstrate the utility of PPI-SDDI-PPI triangles by reconstructing myosin-actin processes in the nucleus, cytoplasm, and cytoskeleton, which were not obvious in the original PPIN. Using other complementary datatypes in place of SDDIs to form triangles, such as PubMed co-occurrences or threading information, results in a similar ability to find protein complexes. Given high-error PPINs with missing information, triangles of mixed datatypes are a promising direction for finding protein complexes. Integrating PPINs with SDDIs improves finding complexes. Structural SDDIs partially explain the high functional similarity of second-level neighbors in PPINs. We estimate that relatively little structural information would be sufficient

  6. Analysis of Multiple Structural Changes in Financial Contagion Based on the Largest Lyapunov Exponents

    Directory of Open Access Journals (Sweden)

    Rui Wang

    2014-01-01

    Full Text Available A modified multiple structural changes model is built to test structural breaks of the financial system based on calculating the largest Lyapunov exponents of the financial time series. Afterwards, the Lorenz system is used as a simulation example to inspect the new model. As the Lorenz system has strong nonlinearity, the verification results show that the new model has good capability in both finding the breakpoint and revealing the changes in nonlinear characteristics of the time series. The empirical study based on the model used daily data from the S&P 500 stock index during the global financial crisis from 2005 to 2012. The results provide four breakpoints of the period, which divide the contagion into four stages: stationary, local outbreak, global outbreak, and recovery period. An additional significant result is the obvious chaos characteristic difference in the largest Lyapunov exponents and the standard deviation at various stages, particularly at the local outbreak stage.

  7. What good are positive emotions for treatment? Trait positive emotionality predicts response to Cognitive Behavioral Therapy for anxiety.

    Science.gov (United States)

    Taylor, Charles T; Knapp, Sarah E; Bomyea, Jessica A; Ramsawh, Holly J; Paulus, Martin P; Stein, Murray B

    2017-06-01

    Cognitive behavioral therapy (CBT) is empirically supported for the treatment of anxiety disorders; however, not all individuals achieve recovery following CBT. Positive emotions serve a number of functions that theoretically should facilitate response to CBT - they promote flexible patterns of information processing and assimilation of new information, encourage approach-oriented behavior, and speed physiological recovery from negative emotions. We conducted a secondary analysis of an existing clinical trial dataset to test the a priori hypothesis that individual differences in trait positive emotions would predict CBT response for anxiety. Participants meeting diagnostic criteria for panic disorder (n = 28) or generalized anxiety disorder (n = 31) completed 10 weekly individual CBT sessions. Trait positive emotionality was assessed at pre-treatment, and severity of anxiety symptoms and associated impairment was assessed throughout treatment. Participants who reported a greater propensity to experience positive emotions at pre-treatment displayed the largest reduction in anxiety symptoms as well as fewer symptoms following treatment. Positive emotions remained a robust predictor of change in symptoms when controlling for baseline depression severity. Initial evidence supports the predictive value of trait positive emotions as a prognostic indicator for CBT outcome in a GAD and PD sample. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Distribution and Modeled Transport of Plastic Pollution in the Great Lakes, the World's Largest Freshwater Resource

    Directory of Open Access Journals (Sweden)

    Rachel N. Cable

    2017-07-01

    Full Text Available Most plastic pollution originates on land. As such, freshwater bodies serve as conduits for the transport of plastic litter to the ocean. Understanding the concentrations and fluxes of plastic litter in freshwater ecosystems is critical to our understanding of the global plastic litter budget and underpins the success of future management strategies. We conducted a replicated field survey of surface plastic concentrations in four lakes in the North American Great Lakes system, the largest contiguous freshwater system on the planet. We then modeled plastic transport to resolve spatial and temporal variability of plastic distribution in one of the Great Lakes, Lake Erie. Triplicate surface samples were collected at 38 stations in mid-summer of 2014. Plastic particles >106 μm in size were quantified. Concentrations were highest near populated urban areas and their water infrastructure. In the highest concentration trawl, nearly 2 million fragments km−2 were found in the Detroit River—dwarfing previous reports of Great Lakes plastic abundances by over 4-fold. Yet, the accuracy of single trawl counts was challenged: within-station plastic abundances varied 0- to 3-fold between replicate trawls. In the smallest size class (106–1,000 μm, false positive rates of 12–24% were determined analytically for plastic vs. non-plastic, while false negative rates averaged ~18%. Though predicted to form in summer by the existing Lake Erie circulation model, our transport model did not predict a permanent surface “Lake Erie Garbage Patch” in its central basin—a trend supported by field survey data. Rather, general eastward transport with recirculation in the major basins was predicted. Further, modeled plastic residence times were drastically influenced by plastic buoyancy. Neutrally buoyant plastics—those with the same density as the ambient water—were flushed several times slower than plastics floating at the water's surface and exceeded the

  9. World's largest off-road tires to be recycled

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2005-07-01

    Suncor Energy is the first company in Canada to use a new technology designed uniquely for tire recycling at oil sand facilities. The technology is owned by CuttingEdge Tire Recycling, a partnership between Denesoline Environmental Limited Partnership and Beaver Environmental Rubber Technologies Limited. Suncor has supported the development of this Aboriginal-owned and operated business by offering land, electricity, diesel fuel and stockpiles of used truck tires from its oil sand mining activities. These tires are the largest off-road tires in the world. In this new technology, tires that are worn-out through oil sand mining are shredded in a portable shredder before being recycled for subsequent use by the Alberta Recycling Management Association. 1 fig.

  10. A novel model to combine clinical and pathway-based transcriptomic information for the prognosis prediction of breast cancer.

    Directory of Open Access Journals (Sweden)

    Sijia Huang

    2014-09-01

    Full Text Available Breast cancer is the most common malignancy in women worldwide. With the increasing awareness of heterogeneity in breast cancers, better prediction of breast cancer prognosis is much needed for more personalized treatment and disease management. Towards this goal, we have developed a novel computational model for breast cancer prognosis by combining the Pathway Deregulation Score (PDS based pathifier algorithm, Cox regression and L1-LASSO penalization method. We trained the model on a set of 236 patients with gene expression data and clinical information, and validated the performance on three diversified testing data sets of 606 patients. To evaluate the performance of the model, we conducted survival analysis of the dichotomized groups, and compared the areas under the curve based on the binary classification. The resulting prognosis genomic model is composed of fifteen pathways (e.g., P53 pathway that had previously reported cancer relevance, and it successfully differentiated relapse in the training set (log rank p-value = 6.25e-12 and three testing data sets (log rank p-value < 0.0005. Moreover, the pathway-based genomic models consistently performed better than gene-based models on all four data sets. We also find strong evidence that combining genomic information with clinical information improved the p-values of prognosis prediction by at least three orders of magnitude in comparison to using either genomic or clinical information alone. In summary, we propose a novel prognosis model that harnesses the pathway-based dysregulation as well as valuable clinical information. The selected pathways in our prognosis model are promising targets for therapeutic intervention.

  11. Assessment of tumor heterogeneity by CT texture analysis: Can the largest cross-sectional area be used as an alternative to whole tumor analysis?

    International Nuclear Information System (INIS)

    Ng, Francesca; Kozarski, Robert; Ganeshan, Balaji; Goh, Vicky

    2013-01-01

    Objective: To determine if there is a difference between contrast enhanced CT texture features from the largest cross-sectional area versus the whole tumor, and its effect on clinical outcome prediction. Methods: Entropy (E) and uniformity (U) were derived for different filter values (1.0–2.5: fine to coarse textures) for the largest primary tumor cross-sectional area and the whole tumor of the staging contrast enhanced CT in 55 patients with primary colorectal cancer. Parameters were compared using non-parametric Wilcoxon test. Kaplan–Meier analysis was performed to determine the relationship between CT texture and 5-year overall survival. Results: E was higher and U lower for the whole tumor indicating greater heterogeneity at all filter levels (1.0–2.5): median (range) for E and U for whole tumor versus largest cross-sectional area of 7.89 (7.43–8.31) versus 7.62 (6.94–8.08) and 0.005 (0.004–0.01) versus 0.006 (0.005–0.01) for filter 1.0; 7.88 (7.22–8.48) versus 7.54 (6.86–8.1) and 0.005 (0.003–0.01) versus 0.007 (0.004–0.01) for filter 1.5; 7.88 (7.17–8.54) versus 7.48 (5.84–8.25) and 0.005 (0.003–0.01) versus 0.007 (0.004–0.02) for filter 2.0; and 7.83 (7.03–8.57) versus 7.42 (5.19–8.26) and 0.005 (0.003–0.01) versus 0.006 (0.004–0.03) for filter 2.5 respectively (p ≤ 0.001). Kaplan–Meier analysis demonstrated better separation of E and U for whole tumor analysis for 5-year overall survival. Conclusion: Whole tumor analysis appears more representative of tumor heterogeneity

  12. Geographical information system (GIS) application for flood prediction at Sungai Sembrong

    Science.gov (United States)

    Kamin, Masiri; Ahmad, Nor Farah Atiqah; Razali, Siti Nooraiin Mohd; Hilaham, Mashuda Mohamad; Rahman, Mohamad Abdul; Ngadiman, Norhayati; Sahat, Suhaila

    2017-10-01

    The occurrence of flood is one of natural disaster that often beset Malaysia. The latest incident that happened in 2007 was the worst occurrence of floods ever be set in Johor. Reporting floods mainly focused on rising water rising levels, so about once a focus on the area of flood delineation. A study focused on the effectiveness of using Geographic Information System (GIS) to predict the flood by taking Sg. Sembrong, Batu Pahat, Johor as study area. This study combined hydrological model and water balance model in the display to show the expected flood area for future reference. The minimum, maximum and average rainfall data for January 2007 at Sg Sembrong were used in this study. The data shows that flood does not occurs at the minimum and average rainfall of 17.2mm and 2mm respectively. At maximum rainfall, 203mm, shows the flood area was 9983 hectares with the highest level of the water depth was 2m. The result showed that the combination of hydrological models and water balance model in GIS is very suitable to be used as a tool to obtain preliminary information on flood immediately. Besides that, GIS system is a very powerful tool used in hydrology engineering to help the engineer and planner to imagine the real situation of flood events, doing flood analysis, problem solving and provide a rational, accurate and efficient decision making.

  13. Online Cancer Information Seeking: Applying and Extending the Comprehensive Model of Information Seeking.

    Science.gov (United States)

    Van Stee, Stephanie K; Yang, Qinghua

    2017-10-30

    This study applied the comprehensive model of information seeking (CMIS) to online cancer information and extended the model by incorporating an exogenous variable: interest in online health information exchange with health providers. A nationally representative sample from the Health Information National Trends Survey 4 Cycle 4 was analyzed to examine the extended CMIS in predicting online cancer information seeking. Findings from a structural equation model supported most of the hypotheses derived from the CMIS, as well as the extension of the model related to interest in online health information exchange. In particular, socioeconomic status, beliefs, and interest in online health information exchange predicted utility. Utility, in turn, predicted online cancer information seeking, as did information-carrier characteristics. An unexpected but important finding from the study was the significant, direct relationship between cancer worry and online cancer information seeking. Theoretical and practical implications are discussed.

  14. Toward an Efficient Prediction of Solar Flares: Which Parameters, and How?

    Directory of Open Access Journals (Sweden)

    Manolis K. Georgoulis

    2013-11-01

    Full Text Available Solar flare prediction has become a forefront topic in contemporary solar physics, with numerous published methods relying on numerous predictive parameters, that can even be divided into parameter classes. Attempting further insight, we focus on two popular classes of flare-predictive parameters, namely multiscale (i.e., fractal and multifractal and proxy (i.e., morphological parameters, and we complement our analysis with a study of the predictive capability of fundamental physical parameters (i.e., magnetic free energy and relative magnetic helicity. Rather than applying the studied parameters to a comprehensive statistical sample of flaring and non-flaring active regions, that was the subject of our previous studies, the novelty of this work is their application to an exceptionally long and high-cadence time series of the intensely eruptive National Oceanic and Atmospheric Administration (NOAA active region (AR 11158, observed by the Helioseismic and Magnetic Imager on board the Solar Dynamics Observatory. Aiming for a detailed study of the temporal evolution of each parameter, we seek distinctive patterns that could be associated with the four largest flares in the AR in the course of its five-day observing interval. We find that proxy parameters only tend to show preflare impulses that are practical enough to warrant subsequent investigation with sufficient statistics. Combining these findings with previous results, we conclude that: (i carefully constructed, physically intuitive proxy parameters may be our best asset toward an efficient future flare-forecasting; and (ii the time series of promising parameters may be as important as their instantaneous values. Value-based prediction is the only approach followed so far. Our results call for novel signal and/or image processing techniques to efficiently utilize combined amplitude and temporal-profile information to optimize the inferred solar-flare probabilities.

  15. Predicting adolescents' disclosure of personal information in exchange for commercial incentives: an application of an extended theory of planned behavior.

    Science.gov (United States)

    Heirman, Wannes; Walrave, Michel; Ponnet, Koen

    2013-02-01

    This study adopts a global theoretical framework to predict adolescents' disclosure of personal information in exchange for incentives offered by commercial Websites. The study postulates and tests the validity of a model based on the theory of planned behavior (TPB), including antecedent factors of attitude and perceived behavioral control (PBC). A survey was conducted among 1,042 respondents. Results from SEM analyses show that the hypothesized model fits the empirical data well. The model accounts for 61.9 percent of the variance in adolescents' intention to disclose and 43.7 percent of the variance in self-reported disclosure. Perceived social pressure exerted by significant others (subjective norm) is the most important TPB factor in predicting intention to disclose personal information in exchange for incentives. This finding suggests that in discussions of adolescents' information privacy, the importance of social factors outweighs the individually oriented TPB factors of attitude and PBC. Moreover, privacy concern and trust propensity are significant predictors of respondents' attitudes toward online disclosure in exchange for commercial incentives, whereas the frequency of Internet use significantly affects their level of PBC.

  16. Beyond clay: Towards an improved set of variables for predicting soil organic matter content

    Science.gov (United States)

    Rasmussen, Craig; Heckman, Katherine; Wieder, William R.; Keiluweit, Marco; Lawrence, Corey R.; Berhe, Asmeret Asefaw; Blankinship, Joseph C.; Crow, Susan E.; Druhan, Jennifer; Hicks Pries, Caitlin E.; Marin-Spiotta, Erika; Plante, Alain F.; Schadel, Christina; Schmiel, Joshua P.; Sierra, Carlos A.; Thompson, Aaron; Wagai, Rota

    2018-01-01

    Improved quantification of the factors controlling soil organic matter (SOM) stabilization at continental to global scales is needed to inform projections of the largest actively cycling terrestrial carbon pool on Earth, and its response to environmental change. Biogeochemical models rely almost exclusively on clay content to modify rates of SOM turnover and fluxes of climate-active CO2 to the atmosphere. Emerging conceptual understanding, however, suggests other soil physicochemical properties may predict SOM stabilization better than clay content. We addressed this discrepancy by synthesizing data from over 5,500 soil profiles spanning continental scale environmental gradients. Here, we demonstrate that other physicochemical parameters are much stronger predictors of SOM content, with clay content having relatively little explanatory power. We show that exchangeable calcium strongly predicted SOM content in water-limited, alkaline soils, whereas with increasing moisture availability and acidity, iron- and aluminum-oxyhydroxides emerged as better predictors, demonstrating that the relative importance of SOM stabilization mechanisms scales with climate and acidity. These results highlight the urgent need to modify biogeochemical models to better reflect the role of soil physicochemical properties in SOM cycling.

  17. Dissolved Oxygen Dynamics in Backwaters of North America's Largest River Swamp

    Science.gov (United States)

    Bueche, S. M.; Xu, Y. J.; Reiman, J. H.

    2017-12-01

    The Atchafalaya River (AR) is the largest distributary of the Mississippi River flowing through south-central Louisiana, creating North America's largest river swamp basin - the Atchafalaya River Basin (ARB). Prior to human settlement, the AR's main channel was highly connected to this large wetland ecosystem. However, due to constructed levee systems and other human modifications, much of the ARB is now hydrologically disconnected from the AR's main channel except during high flow events. This lack of regular inputs of fresh, oxygenated water to these wetlands, paired with high levels of organic matter decomposition in wetlands, has caused low oxygen-deprived hypoxic conditions in the ARB's back waters. In addition, due to the incredibly nutrient-rich and warm nature of the ARB, microbial decomposition in backwater areas with limited flow often results in potentially stressful, if not lethal, levels of DO for organisms during and after flood pulses. This study aims to investigate dynamics of dissolved oxygen in backwaters of the Atchafalaya River Basin, intending to answer a crucial question about hydrological and water quality connectivity between the river's mainstem and its floodplain. Specifically, the study will 1) conduct field water quality measurements, 2) collect composite water samples for chemical analysis of nutrients and carbon, 3) investigate DO dynamics over different seasons for one year, and 4) determine the major factors that affect DO dynamics in this unique swamp ecosystem. The study is currently underway; therefore, in this presentation we will share the major findings gained in the past several months and discuss backwater effects on river chemistry.

  18. Ownership structure and economic and socio-environmental disclosure in the largest Brazilian companies

    Directory of Open Access Journals (Sweden)

    Tatiana Aquino Almeida

    2016-01-01

    Full Text Available The disclosure of sustainable practices has become important in the search for competitive advantage, so as to meet the expectations of the various stakeholders. Thus, the study aims to investigate the relationship between the ownership structure and the economic and environmental voluntary disclosure in the largest Brazilian companies, analyzing ownership concentration and the identity of the controlling shareholder. For the analysis, we considered the economic, social and environmental perspectives, addressed both individually and jointly. The sample consists of 47 companies from the 100 largest public companies listed on BM&FBOVESPA, according to the magazine Exame Biggest and Best, edition 2013. The research is descriptive and quantitative, using Multiple Linear Regression for statistical analysis. The descriptive analysis of the prospects of (economic, social, environmental and sustainability disclosure showed lower average disclosure for the environmental aspect. The state control organizations stood out with the highest average in three of the four levels of disclosure: economic, social and sustainability. As regards the application of statistical analysis, the regression models were not statistically significant, indicating that, for the companies in the sample, the ownership structure does not influence the economic and socio-environmental disclosure.

  19. A two step Bayesian approach for genomic prediction of breeding values.

    Science.gov (United States)

    Shariati, Mohammad M; Sørensen, Peter; Janss, Luc

    2012-05-21

    In genomic models that assign an individual variance to each marker, the contribution of one marker to the posterior distribution of the marker variance is only one degree of freedom (df), which introduces many variance parameters with only little information per variance parameter. A better alternative could be to form clusters of markers with similar effects where markers in a cluster have a common variance. Therefore, the influence of each marker group of size p on the posterior distribution of the marker variances will be p df. The simulated data from the 15th QTL-MAS workshop were analyzed such that SNP markers were ranked based on their effects and markers with similar estimated effects were grouped together. In step 1, all markers with minor allele frequency more than 0.01 were included in a SNP-BLUP prediction model. In step 2, markers were ranked based on their estimated variance on the trait in step 1 and each 150 markers were assigned to one group with a common variance. In further analyses, subsets of 1500 and 450 markers with largest effects in step 2 were kept in the prediction model. Grouping markers outperformed SNP-BLUP model in terms of accuracy of predicted breeding values. However, the accuracies of predicted breeding values were lower than Bayesian methods with marker specific variances. Grouping markers is less flexible than allowing each marker to have a specific marker variance but, by grouping, the power to estimate marker variances increases. A prior knowledge of the genetic architecture of the trait is necessary for clustering markers and appropriate prior parameterization.

  20. OBJECT TRACKING WITH ROTATION-INVARIANT LARGEST DIFFERENCE INDEXED LOCAL TERNARY PATTERN

    Directory of Open Access Journals (Sweden)

    J Shajeena

    2017-02-01

    Full Text Available This paper presents an ideal method for object tracking directly in the compressed domain in video sequences. An enhanced rotation-invariant image operator called Largest Difference Indexed Local Ternary Pattern (LDILTP has been proposed. The Local Ternary Pattern which worked very well in texture classification and face recognition is now extended for rotation invariant object tracking. Histogramming the LTP code makes the descriptor resistant to translation. The histogram intersection is used to find the similarity measure. This method is robust to noise and retain contrast details. The proposed scheme has been verified on various datasets and shows a commendable performance.

  1. Information from Relationship Lending : Evidence from China

    NARCIS (Netherlands)

    Chang, C.; Liao, G.; Yu, X.; Ni, Z.

    2009-01-01

    We study the economic role of banks’ soft information, which evolved from repeated lending relationships, in the context of loan default. Using a proprietary database from one of the largest state-owned commercial banks in China, we find that the bank’s internal credit rating scores play a

  2. Multi-level learning: improving the prediction of protein, domain and residue interactions by allowing information flow between levels

    Directory of Open Access Journals (Sweden)

    McDermott Drew

    2009-08-01

    Full Text Available Abstract Background Proteins interact through specific binding interfaces that contain many residues in domains. Protein interactions thus occur on three different levels of a concept hierarchy: whole-proteins, domains, and residues. Each level offers a distinct and complementary set of features for computationally predicting interactions, including functional genomic features of whole proteins, evolutionary features of domain families and physical-chemical features of individual residues. The predictions at each level could benefit from using the features at all three levels. However, it is not trivial as the features are provided at different granularity. Results To link up the predictions at the three levels, we propose a multi-level machine-learning framework that allows for explicit information flow between the levels. We demonstrate, using representative yeast interaction networks, that our algorithm is able to utilize complementary feature sets to make more accurate predictions at the three levels than when the three problems are approached independently. To facilitate application of our multi-level learning framework, we discuss three key aspects of multi-level learning and the corresponding design choices that we have made in the implementation of a concrete learning algorithm. 1 Architecture of information flow: we show the greater flexibility of bidirectional flow over independent levels and unidirectional flow; 2 Coupling mechanism of the different levels: We show how this can be accomplished via augmenting the training sets at each level, and discuss the prevention of error propagation between different levels by means of soft coupling; 3 Sparseness of data: We show that the multi-level framework compounds data sparsity issues, and discuss how this can be dealt with by building local models in information-rich parts of the data. Our proof-of-concept learning algorithm demonstrates the advantage of combining levels, and opens up

  3. Long-Range Reduced Predictive Information Transfers of Autistic Youths in EEG Sensor-Space During Face Processing.

    Science.gov (United States)

    Khadem, Ali; Hossein-Zadeh, Gholam-Ali; Khorrami, Anahita

    2016-03-01

    The majority of previous functional/effective connectivity studies conducted on the autistic patients converged to the underconnectivity theory of ASD: "long-range underconnectivity and sometimes short-rang overconnectivity". However, to the best of our knowledge the total (linear and nonlinear) predictive information transfers (PITs) of autistic patients have not been investigated yet. Also, EEG data have rarely been used for exploring the information processing deficits in autistic subjects. This study is aimed at comparing the total (linear and nonlinear) PITs of autistic and typically developing healthy youths during human face processing by using EEG data. The ERPs of 12 autistic youths and 19 age-matched healthy control (HC) subjects were recorded while they were watching upright and inverted human face images. The PITs among EEG channels were quantified using two measures separately: transfer entropy with self-prediction optimality (TESPO), and modified transfer entropy with self-prediction optimality (MTESPO). Afterwards, the directed differential connectivity graphs (dDCGs) were constructed to characterize the significant changes in the estimated PITs of autistic subjects compared with HC ones. By using both TESPO and MTESPO, long-range reduction of PITs of ASD group during face processing was revealed (particularly from frontal channels to right temporal channels). Also, it seemed the orientation of face images (upright or upside down) did not modulate the binary pattern of PIT-based dDCGs, significantly. Moreover, compared with TESPO, the results of MTESPO were more compatible with the underconnectivity theory of ASD in the sense that MTESPO showed no long-range increase in PIT. It is also noteworthy that to the best of our knowledge it is the first time that a version of MTE is applied for patients (here ASD) and it is also its first use for EEG data analysis.

  4. Ambient air quality predictions in the Athabasca oil sands region

    International Nuclear Information System (INIS)

    1996-01-01

    This report presents dispersion modelling predictions for SO 2 , NOx, CO, HC and particulate matter (PM), which complement regional monitoring observations. The air quality simulation models provide a scientific means of relating industrial emissions to changes in ambient air quality. The four models applied to the emission sources in the region were: (1) SCREEN3, (2) ISC3BE, (3) ADEPT2, and (4) the box model. Model predictions were compared to air quality guidelines. It was concluded that the largest SO 2 concentrations were associated with intermittent flaring, and with the Suncor Powerhouse whose emissions are continuous. 45 refs., 36 tabs., 40 figs

  5. Differential genome-wide gene expression profiling of bovine largest and second-largest follicles: identification of genes associated with growth of dominant follicles

    Directory of Open Access Journals (Sweden)

    Takahashi Toru

    2010-02-01

    Full Text Available Abstract Background Bovine follicular development is regulated by numerous molecular mechanisms and biological pathways. In this study, we tried to identify differentially expressed genes between largest (F1 and second-largest follicles (F2, and classify them by global gene expression profiling using a combination of microarray and quantitative real-time PCR (QPCR analysis. The follicular status of F1 and F2 were further evaluated in terms of healthy and atretic conditions by investigating mRNA localization of identified genes. Methods Global gene expression profiles of F1 (10.7 +/- 0.7 mm and F2 (7.8 +/- 0.2 mm were analyzed by hierarchical cluster analysis and expression profiles of 16 representative genes were confirmed by QPCR analysis. In addition, localization of six identified transcripts was investigated in healthy and atretic follicles using in situ hybridization. The healthy or atretic condition of examined follicles was classified by progesterone and estradiol concentrations in follicular fluid. Results Hierarchical cluster analysis of microarray data classified the follicles into two clusters. Cluster A was composed of only F2 and was characterized by high expression of 31 genes including IGFBP5, whereas cluster B contained only F1 and predominantly expressed 45 genes including CYP19 and FSHR. QPCR analysis confirmed AMH, CYP19, FSHR, GPX3, PlGF, PLA2G1B, SCD and TRB2 were greater in F1 than F2, while CCL2, GADD45A, IGFBP5, PLAUR, SELP, SPP1, TIMP1 and TSP2 were greater in F2 than in F1. In situ hybridization showed that AMH and CYP19 were detected in granulosa cells (GC of healthy as well as atretic follicles. PlGF was localized in GC and in the theca layer (TL of healthy follicles. IGFBP5 was detected in both GC and TL of atretic follicles. GADD45A and TSP2 were localized in both GC and TL of atretic follicles, whereas healthy follicles expressed them only in GC. Conclusion We demonstrated that global gene expression profiling of F

  6. Dallas reloaded. Pt. 1. Resurrection of the U.S.A. to the largest energy power. Fracking ensures a new gold rush mood; Dallas reloaded. T. 1. Auferstehung der USA zur groessten Energiemacht. Fracking sorgt fuer neue Goldrauschstimmung

    Energy Technology Data Exchange (ETDEWEB)

    Boettger, Gunnar

    2013-05-15

    The contribution under consideration reports on the risks and impacts of fracking on the power supply. The International Energy Agency (Paris, France) predicts that the U.S.A. will be the world's largest producer of petroleum and natural gas in five years. The international energy landscape will change dramatically over the next twenty years. This may involve policy changes. Then, the U.S.A. could be independent possibly.

  7. Solar energy potential of the largest buildings in the United States

    Science.gov (United States)

    Wence, E. R.; Grodsky, S.; Hernandez, R. R.

    2017-12-01

    Sustainable pathways of land use for energy are necessary to mitigate climate change and limit conversion of finite land resources needed for conservation and food production. Large, commercial buildings (LCBs) are increasing in size and number throughout the United States (US) and may serve as suitable recipient environments for photovoltaic (PV) solar energy infrastructure that may support a low carbon, low land footprint energy transition. In this study, we identified, characterized, and evaluated the technical potential of the largest, commercial building rooftops (i.e., exceeding 110,000 m2) and their associated parking lots in the US for PV solar energy systems using Aurora, a cloud-based solar optimization platform. We also performed a case study of building-specific electricity generation: electricity consumption balance. Further, we quantified the environmental co-benefit of land sparing and associated avoided emissions (t-CO2-eq) conferred under the counterfactual scenario that solar development would otherwise proceed as a ground-mounted, utility-scale PV installation of equal nominal capacity. We identified and mapped 37 LCBs (by rooftop area) across 18 states in the US, spanning from as far north as the state of Minnesota to as far south as Florida. Rooftop footprints range from 427,297 to 113,689 m2 and have a cumulative surface area of 99.8 million ft2. We characterize the LCBs as either: distribution/warehouse, factory, shopping center, or administrative office/facility. Three of the 37 LCBs currently support rooftop PV and the numbers of associated, detached buildings number up to 38. This study elucidates the extent to which LCBs and their respective parking lots can serve as suitable sites for PV solar energy generation. Lastly, this study demonstrates research-based applications of the Aurora energy modeling platform and informs decision-making focused on redirecting energy development towards human-modified landscapes to prioritize land use for

  8. Tide Predictions, California, 2014, NOAA

    Data.gov (United States)

    U.S. Environmental Protection Agency — The predictions from the web based NOAA Tide Predictions are based upon the latest information available as of the date of the user's request. Tide predictions...

  9. Predictive Manufacturing: A Classification Strategy to Predict Product Failures

    DEFF Research Database (Denmark)

    Khan, Abdul Rauf; Schiøler, Henrik; Kulahci, Murat

    2018-01-01

    manufacturing analytics model that employs a big data approach to predicting product failures; third, we illustrate the issue of high dimensionality, along with statistically redundant information; and, finally, our proposed method will be compared against the well-known classification methods (SVM, K......-nearest neighbor, artificial neural networks). The results from real data show that our predictive manufacturing analytics approach, using genetic algorithms and Voronoi tessellations, is capable of predicting product failure with reasonable accuracy. The potential application of this method contributes...... to accurately predicting product failures, which would enable manufacturers to reduce production costs without compromising product quality....

  10. New welding information system on the internet (Prediction of the properties of weld heat-affected zones

    Directory of Open Access Journals (Sweden)

    M Fujita

    2003-08-01

    Full Text Available To promote continuous transfer and development of welding technology, a new system for predicting the microstructures and mechanical properties of welded joins has been built on the Internet. It combines a database system containing continuous cooling transformation diagrams (CCT diagrams for welding and an expert system for computing weld thermal histories. In addition, this system employs a technique which was invented during the development of another distributed database system called "Data-Free-Way" , which was designed to contain information advanced nuclear materials and materials obtained from other programs of welding research at NIMS in the past. This paper describes the current state of our new system for computing weld thermal histories to predict the properties of welded joints using the CCT diagrams database, which is now available on the Internet. Some problems encountered with the database used in such a system are also referred to.

  11. The 10 largest public and philanthropic funders of health research in the world: what they fund and how they distribute their funds.

    Science.gov (United States)

    Viergever, Roderik F; Hendriks, Thom C C

    2016-02-18

    Little is known about who the main public and philanthropic funders of health research are globally, what they fund and how they decide what gets funded. This study aims to identify the 10 largest public and philanthropic health research funding organizations in the world, to report on what they fund, and on how they distribute their funds. The world's key health research funding organizations were identified through a search strategy aimed at identifying different types of funding organizations. Organizations were ranked by their reported total annual health research expenditures. For the 10 largest funding organizations, data were collected on (1) funding amounts allocated towards 20 health areas, and (2) schemes employed for distributing funding (intramural/extramural, project/'people'/organizational and targeted/untargeted funding). Data collection consisted of a review of reports and websites and interviews with representatives of funding organizations. Data collection was challenging; data were often not reported or reported using different classification systems. Overall, 55 key health research funding organizations were identified. The 10 largest funding organizations together funded research for $37.1 billion, constituting 40% of all public and philanthropic health research spending globally. The largest funder was the United States National Institutes of Health ($26.1 billion), followed by the European Commission ($3.7 billion), and the United Kingdom Medical Research Council ($1.3 billion). The largest philanthropic funder was the Wellcome Trust ($909.1 million), the largest funder of health research through official development assistance was USAID ($186.4 million), and the largest multilateral funder was the World Health Organization ($135.0 million). Funding distribution mechanisms and funding patterns varied substantially between the 10 largest funders. There is a need for increased transparency about who the main funders of health research are

  12. Phase space reconstruction and estimation of the largest Lyapunov exponent for gait kinematic data

    Energy Technology Data Exchange (ETDEWEB)

    Josiński, Henryk [Silesian University of Technology, Akademicka 16, 44-100 Gliwice (Poland); Świtoński, Adam [Polish-Japanese Institute of Information Technology, Aleja Legionów 2, 41-902 Bytom (Poland); Silesian University of Technology, Akademicka 16, 44-100 Gliwice (Poland); Michalczuk, Agnieszka; Wojciechowski, Konrad [Polish-Japanese Institute of Information Technology, Aleja Legionów 2, 41-902 Bytom (Poland)

    2015-03-10

    The authors describe an example of application of nonlinear time series analysis directed at identifying the presence of deterministic chaos in human motion data by means of the largest Lyapunov exponent. The method was previously verified on the basis of a time series constructed from the numerical solutions of both the Lorenz and the Rössler nonlinear dynamical systems.

  13. Carbon and energy fluxes from China's largest freshwater lake

    Science.gov (United States)

    Gan, G.; LIU, Y.

    2017-12-01

    Carbon and energy fluxes between lakes and the atmosphere are important aspects of hydrology, limnology, and ecology studies. China's largest freshwater lake, the Poyang lake experiences tremendous water-land transitions periodically throughout the year, which provides natural experimental settings for the study of carbon and energy fluxes. In this study, we use the eddy covariance technique to explore the seasonal and diurnal variation patterns of sensible and latent heat fluxes of Poyang lake during its high-water and low-water periods, when the lake is covered by water and mudflat, respectively. We also determine the annual NEE of Poyang lake and the variations of NEE's components: Gross Primary Productivity (GPP) and Ecosystem Respiration (Re). Controlling factors of seasonal and diurnal variations of carbon and energy fluxes are analyzed, and land cover impacts on the variation patterns are also studied. Finally, the coupling between the carbon and energy fluxes are analyzed under different atmospheric, boundary stability and land cover conditions.

  14. Weather and Prey Predict Mammals' Visitation to Water.

    Directory of Open Access Journals (Sweden)

    Grant Harris

    Full Text Available Throughout many arid lands of Africa, Australia and the United States, wildlife agencies provide water year-round for increasing game populations and enhancing biodiversity, despite concerns that water provisioning may favor species more dependent on water, increase predation, and reduce biodiversity. In part, understanding the effects of water provisioning requires identifying why and when animals visit water. Employing this information, by matching water provisioning with use by target species, could assist wildlife management objectives while mitigating unintended consequences of year-round watering regimes. Therefore, we examined if weather variables (maximum temperature, relative humidity [RH], vapor pressure deficit [VPD], long and short-term precipitation and predator-prey relationships (i.e., prey presence predicted water visitation by 9 mammals. We modeled visitation as recorded by trail cameras at Sevilleta National Wildlife Refuge, New Mexico, USA (June 2009 to September 2014 using generalized linear modeling. For 3 native ungulates, elk (Cervus Canadensis, mule deer (Odocoileus hemionus, and pronghorn (Antilocapra americana, less long-term precipitation and higher maximum temperatures increased visitation, including RH for mule deer. Less long-term precipitation and higher VPD increased oryx (Oryx gazella and desert cottontail rabbits (Sylvilagus audubonii visitation. Long-term precipitation, with RH or VPD, predicted visitation for black-tailed jackrabbits (Lepus californicus. Standardized model coefficients demonstrated that the amount of long-term precipitation influenced herbivore visitation most. Weather (especially maximum temperature and prey (cottontails and jackrabbits predicted bobcat (Lynx rufus visitation. Mule deer visitation had the largest influence on coyote (Canis latrans visitation. Puma (Puma concolor visitation was solely predicted by prey visitation (elk, mule deer, oryx. Most ungulate visitation peaked during

  15. Weather and Prey Predict Mammals’ Visitation to Water

    Science.gov (United States)

    Harris, Grant; Sanderson, James G.; Erz, Jon; Lehnen, Sarah E.; Butler, Matthew J.

    2015-01-01

    Throughout many arid lands of Africa, Australia and the United States, wildlife agencies provide water year-round for increasing game populations and enhancing biodiversity, despite concerns that water provisioning may favor species more dependent on water, increase predation, and reduce biodiversity. In part, understanding the effects of water provisioning requires identifying why and when animals visit water. Employing this information, by matching water provisioning with use by target species, could assist wildlife management objectives while mitigating unintended consequences of year-round watering regimes. Therefore, we examined if weather variables (maximum temperature, relative humidity [RH], vapor pressure deficit [VPD], long and short-term precipitation) and predator-prey relationships (i.e., prey presence) predicted water visitation by 9 mammals. We modeled visitation as recorded by trail cameras at Sevilleta National Wildlife Refuge, New Mexico, USA (June 2009 to September 2014) using generalized linear modeling. For 3 native ungulates, elk (Cervus Canadensis), mule deer (Odocoileus hemionus), and pronghorn (Antilocapra americana), less long-term precipitation and higher maximum temperatures increased visitation, including RH for mule deer. Less long-term precipitation and higher VPD increased oryx (Oryx gazella) and desert cottontail rabbits (Sylvilagus audubonii) visitation. Long-term precipitation, with RH or VPD, predicted visitation for black-tailed jackrabbits (Lepus californicus). Standardized model coefficients demonstrated that the amount of long-term precipitation influenced herbivore visitation most. Weather (especially maximum temperature) and prey (cottontails and jackrabbits) predicted bobcat (Lynx rufus) visitation. Mule deer visitation had the largest influence on coyote (Canis latrans) visitation. Puma (Puma concolor) visitation was solely predicted by prey visitation (elk, mule deer, oryx). Most ungulate visitation peaked during May and

  16. The Algorithm of Link Prediction on Social Network

    Directory of Open Access Journals (Sweden)

    Liyan Dong

    2013-01-01

    Full Text Available At present, most link prediction algorithms are based on the similarity between two entities. Social network topology information is one of the main sources to design the similarity function between entities. But the existing link prediction algorithms do not apply the network topology information sufficiently. For lack of traditional link prediction algorithms, we propose two improved algorithms: CNGF algorithm based on local information and KatzGF algorithm based on global information network. For the defect of the stationary of social network, we also provide the link prediction algorithm based on nodes multiple attributes information. Finally, we verified these algorithms on DBLP data set, and the experimental results show that the performance of the improved algorithm is superior to that of the traditional link prediction algorithm.

  17. Triangle network motifs predict complexes by complementing high-error interactomes with structural information

    Directory of Open Access Journals (Sweden)

    Labudde Dirk

    2009-06-01

    Full Text Available Abstract Background A lot of high-throughput studies produce protein-protein interaction networks (PPINs with many errors and missing information. Even for genome-wide approaches, there is often a low overlap between PPINs produced by different studies. Second-level neighbors separated by two protein-protein interactions (PPIs were previously used for predicting protein function and finding complexes in high-error PPINs. We retrieve second level neighbors in PPINs, and complement these with structural domain-domain interactions (SDDIs representing binding evidence on proteins, forming PPI-SDDI-PPI triangles. Results We find low overlap between PPINs, SDDIs and known complexes, all well below 10%. We evaluate the overlap of PPI-SDDI-PPI triangles with known complexes from Munich Information center for Protein Sequences (MIPS. PPI-SDDI-PPI triangles have ~20 times higher overlap with MIPS complexes than using second-level neighbors in PPINs without SDDIs. The biological interpretation for triangles is that a SDDI causes two proteins to be observed with common interaction partners in high-throughput experiments. The relatively few SDDIs overlapping with PPINs are part of highly connected SDDI components, and are more likely to be detected in experimental studies. We demonstrate the utility of PPI-SDDI-PPI triangles by reconstructing myosin-actin processes in the nucleus, cytoplasm, and cytoskeleton, which were not obvious in the original PPIN. Using other complementary datatypes in place of SDDIs to form triangles, such as PubMed co-occurrences or threading information, results in a similar ability to find protein complexes. Conclusion Given high-error PPINs with missing information, triangles of mixed datatypes are a promising direction for finding protein complexes. Integrating PPINs with SDDIs improves finding complexes. Structural SDDIs partially explain the high functional similarity of second-level neighbors in PPINs. We estimate that

  18. The Prediction Value

    NARCIS (Netherlands)

    Koster, M.; Kurz, S.; Lindner, I.; Napel, S.

    2013-01-01

    We introduce the prediction value (PV) as a measure of players’ informational importance in probabilistic TU games. The latter combine a standard TU game and a probability distribution over the set of coalitions. Player i’s prediction value equals the difference between the conditional expectations

  19. MULTIVARIATE MODEL FOR CORPORATE BANKRUPTCY PREDICTION IN ROMANIA

    Directory of Open Access Journals (Sweden)

    Daniel BRÎNDESCU – OLARIU

    2016-06-01

    Full Text Available The current paper proposes a methodology for bankruptcy prediction applicable for Romanian companies. Low bankruptcy frequencies registered in the past have limited the importance of bankruptcy prediction in Romania. The changes in the economic environment brought by the economic crisis, as well as by the entrance in the European Union, make the availability of performing bankruptcy assessment tools more important than ever before. The proposed methodology is centred on a multivariate model, developed through discriminant analysis. Financial ratios are employed as explanatory variables within the model. The study has included 53,252 yearly financial statements from the period 2007 – 2010, with the state of the companies being monitored until the end of 2012. It thus employs the largest sample ever used in Romanian research in the field of bankruptcy prediction, not targeting high levels of accuracy over isolated samples, but reliability and ease of use over the entire population.

  20. Prediction of internet addiction based on information literacy among students of Iran University of Medical Sciences.

    Science.gov (United States)

    Langarizadeh, Mostafa; Naghipour, Majid; Tabatabaei, Seyed Mohsen; Mirzaei, Abbas; Vaghar, Mohammad Eslami

    2018-02-01

    A considerable group of internet users consists of university users; however, despite internet benefits and capabilities, internet overuse is a threat to societies especially to young people and students. The objective of this study was to determine the predictive role of information literacy in internet addiction among students of Iran University of Medical Sciences during 2016. This analytical cross-sectional study was conducted in Iran University of Medical Sciences in 2016. Using stratified random sampling method, 365 students from different disciplines were selected. Measuring tools included the Information Literacy Questionnaire, the Yang Online Drug Addiction Scale and the General Health Questionnaire. The collected data were analyzed by Pearson product-moment correlation, independent samples t-test and multiple linear regression using SPSS version 22. According to this study, 31.2% of students had internet addiction (29.9% were mildly addicted and 1.3% had severe addiction). There was a significant and inverse relationship between higher information literacy and internet addiction (R= -0.45) and (pInformation literacy" explained 20% of the variation in the outcome variable "Internet addiction". Students play a substantial role in promoting the cultural and scientific level of knowledge in society; the higher their information literacy, the lower the level of Internet addiction, and consequently the general health of society will improve. It seems that wise planning by authorities of Iran's universities to prevent internet addiction and to increase information literacy among students is needed.

  1. BEHAVIOR OF THE TEN LARGEST BRAZILIAN BANKS DURING THE SUBPRIME CRISIS: AN ANALYSIS BASED ON FINANCIAL INDICATORS

    Directory of Open Access Journals (Sweden)

    Rosane Maria Pio da Silva

    2012-06-01

    Full Text Available The aim of this paper is to demonstrate the behavior of the ten largest Brazilian banks between June 2008 and September 2009, based on the analysis of financial indicators. Therefore, 16 three-monthly indices were calculated, extracted from financial statement information, which characterizes a documentary research. The indices were separated in five categories: liquidity, capital, profitability, income and market. The obtained results appointed that most financial institutions in the sample were able to manage their resources so as to gain conditions to maintain credit initially. Then, as from the first term of 2009, driven by public banks, they increased their credit operations. In addition, most banks revealed an anti-cyclical trend to encourage productive activities, preferably activities with higher liquidity levels, to the detriment of profitability, which reveals a more conservative attitude. Finally, it was verified that government initiatives, the Brazilian economic balance and the resources the banks offered helped to produce an environment to reactivate business activities during the most acute period of the subprime crisis.

  2. Trend of CO2 emissions of the 30 largest power plants in Germany

    International Nuclear Information System (INIS)

    Hermann, Hauke

    2014-01-01

    The brochure on the trend of CO 2 emissions of the 30 largest power plants in Germany includes tables of the emissions of these power plants. The CO 2 emissions of these power plants in 2013 (25% of the total German greenhouse gas emissions) have increased by 5% compared to 2012. The total CO 2 emission sin Germany increased by 1.5%. The differences between brown coal and black coal fired power plants are discussed.

  3. Entropy and the Predictability of Online Life

    Directory of Open Access Journals (Sweden)

    Roberta Sinatra

    2014-01-01

    Full Text Available Using mobile phone records and information theory measures, our daily lives have been recently shown to follow strict statistical regularities, and our movement patterns are, to a large extent, predictable. Here, we apply entropy and predictability measures to two datasets of the behavioral actions and the mobility of a large number of players in the virtual universe of a massive multiplayer online game. We find that movements in virtual human lives follow the same high levels of predictability as offline mobility, where future movements can, to some extent, be predicted well if the temporal correlations of visited places are accounted for. Time series of behavioral actions show similar high levels of predictability, even when temporal correlations are neglected. Entropy conditional on specific behavioral actions reveals that in terms of predictability, negative behavior has a wider variety than positive actions. The actions that contain the information to best predict an individual’s subsequent action are negative, such as attacks or enemy markings, while the positive actions of friendship marking, trade and communication contain the least amount of predictive information. These observations show that predicting behavioral actions requires less information than predicting the mobility patterns of humans for which the additional knowledge of past visited locations is crucial and that the type and sign of a social relation has an essential impact on the ability to determine future behavior.

  4. Nuclear Information and Knowledge, News from the Nuclear Information Section, No. 13, September 2012

    International Nuclear Information System (INIS)

    2012-09-01

    This issue of the Nuclear Information and Knowledge Newsletter conveys some general information about the Nuclear Information Section and its activities. Special emphasis was placed on the INIS Collection search application. This search application is based on Google technology and represents the main access point not only to the INIS collection of 3.4 million records, but also to the IAEA Library holdings. Combined together, the INIS and IAEA Library collections comprise one of the world's largest resources of published and unpublished information on the peaceful uses of nuclear science and technology. Articles on the International Nuclear Library Network (INLN) and digital preservation efforts are just some of the many INIS and IAEA Library activities. Recent mobile customization of the INIS website proves our commitment to bringing our products and services closer to scientists, researchers and students.

  5. Filtering and prediction

    CERN Document Server

    Fristedt, B; Krylov, N

    2007-01-01

    Filtering and prediction is about observing moving objects when the observations are corrupted by random errors. The main focus is then on filtering out the errors and extracting from the observations the most precise information about the object, which itself may or may not be moving in a somewhat random fashion. Next comes the prediction step where, using information about the past behavior of the object, one tries to predict its future path. The first three chapters of the book deal with discrete probability spaces, random variables, conditioning, Markov chains, and filtering of discrete Markov chains. The next three chapters deal with the more sophisticated notions of conditioning in nondiscrete situations, filtering of continuous-space Markov chains, and of Wiener process. Filtering and prediction of stationary sequences is discussed in the last two chapters. The authors believe that they have succeeded in presenting necessary ideas in an elementary manner without sacrificing the rigor too much. Such rig...

  6. Predicting Hotspots of Human-Elephant Conflict to Inform Mitigation Strategies in Xishuangbanna, Southwest China.

    Directory of Open Access Journals (Sweden)

    Ying Chen

    Full Text Available Research on the spatial patterns of human-wildlife conflict is fundamental to understanding the mechanisms underlying it and to identifying opportunities for mitigation. In the state of Xishuangbanna, containing China's largest tropical forest, an imbalance between nature conservation and economic development has led to increasing conflicts between humans and Asian elephants (Elephas maximus, as both elephant numbers and conversion of habitable land to rubber plantations have increased over the last several decades. We analyzed government data on the compensation costs of elephant-caused damage in Xishuangbanna between 2008 and 2012 to understand the spatial and temporal patterns of conflict, in terms of their occurrence, frequency and distribution. More than 18,261 incidents were reported, including episodes involving damage to rubber trees (n = 10,999, damage to crops such as paddy, upland rice, corn, bananas and sugarcane (n = 11,020, property loss (n = 689 and attacks on humans (n = 19. The conflict data reconfirmed the presence of elephants in areas which have lacked records since the late 1990s. Zero Altered Negative Binomial models revealed that the risk of damage to crops and plantations increased with proximity to protected areas, increasing distance from roads, and lower settlement density. The patterns were constant across seasons and types of crop damaged. Damage to rubber trees was essentially incidental as elephants searched for crops to eat. A predictive map of risks revealed hotspots of conflict within and around protected areas, the last refuges for elephants in the region, and along habitat corridors connecting them. Additionally, we analyzed how mitigation efforts can best diminish the risk of conflict while minimizing financial costs and adverse biological impacts. Our analytical approach can be adopted, adjusted and expanded to other areas with historical records of human-wildlife conflict.

  7. Predicting Hotspots of Human-Elephant Conflict to Inform Mitigation Strategies in Xishuangbanna, Southwest China.

    Science.gov (United States)

    Chen, Ying; Marino, Jorgelina; Chen, Yong; Tao, Qing; Sullivan, Casey D; Shi, Kun; Macdonald, David W

    2016-01-01

    Research on the spatial patterns of human-wildlife conflict is fundamental to understanding the mechanisms underlying it and to identifying opportunities for mitigation. In the state of Xishuangbanna, containing China's largest tropical forest, an imbalance between nature conservation and economic development has led to increasing conflicts between humans and Asian elephants (Elephas maximus), as both elephant numbers and conversion of habitable land to rubber plantations have increased over the last several decades. We analyzed government data on the compensation costs of elephant-caused damage in Xishuangbanna between 2008 and 2012 to understand the spatial and temporal patterns of conflict, in terms of their occurrence, frequency and distribution. More than 18,261 incidents were reported, including episodes involving damage to rubber trees (n = 10,999), damage to crops such as paddy, upland rice, corn, bananas and sugarcane (n = 11,020), property loss (n = 689) and attacks on humans (n = 19). The conflict data reconfirmed the presence of elephants in areas which have lacked records since the late 1990s. Zero Altered Negative Binomial models revealed that the risk of damage to crops and plantations increased with proximity to protected areas, increasing distance from roads, and lower settlement density. The patterns were constant across seasons and types of crop damaged. Damage to rubber trees was essentially incidental as elephants searched for crops to eat. A predictive map of risks revealed hotspots of conflict within and around protected areas, the last refuges for elephants in the region, and along habitat corridors connecting them. Additionally, we analyzed how mitigation efforts can best diminish the risk of conflict while minimizing financial costs and adverse biological impacts. Our analytical approach can be adopted, adjusted and expanded to other areas with historical records of human-wildlife conflict.

  8. Towards the prediction of essential genes by integration of network topology, cellular localization and biological process information

    Directory of Open Access Journals (Sweden)

    Lemke Ney

    2009-09-01

    Full Text Available Abstract Background The identification of essential genes is important for the understanding of the minimal requirements for cellular life and for practical purposes, such as drug design. However, the experimental techniques for essential genes discovery are labor-intensive and time-consuming. Considering these experimental constraints, a computational approach capable of accurately predicting essential genes would be of great value. We therefore present here a machine learning-based computational approach relying on network topological features, cellular localization and biological process information for prediction of essential genes. Results We constructed a decision tree-based meta-classifier and trained it on datasets with individual and grouped attributes-network topological features, cellular compartments and biological processes-to generate various predictors of essential genes. We showed that the predictors with better performances are those generated by datasets with integrated attributes. Using the predictor with all attributes, i.e., network topological features, cellular compartments and biological processes, we obtained the best predictor of essential genes that was then used to classify yeast genes with unknown essentiality status. Finally, we generated decision trees by training the J48 algorithm on datasets with all network topological features, cellular localization and biological process information to discover cellular rules for essentiality. We found that the number of protein physical interactions, the nuclear localization of proteins and the number of regulating transcription factors are the most important factors determining gene essentiality. Conclusion We were able to demonstrate that network topological features, cellular localization and biological process information are reliable predictors of essential genes. Moreover, by constructing decision trees based on these data, we could discover cellular rules governing

  9. Quicksort, largest bucket, and min-wise hashing with limited independence

    DEFF Research Database (Denmark)

    Knudsen, Mathias Bæk Tejs; Stöckel, Morten

    2015-01-01

    Randomized algorithms and data structures are often analyzed under the assumption of access to a perfect source of randomness. The most fundamental metric used to measure how “random” a hash function or a random number generator is, is its independence: a sequence of random variables is said...... to be k-independent if every variable is uniform and every size k subset is independent. In this paper we consider three classic algorithms under limited independence. Besides the theoretical interest in removing the unrealistic assumption of full independence, the work is motivated by lower independence...... being more practical. We provide new bounds for randomized quicksort, min-wise hashing and largest bucket size under limited independence. Our results can be summarized as follows. Randomized Quicksort. When pivot elements are computed using a 5-independent hash function, Karloff and Raghavan, J.ACM’93...

  10. Governance Methods Used in Externalizing Information Technology

    Science.gov (United States)

    Chan, Steven King-Lun

    2012-01-01

    Information technology (IT) is the largest capital expenditure in many firms and is an integral part of many organizations' strategies. However, the benefits that each company receives from its IT investments vary. One study by Weill (2004) found that the top performer in the sample was estimated to have as high as a 40% greater return on its…

  11. Simulated biologic intelligence used to predict length of stay and survival of burns.

    Science.gov (United States)

    Frye, K E; Izenberg, S D; Williams, M D; Luterman, A

    1996-01-01

    From July 13, 1988, to May 14, 1995, 1585 patients with burns and no other injuries besides inhalation were treated; 4.5% did not survive. Artificial neural networks were trained on patient presentation data with known outcomes on 90% of the randomized cases. The remaining cases were then used to predict survival and length of stay in cases not trained on. Survival was predicted with more than 98% accuracy and length of stay to within a week with 72% accuracy in these cases. For anatomic area involved by burn, burns involving the feet, scalp, or both had the largest negative effect on the survival prediction. In survivors burns involving the buttocks, transport to this burn center by the military or by helicopter, electrical burns, hot tar burns, and inhalation were associated with increasing the length of stay prediction. Neural networks can be used to accurately predict the clinical outcome of a burn. What factors affect that prediction can be investigated.

  12. An implementation of an aeroacoustic prediction model for broadband noise from a vertical axis wind turbine using a CFD informed methodology

    Science.gov (United States)

    Botha, J. D. M.; Shahroki, A.; Rice, H.

    2017-12-01

    This paper presents an enhanced method for predicting aerodynamically generated broadband noise produced by a Vertical Axis Wind Turbine (VAWT). The method improves on existing work for VAWT noise prediction and incorporates recently developed airfoil noise prediction models. Inflow-turbulence and airfoil self-noise mechanisms are both considered. Airfoil noise predictions are dependent on aerodynamic input data and time dependent Computational Fluid Dynamics (CFD) calculations are carried out to solve for the aerodynamic solution. Analytical flow methods are also benchmarked against the CFD informed noise prediction results to quantify errors in the former approach. Comparisons to experimental noise measurements for an existing turbine are encouraging. A parameter study is performed and shows the sensitivity of overall noise levels to changes in inflow velocity and inflow turbulence. Noise sources are characterised and the location and mechanism of the primary sources is determined, inflow-turbulence noise is seen to be the dominant source. The use of CFD calculations is seen to improve the accuracy of noise predictions when compared to the analytic flow solution as well as showing that, for inflow-turbulence noise sources, blade generated turbulence dominates the atmospheric inflow turbulence.

  13. Predicting Pollicipes pollicipes (Crustacea: Cirripedia abundance on intertidal rocky shores of SW Portugal: a multi-scale approach based on a simple fetch-based wave exposure index

    Directory of Open Access Journals (Sweden)

    David Jacinto

    2016-06-01

    Full Text Available Understanding and predicting patterns of distribution and abundance of marine resources is important for conservation and management purposes in small-scale artisanal fisheries and industrial fisheries worldwide. The goose barnacle (Pollicipes pollicipes is an important shellfish resource and its distribution is closely related to wave exposure at different spatial scales. We modelled the abundance (percent coverage of P. pollicipes as a function of a simple wave exposure index based on fetch estimates from digitized coastlines at different spatial scales. The model accounted for 47.5% of the explained deviance and indicated that barnacle abundance increases non-linearly with wave exposure at both the smallest (metres and largest (kilometres spatial scales considered in this study. Distribution maps were predicted for the study region in SW Portugal. Our study suggests that the relationship between fetch-based exposure indices and P. pollicipes percent cover may be used as a simple tool for providing stakeholders with information on barnacle distribution patterns. This information may improve assessment of harvesting grounds and the dimension of exploitable areas, aiding management plans and supporting decision making on conservation, harvesting pressure and surveillance strategies for this highly appreciated and socio-economically important marine resource.

  14. Prediction strategies in a TV recommender system - Method and experiments

    NARCIS (Netherlands)

    van Setten, M.J.; Veenstra, M.; van Dijk, Elisabeth M.A.G.; Nijholt, Antinus; Isaísas, P.; Karmakar, N.

    2003-01-01

    Predicting the interests of a user in information is an important process in personalized information systems. In this paper, we present a way to create prediction engines that allow prediction techniques to be easily combined into prediction strategies. Prediction strategies choose one or a

  15. Computational prediction of protein-protein interactions in Leishmania predicted proteomes.

    Directory of Open Access Journals (Sweden)

    Antonio M Rezende

    Full Text Available The Trypanosomatids parasites Leishmania braziliensis, Leishmania major and Leishmania infantum are important human pathogens. Despite of years of study and genome availability, effective vaccine has not been developed yet, and the chemotherapy is highly toxic. Therefore, it is clear just interdisciplinary integrated studies will have success in trying to search new targets for developing of vaccines and drugs. An essential part of this rationale is related to protein-protein interaction network (PPI study which can provide a better understanding of complex protein interactions in biological system. Thus, we modeled PPIs for Trypanosomatids through computational methods using sequence comparison against public database of protein or domain interaction for interaction prediction (Interolog Mapping and developed a dedicated combined system score to address the predictions robustness. The confidence evaluation of network prediction approach was addressed using gold standard positive and negative datasets and the AUC value obtained was 0.94. As result, 39,420, 43,531 and 45,235 interactions were predicted for L. braziliensis, L. major and L. infantum respectively. For each predicted network the top 20 proteins were ranked by MCC topological index. In addition, information related with immunological potential, degree of protein sequence conservation among orthologs and degree of identity compared to proteins of potential parasite hosts was integrated. This information integration provides a better understanding and usefulness of the predicted networks that can be valuable to select new potential biological targets for drug and vaccine development. Network modularity which is a key when one is interested in destabilizing the PPIs for drug or vaccine purposes along with multiple alignments of the predicted PPIs were performed revealing patterns associated with protein turnover. In addition, around 50% of hypothetical protein present in the networks

  16. Predictive modeling of complications.

    Science.gov (United States)

    Osorio, Joseph A; Scheer, Justin K; Ames, Christopher P

    2016-09-01

    Predictive analytic algorithms are designed to identify patterns in the data that allow for accurate predictions without the need for a hypothesis. Therefore, predictive modeling can provide detailed and patient-specific information that can be readily applied when discussing the risks of surgery with a patient. There are few studies using predictive modeling techniques in the adult spine surgery literature. These types of studies represent the beginning of the use of predictive analytics in spine surgery outcomes. We will discuss the advancements in the field of spine surgery with respect to predictive analytics, the controversies surrounding the technique, and the future directions.

  17. Short communication: Use of genomic and metabolic information as well as milk performance records for prediction of subclinical ketosis risk via artificial neural networks.

    Science.gov (United States)

    Ehret, A; Hochstuhl, D; Krattenmacher, N; Tetens, J; Klein, M S; Gronwald, W; Thaller, G

    2015-01-01

    Subclinical ketosis is one of the most prevalent metabolic disorders in high-producing dairy cows during early lactation. This renders its early detection and prevention important for both economical and animal-welfare reasons. Construction of reliable predictive models is challenging, because traits like ketosis are commonly affected by multiple factors. In this context, machine learning methods offer great advantages because of their universal learning ability and flexibility in integrating various sorts of data. Here, an artificial-neural-network approach was applied to investigate the utility of metabolic, genetic, and milk performance data for the prediction of milk levels of β-hydroxybutyrate within and across consecutive weeks postpartum. Data were collected from 218 dairy cows during their first 5wk in milk. All animals were genotyped with a 50,000 SNP panel, and weekly information on the concentrations of the milk metabolites glycerophosphocholine and phosphocholine as well as milk composition data (milk yield, fat and protein percentage) was available. The concentration of β-hydroxybutyric acid in milk was used as target variable in all prediction models. Average correlations between observed and predicted target values up to 0.643 could be obtained, if milk metabolite and routine milk recording data were combined for prediction at the same day within weeks. Predictive performance of metabolic as well as milk performance-based models was higher than that of models based on genetic information. Copyright © 2015 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  18. Using time-to-contact information to assess potential collision modulates both visual and temporal prediction networks

    Directory of Open Access Journals (Sweden)

    Jennifer T Coull

    2008-09-01

    Full Text Available Accurate estimates of the time-to-contact (TTC of approaching objects are crucial for survival. We used an ecologically valid driving simulation to compare and contrast the neural substrates of egocentric (head-on approach and allocentric (lateral approach TTC tasks in a fully factorial, event-related fMRI design. Compared to colour control tasks, both egocentric and allocentric TTC tasks activated left ventral premotor cortex/frontal operculum and inferior parietal cortex, the same areas that have previously been implicated in temporal attentional orienting. Despite differences in visual and cognitive demands, both TTC and temporal orienting paradigms encourage the use of temporally predictive information to guide behaviour, suggesting these areas may form a core network for temporal prediction. We also demonstrated that the temporal derivative of the perceptual index tau (tau-dot held predictive value for making collision judgements and varied inversely with activity in primary visual cortex (V1. Specifically, V1 activity increased with the increasing likelihood of reporting a collision, suggesting top-down attentional modulation of early visual processing areas as a function of subjective collision. Finally, egocentric viewpoints provoked a response bias for reporting collisions, rather than no-collisions, reflecting increased caution for head-on approaches. Associated increases in SMA activity suggest motor preparation mechanisms were engaged, despite the perceptual nature of the task.

  19. Physical and Cognitive Functioning After 3 Years Can Be Predicted Using Information From the Diagnostic Process in Recently Diagnosed Multiple Sclerosis

    NARCIS (Netherlands)

    de Groot, V.; Beckerman, H.; Uitdehaag, B.M.J.; Hintzen, R.Q.; Minneboo, A.; Heymans, M.W.; Lankhorst, G.J.; Polman, C.H.; Bouter, L.M.

    2009-01-01

    de Groot V, Beckerman H, Uitdehaag BM, Hintzen RQ, Minneboo A, Heymans MW, Lankhorst GJ, Polman CH, Bouter LM, on behalf of the Functional Prognostication and Disability (FuPro) Study Group. Physical and cognitive functioning after 3 years can be predicted using information from the diagnostic

  20. Decisions on foot-and-mouth disease control informed by model prediction

    DEFF Research Database (Denmark)

    Hisham Beshara Halasa, Tariq; Willeberg, Preben; Christiansen, Lasse Engbo

    2013-01-01

    of affected herds, epidemic duration, geographical size, and costs. The first fourteen days spatial spread (FFS) was also included to support the prediction. The epidemic data were obtained from a Danish version (DTU-DADS) of the Davis Animal Disease Spread simulation model. The FFI and FFS showed good......The predictive capability of the first fortnight incidence (FFI), which is the number of detected herds within the first 14 days following detection of the disease, of the course of a foot-and-mouth disease (FMD) epidemic and its outcomes were investigated. Epidemic outcomes included the number...... correlations with the epidemic outcomes. The predictive capability of the FFI was high. This indicates that the FFI may take a part in the decision of whether or not to boost FMD control, which might prevent occurrence of a large epidemic in the face of an FMD incursion. The prediction power was improved...

  1. Haven't We Been Here Before? Some Comments on Steve Coffman's Proposal for "Earth's Largest Library".

    Science.gov (United States)

    McGervey, Teresa

    2000-01-01

    Discusses the concept of Earth's Largest Library (ELL), a mega-virtual library based on the Amazon.com model. Topics include who will be included; privacy; censorship; scope of the collection; costs; legal aspects; collection development; personnel management; access; the concept of community; public service; lending policies; technical…

  2. Customer Churn Prediction for Broadband Internet Services

    Science.gov (United States)

    Huang, B. Q.; Kechadi, M.-T.; Buckley, B.

    Although churn prediction has been an area of research in the voice branch of telecommunications services, more focused studies on the huge growth area of Broadband Internet services are limited. Therefore, this paper presents a new set of features for broadband Internet customer churn prediction, based on Henley segments, the broadband usage, dial types, the spend of dial-up, line-information, bill and payment information, account information. Then the four prediction techniques (Logistic Regressions, Decision Trees, Multilayer Perceptron Neural Networks and Support Vector Machines) are applied in customer churn, based on the new features. Finally, the evaluation of new features and a comparative analysis of the predictors are made for broadband customer churn prediction. The experimental results show that the new features with these four modelling techniques are efficient for customer churn prediction in the broadband service field.

  3. Prediction methods and databases within chemoinformatics

    DEFF Research Database (Denmark)

    Jónsdóttir, Svava Osk; Jørgensen, Flemming Steen; Brunak, Søren

    2005-01-01

    MOTIVATION: To gather information about available databases and chemoinformatics methods for prediction of properties relevant to the drug discovery and optimization process. RESULTS: We present an overview of the most important databases with 2-dimensional and 3-dimensional structural information...... about drugs and drug candidates, and of databases with relevant properties. Access to experimental data and numerical methods for selecting and utilizing these data is crucial for developing accurate predictive in silico models. Many interesting predictive methods for classifying the suitability...

  4. Is Kasei Valles (Mars) the largest volcanic channel in the solar system?

    Science.gov (United States)

    Leverington, David W.

    2018-02-01

    With a length of more than 2000 km and widths of up to several hundred kilometers, Kasei Valles is the largest outflow system on Mars. Superficially, the scabland-like character of Kasei Valles is evocative of terrestrial systems carved by catastrophic aqueous floods, and the system is widely interpreted as a product of outbursts from aquifers. However, as at other Martian outflow channels, clear examples of fluvial sedimentary deposits have proven difficult to identify here. Though Kasei Valles lacks several key properties expected of aqueous systems, its basic morphological and contextual properties are aligned with those of ancient volcanic channels on Venus, the Moon, Mercury, and Earth. There is abundant evidence that voluminous effusions of low-viscosity magmas occurred at the head of Kasei Valles, the channel system acted as a conduit for associated flows, and mare-style volcanic plains developed within its terminal basin. Combined mechanical and thermal incision rates of at least several meters per day are estimated to have been readily achieved at Kasei Valles by 20-m-deep magmas flowing with viscosities of 1 Pa s across low topographic slopes underlain by bedrock. If Kasei Valles formed through incision by magma, it would be the largest known volcanic channel in the solar system. The total volume of magma erupted at Kasei Valles is estimated here to have possibly reached or exceeded ∼5 × 106 km3, a volume comparable in magnitude to those that characterize individual Large Igneous Provinces on Earth. Development of other large outflow systems on Mars is expected to have similarly involved eruption of up to millions of cubic kilometers of magma.

  5. Predictability and Prediction for an Experimental Cultural Market

    Science.gov (United States)

    Colbaugh, Richard; Glass, Kristin; Ormerod, Paul

    Individuals are often influenced by the behavior of others, for instance because they wish to obtain the benefits of coordinated actions or infer otherwise inaccessible information. In such situations this social influence decreases the ex ante predictability of the ensuing social dynamics. We claim that, interestingly, these same social forces can increase the extent to which the outcome of a social process can be predicted very early in the process. This paper explores this claim through a theoretical and empirical analysis of the experimental music market described and analyzed in [1]. We propose a very simple model for this music market, assess the predictability of market outcomes through formal analysis of the model, and use insights derived through this analysis to develop algorithms for predicting market share winners, and their ultimate market shares, in the very early stages of the market. The utility of these predictive algorithms is illustrated through analysis of the experimental music market data sets [2].

  6. About Skin: Your Body's Largest Organ

    Science.gov (United States)

    ... Registration General information Housing & travel Education Exhibit hall Mobile app 2019 Annual Meeting Derm Exam Prep Course ... SkinPAC State societies Scope of practice Truth in advertising NP/PA laws Action center Public and patients ...

  7. Time compression of soil erosion by the effect of largest daily event. A regional analysis of USLE database.

    Science.gov (United States)

    Gonzalez-Hidalgo, J. C.; Batalla, R.; Cerda, A.; de Luis, M.

    2009-04-01

    When Thornes and Brunsden wrote in 1977 "How often one hears the researcher (and no less the undergraduate) complain that after weeks of observation "nothing happened" only to learn that, the day after his departure, a flood caused unprecedent erosion and channel changes!" (Thornes and Brunsden, 1977, p. 57), they focussed on two different problems in geomorphological research: the effects of extreme events and the temporal compression of geomorphological processes. The time compression is one of the main characteristic of erosion processes. It means that an important amount of the total soil eroded is produced in very short temporal intervals, i.e. few events mostly related to extreme events. From magnitude-frequency analysis we know that few events, not necessarily extreme by magnitude, produce high amount of geomorphological work. Last but not least, extreme isolated events are a classical issue in geomorphology by their specific effects, and they are receiving permanent attention, increased at present because of scenarios of global change. Notwithstanding, the time compression of geomorphological processes could be focused not only on the analysis of extreme events and the traditional magnitude-frequency approach, but on new complementary approach based on the effects of largest events. The classical approach define extreme event as a rare event (identified by its magnitude and quantified by some deviation from central value), while we define largest events by the rank, whatever their magnitude. In a previous research on time compression of soil erosion, using USLE soil erosion database (Gonzalez-Hidalgo et al., EGU 2007), we described a relationship between the total amount of daily erosive events recorded by plot and the percentage contribution to total soil erosion of n-largest aggregated daily events. Now we offer a further refined analysis comparing different agricultural regions in USA. To do that we have analyzed data from 594 erosion plots from USLE

  8. 78 FR 49781 - Notice of Intent To Seek Approval To Establish an Information Collection

    Science.gov (United States)

    2013-08-15

    ... advance fundamental computing, communications, and information science and engineering; educate a globally... goals. Projects funded through these four programs represent some of the largest single investments made...

  9. Predicting beta-turns and their types using predicted backbone dihedral angles and secondary structures.

    Science.gov (United States)

    Kountouris, Petros; Hirst, Jonathan D

    2010-07-31

    Beta-turns are secondary structure elements usually classified as coil. Their prediction is important, because of their role in protein folding and their frequent occurrence in protein chains. We have developed a novel method that predicts beta-turns and their types using information from multiple sequence alignments, predicted secondary structures and, for the first time, predicted dihedral angles. Our method uses support vector machines, a supervised classification technique, and is trained and tested on three established datasets of 426, 547 and 823 protein chains. We achieve a Matthews correlation coefficient of up to 0.49, when predicting the location of beta-turns, the highest reported value to date. Moreover, the additional dihedral information improves the prediction of beta-turn types I, II, IV, VIII and "non-specific", achieving correlation coefficients up to 0.39, 0.33, 0.27, 0.14 and 0.38, respectively. Our results are more accurate than other methods. We have created an accurate predictor of beta-turns and their types. Our method, called DEBT, is available online at http://comp.chem.nottingham.ac.uk/debt/.

  10. Predicting and analyzing DNA-binding domains using a systematic approach to identifying a set of informative physicochemical and biochemical properties

    Science.gov (United States)

    2011-01-01

    Background Existing methods of predicting DNA-binding proteins used valuable features of physicochemical properties to design support vector machine (SVM) based classifiers. Generally, selection of physicochemical properties and determination of their corresponding feature vectors rely mainly on known properties of binding mechanism and experience of designers. However, there exists a troublesome problem for designers that some different physicochemical properties have similar vectors of representing 20 amino acids and some closely related physicochemical properties have dissimilar vectors. Results This study proposes a systematic approach (named Auto-IDPCPs) to automatically identify a set of physicochemical and biochemical properties in the AAindex database to design SVM-based classifiers for predicting and analyzing DNA-binding domains/proteins. Auto-IDPCPs consists of 1) clustering 531 amino acid indices in AAindex into 20 clusters using a fuzzy c-means algorithm, 2) utilizing an efficient genetic algorithm based optimization method IBCGA to select an informative feature set of size m to represent sequences, and 3) analyzing the selected features to identify related physicochemical properties which may affect the binding mechanism of DNA-binding domains/proteins. The proposed Auto-IDPCPs identified m=22 features of properties belonging to five clusters for predicting DNA-binding domains with a five-fold cross-validation accuracy of 87.12%, which is promising compared with the accuracy of 86.62% of the existing method PSSM-400. For predicting DNA-binding sequences, the accuracy of 75.50% was obtained using m=28 features, where PSSM-400 has an accuracy of 74.22%. Auto-IDPCPs and PSSM-400 have accuracies of 80.73% and 82.81%, respectively, applied to an independent test data set of DNA-binding domains. Some typical physicochemical properties discovered are hydrophobicity, secondary structure, charge, solvent accessibility, polarity, flexibility, normalized Van Der

  11. Predictability of Conversation Partners

    Science.gov (United States)

    Takaguchi, Taro; Nakamura, Mitsuhiro; Sato, Nobuo; Yano, Kazuo; Masuda, Naoki

    2011-08-01

    Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information-theoretic method to the spatiotemporal data of cell-phone locations, [C. Song , ScienceSCIEAS0036-8075 327, 1018 (2010)] found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one’s conversation partners is defined as the degree to which one’s next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between sensor nodes in close proximity. We find that the conversation events are predictable to a certain extent; knowing the current partner decreases the uncertainty about the next partner by 28.4% on average. Much of the predictability is explained by long-tailed distributions of interevent intervals. However, a predictability also exists in the data, apart from the contribution of their long-tailed nature. In addition, an individual’s predictability is correlated with the position of the individual in the static social network derived from the data. Individuals confined in a community—in the sense of an abundance of surrounding triangles—tend to have low predictability, and those bridging different communities tend to have high predictability.

  12. Predictability of Conversation Partners

    Directory of Open Access Journals (Sweden)

    Taro Takaguchi

    2011-09-01

    Full Text Available Recent developments in sensing technologies have enabled us to examine the nature of human social behavior in greater detail. By applying an information-theoretic method to the spatiotemporal data of cell-phone locations, [C. Song et al., Science 327, 1018 (2010SCIEAS0036-8075] found that human mobility patterns are remarkably predictable. Inspired by their work, we address a similar predictability question in a different kind of human social activity: conversation events. The predictability in the sequence of one’s conversation partners is defined as the degree to which one’s next conversation partner can be predicted given the current partner. We quantify this predictability by using the mutual information. We examine the predictability of conversation events for each individual using the longitudinal data of face-to-face interactions collected from two company offices in Japan. Each subject wears a name tag equipped with an infrared sensor node, and conversation events are marked when signals are exchanged between sensor nodes in close proximity. We find that the conversation events are predictable to a certain extent; knowing the current partner decreases the uncertainty about the next partner by 28.4% on average. Much of the predictability is explained by long-tailed distributions of interevent intervals. However, a predictability also exists in the data, apart from the contribution of their long-tailed nature. In addition, an individual’s predictability is correlated with the position of the individual in the static social network derived from the data. Individuals confined in a community—in the sense of an abundance of surrounding triangles—tend to have low predictability, and those bridging different communities tend to have high predictability.

  13. Link prediction with node clustering coefficient

    Science.gov (United States)

    Wu, Zhihao; Lin, Youfang; Wang, Jing; Gregory, Steve

    2016-06-01

    Predicting missing links in incomplete complex networks efficiently and accurately is still a challenging problem. The recently proposed Cannistrai-Alanis-Ravai (CAR) index shows the power of local link/triangle information in improving link-prediction accuracy. Inspired by the idea of employing local link/triangle information, we propose a new similarity index with more local structure information. In our method, local link/triangle structure information can be conveyed by clustering coefficient of common-neighbors directly. The reason why clustering coefficient has good effectiveness in estimating the contribution of a common-neighbor is that it employs links existing between neighbors of a common-neighbor and these links have the same structural position with the candidate link to this common-neighbor. In our experiments, three estimators: precision, AUP and AUC are used to evaluate the accuracy of link prediction algorithms. Experimental results on ten tested networks drawn from various fields show that our new index is more effective in predicting missing links than CAR index, especially for networks with low correlation between number of common-neighbors and number of links between common-neighbors.

  14. Building predictive models of soil particle-size distribution

    Directory of Open Access Journals (Sweden)

    Alessandro Samuel-Rosa

    2013-04-01

    Full Text Available Is it possible to build predictive models (PMs of soil particle-size distribution (psd in a region with complex geology and a young and unstable land-surface? The main objective of this study was to answer this question. A set of 339 soil samples from a small slope catchment in Southern Brazil was used to build PMs of psd in the surface soil layer. Multiple linear regression models were constructed using terrain attributes (elevation, slope, catchment area, convergence index, and topographic wetness index. The PMs explained more than half of the data variance. This performance is similar to (or even better than that of the conventional soil mapping approach. For some size fractions, the PM performance can reach 70 %. Largest uncertainties were observed in geologically more complex areas. Therefore, significant improvements in the predictions can only be achieved if accurate geological data is made available. Meanwhile, PMs built on terrain attributes are efficient in predicting the particle-size distribution (psd of soils in regions of complex geology.

  15. Natural radionuclides in soil profiles surrounding the largest coal-fired power plant in Serbia

    OpenAIRE

    Tanić Milan N.; Janković-Mandić Ljiljana J.; Gajić Boško A.; Daković Marko Z.; Dragović Snežana D.; Bačić Goran G.

    2016-01-01

    This study evaluates the influence of the largest Serbian coal-fired power plant on radionuclide concentrations in soil profiles up to 50 cm in depth. Thirty soil profiles were sampled from the plant surroundings (up to 10 km distance) and analyzed using standard methods for soil physicochemical properties and gamma ray spectrometry for specific activities of natural radionuclides (40K, 226Ra and 232Th). Spatial and vertical distribution of radionuclides wa...

  16. Auditor detected misstatements and the effect of information technology

    OpenAIRE

    Austen, Lizabeth A.; Eilifsen, Aasmund; Messier, William F.

    2003-01-01

    This paper presents information on the causes and detection of misstatements by auditors and the relationship of those misstatements with information technology (IT). The last major study of misstatements and IT used data that was gathered in 1988. In the intervening period, there have been significant changes in IT, possibly altering the error generation and detection process. Two research questions related to detected misstatements and the effect of IT are examined. The six largest public a...

  17. Opportunities for biodiversity gains under the world's largest reforestation programme

    Science.gov (United States)

    Hua, Fangyuan; Wang, Xiaoyang; Zheng, Xinlei; Fisher, Brendan; Wang, Lin; Zhu, Jianguo; Tang, Ya; Yu, Douglas W.; Wilcove, David S.

    2016-01-01

    Reforestation is a critical means of addressing the environmental and social problems of deforestation. China's Grain-for-Green Program (GFGP) is the world's largest reforestation scheme. Here we provide the first nationwide assessment of the tree composition of GFGP forests and the first combined ecological and economic study aimed at understanding GFGP's biodiversity implications. Across China, GFGP forests are overwhelmingly monocultures or compositionally simple mixed forests. Focusing on birds and bees in Sichuan Province, we find that GFGP reforestation results in modest gains (via mixed forest) and losses (via monocultures) of bird diversity, along with major losses of bee diversity. Moreover, all current modes of GFGP reforestation fall short of restoring biodiversity to levels approximating native forests. However, even within existing modes of reforestation, GFGP can achieve greater biodiversity gains by promoting mixed forests over monocultures; doing so is unlikely to entail major opportunity costs or pose unforeseen economic risks to households. PMID:27598524

  18. El Paso natural gas nearing completion of system's largest expansion

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    El Paso Natural Gas Co.'s largest expansion program in its 64-year history will be completed along its northern system this spring or early summer. According to the company, the three-tiered, $241.5 million expansion program will increase El Paso's gas-transport capacity by 835 MMcfd to 2.5 bcfd of conventional and coal-seam gas from the San Juan basin in northwestern New Mexico. That's enough natural gas, says the company, to supply the needs of a city of more than 800,000 residents. This paper reports that the expansion involves the San Juan Triangle system, the company's northern main line, and the Permian-San Juan crossover line. The company also filed with the Federal Energy Regulatory Commission (FERC) in October 1991 to construct a new $15.2 million compressor station, Rio Vista, south of Bloomfield, N.M. The station would be used to move additional gas to the main line

  19. Contamination Control Assessment of the World's Largest Space Environment Simulation Chamber

    Science.gov (United States)

    Snyder, Aaron; Henry, Michael W.; Grisnik, Stanley P.; Sinclair, Stephen M.

    2012-01-01

    The Space Power Facility s thermal vacuum test chamber is the largest chamber in the world capable of providing an environment for space simulation. To improve performance and meet stringent requirements of a wide customer base, significant modifications were made to the vacuum chamber. These include major changes to the vacuum system and numerous enhancements to the chamber s unique polar crane, with a goal of providing high cleanliness levels. The significance of these changes and modifications are discussed in this paper. In addition, the composition and arrangement of the pumping system and its impact on molecular back-streaming are discussed in detail. Molecular contamination measurements obtained with a TQCM and witness wafers during two recent integrated system tests of the chamber are presented and discussed. Finally, a concluding remarks section is presented.

  20. Protein docking prediction using predicted protein-protein interface

    Directory of Open Access Journals (Sweden)

    Li Bin

    2012-01-01

    Full Text Available Abstract Background Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. Results We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm, is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. Conclusion We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.

  1. Protein docking prediction using predicted protein-protein interface.

    Science.gov (United States)

    Li, Bin; Kihara, Daisuke

    2012-01-10

    Many important cellular processes are carried out by protein complexes. To provide physical pictures of interacting proteins, many computational protein-protein prediction methods have been developed in the past. However, it is still difficult to identify the correct docking complex structure within top ranks among alternative conformations. We present a novel protein docking algorithm that utilizes imperfect protein-protein binding interface prediction for guiding protein docking. Since the accuracy of protein binding site prediction varies depending on cases, the challenge is to develop a method which does not deteriorate but improves docking results by using a binding site prediction which may not be 100% accurate. The algorithm, named PI-LZerD (using Predicted Interface with Local 3D Zernike descriptor-based Docking algorithm), is based on a pair wise protein docking prediction algorithm, LZerD, which we have developed earlier. PI-LZerD starts from performing docking prediction using the provided protein-protein binding interface prediction as constraints, which is followed by the second round of docking with updated docking interface information to further improve docking conformation. Benchmark results on bound and unbound cases show that PI-LZerD consistently improves the docking prediction accuracy as compared with docking without using binding site prediction or using the binding site prediction as post-filtering. We have developed PI-LZerD, a pairwise docking algorithm, which uses imperfect protein-protein binding interface prediction to improve docking accuracy. PI-LZerD consistently showed better prediction accuracy over alternative methods in the series of benchmark experiments including docking using actual docking interface site predictions as well as unbound docking cases.

  2. [The system of informal caregiving as inequality].

    Science.gov (United States)

    García-Calvente, María del Mar; Mateo-Rodríguez, Inmaculada; Eguiguren, Ana P

    2004-05-01

    In our setting, it is families, not the health and social services, who play the greatest role in providing continuous care to persons in need of such services. Informal health care poses two key questions with regard to the issue of equity: differences in the burdens borne by men and women, which contribute to gender inequality and, depending on their educational and socio-economic level, inequities in their ability to choose and gain access to needed resources and support services, thus contributing to social class inequalities. Distributing the burden of caregiving between men and women, and between the family and the state, constitutes a crucial debate in public health. This study analyzes the concept and characteristics of informal care, provides data on its dimensions in our setting, and analyzes the profile of caregivers, as well as the work they do and the impact it has on their lives. Finally, it presents currently existing models and support strategies for informal caregivers. It is largely women who assume the principal role of providing informal care, undertaking the most difficult and demanding tasks and dedicating the largest share of their time to them. As a result, women bear an elevated cost in their lives in terms of health, quality of life, access to employment and professional development, social relations, availability of time for themselves, and economic repercussions. Unemployed, under-educated women from the least privileged social classes constitute the largest group of informal caregivers in our country. Any policies aimed at supporting those who provide such care should keep in mind the unequal point from which they start and be evaluated in terms of their impact on gender and social class inequality.

  3. Seeing Central African forests through their largest trees

    NARCIS (Netherlands)

    Bastin, J.F.; Barbier, N.; Réjou-Méchain, M.; Fayolle, A.; Gourlet-Fleury, S.; Maniatis, D.; Haulleville, De T.; Baya, F.; Beeckman, H.; Beina, D.; Couteron, P.; Chuyong, G.; Dauby, G.; Doucet, J.L.; Droissart, V.; Dufrêne, M.; Ewango, C.E.N.; Gillet, F.; Gonmadje, C.H.; Hart, T.; Kavali, T.; Kenfack, D.; Libalah, M.; Malhi, Y.; Makana, J.R.; Pélissier, R.; Ploton, P.; Serckx, S.; Sonké, B.; Stevart, T.; Thomas, D.W.; Cannière, De C.; Bogaert, J.

    2015-01-01

    Large tropical trees and a few dominant species were recently identified as the main structuring elements of tropical forests. However, such result did not translate yet into quantitative approaches which are essential to understand, predict and monitor forest functions and composition over large,

  4. Mental models accurately predict emotion transitions.

    Science.gov (United States)

    Thornton, Mark A; Tamir, Diana I

    2017-06-06

    Successful social interactions depend on people's ability to predict others' future actions and emotions. People possess many mechanisms for perceiving others' current emotional states, but how might they use this information to predict others' future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others' emotional dynamics. People could then use these mental models of emotion transitions to predict others' future emotions from currently observable emotions. To test this hypothesis, studies 1-3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants' ratings of emotion transitions predicted others' experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation-valence, social impact, rationality, and human mind-inform participants' mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants' accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone.

  5. Mental models accurately predict emotion transitions

    Science.gov (United States)

    Thornton, Mark A.; Tamir, Diana I.

    2017-01-01

    Successful social interactions depend on people’s ability to predict others’ future actions and emotions. People possess many mechanisms for perceiving others’ current emotional states, but how might they use this information to predict others’ future states? We hypothesized that people might capitalize on an overlooked aspect of affective experience: current emotions predict future emotions. By attending to regularities in emotion transitions, perceivers might develop accurate mental models of others’ emotional dynamics. People could then use these mental models of emotion transitions to predict others’ future emotions from currently observable emotions. To test this hypothesis, studies 1–3 used data from three extant experience-sampling datasets to establish the actual rates of emotional transitions. We then collected three parallel datasets in which participants rated the transition likelihoods between the same set of emotions. Participants’ ratings of emotion transitions predicted others’ experienced transitional likelihoods with high accuracy. Study 4 demonstrated that four conceptual dimensions of mental state representation—valence, social impact, rationality, and human mind—inform participants’ mental models. Study 5 used 2 million emotion reports on the Experience Project to replicate both of these findings: again people reported accurate models of emotion transitions, and these models were informed by the same four conceptual dimensions. Importantly, neither these conceptual dimensions nor holistic similarity could fully explain participants’ accuracy, suggesting that their mental models contain accurate information about emotion dynamics above and beyond what might be predicted by static emotion knowledge alone. PMID:28533373

  6. Geographical information system and predictive risk maps of urinary schistosomiasis in Ogun State, Nigeria

    Directory of Open Access Journals (Sweden)

    Solarin Adewale RT

    2008-05-01

    Full Text Available Abstract Background The control of urinary schistosomiasis in Ogun State, Nigeria remains inert due to lack of reliable data on the geographical distribution of the disease and the population at risk. To help in developing a control programme, delineating areas of risk, geographical information system and remotely sensed environmental images were used to developed predictive risk maps of the probability of occurrence of the disease and quantify the risk for infection in Ogun State, Nigeria. Methods Infection data used were derived from carefully validated morbidity questionnaires among primary school children in 2001–2002, in which school children were asked among other questions if they have experienced "blood in urine" or urinary schistosomiasis. The infection data from 1,092 schools together with remotely sensed environmental data such as rainfall, vegetation, temperature, soil-types, altitude and land cover were analysis using binary logistic regression models to identify environmental features that influence the spatial distribution of the disease. The final regression equations were then used in Arc View 3.2a GIS software to generate predictive risk maps of the distribution of the disease and population at risk in the state. Results Logistic regression analysis shows that the only significant environmental variable in predicting the presence and absence of urinary schistosomiasis in any area of the State was Land Surface Temperature (LST (B = 0.308, p = 0.013. While LST (B = -0.478, p = 0.035, rainfall (B = -0.006, p = 0.0005, ferric luvisols (B = 0.539, p = 0.274, dystric nitosols (B = 0.133, p = 0.769 and pellic vertisols (B = 1.386, p = 0.008 soils types were the final variables in the model for predicting the probability of an area having an infection prevalence equivalent to or more than 50%. The two predictive risk maps suggest that urinary schistosomiasis is widely distributed and occurring in all the Local Government Areas (LGAs

  7. The population genetics of Quechuas, the largest native South American group: autosomal sequences, SNPs, and microsatellites evidence high level of diversity.

    Science.gov (United States)

    Scliar, Marilia O; Soares-Souza, Giordano B; Chevitarese, Juliana; Lemos, Livia; Magalhães, Wagner C S; Fagundes, Nelson J; Bonatto, Sandro L; Yeager, Meredith; Chanock, Stephen J; Tarazona-Santos, Eduardo

    2012-03-01

    Elucidating the pattern of genetic diversity for non-European populations is necessary to make the benefits of human genetics research available to individuals from these groups. In the era of large human genomic initiatives, Native American populations have been neglected, in particular, the Quechua, the largest South Amerindian group settled along the Andes. We characterized the genetic diversity of a Quechua population in a global setting, using autosomal noncoding sequences (nine unlinked loci for a total of 16 kb), 351 unlinked SNPs and 678 microsatellites and tested predictions of the model of the evolution of Native Americans proposed by (Tarazona-Santos et al.: Am J Hum Genet 68 (2001) 1485-1496). European admixture is Quechua or Melanesian populations, which is concordant with the African origin of modern humans and the fact that South America was the last part of the world to be peopled. The diversity in the Quechua population is comparable with that of Eurasian populations, and the allele frequency spectrum based on resequencing data does not reflect a reduction in the proportion of rare alleles. Thus, the Quechua population is a large reservoir of common and rare genetic variants of South Amerindians. These results are consistent with and complement our evolutionary model of South Amerindians (Tarazona-Santos et al.: Am J Hum Genet 68 (2001) 1485-1496), proposed based on Y-chromosome data, which predicts high genomic diversity due to the high level of gene flow between Andean populations and their long-term effective population size. Copyright © 2012 Wiley Periodicals, Inc.

  8. Predicting Atomic Decay Rates Using an Informational-Entropic Approach

    Science.gov (United States)

    Gleiser, Marcelo; Jiang, Nan

    2018-06-01

    We show that a newly proposed Shannon-like entropic measure of shape complexity applicable to spatially-localized or periodic mathematical functions known as configurational entropy (CE) can be used as a predictor of spontaneous decay rates for one-electron atoms. The CE is constructed from the Fourier transform of the atomic probability density. For the hydrogen atom with degenerate states labeled with the principal quantum number n, we obtain a scaling law relating the n-averaged decay rates to the respective CE. The scaling law allows us to predict the n-averaged decay rate without relying on the traditional computation of dipole matrix elements. We tested the predictive power of our approach up to n = 20, obtaining an accuracy better than 3.7% within our numerical precision, as compared to spontaneous decay tables listed in the literature.

  9. Predicting Atomic Decay Rates Using an Informational-Entropic Approach

    Science.gov (United States)

    Gleiser, Marcelo; Jiang, Nan

    2018-02-01

    We show that a newly proposed Shannon-like entropic measure of shape complexity applicable to spatially-localized or periodic mathematical functions known as configurational entropy (CE) can be used as a predictor of spontaneous decay rates for one-electron atoms. The CE is constructed from the Fourier transform of the atomic probability density. For the hydrogen atom with degenerate states labeled with the principal quantum number n, we obtain a scaling law relating the n-averaged decay rates to the respective CE. The scaling law allows us to predict the n-averaged decay rate without relying on the traditional computation of dipole matrix elements. We tested the predictive power of our approach up to n = 20, obtaining an accuracy better than 3.7% within our numerical precision, as compared to spontaneous decay tables listed in the literature.

  10. Chebyshev polynomial functions based locally recurrent neuro-fuzzy information system for prediction of financial and energy market data

    Directory of Open Access Journals (Sweden)

    A.K. Parida

    2016-09-01

    Full Text Available In this paper Chebyshev polynomial functions based locally recurrent neuro-fuzzy information system is presented for the prediction and analysis of financial and electrical energy market data. The normally used TSK-type feedforward fuzzy neural network is unable to take the full advantage of the use of the linear fuzzy rule base in accurate input–output mapping and hence the consequent part of the rule base is made nonlinear using polynomial or arithmetic basis functions. Further the Chebyshev polynomial functions provide an expanded nonlinear transformation to the input space thereby increasing its dimension for capturing the nonlinearities and chaotic variations in financial or energy market data streams. Also the locally recurrent neuro-fuzzy information system (LRNFIS includes feedback loops both at the firing strength layer and the output layer to allow signal flow both in forward and backward directions, thereby making the LRNFIS mimic a dynamic system that provides fast convergence and accuracy in predicting time series fluctuations. Instead of using forward and backward least mean square (FBLMS learning algorithm, an improved Firefly-Harmony search (IFFHS learning algorithm is used to estimate the parameters of the consequent part and feedback loop parameters for better stability and convergence. Several real world financial and energy market time series databases are used for performance validation of the proposed LRNFIS model.

  11. Using Telepresence to Connect and Engage Classes and the Public in the Exploration of Tamu Massif, the World's Largest Single Volcano

    Science.gov (United States)

    Nanez-James, S. E.; Sager, W.

    2016-02-01

    Research published in 2013 showed that TAMU Massif, the largest mountain in the Shatsky Rise oceanic plateau, located approximately 1500 kilometers east of Japan, is the "World's Largest Single Volcano." This claim garnered widespread public interest and wonder concerning how something so big could remain so mysterious in the 21st century. This disconnect highlights the fact that oceans are still widely unexplored, especially the middle of the deep ocean. Because there is so much interest in TAMU Massif, a diverse outreach team lead by chief scientist Dr. William Sager from the University of Houston in partnership with the Texas State Aquarium and the Schmidt Ocean Institute (SOI) conducted a multifaceted ship-to-shore outreach project that included secondary school students, formal and informal educators, university students and professors, the aquarium and museum audience, and the general public. The objective was to work in conjunction with SOI and various other partners, including the Texas Regional Collaborative, the Aquarium of the Pacific, and the Houston Museum of Natural Science, to promote science and ocean literacy while inspiring future scientists - especially those from underserved and underrepresented groups - through ocean connections. Participants were connected through live ship-to-shore distance learning broadcasts of ongoing marine research and discovery of TAMU Massif aboard the R/V Falkor, allowing audiences to participate in real-time research and apply real world science to curriculum in the classrooms. These ship-to-shore presentations connected to existing curriculums and standards, lessons, and career interests of the students and educators with special teacher events and professional development workshops conducted from aboard the R/V Falkor.

  12. Deep Visual Attention Prediction

    Science.gov (United States)

    Wang, Wenguan; Shen, Jianbing

    2018-05-01

    In this work, we aim to predict human eye fixation with view-free scenes based on an end-to-end deep learning architecture. Although Convolutional Neural Networks (CNNs) have made substantial improvement on human attention prediction, it is still needed to improve CNN based attention models by efficiently leveraging multi-scale features. Our visual attention network is proposed to capture hierarchical saliency information from deep, coarse layers with global saliency information to shallow, fine layers with local saliency response. Our model is based on a skip-layer network structure, which predicts human attention from multiple convolutional layers with various reception fields. Final saliency prediction is achieved via the cooperation of those global and local predictions. Our model is learned in a deep supervision manner, where supervision is directly fed into multi-level layers, instead of previous approaches of providing supervision only at the output layer and propagating this supervision back to earlier layers. Our model thus incorporates multi-level saliency predictions within a single network, which significantly decreases the redundancy of previous approaches of learning multiple network streams with different input scales. Extensive experimental analysis on various challenging benchmark datasets demonstrate our method yields state-of-the-art performance with competitive inference time.

  13. German Migrant Teachers in Australia: Insights into the Largest Cohort of Non-English Speaking Background Teachers

    Science.gov (United States)

    Bense, Katharina

    2015-01-01

    The research reported in this paper investigates the situation of German migrant teachers in Australia. Although German born teachers represent the largest group of non-English speaking background teachers in Australia, there is no study of the circumstances and experiences of these teachers in Australia. This study aims to fill this gap. It…

  14. State-space prediction of spring discharge in a karst catchment in southwest China

    Science.gov (United States)

    Li, Zhenwei; Xu, Xianli; Liu, Meixian; Li, Xuezhang; Zhang, Rongfei; Wang, Kelin; Xu, Chaohao

    2017-06-01

    Southwest China represents one of the largest continuous karst regions in the world. It is estimated that around 1.7 million people are heavily dependent on water derived from karst springs in southwest China. However, there is a limited amount of water supply in this region. Moreover, there is not enough information on temporal patterns of spring discharge in the area. In this context, it is essential to accurately predict spring discharge, as well as understand karst hydrological processes in a thorough manner, so that water shortages in this area could be predicted and managed efficiently. The objectives of this study were to determine the primary factors that govern spring discharge patterns and to develop a state-space model to predict spring discharge. Spring discharge, precipitation (PT), relative humidity (RD), water temperature (WD), and electrical conductivity (EC) were the variables analyzed in the present work, and they were monitored at two different locations (referred to as karst springs A and B, respectively, in this paper) in a karst catchment area in southwest China from May to November 2015. Results showed that a state-space model using any combinations of variables outperformed a classical linear regression, a back-propagation artificial neural network model, and a least square support vector machine in modeling spring discharge time series for karst spring A. The best state-space model was obtained by using PT and RD, which accounted for 99.9% of the total variation in spring discharge. This model was then applied to an independent data set obtained from karst spring B, and it provided accurate spring discharge estimates. Therefore, state-space modeling was a useful tool for predicting spring discharge in karst regions in southwest China, and this modeling procedure may help researchers to obtain accurate results in other karst regions.

  15. Improvement of prediction ability for genomic selection of dairy cattle by including dominance effects.

    Directory of Open Access Journals (Sweden)

    Chuanyu Sun

    Full Text Available Dominance may be an important source of non-additive genetic variance for many traits of dairy cattle. However, nearly all prediction models for dairy cattle have included only additive effects because of the limited number of cows with both genotypes and phenotypes. The role of dominance in the Holstein and Jersey breeds was investigated for eight traits: milk, fat, and protein yields; productive life; daughter pregnancy rate; somatic cell score; fat percent and protein percent. Additive and dominance variance components were estimated and then used to estimate additive and dominance effects of single nucleotide polymorphisms (SNPs. The predictive abilities of three models with both additive and dominance effects and a model with additive effects only were assessed using ten-fold cross-validation. One procedure estimated dominance values, and another estimated dominance deviations; calculation of the dominance relationship matrix was different for the two methods. The third approach enlarged the dataset by including cows with genotype probabilities derived using genotyped ancestors. For yield traits, dominance variance accounted for 5 and 7% of total variance for Holsteins and Jerseys, respectively; using dominance deviations resulted in smaller dominance and larger additive variance estimates. For non-yield traits, dominance variances were very small for both breeds. For yield traits, including additive and dominance effects fit the data better than including only additive effects; average correlations between estimated genetic effects and phenotypes showed that prediction accuracy increased when both effects rather than just additive effects were included. No corresponding gains in prediction ability were found for non-yield traits. Including cows with derived genotype probabilities from genotyped ancestors did not improve prediction accuracy. The largest additive effects were located on chromosome 14 near DGAT1 for yield traits for both

  16. On the Use of Time-Limited Information for Maintenance Decision Support: A Predictive Approach under Maintenance Constraints

    Directory of Open Access Journals (Sweden)

    E. Khoury

    2013-01-01

    Full Text Available This paper deals with a gradually deteriorating system operating under an uncertain environment whose state is only known on a finite rolling horizon. As such, the system is subject to constraints. Maintenance actions can only be planned at imposed times called maintenance opportunities that are available on a limited visibility horizon. This system can, for example, be a commercial vehicle with a monitored critical component that can be maintained only in some specific workshops. Based on the considered system, we aim to use the monitoring data and the time-limited information for maintenance decision support in order to reduce its costs. We propose two predictive maintenance policies based, respectively, on cost and reliability criteria. Classical age-based and condition-based policies are considered as benchmarks. The performance assessment shows the value of the different types of information and the best way to use them in maintenance decision making.

  17. Correlating Structural Order with Structural Rearrangement in Dusty Plasma Liquids: Can Structural Rearrangement be Predicted by Static Structural Information?

    Science.gov (United States)

    Su, Yen-Shuo; Liu, Yu-Hsuan; I, Lin

    2012-11-01

    Whether the static microstructural order information is strongly correlated with the subsequent structural rearrangement (SR) and their predicting power for SR are investigated experimentally in the quenched dusty plasma liquid with microheterogeneities. The poor local structural order is found to be a good alarm to identify the soft spot and predict the short term SR. For the site with good structural order, the persistent time for sustaining the structural memory until SR has a large mean value but a broad distribution. The deviation of the local structural order from that averaged over nearest neighbors serves as a good second alarm to further sort out the short time SR sites. It has the similar sorting power to that using the temporal fluctuation of the local structural order over a small time interval.

  18. In silico prediction of Tetrahymena pyriformis toxicity for diverse industrial chemicals with substructure pattern recognition and machine learning methods.

    Science.gov (United States)

    Cheng, Feixiong; Shen, Jie; Yu, Yue; Li, Weihua; Liu, Guixia; Lee, Philip W; Tang, Yun

    2011-03-01

    There is an increasing need for the rapid safety assessment of chemicals by both industries and regulatory agencies throughout the world. In silico techniques are practical alternatives in the environmental hazard assessment. It is especially true to address the persistence, bioaccumulative and toxicity potentials of organic chemicals. Tetrahymena pyriformis toxicity is often used as a toxic endpoint. In this study, 1571 diverse unique chemicals were collected from the literature and composed of the largest diverse data set for T. pyriformis toxicity. Classification predictive models of T. pyriformis toxicity were developed by substructure pattern recognition and different machine learning methods, including support vector machine (SVM), C4.5 decision tree, k-nearest neighbors and random forest. The results of a 5-fold cross-validation showed that the SVM method performed better than other algorithms. The overall predictive accuracies of the SVM classification model with radial basis functions kernel was 92.2% for the 5-fold cross-validation and 92.6% for the external validation set, respectively. Furthermore, several representative substructure patterns for characterizing T. pyriformis toxicity were also identified via the information gain analysis methods. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. Online Prediction of Health Care Utilization in the Next Six Months Based on Electronic Health Record Information: A Cohort and Validation Study.

    Science.gov (United States)

    Hu, Zhongkai; Hao, Shiying; Jin, Bo; Shin, Andrew Young; Zhu, Chunqing; Huang, Min; Wang, Yue; Zheng, Le; Dai, Dorothy; Culver, Devore S; Alfreds, Shaun T; Rogow, Todd; Stearns, Frank; Sylvester, Karl G; Widen, Eric; Ling, Xuefeng

    2015-09-22

    The increasing rate of health care expenditures in the United States has placed a significant burden on the nation's economy. Predicting future health care utilization of patients can provide useful information to better understand and manage overall health care deliveries and clinical resource allocation. This study developed an electronic medical record (EMR)-based online risk model predictive of resource utilization for patients in Maine in the next 6 months across all payers, all diseases, and all demographic groups. In the HealthInfoNet, Maine's health information exchange (HIE), a retrospective cohort of 1,273,114 patients was constructed with the preceding 12-month EMR. Each patient's next 6-month (between January 1, 2013 and June 30, 2013) health care resource utilization was retrospectively scored ranging from 0 to 100 and a decision tree-based predictive model was developed. Our model was later integrated in the Maine HIE population exploration system to allow a prospective validation analysis of 1,358,153 patients by forecasting their next 6-month risk of resource utilization between July 1, 2013 and December 31, 2013. Prospectively predicted risks, on either an individual level or a population (per 1000 patients) level, were consistent with the next 6-month resource utilization distributions and the clinical patterns at the population level. Results demonstrated the strong correlation between its care resource utilization and our risk scores, supporting the effectiveness of our model. With the online population risk monitoring enterprise dashboards, the effectiveness of the predictive algorithm has been validated by clinicians and caregivers in the State of Maine. The model and associated online applications were designed for tracking the evolving nature of total population risk, in a longitudinal manner, for health care resource utilization. It will enable more effective care management strategies driving improved patient outcomes.

  20. Semen analysis and prediction of natural conception

    NARCIS (Netherlands)

    Leushuis, Esther; van der Steeg, Jan Willem; Steures, Pieternel; Repping, Sjoerd; Bossuyt, Patrick M. M.; Mol, Ben Willem J.; Hompes, Peter G. A.; van der Veen, Fulco

    2014-01-01

    Do two semen analyses predict natural conception better than a single semen analysis and will adding the results of repeated semen analyses to a prediction model for natural pregnancy improve predictions? A second semen analysis does not add helpful information for predicting natural conception

  1. Integrating milk metabolite profile information for the prediction of traditional milk traits based on SNP information for Holstein cows.

    Directory of Open Access Journals (Sweden)

    Nina Melzer

    Full Text Available In this study the benefit of metabolome level analysis for the prediction of genetic value of three traditional milk traits was investigated. Our proposed approach consists of three steps: First, milk metabolite profiles are used to predict three traditional milk traits of 1,305 Holstein cows. Two regression methods, both enabling variable selection, are applied to identify important milk metabolites in this step. Second, the prediction of these important milk metabolite from single nucleotide polymorphisms (SNPs enables the detection of SNPs with significant genetic effects. Finally, these SNPs are used to predict milk traits. The observed precision of predicted genetic values was compared to the results observed for the classical genotype-phenotype prediction using all SNPs or a reduced SNP subset (reduced classical approach. To enable a comparison between SNP subsets, a special invariable evaluation design was implemented. SNPs close to or within known quantitative trait loci (QTL were determined. This enabled us to determine if detected important SNP subsets were enriched in these regions. The results show that our approach can lead to genetic value prediction, but requires less than 1% of the total amount of (40,317 SNPs., significantly more important SNPs in known QTL regions were detected using our approach compared to the reduced classical approach. Concluding, our approach allows a deeper insight into the associations between the different levels of the genotype-phenotype map (genotype-metabolome, metabolome-phenotype, genotype-phenotype.

  2. Revisiting the impacts of oil price increases on monetary policy implementation in the largest oil importers

    Directory of Open Access Journals (Sweden)

    Nurtac Yildirim

    2015-06-01

    Full Text Available The aim of this paper is to test the impacts of oil price increases on monetary policy implementation in the largest oil importers. For that purpose, we estimate structural vector error correction (SVEC models to show the impacts of oil price increases on industrial production, consumer prices and immediate interest rates which are the elements of Taylor rule for the four largest oil importers (the USA, the EU, China and Japan. Our results indicate that oil price increases transmit to output and inflation and lead to fluctuations in industrial production, consumer prices and immediate interest rates which in turn influence the monetary policy stance in the following periods. The basic conclusion of research is that the channels through which oil prices affect output, inflation and interest rates should be identified by the monetary policy authorities of the USA, the EU, China and Japan. We also emphasize the importance of the determination of the optimal monetary policy framework to eliminate the negative consequences of oil price increases.

  3. Impact of modellers' decisions on hydrological a priori predictions

    Science.gov (United States)

    Holländer, H. M.; Bormann, H.; Blume, T.; Buytaert, W.; Chirico, G. B.; Exbrayat, J.-F.; Gustafsson, D.; Hölzel, H.; Krauße, T.; Kraft, P.; Stoll, S.; Blöschl, G.; Flühler, H.

    2014-06-01

    In practice, the catchment hydrologist is often confronted with the task of predicting discharge without having the needed records for calibration. Here, we report the discharge predictions of 10 modellers - using the model of their choice - for the man-made Chicken Creek catchment (6 ha, northeast Germany, Gerwin et al., 2009b) and we analyse how well they improved their prediction in three steps based on adding information prior to each following step. The modellers predicted the catchment's hydrological response in its initial phase without having access to the observed records. They used conceptually different physically based models and their modelling experience differed largely. Hence, they encountered two problems: (i) to simulate discharge for an ungauged catchment and (ii) using models that were developed for catchments, which are not in a state of landscape transformation. The prediction exercise was organized in three steps: (1) for the first prediction the modellers received a basic data set describing the catchment to a degree somewhat more complete than usually available for a priori predictions of ungauged catchments; they did not obtain information on stream flow, soil moisture, nor groundwater response and had therefore to guess the initial conditions; (2) before the second prediction they inspected the catchment on-site and discussed their first prediction attempt; (3) for their third prediction they were offered additional data by charging them pro forma with the costs for obtaining this additional information. Holländer et al. (2009) discussed the range of predictions obtained in step (1). Here, we detail the modeller's assumptions and decisions in accounting for the various processes. We document the prediction progress as well as the learning process resulting from the availability of added information. For the second and third steps, the progress in prediction quality is evaluated in relation to individual modelling experience and costs of

  4. INNOVATIVE METHODS TO EVALUATE THE RELIABILITY OF INFORMATION CONSOLIDATED FINANCIAL STATEMENTS

    Directory of Open Access Journals (Sweden)

    Irina P. Kurochkina

    2014-01-01

    Full Text Available The article explores the possibility of using foreign innovative methods to assess the reliabilityof information consolidated fi nancial statements of Russian companies. Recommendations aremade under their adaptation and applicationinto commercial organizations. Banish methodindicators are implemented in one of the world’s largest vertically integrated steel and miningcompanies. Audit firms are proposed to usemethods of assessing the reliability of information in the practical application of ISA.

  5. Applications of contact predictions to structural biology

    Directory of Open Access Journals (Sweden)

    Felix Simkovic

    2017-05-01

    Full Text Available Evolutionary pressure on residue interactions, intramolecular or intermolecular, that are important for protein structure or function can lead to covariance between the two positions. Recent methodological advances allow much more accurate contact predictions to be derived from this evolutionary covariance signal. The practical application of contact predictions has largely been confined to structural bioinformatics, yet, as this work seeks to demonstrate, the data can be of enormous value to the structural biologist working in X-ray crystallography, cryo-EM or NMR. Integrative structural bioinformatics packages such as Rosetta can already exploit contact predictions in a variety of ways. The contribution of contact predictions begins at construct design, where structural domains may need to be expressed separately and contact predictions can help to predict domain limits. Structure solution by molecular replacement (MR benefits from contact predictions in diverse ways: in difficult cases, more accurate search models can be constructed using ab initio modelling when predictions are available, while intermolecular contact predictions can allow the construction of larger, oligomeric search models. Furthermore, MR using supersecondary motifs or large-scale screens against the PDB can exploit information, such as the parallel or antiparallel nature of any β-strand pairing in the target, that can be inferred from contact predictions. Contact information will be particularly valuable in the determination of lower resolution structures by helping to assign sequence register. In large complexes, contact information may allow the identity of a protein responsible for a certain region of density to be determined and then assist in the orientation of an available model within that density. In NMR, predicted contacts can provide long-range information to extend the upper size limit of the technique in a manner analogous but complementary to experimental

  6. The largest deep-ocean silicic volcanic eruption of the past century.

    Science.gov (United States)

    Carey, Rebecca; Soule, S Adam; Manga, Michael; White, James; McPhie, Jocelyn; Wysoczanski, Richard; Jutzeler, Martin; Tani, Kenichiro; Yoerger, Dana; Fornari, Daniel; Caratori-Tontini, Fabio; Houghton, Bruce; Mitchell, Samuel; Ikegami, Fumihiko; Conway, Chris; Murch, Arran; Fauria, Kristen; Jones, Meghan; Cahalan, Ryan; McKenzie, Warren

    2018-01-01

    The 2012 submarine eruption of Havre volcano in the Kermadec arc, New Zealand, is the largest deep-ocean eruption in history and one of very few recorded submarine eruptions involving rhyolite magma. It was recognized from a gigantic 400-km 2 pumice raft seen in satellite imagery, but the complexity of this event was concealed beneath the sea surface. Mapping, observations, and sampling by submersibles have provided an exceptionally high fidelity record of the seafloor products, which included lava sourced from 14 vents at water depths of 900 to 1220 m, and fragmental deposits including giant pumice clasts up to 9 m in diameter. Most (>75%) of the total erupted volume was partitioned into the pumice raft and transported far from the volcano. The geological record on submarine volcanic edifices in volcanic arcs does not faithfully archive eruption size or magma production.

  7. What Predicts Online Health Information-Seeking Behavior Among Egyptian Adults? A Cross-Sectional Study.

    Science.gov (United States)

    Ghweeba, Mayada; Lindenmeyer, Antje; Shishi, Sobhi; Abbas, Mostafa; Waheed, Amani; Amer, Shaymaa

    2017-06-22

    Over the last decade, the Internet has become an important source of health-related information for a wide range of users worldwide. Yet, little is known about the personal characteristics of Egyptian Internet users who search for online health information (OHI). The aim of the study was to identify the personal characteristics of Egyptian OHI seekers and to determine any associations between their personal characteristics and their health information-seeking behavior.  This cross-sectional questionnaire study was conducted from June to October 2015. A Web-based questionnaire was sent to Egyptian users aged 18 years and older (N=1400) of a popular Arabic-language health information website. The questionnaire included (1) demographic characteristics; (2) self-reported general health status; and (3) OHI-seeking behavior that included frequency of use, different topics sought, and self-reported impact of obtained OHI on health behaviors. Data were analyzed using descriptive statistics and multiple regression analysis. A total of 490 participants completed the electronic questionnaire with a response rate equivalent to 35.0% (490/1400). Regarding personal characteristics, 57.1% (280/490) of participants were females, 63.4% (311/490) had a university level qualification, and 37.1% (182/490) had a chronic health problem. The most commonly sought OHI by the participants was nutrition-related. Results of the multiple regression analysis showed that 31.0% of the variance in frequency of seeking OHI among Egyptian adults can be predicted by personal characteristics. Participants who sought OHI more frequently were likely to be female, of younger age, had higher education levels, and good self-reported general health. Our results provide insights into personal characteristics and OHI-seeking behaviors of Egyptian OHI users. This will contribute to better recognize their needs, highlight ways to increase the availability of appropriate OHI, and may lead to the

  8. An information maximization model of eye movements

    Science.gov (United States)

    Renninger, Laura Walker; Coughlan, James; Verghese, Preeti; Malik, Jitendra

    2005-01-01

    We propose a sequential information maximization model as a general strategy for programming eye movements. The model reconstructs high-resolution visual information from a sequence of fixations, taking into account the fall-off in resolution from the fovea to the periphery. From this framework we get a simple rule for predicting fixation sequences: after each fixation, fixate next at the location that minimizes uncertainty (maximizes information) about the stimulus. By comparing our model performance to human eye movement data and to predictions from a saliency and random model, we demonstrate that our model is best at predicting fixation locations. Modeling additional biological constraints will improve the prediction of fixation sequences. Our results suggest that information maximization is a useful principle for programming eye movements.

  9. The role of atmospheric diagnosis and Big Data science in improving hydroclimatic extreme prediction and the merits of climate informed prediction for future water resources management

    Science.gov (United States)

    Lu, Mengqian; Lall, Upmanu

    2017-04-01

    The threats that hydroclimatic extremes pose to sustainable development, safety and operation of infrastructure are both severe and growing. Recent heavy precipitation triggered flood events in many regions and increasing frequency and intensity of extreme precipitation suggested by various climate projections highlight the importance of understanding the associated hydrometeorological patterns and space-time variability of such extreme events, and developing a new approach to improve predictability with a better estimation of uncertainty. This clear objective requires the optimal utility of Big Data analytics on multi-source datasets to extract informative predictors from the complex ocean-atmosphere coupled system and develop a statistical and physical based framework. The proposed presentation includes the essence of our selected works in the past two years, as part of our Global Floods Initiatives. Our approach for an improved extreme prediction begins with a better understanding of the associated atmospheric circulation patterns, under the influence and regulation of slowly changing oceanic boundary conditions [Lu et al., 2013, 2016a; Lu and Lall, 2016]. The study of the associated atmospheric circulation pattern and the regulation of teleconnected climate signals adopted data science techniques and statistical modeling recognizing the nonstationarity and nonlinearity of the system, as the underlying statistical assumptions of the classical extreme value frequency analysis are challenged in hydroclimatic studies. There are two main factors that are considered important for understanding how future flood risk will change. One is the consideration of moisture holding capacity as a function of temperature, as suggested by Clausius-Clapeyron equation. The other is the strength of the convergence or convection associated with extreme precipitation. As convergence or convection gets stronger, rain rates can be expected to increase if the moisture is available. For

  10. Development of a regional ensemble prediction method for probabilistic weather prediction

    International Nuclear Information System (INIS)

    Nohara, Daisuke; Tamura, Hidetoshi; Hirakuchi, Hiromaru

    2015-01-01

    A regional ensemble prediction method has been developed to provide probabilistic weather prediction using a numerical weather prediction model. To obtain consistent perturbations with the synoptic weather pattern, both of initial and lateral boundary perturbations were given by differences between control and ensemble member of the Japan Meteorological Agency (JMA)'s operational one-week ensemble forecast. The method provides a multiple ensemble member with a horizontal resolution of 15 km for 48-hour based on a downscaling of the JMA's operational global forecast accompanied with the perturbations. The ensemble prediction was examined in the case of heavy snow fall event in Kanto area on January 14, 2013. The results showed that the predictions represent different features of high-resolution spatiotemporal distribution of precipitation affected by intensity and location of extra-tropical cyclone in each ensemble member. Although the ensemble prediction has model bias of mean values and variances in some variables such as wind speed and solar radiation, the ensemble prediction has a potential to append a probabilistic information to a deterministic prediction. (author)

  11. Prediction during natural language comprehension

    NARCIS (Netherlands)

    Willems, R.M.; Frank, S.L.; Nijhof, A.D.; Hagoort, P.; Bosch, A.P.J. van den

    2016-01-01

    The notion of prediction is studied in cognitive neuroscience with increasing intensity. We investigated the neural basis of 2 distinct aspects of word prediction, derived from information theory, during story comprehension. We assessed the effect of entropy of next-word probability distributions as

  12. Benchmark Testing of the Largest Titanium Aluminide Sheet Subelement Conducted

    Science.gov (United States)

    Bartolotta, Paul A.; Krause, David L.

    2000-01-01

    To evaluate wrought titanium aluminide (gamma TiAl) as a viable candidate material for the High-Speed Civil Transport (HSCT) exhaust nozzle, an international team led by the NASA Glenn Research Center at Lewis Field successfully fabricated and tested the largest gamma TiAl sheet structure ever manufactured. The gamma TiAl sheet structure, a 56-percent subscale divergent flap subelement, was fabricated for benchmark testing in three-point bending. Overall, the subelement was 84-cm (33-in.) long by 13-cm (5-in.) wide by 8-cm (3-in.) deep. Incorporated into the subelement were features that might be used in the fabrication of a full-scale divergent flap. These features include the use of: (1) gamma TiAl shear clips to join together sections of corrugations, (2) multiple gamma TiAl face sheets, (3) double hot-formed gamma TiAl corrugations, and (4) brazed joints. The structural integrity of the gamma TiAl sheet subelement was evaluated by conducting a room-temperature three-point static bend test.

  13. LHC : The World's Largest Vacuum Systems being commissioned at CERN

    CERN Document Server

    Jiménez, J M

    2008-01-01

    When it switches on in 2008, the 26.7 km Large Hadron Collider (LHC) at CERN, will have the world's largest vacuum system operating over a wide range of pressures and employing an impressive array of vacuum technologies. This system is composed by 54 km of UHV vacuum for the circulating beams and 50 km of insulation vacuum around the cryogenic magnets and the liquid helium transfer lines. Over the 54 km of UHV beam vacuum, 48 km of this are at cryogenic temperature (1.9 K). The remaining 6 km of beam vacuum containing the insertions for "cleaning" the proton beams, radiofrequency cavities for accelerating the protons as well as beam-monitoring equipment is at ambient temperature and uses non-evaporable getter (NEG) coatings - a vacuum technology that was born and industrialized at CERN. The pumping scheme is completed using 780 ion pumps to remove noble gases and to provide pressure interlocks to the 303 vacuum safety valves. Pressure readings are provided by 170 Bayard-Alpert gauges and 1084 gauges (Pirani a...

  14. The largest forest fires in Portugal: the constraints of burned area size on the comprehension of fire severity.

    Science.gov (United States)

    Tedim, Fantina; Remelgado, Ruben; Martins, João; Carvalho, Salete

    2015-01-01

    Portugal is a European country with highest forest fires density and burned area. Since beginning of official forest fires database in 1980, an increase in number of fires and burned area as well as appearance of large and catastrophic fires have characterized fire activity in Portugal. In 1980s, the largest fires were just a little bit over 10,000 ha. However, in the beginning of 21st century several fires occurred with a burned area over 20,000 ha. Some of these events can be classified as mega-fires due to their ecological and socioeconomic severity. The present study aimed to discuss the characterization of large forest fires trend, in order to understand if the largest fires that occurred in Portugal were exceptional events or evidences of a new trend, and the constraints of fire size to characterize fire effects because, usually, it is assumed that larger the fire higher the damages. Using Portuguese forest fire database and satellite imagery, the present study showed that the largest fires could be seen at the same time as exceptional events and as evidence of a new fire regime. It highlighted the importance of size and patterns of unburned patches within fire perimeter as well as heterogeneity of fire ecological severity, usually not included in fire regime description, which are critical to fire management and research. The findings of this research can be used in forest risk reduction and suppression planning.

  15. A Modified Spatiotemporal Fusion Algorithm Using Phenological Information for Predicting Reflectance of Paddy Rice in Southern China

    Directory of Open Access Journals (Sweden)

    Mengxue Liu

    2018-05-01

    Full Text Available Satellite data for studying surface dynamics in heterogeneous landscapes are missing due to frequent cloud contamination, low temporal resolution, and technological difficulties in developing satellites. A modified spatiotemporal fusion algorithm for predicting the reflectance of paddy rice is presented in this paper. The algorithm uses phenological information extracted from a moderate-resolution imaging spectroradiometer enhanced vegetation index time series to improve the enhanced spatial and temporal adaptive reflectance fusion model (ESTARFM. The algorithm is tested with satellite data on Yueyang City, China. The main contribution of the modified algorithm is the selection of similar neighborhood pixels by using phenological information to improve accuracy. Results show that the modified algorithm performs better than ESTARFM in visual inspection and quantitative metrics, especially for paddy rice. This modified algorithm provides not only new ideas for the improvement of spatiotemporal data fusion method, but also technical support for the generation of remote sensing data with high spatial and temporal resolution.

  16. Added value from 576 years of tree-ring records in the prediction of the Great Salt Lake level

    Science.gov (United States)

    Robert R. Gillies; Oi-Yu Chung; S.-Y. Simon Wang; R. Justin DeRose; Yan Sun

    2015-01-01

    Predicting lake level fluctuations of the Great Salt Lake (GSL) in Utah - the largest terminal salt-water lake in the Western Hemisphere - is critical from many perspectives. The GSL integrates both climate and hydrological variations within the region and is particularly sensitive to low-frequency climate cycles. Since most hydroclimate variable records cover...

  17. Point-and-Click Pedagogy: Is It Effective for Teaching Information Technology?

    Science.gov (United States)

    Angolia, Mark G.; Pagliari, Leslie R.

    2016-01-01

    This paper assesses the effectiveness of the adoption of curriculum content developed and supported by a global academic university-industry alliance sponsored by one of the world's largest information technology software providers. Academic alliances promote practical and future-oriented education while providing access to proprietary software…

  18. PREDICTIVE VALUE OF THE DEFERRED TAXES GENERATED BY THE SUBVENTIONS FOR INVESTMENTS – ESSENTIAL ELEMENT FOR PRESENTING THE INFORMATION IN THE FINANCIAL STATEMENTS

    Directory of Open Access Journals (Sweden)

    PALIU – POPA LUCIA

    2015-12-01

    Full Text Available Most information underlying the decision to invest at the level of a company, are provided by the accountancy, this becoming today a common language with respect to the businesses on the international markets, and the accountancy normalization was extrapolated from the national level to the international level, due to the needs concerning the comparability and the transparency of the entities financial statements, without considering the geopolitical area where they were built. These issues justify the approaches for improving both accounting treatments and the procedures for elaborating and presenting data within the financial statements such that the users to benefit from credible and transparent information. One of the major issues arising with respect to the performance of an entity aims to prepare a unique situation on the company performance, namely:“the statement of the comprehensive income”, having as primordial objective the facility of forecasting the performance, within which the deferred taxes generated by the subventions for investments are an essential element with an important predictive value. In this context, starting from the main differences between the provisions of the national, Anglo-Saxon accounting regulations and those of the international reference system with respect to the predictive value of the deferred taxes and continuing with the occurrence and evolution of the deferred taxes generated by the subventions for investments, the study proposes to highlight the predictive value of the deferred taxes generated by the subventions for investments, provided o the users by the information of annual financial statements.

  19. Predicting recoveries and the importance of using enough information

    NARCIS (Netherlands)

    Cai, X.; den Haan, W.

    2009-01-01

    Several papers that make forecasts about the long-term impact of the current financial crisis rely on models in which there is only one type of financial crisis. These models tend to predict that the current crisis will have long lasting negative effects on economic growth. This paper points out the

  20. Model predictive control using fuzzy decision functions

    NARCIS (Netherlands)

    Kaymak, U.; Costa Sousa, da J.M.

    2001-01-01

    Fuzzy predictive control integrates conventional model predictive control with techniques from fuzzy multicriteria decision making, translating the goals and the constraints to predictive control in a transparent way. The information regarding the (fuzzy) goals and the (fuzzy) constraints of the

  1. Accurate and dynamic predictive model for better prediction in medicine and healthcare.

    Science.gov (United States)

    Alanazi, H O; Abdullah, A H; Qureshi, K N; Ismail, A S

    2018-05-01

    Information and communication technologies (ICTs) have changed the trend into new integrated operations and methods in all fields of life. The health sector has also adopted new technologies to improve the systems and provide better services to customers. Predictive models in health care are also influenced from new technologies to predict the different disease outcomes. However, still, existing predictive models have suffered from some limitations in terms of predictive outcomes performance. In order to improve predictive model performance, this paper proposed a predictive model by classifying the disease predictions into different categories. To achieve this model performance, this paper uses traumatic brain injury (TBI) datasets. TBI is one of the serious diseases worldwide and needs more attention due to its seriousness and serious impacts on human life. The proposed predictive model improves the predictive performance of TBI. The TBI data set is developed and approved by neurologists to set its features. The experiment results show that the proposed model has achieved significant results including accuracy, sensitivity, and specificity.

  2. WISMUT AG: Past, present and future of the largest uranium producer in Europe

    International Nuclear Information System (INIS)

    Madel, J.

    1990-01-01

    The author gives a brief summary of WISMUT AG the largest uranium producer operating in Europe. The jointly owned German-Soviet company operates its production facilities in the southern part of the former German Democratic Republic. Given the new political and economic frame in Germany and the Soviet Union WISMUT AG will receive due recognition. Uranium exploration, mining, and milling activities are summarized from 1946-1989, and a summary of present activities and projections of future activities in the area of decontamination, restoration, and recultivation of present and abandoned mining and milling sites are noted. A statement of WISMUT AG's projected role in the international nuclear fuels market is made

  3. THE EVOLUTION OF THE WORLD’S LARGEST AUTOMAKERS IN THE PERIOD 2013-2014

    Directory of Open Access Journals (Sweden)

    Sorin-George TOMA

    2015-04-01

    Full Text Available The automotive industry has always represented an economic engine for many countries. It is dealing with the design, development, manufacture, marketing, and sale of the motor vehicles. Nowadays, this industry is full of intense competition between big auto groups fighting for higher profits and larger market shares. The key players in the automotive market are operating at a global scale in a highly competitive environment. In the last years, Toyota Motor and Volkswagen Group have proved to be the main competitors. The aim of our paper is to analyze the evolution of the world’s largest automakers in the period 2013-2014. The research type is literature review.

  4. Affective Value in the Predictive Mind

    OpenAIRE

    Van de Cruys, Sander

    2017-01-01

    Although affective value is fundamental in explanations of behavior, it is still a somewhat alien concept in cognitive science. It implies a normativity or directionality that mere information processing models cannot seem to provide. In this paper we trace how affective value can emerge from information processing in the brain, as described by predictive processing. We explain the grounding of predictive processing in homeostasis, and articulate the implications this has for the concept of r...

  5. Improving runoff prediction using agronomical information in a cropped, loess covered catchment

    NARCIS (Netherlands)

    Lefrancq, Marie; Van Dijk, Paul; Jetten, Victor; Schwob, Matthieu; Payraudeau, Sylvain

    2017-01-01

    Predicting runoff hot spots and hot-moments within a headwater crop-catchment is of the utmost importance to reduce adverse effects on aquatic ecosystems by adapting land use management to control runoff. Reliable predictions of runoff patterns during a crop growing season remain challenging. This

  6. Nuclear Information and Knowledge. News from the Nuclear Information Section, No. 12, March 2012

    International Nuclear Information System (INIS)

    2012-03-01

    This issue of the Nuclear Information and Knowledge newsletter is devoted to the topic of constant change. We start with a summary of INIS and IAEA Library activities in 2011, write about the introduction of the new INIS search system based on Google technology, and continue with the restructuring of the Department of Nuclear Energy (NE). This restructuring included the establishment of a new section to deal with synergetic aspects of information management in the form of modern library services combined with a powerful, and one of the world's largest, collections of published information on the peaceful uses of nuclear science and technology. Articles on the International Nuclear Library Network (INLN) and INIS in the World show just some of the ways to bring NIS products and services closer to the world of scientists, researchers and students around the world, while an article on eBooks in Libraries talks about a future beyond circulating collections.

  7. Nuclear Information and Knowledge. News from the Nuclear Information Section, No. 12, March 2012

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-03-15

    This issue of the Nuclear Information and Knowledge newsletter is devoted to the topic of constant change. We start with a summary of INIS and IAEA Library activities in 2011, write about the introduction of the new INIS search system based on Google technology, and continue with the restructuring of the Department of Nuclear Energy (NE). This restructuring included the establishment of a new section to deal with synergetic aspects of information management in the form of modern library services combined with a powerful, and one of the world's largest, collections of published information on the peaceful uses of nuclear science and technology. Articles on the International Nuclear Library Network (INLN) and INIS in the World show just some of the ways to bring NIS products and services closer to the world of scientists, researchers and students around the world, while an article on eBooks in Libraries talks about a future beyond circulating collections.

  8. Five-factor model personality disorder prototypes in a community sample: self- and informant-reports predicting interview-based DSM diagnoses.

    Science.gov (United States)

    Lawton, Erin M; Shields, Andrew J; Oltmanns, Thomas F

    2011-10-01

    The need for an empirically validated, dimensional system of personality disorders is becoming increasingly apparent. While a number of systems have been investigated in this regard, the five-factor model of personality has demonstrated the ability to adequately capture personality pathology. In particular, the personality disorder prototypes developed by Lynam and Widiger (2001) have been tested in a number of samples. The goal of the present study is to extend this literature by validating the prototypes in a large, representative community sample of later middle-aged adults using both self and informant reports. We found that the prototypes largely work well in this age group. Schizoid, Borderline, Histrionic, Narcissistic, and Avoidant personality disorders demonstrate good convergent validity, with a particularly strong pattern of discriminant validity for the latter four. Informant-reported prototypes show similar patterns to self reports for all analyses. This demonstrates that informants are not succumbing to halo representations of the participants, but are rather describing participants in nuanced ways. It is important that informant reports add significant predictive validity for Schizoid, Antisocial, Borderline, Histrionic, and Narcissistic personality disorders. Implications of our results and directions for future research are discussed.

  9. Five-Factor Model personality disorder prototypes in a community sample: Self- and informant-reports predicting interview-based DSM diagnoses

    Science.gov (United States)

    Lawton, Erin M.; Shields, Andrew J.; Oltmanns, Thomas F.

    2011-01-01

    The need for an empirically-validated, dimensional system of personality disorders is becoming increasingly apparent. While a number of systems have been investigated in this regard, the five-factor model of personality has demonstrated the ability to adequately capture personality pathology. In particular, the personality disorder prototypes developed by Lynam and Widiger (2001) have been tested in a number of samples. The goal of the present study is to extend this literature by validating the prototypes in a large, representative community sample of later middle-aged adults using both self and informant reports. We found that the prototypes largely work well in this age group. Schizoid, Borderline, Histrionic, Narcissistic, and Avoidant personality disorders demonstrate good convergent validity, with a particularly strong pattern of discriminant validity for the latter four. Informant-reported prototypes show similar patterns to self reports for all analyses. This demonstrates that informants are not succumbing to halo representations of the participants, but are rather describing participants in nuanced ways. Importantly, informant reports add significant predictive validity for Schizoid, Antisocial, Borderline, Histrionic, and Narcissistic personality disorders. Implications of our results and directions for future research are discussed. PMID:22200006

  10. Predictive maintenance primer

    International Nuclear Information System (INIS)

    Flude, J.W.; Nicholas, J.R.

    1991-04-01

    This Predictive Maintenance Primer provides utility plant personnel with a single-source reference to predictive maintenance analysis methods and technologies used successfully by utilities and other industries. It is intended to be a ready reference to personnel considering starting, expanding or improving a predictive maintenance program. This Primer includes a discussion of various analysis methods and how they overlap and interrelate. Additionally, eighteen predictive maintenance technologies are discussed in sufficient detail for the user to evaluate the potential of each technology for specific applications. This document is designed to allow inclusion of additional technologies in the future. To gather the information necessary to create this initial Primer the Nuclear Maintenance Applications Center (NMAC) collected experience data from eighteen utilities plus other industry and government sources. NMAC also contacted equipment manufacturers for information pertaining to equipment utilization, maintenance, and technical specifications. The Primer includes a discussion of six methods used by analysts to study predictive maintenance data. These are: trend analysis; pattern recognition; correlation; test against limits or ranges; relative comparison data; and statistical process analysis. Following the analysis methods discussions are detailed descriptions for eighteen technologies analysts have found useful for predictive maintenance programs at power plants and other industrial facilities. Each technology subchapter has a description of the operating principles involved in the technology, a listing of plant equipment where the technology can be applied, and a general description of the monitoring equipment. Additionally, these descriptions include a discussion of results obtained from actual equipment users and preferred analysis techniques to be used on data obtained from the technology. 5 refs., 30 figs

  11. Quantitative CT analysis of pulmonary pure ground-glass nodule predicts histological invasiveness

    Energy Technology Data Exchange (ETDEWEB)

    Li, Qiong, E-mail: liqiongsmmu2008@qq.com [Department of Radiology, Changzheng Hospital, Second Military Medical University, NO. 415, Fengyang Road, Shanghai 200003 (China); Fan, Li, E-mail: fanli0930@163.com [Department of Radiology, Changzheng Hospital, Second Military Medical University, NO. 415, Fengyang Road, Shanghai 200003 (China); Cao, En-Tao, E-mail: cet123cs@126.com [Department of Radiology, Suzhou Municipal Hospital (East District), No.16 West Baita Road, Suzhu, Jiangsu Province 215001 (China); Li, Qing-Chu, E-mail: Wudi327@hotmail.com [Department of Radiology, Changzheng Hospital, Second Military Medical University, NO. 415, Fengyang Road, Shanghai 200003 (China); Gu, Ya-Feng, E-mail: 2528473557@qq.com [Department of Radiology, Changzheng Hospital, Second Military Medical University, NO. 415, Fengyang Road, Shanghai 200003 (China); Liu, Shi−Yuan, E-mail: liusy1186@163.com [Department of Radiology, Changzheng Hospital, Second Military Medical University, NO. 415, Fengyang Road, Shanghai 200003 (China)

    2017-04-15

    Objective: To assess whether quantitative computed tomography (CT) can help predict histological invasiveness of pulmonary adenocarcinoma appearing as pure ground glass nodules (pGGNs). Methods: A total of 110 pulmonary pGGNs were retrospectively evaluated, and pathologically classified as pre-invasive lesions, minimally invasive adenocarcinoma (MIA) and invasive pulmonary adenocarcinoma (IPA). Maximum nodule diameters, largest cross-sectional areas, volumes, mean CT values, weights, and CT attenuation values at the 0th,2th,5th, 25th, 50th,75th, 95th, 98th and100th percentiles on histogram, as well as 2th to 98th, 5th to 95th, 25th to 75th,and 0th to 100thslopes, respectively, were compared among the three groups. Results: Of the 110 pGGNs, 50, 28, and 32 were pre-invasive lesions, MIA, and IPA, respectively. Maximum nodule diameters, largest cross-sectional areas, andmass weights were significantly larger in the IPA group than in pre-invasive lesions. The 95th, 98th, 100th percentiles, and 2th to 98th, 25th to 75th, and 0th to 100thslopes were significantly different between pre-invasive lesions and MIA or IPA. Logistic regression analysis showed that the maximum nodule diameter (OR = 1.21, 95%CI: 1.071–1.366, p < 0.01) and 100th percentile on histogram (OR = 1.02, 95%CI: 1.009–1.032, p < 0.001) independently predicted histological invasiveness. Conclusions: Quantitative analysis of CT imaging can predict histological invasiveness of pGGNs, especiallythe maximum nodule diameter and 100th percentile on CT number histogram; this can instruct the long-term follow-up and selective surgical management.

  12. Integrating predictive information into an agro-economic model to guide agricultural management

    Science.gov (United States)

    Zhang, Y.; Block, P.

    2016-12-01

    Skillful season-ahead climate predictions linked with responsive agricultural planning and management have the potential to reduce losses, if adopted by farmers, particularly for rainfed-dominated agriculture such as in Ethiopia. Precipitation predictions during the growing season in major agricultural regions of Ethiopia are used to generate predicted climate yield factors, which reflect the influence of precipitation amounts on crop yields and serve as inputs into an agro-economic model. The adapted model, originally developed by the International Food Policy Research Institute, produces outputs of economic indices (GDP, poverty rates, etc.) at zonal and national levels. Forecast-based approaches, in which farmers' actions are in response to forecasted conditions, are compared with no-forecast approaches in which farmers follow business as usual practices, expecting "average" climate conditions. The effects of farmer adoption rates, including the potential for reduced uptake due to poor predictions, and increasing forecast lead-time on economic outputs are also explored. Preliminary results indicate superior gains under forecast-based approaches.

  13. Innovative Information Systems in the Intensive Care Unit, King Saud Medical City in Saudi Arabia.

    Science.gov (United States)

    Al Saleem, Nouf; Al Harthy, Abdulrahman

    2015-01-01

    The purpose of this paper is to discuss the experience of implementing innovative information technology to improve the quality of services in one of the largest Intensive Care Units in Saudi Arabia. The Intensive Care Units in King Saud Medical City (ICU-KSMC) is the main ICU in the kingdom that represents the Ministry of Health. KSMC's ICU is also considered one of the largest ICU in the world as it consists of six units with 129 beds. Leaders in KSMC's ICU have introduced and integrated three information technologies to produce powerful, accurate, and timely information systems to overcome the challenges of the ICU nature and improve the quality of service to ensure patients' safety. By 2015, ICU in KSMC has noticed a remarkable improvement in: beds' occupation and utilization, staff communication, reduced medical errors, and improved departmental work flow, which created a healthy professional work environment. Yet, ICU in KSMC has ongoing improvement projects that include future plans for more innovative information technologies' implementation in the department.

  14. Evaluating Predictive Uncertainty of Hyporheic Exchange Modelling

    Science.gov (United States)

    Chow, R.; Bennett, J.; Dugge, J.; Wöhling, T.; Nowak, W.

    2017-12-01

    Hyporheic exchange is the interaction of water between rivers and groundwater, and is difficult to predict. One of the largest contributions to predictive uncertainty for hyporheic fluxes have been attributed to the representation of heterogeneous subsurface properties. This research aims to evaluate which aspect of the subsurface representation - the spatial distribution of hydrofacies or the model for local-scale (within-facies) heterogeneity - most influences the predictive uncertainty. Also, we seek to identify data types that help reduce this uncertainty best. For this investigation, we conduct a modelling study of the Steinlach River meander, in Southwest Germany. The Steinlach River meander is an experimental site established in 2010 to monitor hyporheic exchange at the meander scale. We use HydroGeoSphere, a fully integrated surface water-groundwater model, to model hyporheic exchange and to assess the predictive uncertainty of hyporheic exchange transit times (HETT). A highly parameterized complex model is built and treated as `virtual reality', which is in turn modelled with simpler subsurface parameterization schemes (Figure). Then, we conduct Monte-Carlo simulations with these models to estimate the predictive uncertainty. Results indicate that: Uncertainty in HETT is relatively small for early times and increases with transit times. Uncertainty from local-scale heterogeneity is negligible compared to uncertainty in the hydrofacies distribution. Introducing more data to a poor model structure may reduce predictive variance, but does not reduce predictive bias. Hydraulic head observations alone cannot constrain the uncertainty of HETT, however an estimate of hyporheic exchange flux proves to be more effective at reducing this uncertainty. Figure: Approach for evaluating predictive model uncertainty. A conceptual model is first developed from the field investigations. A complex model (`virtual reality') is then developed based on that conceptual model

  15. The largest Fresco in Europe on cooling tower of nuclear power station of Cruas Meysse in Ardeche, France

    International Nuclear Information System (INIS)

    Di Mayo, J.L.

    1993-01-01

    The Nuclear Power Station Cruas Meysse is on the most important communication way of France, in the Rhone Valley, between the Rhin and the Mediterranean Sea. In the South of the Rhone Valley, the Nuclear Power Plant is situated near the very important site of 'Tricastin', the largest nuclear area in France. Cruas Meysse has a very good integration to the economy, social, and cultural scheme ; that's why EDF and the Ardeche Department had enter into partnership to associate art and technology of our time, and offer a work for everybody - 'Le Verseau' is the largest fresco in Europe - It gives a gigantic signalling system to the Ardeche Department, because the Nuclear Power Station has a very interesting position, close the motor way A7, the National 7 road, and the way of high speed train (TGV) an another symbol of the high French technology

  16. Predicting Climate Change Impacts to the Canadian Boreal Forest

    Directory of Open Access Journals (Sweden)

    Trisalyn A. Nelson

    2014-03-01

    Full Text Available Climate change is expected to alter temperature, precipitation, and seasonality with potentially acute impacts on Canada’s boreal. In this research we predicted future spatial distributions of biodiversity in Canada’s boreal for 2020, 2050, and 2080 using indirect indicators derived from remote sensing and based on vegetation productivity. Vegetation productivity indices, representing annual amounts and variability of greenness, have been shown to relate to tree and wildlife richness in Canada’s boreal. Relationships between historical satellite-derived productivity and climate data were applied to modelled scenarios of future climate to predict and map potential future vegetation productivity for 592 regions across Canada. Results indicated that the pattern of vegetation productivity will become more homogenous, particularly west of Hudson Bay. We expect climate change to impact biodiversity along north/south gradients and by 2080 vegetation distributions will be dominated by processes of seasonality in the north and a combination of cumulative greenness and minimum cover in the south. The Hudson Plains, which host the world’s largest and most contiguous wetland, are predicted to experience less seasonality and more greenness. The spatial distribution of predicted trends in vegetation productivity was emphasized over absolute values, in order to support regional biodiversity assessments and conservation planning.

  17. Collision prediction software for radiotherapy treatments

    Energy Technology Data Exchange (ETDEWEB)

    Padilla, Laura [Virginia Commonwealth University Medical Center, Richmond, Virginia 23298 (United States); Pearson, Erik A. [Techna Institute and the Princess Margaret Cancer Center, University Health Network, Toronto, Ontario M5G 2M9 (Canada); Pelizzari, Charles A., E-mail: c-pelizzari@uchicago.edu [Department of Radiation and Cellular Oncology, The University of Chicago, Chicago, Illinois 60637 (United States)

    2015-11-15

    Purpose: This work presents a method of collision predictions for external beam radiotherapy using surface imaging. The present methodology focuses on collision prediction during treatment simulation to evaluate the clearance of a patient’s treatment position and allow for its modification if necessary. Methods: A Kinect camera (Microsoft, Redmond, WA) is used to scan the patient and immobilization devices in the treatment position at the simulator. The surface is reconstructed using the SKANECT software (Occipital, Inc., San Francisco, CA). The treatment isocenter is marked using simulated orthogonal lasers projected on the surface scan. The point cloud of this surface is then shifted to isocenter and converted from Cartesian to cylindrical coordinates. A slab models the treatment couch. A cylinder with a radius equal to the normal distance from isocenter to the collimator plate, and a height defined by the collimator diameter is used to estimate collisions. Points within the cylinder clear through a full gantry rotation with the treatment couch at 0° , while points outside of it collide. The angles of collision are reported. This methodology was experimentally verified using a mannequin positioned in an alpha cradle with both arms up. A planning CT scan of the mannequin was performed, two isocenters were marked in PINNACLE, and this information was exported to AlignRT (VisionRT, London, UK)—a surface imaging system for patient positioning. This was used to ensure accurate positioning of the mannequin in the treatment room, when available. Collision calculations were performed for the two treatment isocenters and the results compared to the collisions detected the room. The accuracy of the Kinect-Skanect surface was evaluated by comparing it to the external surface of the planning CT scan. Results: Experimental verification results showed that the predicted angles of collision matched those recorded in the room within 0.5°, in most cases (largest deviation

  18. Information Processing in Nursing Information Systems: An Evaluation Study from a Developing Country.

    Science.gov (United States)

    Samadbeik, Mahnaz; Shahrokhi, Nafiseh; Saremian, Marzieh; Garavand, Ali; Birjandi, Mahdi

    2017-01-01

    In recent years, information technology has been introduced in the nursing departments of many hospitals to support their daily tasks. Nurses are the largest end user group in Hospital Information Systems (HISs). This study was designed to evaluate data processing in the Nursing Information Systems (NISs) utilized in many university hospitals in Iran. This was a cross-sectional study. The population comprised all nurse managers and NIS users of the five training hospitals in Khorramabad city ( N = 71). The nursing subset of HIS-Monitor questionnaire was used to collect the data. Data were analyzed by the descriptive-analytical method and the inductive content analysis. The results indicated that the nurses participating in the study did not take a desirable advantage of paper (2.02) and computerized (2.34) information processing tools to perform nursing tasks. Moreover, the less work experience nurses have, the further they utilize computer tools for processing patient discharge information. The "readability of patient information" and "repetitive and time-consuming documentation" were stated as the most important expectations and problems regarding the HIS by the participating nurses, respectively. The nurses participating in the present study used to utilize paper and computerized information processing tools together to perform nursing practices. Therefore, it is recommended that the nursing process redesign coincides with NIS implementation in the health care centers.

  19. Astronomer's new guide to the galaxy: largest map of cold dust revealed

    Science.gov (United States)

    2009-07-01

    visible from the APEX site on Chajnantor, as well as combining it with infrared observations to be made by the ESA Herschel Space Observatory. We look forward to new discoveries made with these maps, which will also serve as a guide for future observations with ALMA", said Leonardo Testi from ESO, who is a member of the ATLASGAL team and the European Project Scientist for the ALMA project. Note [1] The map was constructed from individual APEX observations in radiation at 870 µm (0.87 mm) wavelength. More information: The ATLASGAL observations are presented in a paper by Frederic Schuller et al., ATLASGAL -- The APEX Telescope Large Area Survey of the Galaxy at 870 µm, published in Astronomy & Astrophysics. ATLASGAL is a collaboration between the Max Planck Institute for Radio Astronomy, the Max Planck Institute for Astronomy, ESO, and the University of Chile. LABOCA (Large APEX Bolometer Camera), one of APEX's major instruments, is the world's largest bolometer camera (a "thermometer camera", or thermal camera that measures and maps the tiny changes in temperature that occur when sub-millimetre wavelength light falls on its absorbing surface; see ESO 35/07). LABOCA's large field of view and high sensitivity make it an invaluable tool for imaging the "cold Universe". LABOCA was built by the Max Planck Institute for Radio Astronomy. The Atacama Pathfinder Experiment (APEX) telescope is a 12-metre telescope, located at 5100 m altitude on the arid plateau of Chajnantor in the Chilean Andes. APEX operates at millimetre and submillimetre wavelengths. This wavelength range is a relatively unexplored frontier in astronomy, requiring advanced detectors and an extremely high and dry observatory site, such as Chajnantor. APEX, the largest submillimetre-wave telescope operating in the southern hemisphere, is a collaboration between the Max Planck Institute for Radio Astronomy, the Onsala Space Observatory and ESO. Operation of APEX at Chajnantor is entrusted to ESO. APEX is a

  20. Information density converges in dialogue: Towards an information-theoretic model.

    Science.gov (United States)

    Xu, Yang; Reitter, David

    2018-01-01

    The principle of entropy rate constancy (ERC) states that language users distribute information such that words tend to be equally predictable given previous contexts. We examine the applicability of this principle to spoken dialogue, as previous findings primarily rest on written text. The study takes into account the joint-activity nature of dialogue and the topic shift mechanisms that are different from monologue. It examines how the information contributions from the two dialogue partners interactively evolve as the discourse develops. The increase of local sentence-level information density (predicted by ERC) is shown to apply to dialogue overall. However, when the different roles of interlocutors in introducing new topics are identified, their contribution in information content displays a new converging pattern. We draw explanations to this pattern from multiple perspectives: Casting dialogue as an information exchange system would mean that the pattern is the result of two interlocutors maintaining their own context rather than sharing one. Second, we present some empirical evidence that a model of Interactive Alignment may include information density to explain the effect. Third, we argue that building common ground is a process analogous to information convergence. Thus, we put forward an information-theoretic view of dialogue, under which some existing theories of human dialogue may eventually be unified. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Comparison of statistical and clinical predictions of functional outcome after ischemic stroke.

    Directory of Open Access Journals (Sweden)

    Douglas D Thompson

    Full Text Available To determine whether the predictions of functional outcome after ischemic stroke made at the bedside using a doctor's clinical experience were more or less accurate than the predictions made by clinical prediction models (CPMs.A prospective cohort study of nine hundred and thirty one ischemic stroke patients recruited consecutively at the outpatient, inpatient and emergency departments of the Western General Hospital, Edinburgh between 2002 and 2005. Doctors made informal predictions of six month functional outcome on the Oxford Handicap Scale (OHS. Patients were followed up at six months with a validated postal questionnaire. For each patient we calculated the absolute predicted risk of death or dependence (OHS≥3 using five previously described CPMs. The specificity of a doctor's informal predictions of OHS≥3 at six months was good 0.96 (95% CI: 0.94 to 0.97 and similar to CPMs (range 0.94 to 0.96; however the sensitivity of both informal clinical predictions 0.44 (95% CI: 0.39 to 0.49 and clinical prediction models (range 0.38 to 0.45 was poor. The prediction of the level of disability after stroke was similar for informal clinical predictions (ordinal c-statistic 0.74 with 95% CI 0.72 to 0.76 and CPMs (range 0.69 to 0.75. No patient or clinician characteristic affected the accuracy of informal predictions, though predictions were more accurate in outpatients.CPMs are at least as good as informal clinical predictions in discriminating between good and bad functional outcome after ischemic stroke. The place of these models in clinical practice has yet to be determined.

  2. Energy Information Data Base: corporate author entries

    International Nuclear Information System (INIS)

    1978-06-01

    The DOE Energy Information Data Base has been created and is maintained by the DOE Technical Information Center. One of the controls for information entered into the base is the standardized name of the corporate entity or the corporate author. The purpose of this list of authorized or standardized corporate entries is to provide a means for the consistent citing of the names of organizations in bibliographic records. It also serves as a guide for users who retrieve information from a bibliographic data base and who want to locate information originating in particular organizations. This authority is a combination of entries established by the Technical Information Center and the International Atomic Energy Agency's International Nuclear Information System (INIS). The format calls, in general, for the name of the organization represented by the literature being cataloged to be cited as follows: the largest element, the place, the smallest element, e.g., Brigham Young Univ., Provo, Utah (USA), Dept. of Chemical Engineering. Code numbers are assigned to each entry to provide manipulation by computer. Cross references are used to reflect name changes and invalid entries

  3. Satisfaction with information provided to Danish cancer patients

    DEFF Research Database (Denmark)

    Ross, Lone; Petersen, Morten Aagaard; Johnsen, Anna Thit

    2013-01-01

    To validate five items (CPWQ-inf) regarding satisfaction with information provided to cancer patients from health care staff, assess the prevalence of dissatisfaction with this information, and identify factors predicting dissatisfaction.......To validate five items (CPWQ-inf) regarding satisfaction with information provided to cancer patients from health care staff, assess the prevalence of dissatisfaction with this information, and identify factors predicting dissatisfaction....

  4. Predicting the Location and Time of Mobile Phone Users by Using Sequential Pattern Mining Techniques

    DEFF Research Database (Denmark)

    Ozer, Mert; Keles, Ilkcan; Toroslu, Hakki

    2016-01-01

    In recent years, using cell phone log data to model human mobility patterns became an active research area. This problem is a challenging data mining problem due to huge size and non-uniformity of the log data, which introduces several granularity levels for the specification of temporal...... and spatial dimensions. This paper focuses on the prediction of the location of the next activity of the mobile phone users. There are several versions of this problem. In this work, we have concentrated on the following three problems: predicting the location and the time of the next user activity...... the success of these methods with real data obtained from one of the largest mobile phone operators in Turkey. Our results are very encouraging, since we were able to obtain quite high accuracy results under small prediction sets....

  5. Mauna Loa--history, hazards and risk of living with the world's largest volcano

    Science.gov (United States)

    Trusdell, Frank A.

    2012-01-01

    Mauna Loa on the Island Hawaiʻi is the world’s largest volcano. People residing on its flanks face many hazards that come with living on or near an active volcano, including lava flows, explosive eruptions, volcanic smog, damaging earthquakes, and local tsunami (giant seawaves). The County of Hawaiʻi (Island of Hawaiʻi) is the fastest growing County in the State of Hawaii. Its expanding population and increasing development mean that risk from volcano hazards will continue to grow. U.S. Geological Survey (USGS) scientists at the Hawaiian Volcano Observatory (HVO) closely monitor and study Mauna Loa Volcano to enable timely warning of hazardous activity and help protect lives and property.

  6. Organisational Information Security Strategy: Review, Discussion and Future Research

    Directory of Open Access Journals (Sweden)

    Craig A. Horne

    2017-05-01

    Full Text Available Dependence on information, including for some of the world’s largest organisations such as governments and multi-national corporations, has grown rapidly in recent years. However, reports of information security breaches and their associated consequences indicate that attacks are escalating on organisations conducting these information-based activities. Organisations need to formulate strategy to secure their information, however gaps exist in knowledge. Through a thematic review of academic security literature, (1 we analyse the antecedent conditions that motivate the adoption of a comprehensive information security strategy, (2 the conceptual elements of strategy and (3 the benefits that are enjoyed post-adoption. Our contributions include a definition of information security strategy that moves from an internally-focussed protection of information towards a strategic view that considers the organisation, its resources and capabilities, and its external environment. Our findings are then used to suggest future research directions.

  7. Semi-supervised prediction of SH2-peptide interactions from imbalanced high-throughput data.

    Science.gov (United States)

    Kundu, Kousik; Costa, Fabrizio; Huber, Michael; Reth, Michael; Backofen, Rolf

    2013-01-01

    Src homology 2 (SH2) domains are the largest family of the peptide-recognition modules (PRMs) that bind to phosphotyrosine containing peptides. Knowledge about binding partners of SH2-domains is key for a deeper understanding of different cellular processes. Given the high binding specificity of SH2, in-silico ligand peptide prediction is of great interest. Currently however, only a few approaches have been published for the prediction of SH2-peptide interactions. Their main shortcomings range from limited coverage, to restrictive modeling assumptions (they are mainly based on position specific scoring matrices and do not take into consideration complex amino acids inter-dependencies) and high computational complexity. We propose a simple yet effective machine learning approach for a large set of known human SH2 domains. We used comprehensive data from micro-array and peptide-array experiments on 51 human SH2 domains. In order to deal with the high data imbalance problem and the high signal-to-noise ration, we casted the problem in a semi-supervised setting. We report competitive predictive performance w.r.t. state-of-the-art. Specifically we obtain 0.83 AUC ROC and 0.93 AUC PR in comparison to 0.71 AUC ROC and 0.87 AUC PR previously achieved by the position specific scoring matrices (PSSMs) based SMALI approach. Our work provides three main contributions. First, we showed that better models can be obtained when the information on the non-interacting peptides (negative examples) is also used. Second, we improve performance when considering high order correlations between the ligand positions employing regularization techniques to effectively avoid overfitting issues. Third, we developed an approach to tackle the data imbalance problem using a semi-supervised strategy. Finally, we performed a genome-wide prediction of human SH2-peptide binding, uncovering several findings of biological relevance. We make our models and genome-wide predictions, for all the 51 SH2

  8. Estimating prognosis at the time of repeat whole brain radiation therapy for multiple brain metastases: The reirradiation score

    Directory of Open Access Journals (Sweden)

    Natalie Logie, MD

    2017-07-01

    Conclusions: In the largest reported cohort to receive repeat WBRT, application of the RPA score was not predictive of MS. The new ReRT score is a simple tool based on readily available clinical information.

  9. Perceived Physician-informed Weight Status Predicts Accurate Weight Self-Perception and Weight Self-Regulation in Low-income, African American Women.

    Science.gov (United States)

    Harris, Charlie L; Strayhorn, Gregory; Moore, Sandra; Goldman, Brian; Martin, Michelle Y

    2016-01-01

    Obese African American women under-appraise their body mass index (BMI) classification and report fewer weight loss attempts than women who accurately appraise their weight status. This cross-sectional study examined whether physician-informed weight status could predict weight self-perception and weight self-regulation strategies in obese women. A convenience sample of 118 low-income women completed a survey assessing demographic characteristics, comorbidities, weight self-perception, and weight self-regulation strategies. BMI was calculated during nurse triage. Binary logistic regression models were performed to test hypotheses. The odds of obese accurate appraisers having been informed about their weight status were six times greater than those of under-appraisers. The odds of those using an "approach" self-regulation strategy having been physician-informed were four times greater compared with those using an "avoidance" strategy. Physicians are uniquely positioned to influence accurate weight self-perception and adaptive weight self-regulation strategies in underserved women, reducing their risk for obesity-related morbidity.

  10. Selecting Suitable Candidates for Predictive Maintenance

    NARCIS (Netherlands)

    Tiddens, Wieger Willem; Braaksma, Anne Johannes Jan; Tinga, Tiedo

    2018-01-01

    Predictive maintenance (PdM) or Prognostics and Health Management (PHM) assists in better predicting the future state of physical assets and making timely and better-informed maintenance decisions. Many companies nowadays ambition the implementation of such an advanced maintenance policy. However,

  11. The relative value of operon predictions

    NARCIS (Netherlands)

    Brouwer, Rutger W. W.; Kuipers, Oscar P.; van Hijum, Sacha A. F. T.

    For most organisms, computational operon predictions are the only source of genome-wide operon information. Operon prediction methods described in literature are based on (a combination of) the following five criteria: (i) intergenic distance, (ii) conserved gene clusters, (iii) functional relation,

  12. Investigation of Science Faculty with Education Specialties within the Largest University System in the United States

    OpenAIRE

    Bush, Seth D; Pelaez, Nancy; Rudd, James A, II; Stevens, Michael T; Tanner, Kimberly D; Williams, Kathy, PhD

    2011-01-01

    Efforts to improve science education include university science departments hiring Science Faculty with Education Specialties (SFES), scientists who take on specialized roles in science education within their discipline. Although these positions have existed for decades and may be growing more common, few reports have investigated the SFES approach to improving science education. We present comprehensive data on the SFES in the California State University (CSU) system, the largest university ...

  13. Mapping Dynamics of Inundation Patterns of Two Largest River-Connected Lakes in China: A Comparative Study

    OpenAIRE

    Guiping Wu; Yuanbo Liu

    2016-01-01

    Poyang Lake and Dongting Lake are the two largest freshwater lakes in China. The lakes are located approximately 300 km apart on the middle reaches of the Yangtze River and are differently connected through their respective tributary systems, which will lead to different river–lake water exchanges and discharges. Thus, differences in their morphological and hydrological conditions should induce individual lake spatio-temporal inundation patterns. Quantitative comparative analyses of the dynam...

  14. Predicting fractional bed load transport rates: Application of the Wilcock‐Crowe equations to a regulated gravel bed river

    Science.gov (United States)

    Gaeuman, David; Andrews, E.D.; Krause, Andreas; Smith, Wes

    2009-01-01

    Bed load samples from four locations in the Trinity River of northern California are analyzed to evaluate the performance of the Wilcock‐Crowe bed load transport equations for predicting fractional bed load transport rates. Bed surface particles become smaller and the fraction of sand on the bed increases with distance downstream from Lewiston Dam. The dimensionless reference shear stress for the mean bed particle size (τ*rm) is largest near the dam, but varies relatively little between the more downstream locations. The relation between τ*rm and the reference shear stresses for other size fractions is constant across all locations. Total bed load transport rates predicted with the Wilcock‐Crowe equations are within a factor of 2 of sampled transport rates for 68% of all samples. The Wilcock‐Crowe equations nonetheless consistently under‐predict the transport of particles larger than 128 mm, frequently by more than an order of magnitude. Accurate prediction of the transport rates of the largest particles is important for models in which the evolution of the surface grain size distribution determines subsequent bed load transport rates. Values of τ*rm estimated from bed load samples are up to 50% larger than those predicted with the Wilcock‐Crowe equations, and sampled bed load transport approximates equal mobility across a wider range of grain sizes than is implied by the equations. Modifications to the Wilcock‐Crowe equation for determining τ*rm and the hiding function used to scale τ*rm to other grain size fractions are proposed to achieve the best fit to observed bed load transport in the Trinity River.

  15. Prostate Cancer Screening in Jamaica: Results of the Largest National Screening Clinic Prostate Cancer Screening in Jamaica: Results of the Largest National Screening Clinic

    International Nuclear Information System (INIS)

    Morrison, B. F.; Aiken, W.; Mayhew, R.; Gordon, Y.; Reid, M.

    2016-01-01

    Prostate cancer is highly prevalent in Jamaica and is the leading cause of cancer-related deaths. Our aim was to evaluate the patterns of screening in the largest organized screening clinic in Jamaica at the Jamaica Cancer Society. A retrospective analysis of all men presenting for screening at the Jamaica Cancer Society from 1995 to 2005 was done. All patients had digital rectal examinations (DRE) and prostate specific antigen (PSA) tests done. Results of prostate biopsies were noted. 1117 men of mean age 59.9 ± 8.2 years presented for screening. The median documented PSA was 1.6 ng/mL (maximum of 5170 ng/mL). Most patients presented for only 1 screen. There was a gradual reduction in the mean age of presentation for screening over the period. Prostate biopsies were requested on 11% of screening visits; however, only 59% of these were done. 5.6% of all persons screened were found to have cancer. Of the cancers diagnosed, Gleason 6 adenocarcinoma was the commonest grade and median PSA was 8.9 ng/mL (range 1.5-1059 ng/mL). Older men tend to screen for prostate cancer in Jamaica. However, compliance with regular maintenance visits and requests for confirmatory biopsies are poor. Screening needs intervention in the Jamaican population.

  16. Evaluating the predictive performance of empirical estimators of natural mortality rate using information on over 200 fish species

    Science.gov (United States)

    Then, Amy Y.; Hoenig, John M; Hall, Norman G.; Hewitt, David A.

    2015-01-01

    Many methods have been developed in the last 70 years to predict the natural mortality rate, M, of a stock based on empirical evidence from comparative life history studies. These indirect or empirical methods are used in most stock assessments to (i) obtain estimates of M in the absence of direct information, (ii) check on the reasonableness of a direct estimate of M, (iii) examine the range of plausible M estimates for the stock under consideration, and (iv) define prior distributions for Bayesian analyses. The two most cited empirical methods have appeared in the literature over 2500 times to date. Despite the importance of these methods, there is no consensus in the literature on how well these methods work in terms of prediction error or how their performance may be ranked. We evaluate estimators based on various combinations of maximum age (tmax), growth parameters, and water temperature by seeing how well they reproduce >200 independent, direct estimates of M. We use tenfold cross-validation to estimate the prediction error of the estimators and to rank their performance. With updated and carefully reviewed data, we conclude that a tmax-based estimator performs the best among all estimators evaluated. The tmax-based estimators in turn perform better than the Alverson–Carney method based on tmax and the von Bertalanffy K coefficient, Pauly’s method based on growth parameters and water temperature and methods based just on K. It is possible to combine two independent methods by computing a weighted mean but the improvement over the tmax-based methods is slight. Based on cross-validation prediction error, model residual patterns, model parsimony, and biological considerations, we recommend the use of a tmax-based estimator (M=4.899tmax−0.916">M=4.899t−0.916maxM=4.899tmax−0.916, prediction error = 0.32) when possible and a growth-based method (M=4.118K0.73L∞−0.33">M=4.118K0.73L−0.33∞M=4.118K0.73L∞−0.33 , prediction error

  17. Meta-path based heterogeneous combat network link prediction

    Science.gov (United States)

    Li, Jichao; Ge, Bingfeng; Yang, Kewei; Chen, Yingwu; Tan, Yuejin

    2017-09-01

    The combat system-of-systems in high-tech informative warfare, composed of many interconnected combat systems of different types, can be regarded as a type of complex heterogeneous network. Link prediction for heterogeneous combat networks (HCNs) is of significant military value, as it facilitates reconfiguring combat networks to represent the complex real-world network topology as appropriate with observed information. This paper proposes a novel integrated methodology framework called HCNMP (HCN link prediction based on meta-path) to predict multiple types of links simultaneously for an HCN. More specifically, the concept of HCN meta-paths is introduced, through which the HCNMP can accumulate information by extracting different features of HCN links for all the six defined types. Next, an HCN link prediction model, based on meta-path features, is built to predict all types of links of the HCN simultaneously. Then, the solution algorithm for the HCN link prediction model is proposed, in which the prediction results are obtained by iteratively updating with the newly predicted results until the results in the HCN converge or reach a certain maximum iteration number. Finally, numerical experiments on the dataset of a real HCN are conducted to demonstrate the feasibility and effectiveness of the proposed HCNMP, in comparison with 30 baseline methods. The results show that the performance of the HCNMP is superior to those of the baseline methods.

  18. A predictive model to inform adaptive management of double-crested cormorants and fisheries in Michigan

    Science.gov (United States)

    Tsehaye, Iyob; Jones, Michael L.; Irwin, Brian J.; Fielder, David G.; Breck, James E.; Luukkonen, David R.

    2015-01-01

    The proliferation of double-crested cormorants (DCCOs; Phalacrocorax auritus) in North America has raised concerns over their potential negative impacts on game, cultured and forage fishes, island and terrestrial resources, and other colonial water birds, leading to increased public demands to reduce their abundance. By combining fish surplus production and bird functional feeding response models, we developed a deterministic predictive model representing bird–fish interactions to inform an adaptive management process for the control of DCCOs in multiple colonies in Michigan. Comparisons of model predictions with observations of changes in DCCO numbers under management measures implemented from 2004 to 2012 suggested that our relatively simple model was able to accurately reconstruct past DCCO population dynamics. These comparisons helped discriminate among alternative parameterizations of demographic processes that were poorly known, especially site fidelity. Using sensitivity analysis, we also identified remaining critical uncertainties (mainly in the spatial distributions of fish vs. DCCO feeding areas) that can be used to prioritize future research and monitoring needs. Model forecasts suggested that continuation of existing control efforts would be sufficient to achieve long-term DCCO control targets in Michigan and that DCCO control may be necessary to achieve management goals for some DCCO-impacted fisheries in the state. Finally, our model can be extended by accounting for parametric or ecological uncertainty and including more complex assumptions on DCCO–fish interactions as part of the adaptive management process.

  19. Investigation on Cardiovascular Risk Prediction Using Physiological Parameters

    Directory of Open Access Journals (Sweden)

    Wan-Hua Lin

    2013-01-01

    Full Text Available Cardiovascular disease (CVD is the leading cause of death worldwide. Early prediction of CVD is urgently important for timely prevention and treatment. Incorporation or modification of new risk factors that have an additional independent prognostic value of existing prediction models is widely used for improving the performance of the prediction models. This paper is to investigate the physiological parameters that are used as risk factors for the prediction of cardiovascular events, as well as summarizing the current status on the medical devices for physiological tests and discuss the potential implications for promoting CVD prevention and treatment in the future. The results show that measures extracted from blood pressure, electrocardiogram, arterial stiffness, ankle-brachial blood pressure index (ABI, and blood glucose carry valuable information for the prediction of both long-term and near-term cardiovascular risk. However, the predictive values should be further validated by more comprehensive measures. Meanwhile, advancing unobtrusive technologies and wireless communication technologies allow on-site detection of the physiological information remotely in an out-of-hospital setting in real-time. In addition with computer modeling technologies and information fusion. It may allow for personalized, quantitative, and real-time assessment of sudden CVD events.

  20. Time evolution of predictability of epidemics on networks

    Science.gov (United States)

    Holme, Petter; Takaguchi, Taro

    2015-04-01

    Epidemic outbreaks of new pathogens, or known pathogens in new populations, cause a great deal of fear because they are hard to predict. For theoretical models of disease spreading, on the other hand, quantities characterizing the outbreak converge to deterministic functions of time. Our goal in this paper is to shed some light on this apparent discrepancy. We measure the diversity of (and, thus, the predictability of) outbreak sizes and extinction times as functions of time given different scenarios of the amount of information available. Under the assumption of perfect information—i.e., knowing the state of each individual with respect to the disease—the predictability decreases exponentially, or faster, with time. The decay is slowest for intermediate values of the per-contact transmission probability. With a weaker assumption on the information available, assuming that we know only the fraction of currently infectious, recovered, or susceptible individuals, the predictability also decreases exponentially most of the time. There are, however, some peculiar regions in this scenario where the predictability decreases. In other words, to predict its final size with a given accuracy, we would need increasingly more information about the outbreak.

  1. Reliability of windstorm predictions in the ECMWF ensemble prediction system

    Science.gov (United States)

    Becker, Nico; Ulbrich, Uwe

    2016-04-01

    Windstorms caused by extratropical cyclones are one of the most dangerous natural hazards in the European region. Therefore, reliable predictions of such storm events are needed. Case studies have shown that ensemble prediction systems (EPS) are able to provide useful information about windstorms between two and five days prior to the event. In this work, ensemble predictions with the European Centre for Medium-Range Weather Forecasts (ECMWF) EPS are evaluated in a four year period. Within the 50 ensemble members, which are initialized every 12 hours and are run for 10 days, windstorms are identified and tracked in time and space. By using a clustering approach, different predictions of the same storm are identified in the different ensemble members and compared to reanalysis data. The occurrence probability of the predicted storms is estimated by fitting a bivariate normal distribution to the storm track positions. Our results show, for example, that predicted storm clusters with occurrence probabilities of more than 50% have a matching observed storm in 80% of all cases at a lead time of two days. The predicted occurrence probabilities are reliable up to 3 days lead time. At longer lead times the occurrence probabilities are overestimated by the EPS.

  2. SURVEY ON CRIME ANALYSIS AND PREDICTION USING DATA MINING TECHNIQUES

    Directory of Open Access Journals (Sweden)

    H Benjamin Fredrick David

    2017-04-01

    Full Text Available Data Mining is the procedure which includes evaluating and examining large pre-existing databases in order to generate new information which may be essential to the organization. The extraction of new information is predicted using the existing datasets. Many approaches for analysis and prediction in data mining had been performed. But, many few efforts has made in the criminology field. Many few have taken efforts for comparing the information all these approaches produce. The police stations and other similar criminal justice agencies hold many large databases of information which can be used to predict or analyze the criminal movements and criminal activity involvement in the society. The criminals can also be predicted based on the crime data. The main aim of this work is to perform a survey on the supervised learning and unsupervised learning techniques that has been applied towards criminal identification. This paper presents the survey on the Crime analysis and crime prediction using several Data Mining techniques.

  3. Predicting chemically-induced skin reactions. Part I: QSAR models of skin sensitization and their application to identify potentially hazardous compounds

    Energy Technology Data Exchange (ETDEWEB)

    Alves, Vinicius M. [Laboratory of Molecular Modeling and Design, Faculty of Pharmacy, Federal University of Goiás, Goiânia, GO 74605-220 (Brazil); Laboratory for Molecular Modeling, Division of Chemical Biology and Medicinal Chemistry, Eshelman School of Pharmacy, University of North Carolina, Chapel Hill, NC 27599 (United States); Muratov, Eugene [Laboratory for Molecular Modeling, Division of Chemical Biology and Medicinal Chemistry, Eshelman School of Pharmacy, University of North Carolina, Chapel Hill, NC 27599 (United States); Laboratory of Theoretical Chemistry, A.V. Bogatsky Physical-Chemical Institute NAS of Ukraine, Odessa 65080 (Ukraine); Fourches, Denis [Laboratory for Molecular Modeling, Division of Chemical Biology and Medicinal Chemistry, Eshelman School of Pharmacy, University of North Carolina, Chapel Hill, NC 27599 (United States); Strickland, Judy; Kleinstreuer, Nicole [ILS/Contractor Supporting the NTP Interagency Center for the Evaluation of Alternative Toxicological Methods (NICEATM), P.O. Box 13501, Research Triangle Park, NC 27709 (United States); Andrade, Carolina H. [Laboratory of Molecular Modeling and Design, Faculty of Pharmacy, Federal University of Goiás, Goiânia, GO 74605-220 (Brazil); Tropsha, Alexander, E-mail: alex_tropsha@unc.edu [Laboratory for Molecular Modeling, Division of Chemical Biology and Medicinal Chemistry, Eshelman School of Pharmacy, University of North Carolina, Chapel Hill, NC 27599 (United States)

    2015-04-15

    Repetitive exposure to a chemical agent can induce an immune reaction in inherently susceptible individuals that leads to skin sensitization. Although many chemicals have been reported as skin sensitizers, there have been very few rigorously validated QSAR models with defined applicability domains (AD) that were developed using a large group of chemically diverse compounds. In this study, we have aimed to compile, curate, and integrate the largest publicly available dataset related to chemically-induced skin sensitization, use this data to generate rigorously validated and QSAR models for skin sensitization, and employ these models as a virtual screening tool for identifying putative sensitizers among environmental chemicals. We followed best practices for model building and validation implemented with our predictive QSAR workflow using Random Forest modeling technique in combination with SiRMS and Dragon descriptors. The Correct Classification Rate (CCR) for QSAR models discriminating sensitizers from non-sensitizers was 71–88% when evaluated on several external validation sets, within a broad AD, with positive (for sensitizers) and negative (for non-sensitizers) predicted rates of 85% and 79% respectively. When compared to the skin sensitization module included in the OECD QSAR Toolbox as well as to the skin sensitization model in publicly available VEGA software, our models showed a significantly higher prediction accuracy for the same sets of external compounds as evaluated by Positive Predicted Rate, Negative Predicted Rate, and CCR. These models were applied to identify putative chemical hazards in the Scorecard database of possible skin or sense organ toxicants as primary candidates for experimental validation. - Highlights: • It was compiled the largest publicly-available skin sensitization dataset. • Predictive QSAR models were developed for skin sensitization. • Developed models have higher prediction accuracy than OECD QSAR Toolbox. • Putative

  4. Predicting chemically-induced skin reactions. Part I: QSAR models of skin sensitization and their application to identify potentially hazardous compounds

    International Nuclear Information System (INIS)

    Alves, Vinicius M.; Muratov, Eugene; Fourches, Denis; Strickland, Judy; Kleinstreuer, Nicole; Andrade, Carolina H.; Tropsha, Alexander

    2015-01-01

    Repetitive exposure to a chemical agent can induce an immune reaction in inherently susceptible individuals that leads to skin sensitization. Although many chemicals have been reported as skin sensitizers, there have been very few rigorously validated QSAR models with defined applicability domains (AD) that were developed using a large group of chemically diverse compounds. In this study, we have aimed to compile, curate, and integrate the largest publicly available dataset related to chemically-induced skin sensitization, use this data to generate rigorously validated and QSAR models for skin sensitization, and employ these models as a virtual screening tool for identifying putative sensitizers among environmental chemicals. We followed best practices for model building and validation implemented with our predictive QSAR workflow using Random Forest modeling technique in combination with SiRMS and Dragon descriptors. The Correct Classification Rate (CCR) for QSAR models discriminating sensitizers from non-sensitizers was 71–88% when evaluated on several external validation sets, within a broad AD, with positive (for sensitizers) and negative (for non-sensitizers) predicted rates of 85% and 79% respectively. When compared to the skin sensitization module included in the OECD QSAR Toolbox as well as to the skin sensitization model in publicly available VEGA software, our models showed a significantly higher prediction accuracy for the same sets of external compounds as evaluated by Positive Predicted Rate, Negative Predicted Rate, and CCR. These models were applied to identify putative chemical hazards in the Scorecard database of possible skin or sense organ toxicants as primary candidates for experimental validation. - Highlights: • It was compiled the largest publicly-available skin sensitization dataset. • Predictive QSAR models were developed for skin sensitization. • Developed models have higher prediction accuracy than OECD QSAR Toolbox. • Putative

  5. Distributed video coding with multiple side information

    DEFF Research Database (Denmark)

    Huang, Xin; Brites, C.; Ascenso, J.

    2009-01-01

    Distributed Video Coding (DVC) is a new video coding paradigm which mainly exploits the source statistics at the decoder based on the availability of some decoder side information. The quality of the side information has a major impact on the DVC rate-distortion (RD) performance in the same way...... the quality of the predictions had a major impact in predictive video coding. In this paper, a DVC solution exploiting multiple side information is proposed; the multiple side information is generated by frame interpolation and frame extrapolation targeting to improve the side information of a single...

  6. The Determinants of Bank Internationalisation in Times of Financial Globalisation: Evidence from the World's Largest Banks (1980-2007)

    NARCIS (Netherlands)

    Westerhuis, Gerarda; Mulder, Arjen

    2015-01-01

    This article analyses the determinants of bank internationalisation, of the world's largest banks from the period 1980–2007. The purpose of the article is twofold. First, we show how a mixed-methods research design, in which we combine a variables-based research with three case studies, can

  7. Creativity, information, and consciousness: The information dynamics of thinking.

    Science.gov (United States)

    Wiggins, Geraint A

    2018-05-07

    This paper presents a theory of the basic operation of mind, Information Dynamics of Thinking, which is intended for computational implementation and thence empirical testing. It is based on the information theory of Shannon, and treats the mind/brain as an information processing organ that aims to be information-efficient, in that it predicts its world, so as to use information efficiently, and regularly re-represents it, so as to store information efficiently. The theory is presented in context of a background review of various research areas that impinge upon its development. Consequences of the theory and testable hypotheses arising from it are discussed. Copyright © 2018. Published by Elsevier B.V.

  8. Looking for a Location: Dissociated Effects of Event-Related Plausibility and Verb-Argument Information on Predictive Processing in Aphasia.

    Science.gov (United States)

    Hayes, Rebecca A; Dickey, Michael Walsh; Warren, Tessa

    2016-12-01

    This study examined the influence of verb-argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54-82 years) as well as 44 young adults (aged 18-31 years) and 18 older adults (aged 50-71 years) participated. Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure.

  9. Looking for a Location: Dissociated Effects of Event-Related Plausibility and Verb–Argument Information on Predictive Processing in Aphasia

    Science.gov (United States)

    Dickey, Michael Walsh; Warren, Tessa

    2016-01-01

    Purpose This study examined the influence of verb–argument information and event-related plausibility on prediction of upcoming event locations in people with aphasia, as well as older and younger, neurotypical adults. It investigated how these types of information interact during anticipatory processing and how the ability to take advantage of the different types of information is affected by aphasia. Method This study used a modified visual-world task to examine eye movements and offline photo selection. Twelve adults with aphasia (aged 54–82 years) as well as 44 young adults (aged 18–31 years) and 18 older adults (aged 50–71 years) participated. Results Neurotypical adults used verb argument status and plausibility information to guide both eye gaze (a measure of anticipatory processing) and image selection (a measure of ultimate interpretation). Argument status did not affect the behavior of people with aphasia in either measure. There was only limited evidence of interaction between these 2 factors in eye gaze data. Conclusions Both event-related plausibility and verb-based argument status contributed to anticipatory processing of upcoming event locations among younger and older neurotypical adults. However, event-related likelihood had a much larger role in the performance of people with aphasia than did verb-based knowledge regarding argument structure. PMID:27997951

  10. Gas Concentration Prediction Based on the Measured Data of a Coal Mine Rescue Robot

    Directory of Open Access Journals (Sweden)

    Xiliang Ma

    2016-01-01

    Full Text Available The coal mine environment is complex and dangerous after gas accident; then a timely and effective rescue and relief work is necessary. Hence prediction of gas concentration in front of coal mine rescue robot is an important significance to ensure that the coal mine rescue robot carries out the exploration and search and rescue mission. In this paper, a gray neural network is proposed to predict the gas concentration 10 meters in front of the coal mine rescue robot based on the gas concentration, temperature, and wind speed of the current position and 1 meter in front. Subsequently the quantum genetic algorithm optimization gray neural network parameters of the gas concentration prediction method are proposed to get more accurate prediction of the gas concentration in the roadway. Experimental results show that a gray neural network optimized by the quantum genetic algorithm is more accurate for predicting the gas concentration. The overall prediction error is 9.12%, and the largest forecasting error is 11.36%; compared with gray neural network, the gas concentration prediction error increases by 55.23%. This means that the proposed method can better allow the coal mine rescue robot to accurately predict the gas concentration in the coal mine roadway.

  11. SU-D-204-01: A Methodology Based On Machine Learning and Quantum Clustering to Predict Lung SBRT Dosimetric Endpoints From Patient Specific Anatomic Features

    Energy Technology Data Exchange (ETDEWEB)

    Lafata, K; Ren, L; Wu, Q; Kelsey, C; Hong, J; Cai, J; Yin, F [Duke University Medical Center, Durham, NC (United States)

    2016-06-15

    Purpose: To develop a data-mining methodology based on quantum clustering and machine learning to predict expected dosimetric endpoints for lung SBRT applications based on patient-specific anatomic features. Methods: Ninety-three patients who received lung SBRT at our clinic from 2011–2013 were retrospectively identified. Planning information was acquired for each patient, from which various features were extracted using in-house semi-automatic software. Anatomic features included tumor-to-OAR distances, tumor location, total-lung-volume, GTV and ITV. Dosimetric endpoints were adopted from RTOG-0195 recommendations, and consisted of various OAR-specific partial-volume doses and maximum point-doses. First, PCA analysis and unsupervised quantum-clustering was used to explore the feature-space to identify potentially strong classifiers. Secondly, a multi-class logistic regression algorithm was developed and trained to predict dose-volume endpoints based on patient-specific anatomic features. Classes were defined by discretizing the dose-volume data, and the feature-space was zero-mean normalized. Fitting parameters were determined by minimizing a regularized cost function, and optimization was performed via gradient descent. As a pilot study, the model was tested on two esophageal dosimetric planning endpoints (maximum point-dose, dose-to-5cc), and its generalizability was evaluated with leave-one-out cross-validation. Results: Quantum-Clustering demonstrated a strong separation of feature-space at 15Gy across the first-and-second Principle Components of the data when the dosimetric endpoints were retrospectively identified. Maximum point dose prediction to the esophagus demonstrated a cross-validation accuracy of 87%, and the maximum dose to 5cc demonstrated a respective value of 79%. The largest optimized weighting factor was placed on GTV-to-esophagus distance (a factor of 10 greater than the second largest weighting factor), indicating an intuitively strong

  12. Predicting Great Lakes fish yields: tools and constraints

    Science.gov (United States)

    Lewis, C.A.; Schupp, D.H.; Taylor, W.W.; Collins, J.J.; Hatch, Richard W.

    1987-01-01

    Prediction of yield is a critical component of fisheries management. The development of sound yield prediction methodology and the application of the results of yield prediction are central to the evolution of strategies to achieve stated goals for Great Lakes fisheries and to the measurement of progress toward those goals. Despite general availability of species yield models, yield prediction for many Great Lakes fisheries has been poor due to the instability of the fish communities and the inadequacy of available data. A host of biological, institutional, and societal factors constrain both the development of sound predictions and their application to management. Improved predictive capability requires increased stability of Great Lakes fisheries through rehabilitation of well-integrated communities, improvement of data collection, data standardization and information-sharing mechanisms, and further development of the methodology for yield prediction. Most important is the creation of a better-informed public that will in turn establish the political will to do what is required.

  13. Data Prediction for Public Events in Professional Domains Based on Improved RNN- LSTM

    Science.gov (United States)

    Song, Bonan; Fan, Chunxiao; Wu, Yuexin; Sun, Juanjuan

    2018-02-01

    The traditional data services of prediction for emergency or non-periodic events usually cannot generate satisfying result or fulfill the correct prediction purpose. However, these events are influenced by external causes, which mean certain a priori information of these events generally can be collected through the Internet. This paper studied the above problems and proposed an improved model—LSTM (Long Short-term Memory) dynamic prediction and a priori information sequence generation model by combining RNN-LSTM and public events a priori information. In prediction tasks, the model is qualified for determining trends, and its accuracy also is validated. This model generates a better performance and prediction results than the previous one. Using a priori information can increase the accuracy of prediction; LSTM can better adapt to the changes of time sequence; LSTM can be widely applied to the same type of prediction tasks, and other prediction tasks related to time sequence.

  14. Oceans of Opportunity. Harnessing Europe's largest domestic energy resource

    International Nuclear Information System (INIS)

    Fichaux, N.; Wilkes, J.

    2009-09-01

    Europe's offshore wind potential is enormous and able to power Europe seven times over. Over 100 GW of offshore wind projects are already in various stages of planning. If realised, these projects would produce 10% of the EU's electricity whilst avoiding 200 million tonnes of CO2 emissions each year. EWEA has a target of 40 GW of offshore wind in the EU by 2020, implying an average annual market growth of 28% over the coming 12 years. The EU market for onshore wind grew by an average 32% per year in the 12-year period from 1992-2004 - what the wind energy industry has achieved on land can be repeated at sea. EWEA's proposed offshore grid builds on the 11 offshore grids currently operating and 21 offshore grids currently being considered by the grid operators in the Baltic and North Seas to give Europe a truly pan-European electricity super highway. Strong political support and action from Europe's policy-makers will allow a new, multi-billion euro industry to be built. This new industry will deliver thousands of green collar jobs and a new renewable energy economy and establish Europe as world leader in offshore wind power technology. A single European electricity market with large amounts of wind power will bring affordable electricity to consumers, reduce import dependence, cut CO2 emissions and allow Europe to access its largest domestic energy source.

  15. CleavPredict: A Platform for Reasoning about Matrix Metalloproteinases Proteolytic Events.

    Directory of Open Access Journals (Sweden)

    Sonu Kumar

    Full Text Available CleavPredict (http://cleavpredict.sanfordburnham.org is a Web server for substrate cleavage prediction for matrix metalloproteinases (MMPs. It is intended as a computational platform aiding the scientific community in reasoning about proteolytic events. CleavPredict offers in silico prediction of cleavage sites specific for 11 human MMPs. The prediction method employs the MMP specific position weight matrices (PWMs derived from statistical analysis of high-throughput phage display experimental results. To augment the substrate cleavage prediction process, CleavPredict provides information about the structural features of potential cleavage sites that influence proteolysis. These include: secondary structure, disordered regions, transmembrane domains, and solvent accessibility. The server also provides information about subcellular location, co-localization, and co-expression of proteinase and potential substrates, along with experimentally determined positions of single nucleotide polymorphism (SNP, and posttranslational modification (PTM sites in substrates. All this information will provide the user with perspectives in reasoning about proteolytic events. CleavPredict is freely accessible, and there is no login required.

  16. Feeling conflicted and seeking information: when ambivalence enhances and diminishes selective exposure to attitude-consistent information.

    Science.gov (United States)

    Sawicki, Vanessa; Wegener, Duane T; Clark, Jason K; Fabrigar, Leandre R; Smith, Steven M; Durso, Geoffrey R O

    2013-06-01

    To date, little research has examined the impact of attitudinal ambivalence on attitude-congruent selective exposure. Past research would suggest that strong/univalent rather than weak/ambivalent attitudes should be more predictive of proattitudinal information seeking. Although ambivalent attitude structure might weaken the attitude's effect on seeking proattitudinal information, we believe that conflicted attitudes might also motivate attitude-congruent selective exposure because proattitudinal information should be effective in reducing ambivalence. Two studies provide evidence that the effects of ambivalence on information choices depend on amount of issue knowledge. That is, ambivalence motivates attitude-consistent exposure when issue knowledge is relatively low because less familiar information is perceived to be effective at reducing ambivalence. Conversely, when knowledge is relatively high, more unambivalent (univalent) attitudes predicted attitude-consistent information seeking.

  17. Functional region prediction with a set of appropriate homologous sequences-an index for sequence selection by integrating structure and sequence information with spatial statistics

    Science.gov (United States)

    2012-01-01

    Background The detection of conserved residue clusters on a protein structure is one of the effective strategies for the prediction of functional protein regions. Various methods, such as Evolutionary Trace, have been developed based on this strategy. In such approaches, the conserved residues are identified through comparisons of homologous amino acid sequences. Therefore, the selection of homologous sequences is a critical step. It is empirically known that a certain degree of sequence divergence in the set of homologous sequences is required for the identification of conserved residues. However, the development of a method to select homologous sequences appropriate for the identification of conserved residues has not been sufficiently addressed. An objective and general method to select appropriate homologous sequences is desired for the efficient prediction of functional regions. Results We have developed a novel index to select the sequences appropriate for the identification of conserved residues, and implemented the index within our method to predict the functional regions of a protein. The implementation of the index improved the performance of the functional region prediction. The index represents the degree of conserved residue clustering on the tertiary structure of the protein. For this purpose, the structure and sequence information were integrated within the index by the application of spatial statistics. Spatial statistics is a field of statistics in which not only the attributes but also the geometrical coordinates of the data are considered simultaneously. Higher degrees of clustering generate larger index scores. We adopted the set of homologous sequences with the highest index score, under the assumption that the best prediction accuracy is obtained when the degree of clustering is the maximum. The set of sequences selected by the index led to higher functional region prediction performance than the sets of sequences selected by other sequence

  18. Estimating the recreational value of Pakistan's largest freshwater lake to support sustainable tourism management using a travel cost model

    NARCIS (Netherlands)

    Mangan, T.; Brouwer, R.; Lohano, H.; Nagraj, G.M.

    2013-01-01

    Keenjhar Lake, Pakistan's largest freshwater lake and an important Ramsar site, provides habitat for internationally important water birds. Annually, 385,000 people visit the lake. The lake is threatened by a variety of causes, including industrial and agricultural pollution. To support its

  19. The spatiotemporal dynamic analysis of the implied market information and characteristics of the correlation coefficient matrix of the international crude oil price returns

    International Nuclear Information System (INIS)

    Tian, Lixin; Ding, Zhenqi; Zhen, Zaili; Wang, Minggang

    2016-01-01

    The international crude oil market plays a crucial role in economies, and the studies of the correlation, risk and synchronization of the international crude oil market have important implications for the security and stability of the country, avoidance of business risk and people's daily lives. We investigate the information and characteristics of the international crude oil market (1999-2015) based on the random matrix theory (RMT). Firstly, we identify richer information in the largest eigenvalues deviating from RMT predictions for the international crude oil market; the international crude oil market can be roughly divided into ten different periods by the methods of eigenvectors and characteristic combination, and the implied market information of the correlation coefficient matrix is advanced. Secondly, we study the characteristics of the international crude oil market by the methods of system risk entropy, dynamic synchronous ratio, dynamic non-synchronous ratio and dynamic clustering algorithm. The results show that the international crude oil market is full of risk. The synchronization of the international crude oil market is very strong, and WTI and Brent occupy a very important position in the international crude oil market. (orig.)

  20. The spatiotemporal dynamic analysis of the implied market information and characteristics of the correlation coefficient matrix of the international crude oil price returns

    Energy Technology Data Exchange (ETDEWEB)

    Tian, Lixin [Jiangsu University, Energy Development and Environmental Protection Strategy Research Center, Zhenjiang, Jiangsu (China); Nanjing Normal University, School of Mathematical Sciences, Nanjing, Jiangsu (China); Ding, Zhenqi; Zhen, Zaili [Jiangsu University, Energy Development and Environmental Protection Strategy Research Center, Zhenjiang, Jiangsu (China); Wang, Minggang [Nanjing Normal University, School of Mathematical Sciences, Nanjing, Jiangsu (China)

    2016-08-15

    The international crude oil market plays a crucial role in economies, and the studies of the correlation, risk and synchronization of the international crude oil market have important implications for the security and stability of the country, avoidance of business risk and people's daily lives. We investigate the information and characteristics of the international crude oil market (1999-2015) based on the random matrix theory (RMT). Firstly, we identify richer information in the largest eigenvalues deviating from RMT predictions for the international crude oil market; the international crude oil market can be roughly divided into ten different periods by the methods of eigenvectors and characteristic combination, and the implied market information of the correlation coefficient matrix is advanced. Secondly, we study the characteristics of the international crude oil market by the methods of system risk entropy, dynamic synchronous ratio, dynamic non-synchronous ratio and dynamic clustering algorithm. The results show that the international crude oil market is full of risk. The synchronization of the international crude oil market is very strong, and WTI and Brent occupy a very important position in the international crude oil market. (orig.)

  1. M&A information technology best practices

    CERN Document Server

    Roehl-Anderson, Janice M

    2013-01-01

    Add value to your organization via the mergers & acquisitions IT function  As part of Deloitte Consulting, one of the largest mergers and acquisitions (M&A) consulting practice in the world, author Janice Roehl-Anderson reveals in M&A Information Technology Best Practices how companies can effectively and efficiently address the IT aspects of mergers, acquisitions, and divestitures. Filled with best practices for implementing and maintaining systems, this book helps financial and technology executives in every field to add value to their mergers, acquisitions, and/or divestitures via the IT

  2. Social Trust Prediction Using Heterogeneous Networks

    Science.gov (United States)

    HUANG, JIN; NIE, FEIPING; HUANG, HENG; TU, YI-CHENG; LEI, YU

    2014-01-01

    Along with increasing popularity of social websites, online users rely more on the trustworthiness information to make decisions, extract and filter information, and tag and build connections with other users. However, such social network data often suffer from severe data sparsity and are not able to provide users with enough information. Therefore, trust prediction has emerged as an important topic in social network research. Traditional approaches are primarily based on exploring trust graph topology itself. However, research in sociology and our life experience suggest that people who are in the same social circle often exhibit similar behaviors and tastes. To take advantage of the ancillary information for trust prediction, the challenge then becomes what to transfer and how to transfer. In this article, we address this problem by aggregating heterogeneous social networks and propose a novel joint social networks mining (JSNM) method. Our new joint learning model explores the user-group-level similarity between correlated graphs and simultaneously learns the individual graph structure; therefore, the shared structures and patterns from multiple social networks can be utilized to enhance the prediction tasks. As a result, we not only improve the trust prediction in the target graph but also facilitate other information retrieval tasks in the auxiliary graphs. To optimize the proposed objective function, we use the alternative technique to break down the objective function into several manageable subproblems. We further introduce the auxiliary function to solve the optimization problems with rigorously proved convergence. The extensive experiments have been conducted on both synthetic and real- world data. All empirical results demonstrate the effectiveness of our method. PMID:24729776

  3. State-owned companies dominate list of largest non-U.S. producers

    International Nuclear Information System (INIS)

    Beck, R.J.; Williamson, M.

    1994-01-01

    Because state-owned oil and gas companies dominate Oil and Gas Journal's list of largest non-US producers, data aren't fully comparable with those of the OGJ300. Many state companies report only production and reserves, with little or no financial data. Companies on the OGJ100, therefore, cannot be ranked by assets or revenues. Instead, they are listed by regions, based on location of corporate headquarters. There was no change in makeup of the top 20 holders of crude oil reserves. These companies' reserves totaled 872.3 billion bbl in 1993. The top 20 non-US companies now control 87.3 % of total world crude oil reserves, according to OGJ estimates. This is up marginally from 87.2 % of total world oil reserves in 1992. The top 20 had 87.7 % of total world reserves in 1991 and 85.5 % in 1990. The table lists company name, total assets, revenues, net income, capital and exploratory expenditures, worldwide oil production, gas production, oil and gas reserves worldwide

  4. Banking Umbilical Cord Blood (UCB) Stem Cells: Awareness, Attitude and Expectations of Potential Donors from One of the Largest Potential Repository (India).

    Science.gov (United States)

    Pandey, Deeksha; Kaur, Simar; Kamath, Asha

    2016-01-01

    The concept of Umbilical Cord blood (UCB) stem cells is emerging as a non-invasive, efficacious alternative source of hematopoietic stem cells to treat a variety of blood and bone marrow diseases, blood cancers, metabolic disorders and immune deficiencies. Aim of the present study was to determine the level of awareness about banking UCB among pregnant women in India. We also assessed patient perception for banking of UCB and explored the patient expectations of banking UCB in future. This is the first study to assess current attitudes, in a sample population of potential donors from one of the largest potential UCB repository (India). Obtaining this information may help optimize recruitment efforts and improve patient education. Present explorative questionnaire based survey included 254 pregnant women in the final analysis. We established only 26.5% pregnant women in our study population knew what exactly is meant by UCB. A large proportion (55.1%) was undecided on whether they want to bank UCB or not. Women were more aware of the more advertised private cord blood banking compared to public banking. More than half of the pregnant women expected their obstetrician to inform them regarding UCB. One-third of the women in our population had undue expectations from banking of the UCB. Obstetricians should play a more active role in explaining the patients regarding pros and cons of UCB banking.

  5. Banking Umbilical Cord Blood (UCB Stem Cells: Awareness, Attitude and Expectations of Potential Donors from One of the Largest Potential Repository (India.

    Directory of Open Access Journals (Sweden)

    Deeksha Pandey

    Full Text Available The concept of Umbilical Cord blood (UCB stem cells is emerging as a non-invasive, efficacious alternative source of hematopoietic stem cells to treat a variety of blood and bone marrow diseases, blood cancers, metabolic disorders and immune deficiencies. Aim of the present study was to determine the level of awareness about banking UCB among pregnant women in India. We also assessed patient perception for banking of UCB and explored the patient expectations of banking UCB in future. This is the first study to assess current attitudes, in a sample population of potential donors from one of the largest potential UCB repository (India. Obtaining this information may help optimize recruitment efforts and improve patient education.Present explorative questionnaire based survey included 254 pregnant women in the final analysis.We established only 26.5% pregnant women in our study population knew what exactly is meant by UCB. A large proportion (55.1% was undecided on whether they want to bank UCB or not. Women were more aware of the more advertised private cord blood banking compared to public banking. More than half of the pregnant women expected their obstetrician to inform them regarding UCB. One-third of the women in our population had undue expectations from banking of the UCB.Obstetricians should play a more active role in explaining the patients regarding pros and cons of UCB banking.

  6. Quantification of informed opinion

    International Nuclear Information System (INIS)

    Rasmuson, D.M.

    1985-01-01

    The objective of this session, Quantification of Informed Opinion, is to provide the statistician with a better understanding of this important area. The NRC uses informed opinion, sometimes called engineering judgment or subjective judgment, in many areas. Sometimes informed opinion is the only source of information that exists, especially in phenomenological areas, such as steam explosions, where experiments are costly and phenomena are very difficult to measure. There are many degrees of informed opinion. These vary from the weatherman who makes predictions concerning relatively high probability events with a large data base to the phenomenological expert who must use his intuition tempered with basic knowledge and little or no measured data to predict the behavior of events with a low probability of occurrence. The first paper in this session provides the reader with an overview of the subject area. The second paper provides some aspects that must be considered in the collection of informed opinion to improve the quality of the information. The final paper contains an example of the use of informed opinion in the area of seismic hazard characterization. These papers should be useful to researchers and statisticians who need to collect and use informed opinion in their work

  7. Line Balancing Using Largest Candidate Rule Algorithm In A Garment Industry: A Case Study

    Directory of Open Access Journals (Sweden)

    V. P.Jaganathan

    2014-12-01

    Full Text Available The emergence of fast changes in fashion has given rise to the need to shorten production cycle times in the garment industry. As effective usage of resources has a significant effect on the productivity and efficiency of production operations, garment manufacturers are urged to utilize their resources effectively in order to meet dynamic customer demand. This paper focuses specifically on line balancing and layout modification. The aim of assembly line balance in sewing lines is to assign tasks to the workstations, so that the machines of the workstation can perform the assigned tasks with a balanced loading. Largest Candidate Rule Algorithm (LCR has been deployed in this paper.

  8. The prediction of swimming performance in competition from behavioral information.

    Science.gov (United States)

    Rushall, B S; Leet, D

    1979-06-01

    The swimming performances of the Canadian Team at the 1976 Olympic Games were categorized as being improved or worse than previous best times in the events contested. The two groups had been previously assessed on the Psychological Inventories for Competitive Swimmers. A stepwise multiple-discriminant analysis of the inventory responses revealed that 13 test questions produced a perfect discrimination of group membership. The resultant discriminant functions for predicting performance classification were applied to the test responses of 157 swimmers at the 1977 Canadian Winter National Swimming Championships. Using the same performance classification criteria the accuracy of prediction was not better than chance in three of four sex by performance classifications. This yielded a failure to locate a set of behavioral factors which determine swimming performance improvements in elite competitive circumstances. The possibility of sets of factors which do not discriminate between performances in similar environments or between similar groups of swimmers was raised.

  9. Predicting Defects Using Information Intelligence Process Models in the Software Technology Project.

    Science.gov (United States)

    Selvaraj, Manjula Gandhi; Jayabal, Devi Shree; Srinivasan, Thenmozhi; Balasubramanie, Palanisamy

    2015-01-01

    A key differentiator in a competitive market place is customer satisfaction. As per Gartner 2012 report, only 75%-80% of IT projects are successful. Customer satisfaction should be considered as a part of business strategy. The associated project parameters should be proactively managed and the project outcome needs to be predicted by a technical manager. There is lot of focus on the end state and on minimizing defect leakage as much as possible. Focus should be on proactively managing and shifting left in the software life cycle engineering model. Identify the problem upfront in the project cycle and do not wait for lessons to be learnt and take reactive steps. This paper gives the practical applicability of using predictive models and illustrates use of these models in a project to predict system testing defects thus helping to reduce residual defects.

  10. The importance of information on relatives for the prediction of genomic breeding values and the implications for the makeup of reference data sets in livestock breeding schemes.

    Science.gov (United States)

    Clark, Samuel A; Hickey, John M; Daetwyler, Hans D; van der Werf, Julius H J

    2012-02-09

    The theory of genomic selection is based on the prediction of the effects of genetic markers in linkage disequilibrium with quantitative trait loci. However, genomic selection also relies on relationships between individuals to accurately predict genetic value. This study aimed to examine the importance of information on relatives versus that of unrelated or more distantly related individuals on the estimation of genomic breeding values. Simulated and real data were used to examine the effects of various degrees of relationship on the accuracy of genomic selection. Genomic Best Linear Unbiased Prediction (gBLUP) was compared to two pedigree based BLUP methods, one with a shallow one generation pedigree and the other with a deep ten generation pedigree. The accuracy of estimated breeding values for different groups of selection candidates that had varying degrees of relationships to a reference data set of 1750 animals was investigated. The gBLUP method predicted breeding values more accurately than BLUP. The most accurate breeding values were estimated using gBLUP for closely related animals. Similarly, the pedigree based BLUP methods were also accurate for closely related animals, however when the pedigree based BLUP methods were used to predict unrelated animals, the accuracy was close to zero. In contrast, gBLUP breeding values, for animals that had no pedigree relationship with animals in the reference data set, allowed substantial accuracy. An animal's relationship to the reference data set is an important factor for the accuracy of genomic predictions. Animals that share a close relationship to the reference data set had the highest accuracy from genomic predictions. However a baseline accuracy that is driven by the reference data set size and the overall population effective population size enables gBLUP to estimate a breeding value for unrelated animals within a population (breed), using information previously ignored by pedigree based BLUP methods.

  11. Health-related ad information and health motivation effects on product evaluations

    DEFF Research Database (Denmark)

    Chrysochou, Polymeros; Grunert, Klaus G

    2014-01-01

    This study tests the effect of health-related ad information on perceived product healthfulness and purchase intention. Also, the study investigates whether consumers' health motivation moderates the effects, because of the way health motivation affects processing of health-related information...... in ads. Three types of healthrelated ad elements are distinguished: functional claims, process claims and health imagery. These elements were combined in mock ads and an online experiment was run to test the study hypotheses. Results show that health imagery has the largest impact on consumers' product...

  12. Literature-based condition-specific miRNA-mRNA target prediction.

    Directory of Open Access Journals (Sweden)

    Minsik Oh

    Full Text Available miRNAs are small non-coding RNAs that regulate gene expression by binding to the 3'-UTR of genes. Many recent studies have reported that miRNAs play important biological roles by regulating specific mRNAs or genes. Many sequence-based target prediction algorithms have been developed to predict miRNA targets. However, these methods are not designed for condition-specific target predictions and produce many false positives; thus, expression-based target prediction algorithms have been developed for condition-specific target predictions. A typical strategy to utilize expression data is to leverage the negative control roles of miRNAs on genes. To control false positives, a stringent cutoff value is typically set, but in this case, these methods tend to reject many true target relationships, i.e., false negatives. To overcome these limitations, additional information should be utilized. The literature is probably the best resource that we can utilize. Recent literature mining systems compile millions of articles with experiments designed for specific biological questions, and the systems provide a function to search for specific information. To utilize the literature information, we used a literature mining system, BEST, that automatically extracts information from the literature in PubMed and that allows the user to perform searches of the literature with any English words. By integrating omics data analysis methods and BEST, we developed Context-MMIA, a miRNA-mRNA target prediction method that combines expression data analysis results and the literature information extracted based on the user-specified context. In the pathway enrichment analysis using genes included in the top 200 miRNA-targets, Context-MMIA outperformed the four existing target prediction methods that we tested. In another test on whether prediction methods can re-produce experimentally validated target relationships, Context-MMIA outperformed the four existing target prediction

  13. Towards a predictive theory for genetic regulatory networks

    Science.gov (United States)

    Tkacik, Gasper

    When cells respond to changes in the environment by regulating the expression levels of their genes, we often draw parallels between these biological processes and engineered information processing systems. One can go beyond this qualitative analogy, however, by analyzing information transmission in biochemical ``hardware'' using Shannon's information theory. Here, gene regulation is viewed as a transmission channel operating under restrictive constraints set by the resource costs and intracellular noise. We present a series of results demonstrating that a theory of information transmission in genetic regulatory circuits feasibly yields non-trivial, testable predictions. These predictions concern strategies by which individual gene regulatory elements, e.g., promoters or enhancers, read out their signals; as well as strategies by which small networks of genes, independently or in spatially coupled settings, respond to their inputs. These predictions can be quantitatively compared to the known regulatory networks and their function, and can elucidate how reproducible biological processes, such as embryonic development, can be orchestrated by networks built out of noisy components. Preliminary successes in the gap gene network of the fruit fly Drosophila indicate that a full ab initio theoretical prediction of a regulatory network is possible, a feat that has not yet been achieved for any real regulatory network. We end by describing open challenges on the path towards such a prediction.

  14. The Information Needs of Virtual Users: A Study of Second Life Libraries

    Science.gov (United States)

    Chow, Anthony S.; Baity, C. Chase; Zamarripa, Marilyn; Chappell, Pam; Rachlin, David; Vinson, Curtis

    2012-01-01

    As virtual worlds continue to proliferate globally, libraries are faced with the question of whether to provide information services to virtual patrons. This study, utilizing a mixed-method approach of interviews, focus groups, and surveys, represents one of the largest studies of virtual libraries attempted to date. Taking a holistic perspective,…

  15. CCTOP: a Consensus Constrained TOPology prediction web server.

    Science.gov (United States)

    Dobson, László; Reményi, István; Tusnády, Gábor E

    2015-07-01

    The Consensus Constrained TOPology prediction (CCTOP; http://cctop.enzim.ttk.mta.hu) server is a web-based application providing transmembrane topology prediction. In addition to utilizing 10 different state-of-the-art topology prediction methods, the CCTOP server incorporates topology information from existing experimental and computational sources available in the PDBTM, TOPDB and TOPDOM databases using the probabilistic framework of hidden Markov model. The server provides the option to precede the topology prediction with signal peptide prediction and transmembrane-globular protein discrimination. The initial result can be recalculated by (de)selecting any of the prediction methods or mapped experiments or by adding user specified constraints. CCTOP showed superior performance to existing approaches. The reliability of each prediction is also calculated, which correlates with the accuracy of the per protein topology prediction. The prediction results and the collected experimental information are visualized on the CCTOP home page and can be downloaded in XML format. Programmable access of the CCTOP server is also available, and an example of client-side script is provided. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. fRMSDPred: Predicting Local RMSD Between Structural Fragments Using Sequence Information

    National Research Council Canada - National Science Library

    Rangwala, Huzefa; Karypis, George

    2007-01-01

    .... We present algorithms to solve this fragment-level RMSD prediction problem using a supervised learning framework based on support vector regression and classification that incorporates protein...

  17. Water-saturated systems of the largest gas and gas-condensate deposits of the USSR. Vodonapornye sistemy krupneishikh gazovykh i gazokondensatnykh mestorozhdenii sssr

    Energy Technology Data Exchange (ETDEWEB)

    Kortsenshtein, V.N.

    1977-01-01

    A description is given of water-pressure systems in a number of the largest gas and gas-condensate fields of the Soviet Union, whose industrial reserves exceed 500 billion cubic meters. These include fields located in the concluding stage of development with sharply reduced recovery (Shebelinsk), fields that have just begun to operate and are characterized by increasing production (Vuktyl, Medved, Orenburg, Shatlyk, Urengoisk), and fields that are not yet developed (Yamburg and Zapolyar). Problems in the theory and practice of studying water-pressure systems of the largest gas and gas-condensate fields are analyzed primarily in connection with conditions required for their rational development which would provide for a maximum extraction of hydrocarbons from the interior. Importance is also given to the hydrogeological aspects of the formation of large hydrocarbon deposits and their distribution in the earth's crust. The most reliable factual materials on hydrogeology are utilized. The book is designed for personnel in the gas and oil industries, hydrogeologists, and scientists interested in problems of the formation, survey, and development of the largest hydrocarbon deposits. 92 references, 65 figures, 71 tables.

  18. Using field data to assess model predictions of surface and ground fuel consumption by wildfire in coniferous forests of California

    Science.gov (United States)

    Lydersen, Jamie M.; Collins, Brandon M.; Ewell, Carol M.; Reiner, Alicia L.; Fites, Jo Ann; Dow, Christopher B.; Gonzalez, Patrick; Saah, David S.; Battles, John J.

    2014-03-01

    Inventories of greenhouse gas (GHG) emissions from wildfire provide essential information to the state of California, USA, and other governments that have enacted emission reductions. Wildfires can release a substantial amount of GHGs and other compounds to the atmosphere, so recent increases in fire activity may be increasing GHG emissions. Quantifying wildfire emissions however can be difficult due to inherent variability in fuel loads and consumption and a lack of field data of fuel consumption by wildfire. We compare a unique set of fuel data collected immediately before and after six wildfires in coniferous forests of California to fuel consumption predictions of the first-order fire effects model (FOFEM), based on two different available fuel characterizations. We found strong regional differences in the performance of different fuel characterizations, with FOFEM overestimating the fuel consumption to a greater extent in the Klamath Mountains than in the Sierra Nevada. Inaccurate fuel load inputs caused the largest differences between predicted and observed fuel consumption. Fuel classifications tended to overestimate duff load and underestimate litter load, leading to differences in predicted emissions for some pollutants. When considering total ground and surface fuels, modeled consumption was fairly accurate on average, although the range of error in estimates of plot level consumption was very large. These results highlight the importance of fuel load input to the accuracy of modeled fuel consumption and GHG emissions from wildfires in coniferous forests.

  19. Using information from historical high-throughput screens to predict active compounds.

    Science.gov (United States)

    Riniker, Sereina; Wang, Yuan; Jenkins, Jeremy L; Landrum, Gregory A

    2014-07-28

    Modern high-throughput screening (HTS) is a well-established approach for hit finding in drug discovery that is routinely employed in the pharmaceutical industry to screen more than a million compounds within a few weeks. However, as the industry shifts to more disease-relevant but more complex phenotypic screens, the focus has moved to piloting smaller but smarter chemically/biologically diverse subsets followed by an expansion around hit compounds. One standard method for doing this is to train a machine-learning (ML) model with the chemical fingerprints of the tested subset of molecules and then select the next compounds based on the predictions of this model. An alternative approach would be to take advantage of the wealth of bioactivity information contained in older (full-deck) screens using so-called HTS fingerprints, where each element of the fingerprint corresponds to the outcome of a particular assay, as input to machine-learning algorithms. We constructed HTS fingerprints using two collections of data: 93 in-house assays and 95 publicly available assays from PubChem. For each source, an additional set of 51 and 46 assays, respectively, was collected for testing. Three different ML methods, random forest (RF), logistic regression (LR), and naïve Bayes (NB), were investigated for both the HTS fingerprint and a chemical fingerprint, Morgan2. RF was found to be best suited for learning from HTS fingerprints yielding area under the receiver operating characteristic curve (AUC) values >0.8 for 78% of the internal assays and enrichment factors at 5% (EF(5%)) >10 for 55% of the assays. The RF(HTS-fp) generally outperformed the LR trained with Morgan2, which was the best ML method for the chemical fingerprint, for the majority of assays. In addition, HTS fingerprints were found to retrieve more diverse chemotypes. Combining the two models through heterogeneous classifier fusion led to a similar or better performance than the best individual model for all assays

  20. Improving acute kidney injury diagnostics using predictive analytics.

    Science.gov (United States)

    Basu, Rajit K; Gist, Katja; Wheeler, Derek S

    2015-12-01

    Acute kidney injury (AKI) is a multifactorial syndrome affecting an alarming proportion of hospitalized patients. Although early recognition may expedite management, the ability to identify patients at-risk and those suffering real-time injury is inconsistent. The review will summarize the recent reports describing advancements in the area of AKI epidemiology, specifically focusing on risk scoring and predictive analytics. In the critical care population, the primary underlying factors limiting prediction models include an inability to properly account for patient heterogeneity and underperforming metrics used to assess kidney function. Severity of illness scores demonstrate limited AKI predictive performance. Recent evidence suggests traditional methods for detecting AKI may be leveraged and ultimately replaced by newer, more sophisticated analytical tools capable of prediction and identification: risk stratification, novel AKI biomarkers, and clinical information systems. Additionally, the utility of novel biomarkers may be optimized through targeting using patient context, and may provide more granular information about the injury phenotype. Finally, manipulation of the electronic health record allows for real-time recognition of injury. Integrating a high-functioning clinical information system with risk stratification methodology and novel biomarker yields a predictive analytic model for AKI diagnostics.

  1. Computational data sciences for assessment and prediction of climate extremes

    Science.gov (United States)

    Ganguly, A. R.

    2011-12-01

    Climate extremes may be defined inclusively as severe weather events or large shifts in global or regional weather patterns which may be caused or exacerbated by natural climate variability or climate change. This area of research arguably represents one of the largest knowledge-gaps in climate science which is relevant for informing resource managers and policy makers. While physics-based climate models are essential in view of non-stationary and nonlinear dynamical processes, their current pace of uncertainty reduction may not be adequate for urgent stakeholder needs. The structure of the models may in some cases preclude reduction of uncertainty for critical processes at scales or for the extremes of interest. On the other hand, methods based on complex networks, extreme value statistics, machine learning, and space-time data mining, have demonstrated significant promise to improve scientific understanding and generate enhanced predictions. When combined with conceptual process understanding at multiple spatiotemporal scales and designed to handle massive data, interdisciplinary data science methods and algorithms may complement or supplement physics-based models. Specific examples from the prior literature and our ongoing work suggests how data-guided improvements may be possible, for example, in the context of ocean meteorology, climate oscillators, teleconnections, and atmospheric process understanding, which in turn can improve projections of regional climate, precipitation extremes and tropical cyclones in an useful and interpretable fashion. A community-wide effort is motivated to develop and adapt computational data science tools for translating climate model simulations to information relevant for adaptation and policy, as well as for improving our scientific understanding of climate extremes from both observed and model-simulated data.

  2. Burst of virus infection and a possibly largest epidemic threshold of non-Markovian susceptible-infected-susceptible processes on networks

    Science.gov (United States)

    Liu, Qiang; Van Mieghem, Piet

    2018-02-01

    Since a real epidemic process is not necessarily Markovian, the epidemic threshold obtained under the Markovian assumption may be not realistic. To understand general non-Markovian epidemic processes on networks, we study the Weibullian susceptible-infected-susceptible (SIS) process in which the infection process is a renewal process with a Weibull time distribution. We find that, if the infection rate exceeds 1 /ln(λ1+1 ) , where λ1 is the largest eigenvalue of the network's adjacency matrix, then the infection will persist on the network under the mean-field approximation. Thus, 1 /ln(λ1+1 ) is possibly the largest epidemic threshold for a general non-Markovian SIS process with a Poisson curing process under the mean-field approximation. Furthermore, non-Markovian SIS processes may result in a multimodal prevalence. As a byproduct, we show that a limiting Weibullian SIS process has the potential to model bursts of a synchronized infection.

  3. Pay less, take more: the myth of quality of life at the Brazil’s largest drugstore chain counter

    Directory of Open Access Journals (Sweden)

    André Luiz Maranhão de Souza Leão

    2013-06-01

    Full Text Available Considering the role of cultural transformer that the brandoccupies in the contemporary and seeking to delineate the role played bymyth in this construction, we look at Pague Menos, the largest pharmacyretail network in the country, through a qualitative study based on Barthesiansemiology. Our results showed six myths, whose relationships reveal ametanarrative: the quality of life.

  4. Selecting Optimal Random Forest Predictive Models: A Case Study on Predicting the Spatial Distribution of Seabed Hardness

    Science.gov (United States)

    Li, Jin; Tran, Maggie; Siwabessy, Justy

    2016-01-01

    Spatially continuous predictions of seabed hardness are important baseline environmental information for sustainable management of Australia’s marine jurisdiction. Seabed hardness is often inferred from multibeam backscatter data with unknown accuracy and can be inferred from underwater video footage at limited locations. In this study, we classified the seabed into four classes based on two new seabed hardness classification schemes (i.e., hard90 and hard70). We developed optimal predictive models to predict seabed hardness using random forest (RF) based on the point data of hardness classes and spatially continuous multibeam data. Five feature selection (FS) methods that are variable importance (VI), averaged variable importance (AVI), knowledge informed AVI (KIAVI), Boruta and regularized RF (RRF) were tested based on predictive accuracy. Effects of highly correlated, important and unimportant predictors on the accuracy of RF predictive models were examined. Finally, spatial predictions generated using the most accurate models were visually examined and analysed. This study confirmed that: 1) hard90 and hard70 are effective seabed hardness classification schemes; 2) seabed hardness of four classes can be predicted with a high degree of accuracy; 3) the typical approach used to pre-select predictive variables by excluding highly correlated variables needs to be re-examined; 4) the identification of the important and unimportant predictors provides useful guidelines for further improving predictive models; 5) FS methods select the most accurate predictive model(s) instead of the most parsimonious ones, and AVI and Boruta are recommended for future studies; and 6) RF is an effective modelling method with high predictive accuracy for multi-level categorical data and can be applied to ‘small p and large n’ problems in environmental sciences. Additionally, automated computational programs for AVI need to be developed to increase its computational efficiency and

  5. BNP predicts chemotherapy-related cardiotoxicity and death

    DEFF Research Database (Denmark)

    Skovgaard, Dorthe; Hasbak, Philip; Kjaer, Andreas

    2014-01-01

    ventriculography (MUGA) or echocardiography. However, the plasma cardiac biomarker B-type natriuretic peptide (BNP) has been suggested for early identification of cardiac dysfunction. The aim of the study was to compare LVEF obtained by MUGA and plasma BNP as predictors of developing congestive heart failure (CHF...... death. In multivariate Cox analysis both BNP and LVEF were independent predictors of CHF while age remained the only independent predictor of overall death. CONCLUSION: In cancer patients treated with cardiotoxic chemotherapy both BNP and LVEF can significantly predict subsequent hospitalization...... with CHF. In addition, BNP and not LVEF has a prognostic value in detecting overall death. This prospective study based on the hitherto largest study population supports BNP as a clinical relevant method for monitoring chemotherapy-related cardiac failure and death....

  6. Wireless digital information transfer : Modelling, prediction and assessment

    NARCIS (Netherlands)

    Lager, I.E.; De Hoop, A.T.; Kikkawa, T.

    2013-01-01

    The loop-to-loop pulsed electromagnetic field wireless signal transfer is investigated with a view on its application in wireless digital information transfer. Closed-form expressions are derived for the emitted magnetic field and for the open-circuit voltage of the receiving loop in dependence on

  7. Largest solar installation on a hotel in Switzerland; Groesste Hotel-Solaranlage der Schweiz

    Energy Technology Data Exchange (ETDEWEB)

    Stadelmann, M.

    2008-07-01

    This article describes the solar thermal installation on the Hotel Europa in St. Moritz-Champfer, Switzerland. The installation provides heat energy for domestic hot water preparation and for the heating of the hotel's indoor swimming pool. A thirty-percent reduction of heating oil consumption has been obtained. The system, which is based on the 'low-flow' principle, provides the highest possible temperature difference while using low pumping energy. The hotel's hot-water circulation system, which ensures fast availability of hot water at the taps, is also discussed. This largest hotel solar installation is designed to meet heating and hot-water requirements during the summer season. The high requirements placed on the materials used are discussed. Schematics are provided and first operational experience is briefly discussed.

  8. SHMF: Interest Prediction Model with Social Hub Matrix Factorization

    Directory of Open Access Journals (Sweden)

    Chaoyuan Cui

    2017-01-01

    Full Text Available With the development of social networks, microblog has become the major social communication tool. There is a lot of valuable information such as personal preference, public opinion, and marketing in microblog. Consequently, research on user interest prediction in microblog has a positive practical significance. In fact, how to extract information associated with user interest orientation from the constantly updated blog posts is not so easy. Existing prediction approaches based on probabilistic factor analysis use blog posts published by user to predict user interest. However, these methods are not very effective for the users who post less but browse more. In this paper, we propose a new prediction model, which is called SHMF, using social hub matrix factorization. SHMF constructs the interest prediction model by combining the information of blogs posts published by both user and direct neighbors in user’s social hub. Our proposed model predicts user interest by integrating user’s historical behavior and temporal factor as well as user’s friendships, thus achieving accurate forecasts of user’s future interests. The experimental results on Sina Weibo show the efficiency and effectiveness of our proposed model.

  9. Orthology prediction methods: a quality assessment using curated protein families.

    Science.gov (United States)

    Trachana, Kalliopi; Larsson, Tomas A; Powell, Sean; Chen, Wei-Hua; Doerks, Tobias; Muller, Jean; Bork, Peer

    2011-10-01

    The increasing number of sequenced genomes has prompted the development of several automated orthology prediction methods. Tests to evaluate the accuracy of predictions and to explore biases caused by biological and technical factors are therefore required. We used 70 manually curated families to analyze the performance of five public methods in Metazoa. We analyzed the strengths and weaknesses of the methods and quantified the impact of biological and technical challenges. From the latter part of the analysis, genome annotation emerged as the largest single influencer, affecting up to 30% of the performance. Generally, most methods did well in assigning orthologous group but they failed to assign the exact number of genes for half of the groups. The publicly available benchmark set (http://eggnog.embl.de/orthobench/) should facilitate the improvement of current orthology assignment protocols, which is of utmost importance for many fields of biology and should be tackled by a broad scientific community. Copyright © 2011 WILEY Periodicals, Inc.

  10. Trust-based collective view prediction

    CERN Document Server

    Luo, Tiejian; Xu, Guandong; Zhou, Jia

    2013-01-01

    Collective view prediction is to judge the opinions of an active web user based on unknown elements by referring to the collective mind of the whole community. Content-based recommendation and collaborative filtering are two mainstream collective view prediction techniques. They generate predictions by analyzing the text features of the target object or the similarity of users' past behaviors. Still, these techniques are vulnerable to the artificially-injected noise data, because they are not able to judge the reliability and credibility of the information sources. Trust-based Collective View

  11. Information Storage and Management Storing, Managing, and Protecting Digital Information in Classic, Virtualized, and Cloud Environments

    CERN Document Server

    Services, EMC Education

    2012-01-01

    The new edition of a bestseller, now revised and update throughout! This new edition of the unparalleled bestseller serves as a full training course all in one and as the world's largest data storage company, EMC is the ideal author for such a critical resource. They cover the components of a storage system and the different storage system models while also offering essential new material that explores the advances in existing technologies and the emergence of the "Cloud" as well as updates and vital information on new technologies. Features a separate section on emerging area of cloud computi

  12. Translational Modeling in Schizophrenia: Predicting Human Dopamine D2 Receptor Occupancy.

    Science.gov (United States)

    Johnson, Martin; Kozielska, Magdalena; Pilla Reddy, Venkatesh; Vermeulen, An; Barton, Hugh A; Grimwood, Sarah; de Greef, Rik; Groothuis, Geny M M; Danhof, Meindert; Proost, Johannes H

    2016-04-01

    To assess the ability of a previously developed hybrid physiology-based pharmacokinetic-pharmacodynamic (PBPKPD) model in rats to predict the dopamine D2 receptor occupancy (D2RO) in human striatum following administration of antipsychotic drugs. A hybrid PBPKPD model, previously developed using information on plasma concentrations, brain exposure and D2RO in rats, was used as the basis for the prediction of D2RO in human. The rat pharmacokinetic and brain physiology parameters were substituted with human population pharmacokinetic parameters and human physiological information. To predict the passive transport across the human blood-brain barrier, apparent permeability values were scaled based on rat and human brain endothelial surface area. Active efflux clearance in brain was scaled from rat to human using both human brain endothelial surface area and MDR1 expression. Binding constants at the D2 receptor were scaled based on the differences between in vitro and in vivo systems of the same species. The predictive power of this physiology-based approach was determined by comparing the D2RO predictions with the observed human D2RO of six antipsychotics at clinically relevant doses. Predicted human D2RO was in good agreement with clinically observed D2RO for five antipsychotics. Models using in vitro information predicted human D2RO well for most of the compounds evaluated in this analysis. However, human D2RO was under-predicted for haloperidol. The rat hybrid PBPKPD model structure, integrated with in vitro information and human pharmacokinetic and physiological information, constitutes a scientific basis to predict the time course of D2RO in man.

  13. Seal carrion is a predictable resource for coastal ecosystems

    Science.gov (United States)

    Quaggiotto, Maria-Martina; Barton, Philip S.; Morris, Christopher D.; Moss, Simon E. W.; Pomeroy, Patrick P.; McCafferty, Dominic J.; Bailey, David M.

    2018-04-01

    The timing, magnitude, and spatial distribution of resource inputs can have large effects on dependent organisms. Few studies have examined the predictability of such resources and no standard ecological measure of predictability exists. We examined the potential predictability of carrion resources provided by one of the UK's largest grey seal (Halichoerus grypus) colonies, on the Isle of May, Scotland. We used aerial (11 years) and ground surveys (3 years) to quantify the variability in time, space, quantity (kg), and quality (MJ) of seal carrion during the seal pupping season. We then compared the potential predictability of seal carrion to other periodic changes in food availability in nature. An average of 6893 kg of carrion •yr-1 corresponding to 110.5 × 103 MJ yr-1 was released for potential scavengers as placentae and dead animals. A fifth of the total biomass from dead seals was consumed by the end of the pupping season, mostly by avian scavengers. The spatial distribution of carcasses was similar across years, and 28% of the area containing >10 carcasses ha-1 was shared among all years. Relative standard errors (RSE) in space, time, quantity, and quality of carrion were all below 34%. This is similar to other allochthonous-dependent ecosystems, such as those affected by migratory salmon, and indicates high predictability of seal carrion as a resource. Our study illustrates how to quantify predictability in carrion, which is of general relevance to ecosystems that are dependent on this resource. We also highlight the importance of carrion to marine coastal ecosystems, where it sustains avian scavengers thus affecting ecosystem structure and function.

  14. SU-D-BRA-04: Fractal Dimension Analysis of Edge-Detected Rectal Cancer CTs for Outcome Prediction

    International Nuclear Information System (INIS)

    Zhong, H; Wang, J; Hu, W; Shen, L; Wan, J; Zhou, Z; Zhang, Z

    2015-01-01

    Purpose: To extract the fractal dimension features from edge-detected rectal cancer CTs, and to examine the predictability of fractal dimensions to outcomes of primary rectal cancer patients. Methods: Ninety-seven rectal cancer patients treated with neo-adjuvant chemoradiation were enrolled in this study. CT images were obtained before chemoradiotherapy. The primary lesions of the rectal cancer were delineated by experienced radiation oncologists. These images were extracted and filtered by six different Laplacian of Gaussian (LoG) filters with different filter values (0.5–3.0: from fine to coarse) to achieve primary lesions in different anatomical scales. Edges of the original images were found at zero-crossings of the filtered images. Three different fractal dimensions (box-counting dimension, Minkowski dimension, mass dimension) were calculated upon the image slice with the largest cross-section of the primary lesion. The significance of these fractal dimensions in survival, recurrence and metastasis were examined by Student’s t-test. Results: For a follow-up time of two years, 18 of 97 patients had experienced recurrence, 24 had metastasis, and 18 were dead. Minkowski dimensions under large filter values (2.0, 2.5, 3.0) were significantly larger (p=0.014, 0.006, 0.015) in patients with recurrence than those without. For metastasis, only box-counting dimensions under a single filter value (2.5) showed differences (p=0.016) between patients with and without. For overall survival, box-counting dimensions (filter values = 0.5, 1.0, 1.5), Minkowski dimensions (filter values = 0.5, 1.5, 2.0, 2,5) and mass dimensions (filter values = 1.5, 2.0) were all significant (p<0.05). Conclusion: It is feasible to extract shape information by edge detection and fractal dimensions analysis in neo-adjuvant rectal cancer patients. This information can be used to prognosis prediction

  15. Using the information on cosmic rays to predict influenza epidemics

    International Nuclear Information System (INIS)

    Yu, Z.D.

    1985-01-01

    A correlation between the incidence of influenza pandemics and increased cosmic ray activity is made. A correlation is also made between the occurrence of these pandemics and the appearance of bright novae, e.g., Nova Eta Car. Four indices based on increased cosmic ray activity and novae are proposed to predict future influenza pandemics and viral antigenic shifts

  16. On the role of crossmodal prediction in audiovisual emotion perception.

    Science.gov (United States)

    Jessen, Sarah; Kotz, Sonja A

    2013-01-01

    Humans rely on multiple sensory modalities to determine the emotional state of others. In fact, such multisensory perception may be one of the mechanisms explaining the ease and efficiency by which others' emotions are recognized. But how and when exactly do the different modalities interact? One aspect in multisensory perception that has received increasing interest in recent years is the concept of cross-modal prediction. In emotion perception, as in most other settings, visual information precedes the auditory information. Thereby, leading in visual information can facilitate subsequent auditory processing. While this mechanism has often been described in audiovisual speech perception, so far it has not been addressed in audiovisual emotion perception. Based on the current state of the art in (a) cross-modal prediction and (b) multisensory emotion perception research, we propose that it is essential to consider the former in order to fully understand the latter. Focusing on electroencephalographic (EEG) and magnetoencephalographic (MEG) studies, we provide a brief overview of the current research in both fields. In discussing these findings, we suggest that emotional visual information may allow more reliable predicting of auditory information compared to non-emotional visual information. In support of this hypothesis, we present a re-analysis of a previous data set that shows an inverse correlation between the N1 EEG response and the duration of visual emotional, but not non-emotional information. If the assumption that emotional content allows more reliable predicting can be corroborated in future studies, cross-modal prediction is a crucial factor in our understanding of multisensory emotion perception.

  17. Energy Information Data Base: corporate author entries

    International Nuclear Information System (INIS)

    1980-03-01

    One of the controls for information entered into the data bases created and maintained by the DOE Technical Information Center is the standardized name for the corporate entity or the corporate author. The purpose of Energy Information Data Base: Corporate Author Entries (TID-4585-R1) and this supplemental list of authorized or standardized corporate entries is to provide a means for the consistent citing of the names of organizations in bibliographic records. In general, an entry in Corporate Author Entries consists of the seven-digit code number assigned to the particular corporate entity, the two-letter country code, the largest element of the corporate name, the location of the corporate entity, and the smallest element of the corporate name (if provided). This supplement [DOE/TIC-4585-R1(Suppl.5)] contains additions to the base document (TID-4585-R1) and is intended to be used with that publication

  18. Science Information Centre and Nuclear Library of 'Jozef Stefan' Institute, Ljubljana, Slovenia

    International Nuclear Information System (INIS)

    Stante, A.; Smuc, S.

    2006-01-01

    The 'Jozef Stefan' Institute Science Information Centre is the central Slovene physics library and one of the largest special libraries in Slovenia. Our collection covers the fields of physics, chemistry, biochemistry, electronics, information science, artificial intelligence, energy management, environmental science, material science, robotics etc. The Nuclear Library at the Reactor Centre Podgorica is a part of the Science Information Centre. It collects and keeps literature from the field of reactor and nuclear energy and provides information to scientists employed at the Reactor Centre and users from the Nuclear Power Plant Krsko as well as other experts dealing with nuclear science and similar fields. The orders subscribed are sent by the Science Information Centre to other libraries included in inter-library lending in Slovenia and abroad. (author)

  19. Estimate of the largest Lyapunov characteristic exponent of a high dimensional atmospheric global circulation model: a sensitivity analysis

    International Nuclear Information System (INIS)

    Guerrieri, A.

    2009-01-01

    In this report the largest Lyapunov characteristic exponent of a high dimensional atmospheric global circulation model of intermediate complexity has been estimated numerically. A sensitivity analysis has been carried out by varying the equator-to-pole temperature difference, the space resolution and the value of some parameters employed by the model. Chaotic and non-chaotic regimes of circulation have been found. [it

  20. When clusters collide: constraints on antimatter on the largest scales

    International Nuclear Information System (INIS)

    Steigman, Gary

    2008-01-01

    Observations have ruled out the presence of significant amounts of antimatter in the Universe on scales ranging from the solar system, to the Galaxy, to groups and clusters of galaxies, and even to distances comparable to the scale of the present horizon. Except for the model-dependent constraints on the largest scales, the most significant upper limits to diffuse antimatter in the Universe are those on the ∼Mpc scale of clusters of galaxies provided by the EGRET upper bounds to annihilation gamma rays from galaxy clusters whose intracluster gas is revealed through its x-ray emission. On the scale of individual clusters of galaxies the upper bounds to the fraction of mixed matter and antimatter for the 55 clusters from a flux-limited x-ray survey range from 5 × 10 −9 to −6 , strongly suggesting that individual clusters of galaxies are made entirely of matter or of antimatter. X-ray and gamma-ray observations of colliding clusters of galaxies, such as the Bullet Cluster, permit these constraints to be extended to even larger scales. If the observations of the Bullet Cluster, where the upper bound to the antimatter fraction is found to be −6 , can be generalized to other colliding clusters of galaxies, cosmologically significant amounts of antimatter will be excluded on scales of order ∼20 Mpc (M∼5×10 15 M sun )