WorldWideScience

Sample records for large-scale knowledge information

  1. Integration, Provenance, and Temporal Queries for Large-Scale Knowledge Bases

    OpenAIRE

    Gao, Shi

    2016-01-01

    Knowledge bases that summarize web information in RDF triples deliver many benefits, including support for natural language question answering and powerful structured queries that extract encyclopedic knowledge via SPARQL. Large scale knowledge bases grow rapidly in terms of scale and significance, and undergo frequent changes in both schema and content. Two critical problems have thus emerged: (i) how to support temporal queries that explore the history of knowledge bases or flash-back to th...

  2. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations.

    Science.gov (United States)

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts.

  3. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations

    Science.gov (United States)

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts. PMID:25993414

  4. Analysis Methods for Extracting Knowledge from Large-Scale WiFi Monitoring to Inform Building Facility Planning

    DEFF Research Database (Denmark)

    Ruiz-Ruiz, Antonio; Blunck, Henrik; Prentow, Thor Siiger

    2014-01-01

    realistic data to inform facility planning. In this paper, we propose analysis methods to extract knowledge from large sets of network collected WiFi traces to better inform facility management and planning in large building complexes. The analysis methods, which build on a rich set of temporal and spatial......The optimization of logistics in large building com- plexes with many resources, such as hospitals, require realistic facility management and planning. Current planning practices rely foremost on manual observations or coarse unverified as- sumptions and therefore do not properly scale or provide....... Spatio-temporal visualization tools built on top of these methods enable planners to inspect and explore extracted information to inform facility-planning activities. To evaluate the methods, we present results for a large hospital complex covering more than 10 hectares. The evaluation is based on Wi...

  5. Foundations of Large-Scale Multimedia Information Management and Retrieval

    CERN Document Server

    Chang, Edward Y

    2011-01-01

    "Foundations of Large-Scale Multimedia Information Management and Retrieval - Mathematics of Perception" covers knowledge representation and semantic analysis of multimedia data and scalability in signal extraction, data mining, and indexing. The book is divided into two parts: Part I - Knowledge Representation and Semantic Analysis focuses on the key components of mathematics of perception as it applies to data management and retrieval. These include feature selection/reduction, knowledge representation, semantic analysis, distance function formulation for measuring similarity, and

  6. A Ranking Approach on Large-Scale Graph With Multidimensional Heterogeneous Information.

    Science.gov (United States)

    Wei, Wei; Gao, Bin; Liu, Tie-Yan; Wang, Taifeng; Li, Guohui; Li, Hang

    2016-04-01

    Graph-based ranking has been extensively studied and frequently applied in many applications, such as webpage ranking. It aims at mining potentially valuable information from the raw graph-structured data. Recently, with the proliferation of rich heterogeneous information (e.g., node/edge features and prior knowledge) available in many real-world graphs, how to effectively and efficiently leverage all information to improve the ranking performance becomes a new challenging problem. Previous methods only utilize part of such information and attempt to rank graph nodes according to link-based methods, of which the ranking performances are severely affected by several well-known issues, e.g., over-fitting or high computational complexity, especially when the scale of graph is very large. In this paper, we address the large-scale graph-based ranking problem and focus on how to effectively exploit rich heterogeneous information of the graph to improve the ranking performance. Specifically, we propose an innovative and effective semi-supervised PageRank (SSP) approach to parameterize the derived information within a unified semi-supervised learning framework (SSLF-GR), then simultaneously optimize the parameters and the ranking scores of graph nodes. Experiments on the real-world large-scale graphs demonstrate that our method significantly outperforms the algorithms that consider such graph information only partially.

  7. Knowledge Guided Disambiguation for Large-Scale Scene Classification With Multi-Resolution CNNs

    Science.gov (United States)

    Wang, Limin; Guo, Sheng; Huang, Weilin; Xiong, Yuanjun; Qiao, Yu

    2017-04-01

    Convolutional Neural Networks (CNNs) have made remarkable progress on scene recognition, partially due to these recent large-scale scene datasets, such as the Places and Places2. Scene categories are often defined by multi-level information, including local objects, global layout, and background environment, thus leading to large intra-class variations. In addition, with the increasing number of scene categories, label ambiguity has become another crucial issue in large-scale classification. This paper focuses on large-scale scene recognition and makes two major contributions to tackle these issues. First, we propose a multi-resolution CNN architecture that captures visual content and structure at multiple levels. The multi-resolution CNNs are composed of coarse resolution CNNs and fine resolution CNNs, which are complementary to each other. Second, we design two knowledge guided disambiguation techniques to deal with the problem of label ambiguity. (i) We exploit the knowledge from the confusion matrix computed on validation data to merge ambiguous classes into a super category. (ii) We utilize the knowledge of extra networks to produce a soft label for each image. Then the super categories or soft labels are employed to guide CNN training on the Places2. We conduct extensive experiments on three large-scale image datasets (ImageNet, Places, and Places2), demonstrating the effectiveness of our approach. Furthermore, our method takes part in two major scene recognition challenges, and achieves the second place at the Places2 challenge in ILSVRC 2015, and the first place at the LSUN challenge in CVPR 2016. Finally, we directly test the learned representations on other scene benchmarks, and obtain the new state-of-the-art results on the MIT Indoor67 (86.7\\%) and SUN397 (72.0\\%). We release the code and models at~\\url{https://github.com/wanglimin/MRCNN-Scene-Recognition}.

  8. Large-scale computer networks and the future of legal knowledge-based systems

    NARCIS (Netherlands)

    Leenes, R.E.; Svensson, Jorgen S.; Hage, J.C.; Bench-Capon, T.J.M.; Cohen, M.J.; van den Herik, H.J.

    1995-01-01

    In this paper we investigate the relation between legal knowledge-based systems and large-scale computer networks such as the Internet. On the one hand, researchers of legal knowledge-based systems have claimed huge possibilities, but despite the efforts over the last twenty years, the number of

  9. Large-scale structural and textual similarity-based mining of knowledge graph to predict drug-drug interactions

    KAUST Repository

    Abdelaziz, Ibrahim; Fokoue, Achille; Hassanzadeh, Oktie; Zhang, Ping; Sadoghi, Mohammad

    2017-01-01

    Drug-Drug Interactions (DDIs) are a major cause of preventable Adverse Drug Reactions (ADRs), causing a significant burden on the patients’ health and the healthcare system. It is widely known that clinical studies cannot sufficiently and accurately identify DDIs for new drugs before they are made available on the market. In addition, existing public and proprietary sources of DDI information are known to be incomplete and/or inaccurate and so not reliable. As a result, there is an emerging body of research on in-silico prediction of drug-drug interactions. In this paper, we present Tiresias, a large-scale similarity-based framework that predicts DDIs through link prediction. Tiresias takes in various sources of drug-related data and knowledge as inputs, and provides DDI predictions as outputs. The process starts with semantic integration of the input data that results in a knowledge graph describing drug attributes and relationships with various related entities such as enzymes, chemical structures, and pathways. The knowledge graph is then used to compute several similarity measures between all the drugs in a scalable and distributed framework. In particular, Tiresias utilizes two classes of features in a knowledge graph: local and global features. Local features are derived from the information directly associated to each drug (i.e., one hop away) while global features are learnt by minimizing a global loss function that considers the complete structure of the knowledge graph. The resulting similarity metrics are used to build features for a large-scale logistic regression model to predict potential DDIs. We highlight the novelty of our proposed Tiresias and perform thorough evaluation of the quality of the predictions. The results show the effectiveness of Tiresias in both predicting new interactions among existing drugs as well as newly developed drugs.

  10. Large-scale structural and textual similarity-based mining of knowledge graph to predict drug-drug interactions

    KAUST Repository

    Abdelaziz, Ibrahim

    2017-06-12

    Drug-Drug Interactions (DDIs) are a major cause of preventable Adverse Drug Reactions (ADRs), causing a significant burden on the patients’ health and the healthcare system. It is widely known that clinical studies cannot sufficiently and accurately identify DDIs for new drugs before they are made available on the market. In addition, existing public and proprietary sources of DDI information are known to be incomplete and/or inaccurate and so not reliable. As a result, there is an emerging body of research on in-silico prediction of drug-drug interactions. In this paper, we present Tiresias, a large-scale similarity-based framework that predicts DDIs through link prediction. Tiresias takes in various sources of drug-related data and knowledge as inputs, and provides DDI predictions as outputs. The process starts with semantic integration of the input data that results in a knowledge graph describing drug attributes and relationships with various related entities such as enzymes, chemical structures, and pathways. The knowledge graph is then used to compute several similarity measures between all the drugs in a scalable and distributed framework. In particular, Tiresias utilizes two classes of features in a knowledge graph: local and global features. Local features are derived from the information directly associated to each drug (i.e., one hop away) while global features are learnt by minimizing a global loss function that considers the complete structure of the knowledge graph. The resulting similarity metrics are used to build features for a large-scale logistic regression model to predict potential DDIs. We highlight the novelty of our proposed Tiresias and perform thorough evaluation of the quality of the predictions. The results show the effectiveness of Tiresias in both predicting new interactions among existing drugs as well as newly developed drugs.

  11. Do large-scale assessments measure students' ability to integrate scientific knowledge?

    Science.gov (United States)

    Lee, Hee-Sun

    2010-03-01

    Large-scale assessments are used as means to diagnose the current status of student achievement in science and compare students across schools, states, and countries. For efficiency, multiple-choice items and dichotomously-scored open-ended items are pervasively used in large-scale assessments such as Trends in International Math and Science Study (TIMSS). This study investigated how well these items measure secondary school students' ability to integrate scientific knowledge. This study collected responses of 8400 students to 116 multiple-choice and 84 open-ended items and applied an Item Response Theory analysis based on the Rasch Partial Credit Model. Results indicate that most multiple-choice items and dichotomously-scored open-ended items can be used to determine whether students have normative ideas about science topics, but cannot measure whether students integrate multiple pieces of relevant science ideas. Only when the scoring rubric is redesigned to capture subtle nuances of student open-ended responses, open-ended items become a valid and reliable tool to assess students' knowledge integration ability.

  12. Network Partitioning Domain Knowledge Multiobjective Application Mapping for Large-Scale Network-on-Chip

    Directory of Open Access Journals (Sweden)

    Yin Zhen Tei

    2014-01-01

    Full Text Available This paper proposes a multiobjective application mapping technique targeted for large-scale network-on-chip (NoC. As the number of intellectual property (IP cores in multiprocessor system-on-chip (MPSoC increases, NoC application mapping to find optimum core-to-topology mapping becomes more challenging. Besides, the conflicting cost and performance trade-off makes multiobjective application mapping techniques even more complex. This paper proposes an application mapping technique that incorporates domain knowledge into genetic algorithm (GA. The initial population of GA is initialized with network partitioning (NP while the crossover operator is guided with knowledge on communication demands. NP reduces the large-scale application mapping complexity and provides GA with a potential mapping search space. The proposed genetic operator is compared with state-of-the-art genetic operators in terms of solution quality. In this work, multiobjective optimization of energy and thermal-balance is considered. Through simulation, knowledge-based initial mapping shows significant improvement in Pareto front compared to random initial mapping that is widely used. The proposed knowledge-based crossover also shows better Pareto front compared to state-of-the-art knowledge-based crossover.

  13. Large-scale functional MRI analysis to accumulate knowledge on brain functions

    International Nuclear Information System (INIS)

    Schwartz, Yannick

    2015-01-01

    How can we accumulate knowledge on brain functions? How can we leverage years of research in functional MRI to analyse finer-grained psychological constructs, and build a comprehensive model of the brain? Researchers usually rely on single studies to delineate brain regions recruited by mental processes. They relate their findings to previous works in an informal way by defining regions of interest from the literature. Meta-analysis approaches provide a more principled way to build upon the literature. This thesis investigates three ways to assemble knowledge using activation maps from a large amount of studies. First, we present an approach that uses jointly two similar fMRI experiments, to better condition an analysis from a statistical standpoint. We show that it is a valuable data-driven alternative to traditional regions of interest analyses, but fails to provide a systematic way to relate studies, and thus does not permit to integrate knowledge on a large scale. Because of the difficulty to associate multiple studies, we resort to using a single dataset sampling a large number of stimuli for our second contribution. This method estimates functional networks associated with functional profiles, where the functional networks are interacting brain regions and the functional profiles are a weighted set of cognitive descriptors. This work successfully yields known brain networks and automatically associates meaningful descriptions. Its limitations lie in the unsupervised nature of this method, which is more difficult to validate, and the use of a single dataset. It however brings the notion of cognitive labels, which is central to our last contribution. Our last contribution presents a method that learns functional atlases by combining several datasets. [Henson 2006] shows that forward inference, i.e. the probability of an activation given a cognitive process, is often not sufficient to conclude on the engagement of brain regions for a cognitive process

  14. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  15. Monitoring and Information Fusion for Search and Rescue Operations in Large-Scale Disasters

    National Research Council Canada - National Science Library

    Nardi, Daniele

    2002-01-01

    ... for information fusion with application to search-and-rescue and large scale disaster relief. The objective is to develop and to deploy tools to support the monitoring activities in an intervention caused by a large-scale disaster...

  16. Reconstructing Information in Large-Scale Structure via Logarithmic Mapping

    Science.gov (United States)

    Szapudi, Istvan

    We propose to develop a new method to extract information from large-scale structure data combining two-point statistics and non-linear transformations; before, this information was available only with substantially more complex higher-order statistical methods. Initially, most of the cosmological information in large-scale structure lies in two-point statistics. With non- linear evolution, some of that useful information leaks into higher-order statistics. The PI and group has shown in a series of theoretical investigations how that leakage occurs, and explained the Fisher information plateau at smaller scales. This plateau means that even as more modes are added to the measurement of the power spectrum, the total cumulative information (loosely speaking the inverse errorbar) is not increasing. Recently we have shown in Neyrinck et al. (2009, 2010) that a logarithmic (and a related Gaussianization or Box-Cox) transformation on the non-linear Dark Matter or galaxy field reconstructs a surprisingly large fraction of this missing Fisher information of the initial conditions. This was predicted by the earlier wave mechanical formulation of gravitational dynamics by Szapudi & Kaiser (2003). The present proposal is focused on working out the theoretical underpinning of the method to a point that it can be used in practice to analyze data. In particular, one needs to deal with the usual real-life issues of galaxy surveys, such as complex geometry, discrete sam- pling (Poisson or sub-Poisson noise), bias (linear, or non-linear, deterministic, or stochastic), redshift distortions, pro jection effects for 2D samples, and the effects of photometric redshift errors. We will develop methods for weak lensing and Sunyaev-Zeldovich power spectra as well, the latter specifically targetting Planck. In addition, we plan to investigate the question of residual higher- order information after the non-linear mapping, and possible applications for cosmology. Our aim will be to work out

  17. Are large-scale flow experiments informing the science and management of freshwater ecosystems?

    Science.gov (United States)

    Olden, Julian D.; Konrad, Christopher P.; Melis, Theodore S.; Kennard, Mark J.; Freeman, Mary C.; Mims, Meryl C.; Bray, Erin N.; Gido, Keith B.; Hemphill, Nina P.; Lytle, David A.; McMullen, Laura E.; Pyron, Mark; Robinson, Christopher T.; Schmidt, John C.; Williams, John G.

    2013-01-01

    Greater scientific knowledge, changing societal values, and legislative mandates have emphasized the importance of implementing large-scale flow experiments (FEs) downstream of dams. We provide the first global assessment of FEs to evaluate their success in advancing science and informing management decisions. Systematic review of 113 FEs across 20 countries revealed that clear articulation of experimental objectives, while not universally practiced, was crucial for achieving management outcomes and changing dam-operating policies. Furthermore, changes to dam operations were three times less likely when FEs were conducted primarily for scientific purposes. Despite the recognized importance of riverine flow regimes, four-fifths of FEs involved only discrete flow events. Over three-quarters of FEs documented both abiotic and biotic outcomes, but only one-third examined multiple taxonomic responses, thus limiting how FE results can inform holistic dam management. Future FEs will present new opportunities to advance scientifically credible water policies.

  18. Cosmological parameters from large scale structure - geometric versus shape information

    CERN Document Server

    Hamann, Jan; Lesgourgues, Julien; Rampf, Cornelius; Wong, Yvonne Y Y

    2010-01-01

    The matter power spectrum as derived from large scale structure (LSS) surveys contains two important and distinct pieces of information: an overall smooth shape and the imprint of baryon acoustic oscillations (BAO). We investigate the separate impact of these two types of information on cosmological parameter estimation, and show that for the simplest cosmological models, the broad-band shape information currently contained in the SDSS DR7 halo power spectrum (HPS) is by far superseded by geometric information derived from the baryonic features. An immediate corollary is that contrary to popular beliefs, the upper limit on the neutrino mass m_\

  19. Engineering youth service system infrastructure: Hawaii's continued efforts at large-scale implementation through knowledge management strategies.

    Science.gov (United States)

    Nakamura, Brad J; Mueller, Charles W; Higa-McMillan, Charmaine; Okamura, Kelsie H; Chang, Jaime P; Slavin, Lesley; Shimabukuro, Scott

    2014-01-01

    Hawaii's Child and Adolescent Mental Health Division provides a unique illustration of a youth public mental health system with a long and successful history of large-scale quality improvement initiatives. Many advances are linked to flexibly organizing and applying knowledge gained from the scientific literature and move beyond installing a limited number of brand-named treatment approaches that might be directly relevant only to a small handful of system youth. This article takes a knowledge-to-action perspective and outlines five knowledge management strategies currently under way in Hawaii. Each strategy represents one component of a larger coordinated effort at engineering a service system focused on delivering both brand-named treatment approaches and complimentary strategies informed by the evidence base. The five knowledge management examples are (a) a set of modular-based professional training activities for currently practicing therapists, (b) an outreach initiative for supporting youth evidence-based practices training at Hawaii's mental health-related professional programs, (c) an effort to increase consumer knowledge of and demand for youth evidence-based practices, (d) a practice and progress agency performance feedback system, and (e) a sampling of system-level research studies focused on understanding treatment as usual. We end by outlining a small set of lessons learned and a longer term vision for embedding these efforts into the system's infrastructure.

  20. Risk Management Challenges in Large-scale Energy PSS

    DEFF Research Database (Denmark)

    Tegeltija, Miroslava; Oehmen, Josef; Kozin, Igor

    2017-01-01

    Probabilistic risk management approaches have a long tradition in engineering. A large variety of tools and techniques based on the probabilistic view of risk is available and applied in PSS practice. However, uncertainties that arise due to lack of knowledge and information are still missing...... adequate representations. We focus on a large-scale energy company in Denmark as one case of current product/servicesystems risk management best practices. We analyze their risk management process and investigate the tools they use in order to support decision making processes within the company. First, we...... identify the following challenges in the current risk management practices that are in line with literature: (1) current methods are not appropriate for the situations dominated by weak knowledge and information; (2) quality of traditional models in such situations is open to debate; (3) quality of input...

  1. The Development and Validation of a Knowledge Activities Scale for the Information Professionals in University Libraries

    Directory of Open Access Journals (Sweden)

    Yuan-Ho Huang

    2013-12-01

    Full Text Available This research aims to develop a scale for measuring knowledge activities of information professionals which include the attributes for positive and negative, individual and group. The research processes include interviewing several experts, the exploratory analysis of the pre-test, and the confirmatory factor analysis of the formal questionnaire collecting from academic librarians. The result indicates that there are four factors for individual level, including knowledge absorption, knowledge share, knowledge hampering, and knowledge transfer; and three factors for group level, including knowledge enlarging, knowledge clustering, and knowledge initiating. The scale from both individual and group level demonstrated robust psychometric properties, with acceptable levels of reliability and validity. Library managers could adopt the scales to examine the extent to knowledge activities in order to design a future plan according to the status of the existing library for promoting knowledge management. Furthermore, the result of t-test and ANOVA revealed some facts that we need to consider some business strategies we need to improve for managing human resources. [Article content in Chinese

  2. Study on large scale knowledge base with real time operation for autonomous nuclear power plant. 1. Basic concept and expecting performance

    International Nuclear Information System (INIS)

    Ozaki, Yoshihiko; Suda, Kazunori; Yoshikawa, Shinji; Ozawa, Kenji

    1996-04-01

    Since it is desired to enhance availability and safety of nuclear power plants operation and maintenance by removing human factor, there are many researches and developments for intelligent operation or diagnosis using artificial intelligence (AI) technique. We have been developing an autonomous operation and maintenance system for nuclear power plants by substituting AI's and intelligent robots. It is indispensable to use various and large scale knowledge relative to plant design, operation, and maintenance, that is, whole life cycle data of the plant for the autonomous nuclear power plant. These knowledge must be given to AI system or intelligent robots adequately and opportunely. Moreover, it is necessary to insure real time operation using the large scale knowledge base for plant control and diagnosis performance. We have been studying on the large scale and real time knowledge base system for autonomous plant. In the report, we would like to present the basic concept and expecting performance of the knowledge base for autonomous plant, especially, autonomous control and diagnosis system. (author)

  3. Information technology to support informal knowledge sharing

    NARCIS (Netherlands)

    Davison, R.M.; Ou, C.X.J.; Martinsons, M.G.

    2013-01-01

    The knowledge management (KM) literature largely focuses on the explicit and formal representation of knowledge in computer-based KM systems. Informal KM practices are widespread, but less is known about them. This paper aims to redress this imbalance by exploring the use of interactive information

  4. Participatory Design of Large-Scale Information Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    into a PD process model that (1) emphasizes PD experiments as transcending traditional prototyping by evaluating fully integrated systems exposed to real work practices; (2) incorporates improvisational change management including anticipated, emergent, and opportunity-based change; and (3) extends initial...... design and development into a sustained and ongoing stepwise implementation that constitutes an overall technology-driven organizational change. The process model is presented through a largescale PD experiment in the Danish healthcare sector. We reflect on our experiences from this experiment......In this article we discuss how to engage in large-scale information systems development by applying a participatory design (PD) approach that acknowledges the unique situated work practices conducted by the domain experts of modern organizations. We reconstruct the iterative prototyping approach...

  5. Visual attention mitigates information loss in small- and large-scale neural codes

    Science.gov (United States)

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-01-01

    Summary The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires processing sensory signals in a manner that protects information about relevant stimuli from degradation. Such selective processing – or selective attention – is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. PMID:25769502

  6. Third generation participatory design in health informatics--making user participation applicable to large-scale information system projects.

    Science.gov (United States)

    Pilemalm, Sofie; Timpka, Toomas

    2008-04-01

    Participatory Design (PD) methods in the field of health informatics have mainly been applied to the development of small-scale systems with homogeneous user groups in local settings. Meanwhile, health service organizations are becoming increasingly large and complex in character, making it necessary to extend the scope of the systems that are used for managing data, information and knowledge. This study reports participatory action research on the development of a PD framework for large-scale system design. The research was conducted in a public health informatics project aimed at developing a system for 175,000 users. A renewed PD framework was developed in response to six major limitations experienced to be associated with the existing methods. The resulting framework preserves the theoretical grounding, but extends the toolbox to suit applications in networked health service organizations. Future research should involve evaluations of the framework in other health service settings where comprehensive HISs are developed.

  7. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    Science.gov (United States)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  8. Visual attention mitigates information loss in small- and large-scale neural codes.

    Science.gov (United States)

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-04-01

    The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires that sensory signals are processed in a manner that protects information about relevant stimuli from degradation. Such selective processing--or selective attention--is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, thereby providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Knowledge discovery: Extracting usable information from large amounts of data

    International Nuclear Information System (INIS)

    Whiteson, R.

    1998-01-01

    The threat of nuclear weapons proliferation is a problem of world wide concern. Safeguards are the key to nuclear nonproliferation and data is the key to safeguards. The safeguards community has access to a huge and steadily growing volume of data. The advantages of this data rich environment are obvious, there is a great deal of information which can be utilized. The challenge is to effectively apply proven and developing technologies to find and extract usable information from that data. That information must then be assessed and evaluated to produce the knowledge needed for crucial decision making. Efficient and effective analysis of safeguards data will depend on utilizing technologies to interpret the large, heterogeneous data sets that are available from diverse sources. With an order-of-magnitude increase in the amount of data from a wide variety of technical, textual, and historical sources there is a vital need to apply advanced computer technologies to support all-source analysis. There are techniques of data warehousing, data mining, and data analysis that can provide analysts with tools that will expedite their extracting useable information from the huge amounts of data to which they have access. Computerized tools can aid analysts by integrating heterogeneous data, evaluating diverse data streams, automating retrieval of database information, prioritizing inputs, reconciling conflicting data, doing preliminary interpretations, discovering patterns or trends in data, and automating some of the simpler prescreening tasks that are time consuming and tedious. Thus knowledge discovery technologies can provide a foundation of support for the analyst. Rather than spending time sifting through often irrelevant information, analysts could use their specialized skills in a focused, productive fashion. This would allow them to make their analytical judgments with more confidence and spend more of their time doing what they do best

  10. Implementation of a large-scale hospital information infrastructure for multi-unit health-care services.

    Science.gov (United States)

    Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul

    2008-01-01

    With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications.

  11. Collaborative ethnography for information systems research Studying knowledge work practices and designing supportive information systems

    Directory of Open Access Journals (Sweden)

    Ronald Maier

    2012-04-01

    Full Text Available Understanding knowledge work and supporting it with information systems (ISs are challenging tasks. Knowledge work has changed substantially recently and studies on how knowledge work is currently performed are scarce. Ethnography is the most suitable qualitative research method for studying knowledge work, yet too time-consuming, costly and unfocused for the fast changing IS domain. Moreover, results from qualitative studies need to be transformed into artefacts useful for IS requirements engineering and design. This paper proposes a procedure for collaborative ethnography to study knowledge work practices and inform IS requirements gathering and design illustrated with the case of a collaborative ethnographic study of seven organisations in four European countries performed in a large-scale international IS research and development project. The paper also critically discusses the procedure’s applicability and limitations.

  12. Large-scale Health Information Database and Privacy Protection.

    Science.gov (United States)

    Yamamoto, Ryuichi

    2016-09-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law that aims to ensure healthcare for the elderly; however, there is no mention in the act about using these databases for public interest in general. Thus, an initiative for such use must proceed carefully and attentively. The PMDA projects that collect a large amount of medical record information from large hospitals and the health database development project that the Ministry of Health, Labour and Welfare (MHLW) is working on will soon begin to operate according to a general consensus; however, the validity of this consensus can be questioned if issues of anonymity arise. The likelihood that researchers conducting a study for public interest would intentionally invade the privacy of their subjects is slim. However, patients could develop a sense of distrust about their data being used since legal requirements are ambiguous. Nevertheless, without using patients' medical records for public interest, progress in medicine will grind to a halt. Proper legislation that is clear for both researchers and patients will therefore be highly desirable. A revision of the Act on the Protection of Personal Information is currently in progress. In reality, however, privacy is not something that laws alone can protect; it will also require guidelines and self-discipline. We now live in an information capitalization age. I will introduce the trends in legal reform regarding healthcare information and discuss some basics to help people properly face the issue of health big data and privacy

  13. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  14. Large-scale modeling of condition-specific gene regulatory networks by information integration and inference.

    Science.gov (United States)

    Ellwanger, Daniel Christian; Leonhardt, Jörn Florian; Mewes, Hans-Werner

    2014-12-01

    Understanding how regulatory networks globally coordinate the response of a cell to changing conditions, such as perturbations by shifting environments, is an elementary challenge in systems biology which has yet to be met. Genome-wide gene expression measurements are high dimensional as these are reflecting the condition-specific interplay of thousands of cellular components. The integration of prior biological knowledge into the modeling process of systems-wide gene regulation enables the large-scale interpretation of gene expression signals in the context of known regulatory relations. We developed COGERE (http://mips.helmholtz-muenchen.de/cogere), a method for the inference of condition-specific gene regulatory networks in human and mouse. We integrated existing knowledge of regulatory interactions from multiple sources to a comprehensive model of prior information. COGERE infers condition-specific regulation by evaluating the mutual dependency between regulator (transcription factor or miRNA) and target gene expression using prior information. This dependency is scored by the non-parametric, nonlinear correlation coefficient η(2) (eta squared) that is derived by a two-way analysis of variance. We show that COGERE significantly outperforms alternative methods in predicting condition-specific gene regulatory networks on simulated data sets. Furthermore, by inferring the cancer-specific gene regulatory network from the NCI-60 expression study, we demonstrate the utility of COGERE to promote hypothesis-driven clinical research. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.

  15. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    Science.gov (United States)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  16. Direction of information flow in large-scale resting-state networks is frequency-dependent.

    Science.gov (United States)

    Hillebrand, Arjan; Tewarie, Prejaas; van Dellen, Edwin; Yu, Meichen; Carbo, Ellen W S; Douw, Linda; Gouw, Alida A; van Straaten, Elisabeth C W; Stam, Cornelis J

    2016-04-05

    Normal brain function requires interactions between spatially separated, and functionally specialized, macroscopic regions, yet the directionality of these interactions in large-scale functional networks is unknown. Magnetoencephalography was used to determine the directionality of these interactions, where directionality was inferred from time series of beamformer-reconstructed estimates of neuronal activation, using a recently proposed measure of phase transfer entropy. We observed well-organized posterior-to-anterior patterns of information flow in the higher-frequency bands (alpha1, alpha2, and beta band), dominated by regions in the visual cortex and posterior default mode network. Opposite patterns of anterior-to-posterior flow were found in the theta band, involving mainly regions in the frontal lobe that were sending information to a more distributed network. Many strong information senders in the theta band were also frequent receivers in the alpha2 band, and vice versa. Our results provide evidence that large-scale resting-state patterns of information flow in the human brain form frequency-dependent reentry loops that are dominated by flow from parieto-occipital cortex to integrative frontal areas in the higher-frequency bands, which is mirrored by a theta band anterior-to-posterior flow.

  17. Large-scale Health Information Database and Privacy Protection*1

    Science.gov (United States)

    YAMAMOTO, Ryuichi

    2016-01-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law that aims to ensure healthcare for the elderly; however, there is no mention in the act about using these databases for public interest in general. Thus, an initiative for such use must proceed carefully and attentively. The PMDA*2 projects that collect a large amount of medical record information from large hospitals and the health database development project that the Ministry of Health, Labour and Welfare (MHLW) is working on will soon begin to operate according to a general consensus; however, the validity of this consensus can be questioned if issues of anonymity arise. The likelihood that researchers conducting a study for public interest would intentionally invade the privacy of their subjects is slim. However, patients could develop a sense of distrust about their data being used since legal requirements are ambiguous. Nevertheless, without using patients’ medical records for public interest, progress in medicine will grind to a halt. Proper legislation that is clear for both researchers and patients will therefore be highly desirable. A revision of the Act on the Protection of Personal Information is currently in progress. In reality, however, privacy is not something that laws alone can protect; it will also require guidelines and self-discipline. We now live in an information capitalization age. I will introduce the trends in legal reform regarding healthcare information and discuss some basics to help people properly face the issue of health big data and privacy

  18. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  19. Inference of functional properties from large-scale analysis of enzyme superfamilies.

    Science.gov (United States)

    Brown, Shoshana D; Babbitt, Patricia C

    2012-01-02

    As increasingly large amounts of data from genome and other sequencing projects become available, new approaches are needed to determine the functions of the proteins these genes encode. We show how large-scale computational analysis can help to address this challenge by linking functional information to sequence and structural similarities using protein similarity networks. Network analyses using three functionally diverse enzyme superfamilies illustrate the use of these approaches for facile updating and comparison of available structures for a large superfamily, for creation of functional hypotheses for metagenomic sequences, and to summarize the limits of our functional knowledge about even well studied superfamilies.

  20. Differential Relationships Between Diabetes Knowledge Scales and Diabetes Outcomes.

    Science.gov (United States)

    Dawson, Aprill Z; Walker, Rebekah J; Egede, Leonard E

    2017-08-01

    Background Diabetes affects more than 29 million people in the US and requires daily self-management in addition to knowledge of the disease. Three knowledge assessments used are the Michigan Brief Diabetes Knowledge Test (DKT), Starr County Diabetes Knowledge Questionnaire (DKQ), and Kaiser DISTANCE Survey (DISTANCE). Purpose The purpose of the study was to test the discriminate validity of 3 diabetes knowledge scales and determine which is best associated with diabetes self-care and glycemic control. Methods Three hundred sixty-one adults with type 2 diabetes were recruited from primary care clinics. Four analyses were conducted to investigate the validity and relationships of the scale: alpha statistic to test internal validity, factor analysis to determine how much of the variance was explained, Pearson's correlation between the 3 scales, and Pearson's correlation between each scale, self-care, and outcomes. Results The DKQ had an alpha of 0.75, the DKT had an alpha of 0.49, and DISTANCE had an alpha of 0.36. The DKQ was significantly correlated with glycemic control. The DKT scale was significantly associated with general diet, the DISTANCE survey was significantly associated with exercise, and both DKT and DISTANCE were significantly associated with foot care. Conclusion Correlations among the 3 scales were modest, suggesting the scales are not measuring the same underlying construct. These findings indicate that researchers should carefully select scales appropriate for study goals or to appropriately capture the information being sought to inform practice.

  1. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  2. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  3. A large scale analysis of information-theoretic network complexity measures using chemical structures.

    Directory of Open Access Journals (Sweden)

    Matthias Dehmer

    Full Text Available This paper aims to investigate information-theoretic network complexity measures which have already been intensely used in mathematical- and medicinal chemistry including drug design. Numerous such measures have been developed so far but many of them lack a meaningful interpretation, e.g., we want to examine which kind of structural information they detect. Therefore, our main contribution is to shed light on the relatedness between some selected information measures for graphs by performing a large scale analysis using chemical networks. Starting from several sets containing real and synthetic chemical structures represented by graphs, we study the relatedness between a classical (partition-based complexity measure called the topological information content of a graph and some others inferred by a different paradigm leading to partition-independent measures. Moreover, we evaluate the uniqueness of network complexity measures numerically. Generally, a high uniqueness is an important and desirable property when designing novel topological descriptors having the potential to be applied to large chemical databases.

  4. Information systems for knowledge management

    CERN Document Server

    Saad, Inès; Gargouri, Faiez

    2014-01-01

    More and more organizations are becoming aware of the importance of tacit and explicit knowledge owned by their members which corresponds to their experience and accumulated knowledge about the firm activities. However, considering the large amount of knowledge created and used in the organization, especially with the evolution of information and communications technologies, the firm must first determine the specific knowledge on which it is necessary to focus. Creating activities to enhance identification, preservation, and use of this knowledge is a powerful mean to improve the level of econ

  5. Inference of Functional Properties from Large-scale Analysis of Enzyme Superfamilies*

    Science.gov (United States)

    Brown, Shoshana D.; Babbitt, Patricia C.

    2012-01-01

    As increasingly large amounts of data from genome and other sequencing projects become available, new approaches are needed to determine the functions of the proteins these genes encode. We show how large-scale computational analysis can help to address this challenge by linking functional information to sequence and structural similarities using protein similarity networks. Network analyses using three functionally diverse enzyme superfamilies illustrate the use of these approaches for facile updating and comparison of available structures for a large superfamily, for creation of functional hypotheses for metagenomic sequences, and to summarize the limits of our functional knowledge about even well studied superfamilies. PMID:22069325

  6. Evaluating the use of local ecological knowledge to monitor hunted tropical-forest wildlife over large spatial scales

    Directory of Open Access Journals (Sweden)

    Luke Parry

    2015-09-01

    Full Text Available Monitoring the distribution and abundance of hunted wildlife is critical to achieving sustainable resource use, yet adequate data are sparse for most tropical regions. Conventional methods for monitoring hunted forest-vertebrate species require intensive in situ survey effort, which severely constrains spatial and temporal replication. Integrating local ecological knowledge (LEK into monitoring and management is appealing because it can be cost-effective, enhance community participation, and provide novel insights into sustainable resource use. We develop a technique to monitor population depletion of hunted forest wildlife in the Brazilian Amazon, based on the local ecological knowledge of rural hunters. We performed rapid interview surveys to estimate the landscape-scale depletion of ten large-bodied vertebrate species around 161 Amazonian riverine settlements. We assessed the explanatory and predictive power of settlement and landscape characteristics and were able to develop robust estimates of local faunal depletion. By identifying species-specific drivers of depletion and using secondary data on human population density, land form, and physical accessibility, we then estimated landscape- and regional-scale depletion. White-lipped peccary (Tayassu pecari, for example, were estimated to be absent from 17% of their putative range in Brazil's largest state (Amazonas, despite 98% of the original forest cover remaining intact. We found evidence that bushmeat consumption in small urban centers has far-reaching impacts on some forest species, including severe depletion well over 100 km from urban centers. We conclude that LEK-based approaches require further field validation, but have significant potential for community-based participatory monitoring as well as cost-effective, large-scale monitoring of threatened forest species.

  7. Cloud-enabled large-scale land surface model simulations with the NASA Land Information System

    Science.gov (United States)

    Duffy, D.; Vaughan, G.; Clark, M. P.; Peters-Lidard, C. D.; Nijssen, B.; Nearing, G. S.; Rheingrover, S.; Kumar, S.; Geiger, J. V.

    2017-12-01

    Developed by the Hydrological Sciences Laboratory at NASA Goddard Space Flight Center (GSFC), the Land Information System (LIS) is a high-performance software framework for terrestrial hydrology modeling and data assimilation. LIS provides the ability to integrate satellite and ground-based observational products and advanced modeling algorithms to extract land surface states and fluxes. Through a partnership with the National Center for Atmospheric Research (NCAR) and the University of Washington, the LIS model is currently being extended to include the Structure for Unifying Multiple Modeling Alternatives (SUMMA). With the addition of SUMMA in LIS, meaningful simulations containing a large multi-model ensemble will be enabled and can provide advanced probabilistic continental-domain modeling capabilities at spatial scales relevant for water managers. The resulting LIS/SUMMA application framework is difficult for non-experts to install due to the large amount of dependencies on specific versions of operating systems, libraries, and compilers. This has created a significant barrier to entry for domain scientists that are interested in using the software on their own systems or in the cloud. In addition, the requirement to support multiple run time environments across the LIS community has created a significant burden on the NASA team. To overcome these challenges, LIS/SUMMA has been deployed using Linux containers, which allows for an entire software package along with all dependences to be installed within a working runtime environment, and Kubernetes, which orchestrates the deployment of a cluster of containers. Within a cloud environment, users can now easily create a cluster of virtual machines and run large-scale LIS/SUMMA simulations. Installations that have taken weeks and months can now be performed in minutes of time. This presentation will discuss the steps required to create a cloud-enabled large-scale simulation, present examples of its use, and

  8. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  9. The Large-Scale Structure of Scientific Method

    Science.gov (United States)

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  10. Organisation of biotechnological information into knowledge.

    Science.gov (United States)

    Boh, B

    1996-09-01

    The success of biotechnological research, development and marketing depends to a large extent on the international transfer of information and on the ability to organise biotechnology information into knowledge. To increase the efficiency of information-based approaches, an information strategy has been developed and consists of the following stages: definition of the problem, its structure and sub-problems; acquisition of data by targeted processing of computer-supported bibliographic, numeric, textual and graphic databases; analysis of data and building of specialized in-house information systems; information processing for structuring data into systems, recognition of trends and patterns of knowledge, particularly by information synthesis using the concept of information density; design of research hypotheses; testing hypotheses in the laboratory and/or pilot plant; repeated evaluation and optimization of hypotheses by information methods and testing them by further laboratory work. The information approaches are illustrated by examples from the university-industry joint projects in biotechnology, biochemistry and agriculture.

  11. LASSIE: the large analogue signal and scaling information environment for FAIR

    International Nuclear Information System (INIS)

    Hoffmann, T.; Braeuning, H.; Haseitl, R.

    2012-01-01

    At FAIR, the Facility for Antiproton and Ion Research, several new accelerators and storage rings such as the SIS-100, HESR, CR, the inter-connecting HEBT beam lines, S-FRS and experiments will be built. All of these installations are equipped with beam diagnostic devices and other components, which deliver time-resolved analogue signals to show status, quality and performance of the accelerators. These signals can originate from particle detectors such as ionization chambers and plastic scintillators, but also from adapted output signals of transformers, collimators, magnet functions, RF cavities and others. To visualize and precisely correlate the time axis of all input signals a dedicated FESA based data acquisition and analysis system named LASSIE, the Large Analogue Signal and Scaling Information Environment, is currently being developed. The main operation mode of LASSIE is currently pulse counting with latching VME scaler boards. Later enhancements for ADC, QDC, or TDC digitization in the future are foreseen. The concept, features and challenges of this large distributed data acquisition system are presented. (authors)

  12. Tacit knowledge in academia: a proposed model and measurement scale.

    Science.gov (United States)

    Leonard, Nancy; Insch, Gary S

    2005-11-01

    The authors propose a multidimensional model of tacit knowledge and develop a measure of tacit knowledge in academia. They discuss the theory and extant literature on tacit knowledge and propose a 6-factor model. Experiment 1 is a replication of a recent study of academic tacit knowledge using the scale developed and administered at an Israeli university (A. Somech & R. Bogler, 1999). The results of the replication differed from those found in the original study. For Experiment 2, the authors developed a domain-specific measure of academic tacit knowledge, the Academic Tacit Knowledge Scale (ATKS), and used this measure to explore the multidimensionality of tacit knowledge proposed in the model. The results of an exploratory factor analysis (n=142) followed by a confirmatory factor analysis (n=286) are reported. The sample for both experiments was 428 undergraduate students enrolled at a large public university in the eastern United States. Results indicated that a 5-factor model of academic tacit knowledge provided a strong fit for the data.

  13. Ensuring Adequate Health and Safety Information for Decision Makers during Large-Scale Chemical Releases

    Science.gov (United States)

    Petropoulos, Z.; Clavin, C.; Zuckerman, B.

    2015-12-01

    The 2014 4-Methylcyclohexanemethanol (MCHM) spill in the Elk River of West Virginia highlighted existing gaps in emergency planning for, and response to, large-scale chemical releases in the United States. The Emergency Planning and Community Right-to-Know Act requires that facilities with hazardous substances provide Material Safety Data Sheets (MSDSs), which contain health and safety information on the hazardous substances. The MSDS produced by Eastman Chemical Company, the manufacturer of MCHM, listed "no data available" for various human toxicity subcategories, such as reproductive toxicity and carcinogenicity. As a result of incomplete toxicity data, the public and media received conflicting messages on the safety of the contaminated water from government officials, industry, and the public health community. Two days after the governor lifted the ban on water use, the health department partially retracted the ban by warning pregnant women to continue avoiding the contaminated water, which the Centers for Disease Control and Prevention deemed safe three weeks later. The response in West Virginia represents a failure in risk communication and calls to question if government officials have sufficient information to support evidence-based decisions during future incidents. Research capabilities, like the National Science Foundation RAPID funding, can provide a solution to some of the data gaps, such as information on environmental fate in the case of the MCHM spill. In order to inform policy discussions on this issue, a methodology for assessing the outcomes of RAPID and similar National Institutes of Health grants in the context of emergency response is employed to examine the efficacy of research-based capabilities in enhancing public health decision making capacity. The results of this assessment highlight potential roles rapid scientific research can fill in ensuring adequate health and safety data is readily available for decision makers during large-scale

  14. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  15. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  16. Influence of Extrinsic Information Scaling Coefficient on Double-Iterative Decoding Algorithm for Space-Time Turbo Codes with Large Number of Antennas

    Directory of Open Access Journals (Sweden)

    TRIFINA, L.

    2011-02-01

    Full Text Available This paper analyzes the extrinsic information scaling coefficient influence on double-iterative decoding algorithm for space-time turbo codes with large number of antennas. The max-log-APP algorithm is used, scaling both the extrinsic information in the turbo decoder and the one used at the input of the interference-canceling block. Scaling coefficients of 0.7 or 0.75 lead to a 0.5 dB coding gain compared to the no-scaling case, for one or more iterations to cancel the spatial interferences.

  17. Information Impact: Journal of Information and Knowledge ...

    African Journals Online (AJOL)

    Information Impact: Journal of Information and Knowledge Management: Advanced Search. Journal Home > Information Impact: Journal of Information and Knowledge Management: Advanced Search. Log in or Register to get access to full text downloads.

  18. Informational and emotional elements in online support groups: a Bayesian approach to large-scale content analysis.

    Science.gov (United States)

    Deetjen, Ulrike; Powell, John A

    2016-05-01

    This research examines the extent to which informational and emotional elements are employed in online support forums for 14 purposively sampled chronic medical conditions and the factors that influence whether posts are of a more informational or emotional nature. Large-scale qualitative data were obtained from Dailystrength.org. Based on a hand-coded training dataset, all posts were classified into informational or emotional using a Bayesian classification algorithm to generalize the findings. Posts that could not be classified with a probability of at least 75% were excluded. The overall tendency toward emotional posts differs by condition: mental health (depression, schizophrenia) and Alzheimer's disease consist of more emotional posts, while informational posts relate more to nonterminal physical conditions (irritable bowel syndrome, diabetes, asthma). There is no gender difference across conditions, although prostate cancer forums are oriented toward informational support, whereas breast cancer forums rather feature emotional support. Across diseases, the best predictors for emotional content are lower age and a higher number of overall posts by the support group member. The results are in line with previous empirical research and unify empirical findings from single/2-condition research. Limitations include the analytical restriction to predefined categories (informational, emotional) through the chosen machine-learning approach. Our findings provide an empirical foundation for building theory on informational versus emotional support across conditions, give insights for practitioners to better understand the role of online support groups for different patients, and show the usefulness of machine-learning approaches to analyze large-scale qualitative health data from online settings. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  20. Informed consent: information or knowledge?

    Science.gov (United States)

    Berger, Ken

    2003-01-01

    A fiduciary relationship should be nurtured between patient and physician. This requires effective communication throughout all aspects of care - especially pertaining to treatment decisions. In the context of illness as experienced by the patient a unique set of circumstances is presented. However, communication in an illness context is fraught with problems. The patient is vulnerable and the situation may be overwhelming. Voluminous amounts of information are available to patients from a host of health care providers, family members, support groups, advocacy centers, books, journals, and the internet. Often conflicting and confusion, frequently complex, this information may be of greater burden than benefit. Some information is of high validity and reliability while other information is of dubious reliability. The emotional freight of bad news may further inhibit understanding. An overload of information may pose an obstacle in decision-making. To facilitate the transformation of information into knowledge, the health care provider must act on some occasions as a filter, on other occasions as a conduit, and on still other occasions simply as a reservoir. The evolution of patient rights to receive or refuse treatment, the right to know or not to know calls for a change in processing of overwhelming information in our modem era. In this paper we will discuss the difference between information and knowledge. How can health care providers ensure they have given their patients all necessary and sufficient information to make an autonomous decision? How can they facilitate the transformation of information into knowledge? The effect of knowledge to consent allows a more focused, relevant and modern approach to choice in health care.

  1. Information Impact: Journal of Information and Knowledge ...

    African Journals Online (AJOL)

    Information Impact: Journal of Information and Knowledge Management: Site Map. Journal Home > About the Journal > Information Impact: Journal of Information and Knowledge Management: Site Map. Log in or Register to get access to full text downloads.

  2. Progress in Root Cause and Fault Propagation Analysis of Large-Scale Industrial Processes

    Directory of Open Access Journals (Sweden)

    Fan Yang

    2012-01-01

    Full Text Available In large-scale industrial processes, a fault can easily propagate between process units due to the interconnections of material and information flows. Thus the problem of fault detection and isolation for these processes is more concerned about the root cause and fault propagation before applying quantitative methods in local models. Process topology and causality, as the key features of the process description, need to be captured from process knowledge and process data. The modelling methods from these two aspects are overviewed in this paper. From process knowledge, structural equation modelling, various causal graphs, rule-based models, and ontological models are summarized. From process data, cross-correlation analysis, Granger causality and its extensions, frequency domain methods, information-theoretical methods, and Bayesian nets are introduced. Based on these models, inference methods are discussed to find root causes and fault propagation paths under abnormal situations. Some future work is proposed in the end.

  3. Knowledge for the future - Time eats information

    International Nuclear Information System (INIS)

    Kornwachs, Klaus

    2015-01-01

    The need to pass knowledge on to future generations is not unique to radioactive waste management. Think, for instance, of chemical waste, space debris, the location of land mines, or the genetic code of manipulated organisms, etc.. In all these cases we have to handle the impacts and effects of technologies over the long term. The time frame of these effects surmounts the lifetime of one generation and more. In order to enable future generations to handle this precarious legacy we need to hand on suitable information. However, this is not enough; we have to facilitate the understanding of the very meaning of this information, too. This can be referred to as a 'wicked problem', since the legacy of the nuclear age is distributed all over the world and huge amounts of wastes have been accumulated. There is not yet any solution available which could reduce the half-life of nuclear waste on a large industrial scale. Information is constantly decaying, e.g. due to copy processes and the limited lifetime of information carriers such as paper, chemical, electronic and nano-storage technologies. For time frames greater than 1 000 years none of the present technologies seems to be long lasting enough or effective by itself. It can be shown that no presently known information and communication technology (ICT) can preserve written or electronically stored information over 4 000 years, say. The preservation effort would have to include the reception, deciphering, and the semantically correct understanding. The decay of information entails the decay of knowledge. This leads to a decrease of possibilities to act. However, we and future generations need this knowledge (including the basics of physics and relevant technology) in order to be able to take action in the future. This task is still unresolved, both for nuclear waste management and for other issues. One can only try to pass knowledge on to future generations via institutions. However, an organisational

  4. Emergent Semantics Interoperability in Large-Scale Decentralized Information Systems

    CERN Document Server

    Cudré-Mauroux, Philippe

    2008-01-01

    Peer-to-peer systems are evolving with new information-system architectures, leading to the idea that the principles of decentralization and self-organization will offer new approaches in informatics, especially for systems that scale with the number of users or for which central authorities do not prevail. This book describes a new way of building global agreements (semantic interoperability) based only on decentralized, self-organizing interactions.

  5. Information contained within the large scale gas injection test (Lasgit) dataset exposed using a bespoke data analysis tool-kit

    International Nuclear Information System (INIS)

    Bennett, D.P.; Thomas, H.R.; Cuss, R.J.; Harrington, J.F.; Vardon, P.J.

    2012-01-01

    Document available in extended abstract form only. The Large Scale Gas Injection Test (Lasgit) is a field scale experiment run by the British Geological Survey (BGS) and is located approximately 420 m underground at SKB's Aespoe Hard Rock Laboratory (HRL) in Sweden. It has been designed to study the impact on safety of gas build up within a KBS-3V concept high level radioactive waste repository. Lasgit has been in almost continuous operation for approximately seven years and is still underway. An analysis of the dataset arising from the Lasgit experiment with particular attention to the smaller scale features and phenomenon recorded has been undertaken in parallel to the macro scale analysis performed by the BGS. Lasgit is a highly instrumented, frequently sampled and long-lived experiment leading to a substantial dataset containing in excess of 14.7 million datum points. The data is anticipated to include a wealth of information, including information regarding overall processes as well as smaller scale or 'second order' features. Due to the size of the dataset coupled with the detailed analysis of the dataset required and the reduction in subjectivity associated with measurement compared to observation, computational analysis is essential. Moreover, due to the length of operation and complexity of experimental activity, the Lasgit dataset is not typically suited to 'out of the box' time series analysis algorithms. In particular, the features that are not suited to standard algorithms include non-uniformities due to (deliberate) changes in sample rate at various points in the experimental history and missing data due to hardware malfunction/failure causing interruption of logging cycles. To address these features a computational tool-kit capable of performing an Exploratory Data Analysis (EDA) on long-term, large-scale datasets with non-uniformities has been developed. Particular tool-kit abilities include: the parameterization of signal variation in the dataset

  6. Success Factors of Large Scale ERP Implementation in Thailand

    OpenAIRE

    Rotchanakitumnuai; Siriluck

    2010-01-01

    The objectives of the study are to examine the determinants of ERP implementation success factors of ERP implementation. The result indicates that large scale ERP implementation success consist of eight factors: project management competence, knowledge sharing, ERP system quality , understanding, user involvement, business process re-engineering, top management support, organization readiness.

  7. Large-scale Health Information Database and Privacy Protection*1

    OpenAIRE

    YAMAMOTO, Ryuichi

    2016-01-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law...

  8. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  9. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  10. Coordinated SLNR based Precoding in Large-Scale Heterogeneous Networks

    KAUST Repository

    Boukhedimi, Ikram

    2017-03-06

    This work focuses on the downlink of large-scale two-tier heterogeneous networks composed of a macro-cell overlaid by micro-cell networks. Our interest is on the design of coordinated beamforming techniques that allow to mitigate the inter-cell interference. Particularly, we consider the case in which the coordinating base stations (BSs) have imperfect knowledge of the channel state information. Under this setting, we propose a regularized SLNR based precoding design in which the regularization factor is used to allow better resilience with respect to the channel estimation errors. Based on tools from random matrix theory, we provide an analytical analysis of the SINR and SLNR performances. These results are then exploited to propose a proper setting of the regularization factor. Simulation results are finally provided in order to validate our findings and to confirm the performance of the proposed precoding scheme.

  11. Coordinated SLNR based Precoding in Large-Scale Heterogeneous Networks

    KAUST Repository

    Boukhedimi, Ikram; Kammoun, Abla; Alouini, Mohamed-Slim

    2017-01-01

    This work focuses on the downlink of large-scale two-tier heterogeneous networks composed of a macro-cell overlaid by micro-cell networks. Our interest is on the design of coordinated beamforming techniques that allow to mitigate the inter-cell interference. Particularly, we consider the case in which the coordinating base stations (BSs) have imperfect knowledge of the channel state information. Under this setting, we propose a regularized SLNR based precoding design in which the regularization factor is used to allow better resilience with respect to the channel estimation errors. Based on tools from random matrix theory, we provide an analytical analysis of the SINR and SLNR performances. These results are then exploited to propose a proper setting of the regularization factor. Simulation results are finally provided in order to validate our findings and to confirm the performance of the proposed precoding scheme.

  12. Integrating Remote Sensing Information Into A Distributed Hydrological Model for Improving Water Budget Predictions in Large-scale Basins through Data Assimilation

    Science.gov (United States)

    Qin, Changbo; Jia, Yangwen; Su, Z.(Bob); Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen

    2008-01-01

    This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems. PMID:27879946

  13. D and D Knowledge Management Information Tool - 2012 - 12106

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyay, H.; Lagos, L.; Quintero, W.; Shoffner, P. [Applied Research Center, Florida International University, Miami. FL 33174 (United States); DeGregory, J. [Office of D and D and Facility Engineering, Environmental Management, Department of Energy (United States)

    2012-07-01

    Deactivation and decommissioning (D and D) work is a high priority activity across the Department of Energy (DOE) complex. Subject matter specialists (SMS) associated with the different ALARA (As-Low-As-Reasonably-Achievable) Centers, DOE sites, Energy Facility Contractors Group (EFCOG) and the D and D community have gained extensive knowledge and experience over the years in the cleanup of the legacy waste from the Manhattan Project. To prevent the D and D knowledge and expertise from being lost over time from the evolving and aging workforce, DOE and the Applied Research Center (ARC) at Florida International University (FIU) proposed to capture and maintain this valuable information in a universally available and easily usable system. D and D KM-IT provides single point access to all D and D related activities through its knowledge base. It is a community driven system. D and D KM-IT makes D and D knowledge available to the people who need it at the time they need it and in a readily usable format. It uses the World Wide Web as the primary source for content in addition to information collected from subject matter specialists and the D and D community. It brings information in real time through web based custom search processes and its dynamic knowledge repository. Future developments include developing a document library, providing D and D information access on mobile devices for the Technology module and Hotline, and coordinating multiple subject matter specialists to support the Hotline. The goal is to deploy a high-end sophisticated and secured system to serve as a single large knowledge base for all the D and D activities. The system consolidates a large amount of information available on the web and presents it to users in the simplest way possible. (authors)

  14. Theme II Joint Work Plan -2017 Collaboration and Knowledge Sharing on Large-scale Demonstration Projects

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiaoliang [World Resources Inst. (WRI), Washington, DC (United States); Stauffer, Philip H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-25

    This effort is designed to expedite learnings from existing and planned large demonstration projects and their associated research through effective knowledge sharing among participants in the US and China.

  15. A Knowledge Based Recommender System with Multigranular Linguistic Information

    Directory of Open Access Journals (Sweden)

    Luis Martinez

    2008-08-01

    Full Text Available Recommender systems are applications that have emerged in the e-commerce area in order to assist users in their searches in electronic shops. These shops usually offer a wide range of items that cover the necessities of a great variety of users. Nevertheless, searching in such a wide range of items could be a very difficult and time-consuming task. Recommender systems assist users to find out suitable items by means of recommendations based on information provided by different sources such as: other users, experts, item features, etc. Most of the recommender systems force users to provide their preferences or necessities using an unique numerical scale of information fixed in advance. In spite of this information is usually related to opinions, tastes and perceptions, therefore, it seems that is usually better expressed in a qualitative way, with linguistic terms, than in a quantitative way, with precise numbers. We propose a Knowledge Based Recommender System that uses the fuzzy linguistic approach to define a flexible framework to capture the uncertainty of the user's preferences. Thus, this framework will allow users to express their necessities in scales closer to their own knowledge, and different from the scale utilized to describe the items.

  16. [A large-scale accident in Alpine terrain].

    Science.gov (United States)

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  17. Knowledge Management for Large Scale Condition Based Maintenance

    Data.gov (United States)

    National Aeronautics and Space Administration — This presentation will review the use of knowledge management in the development and support of Condition Based Maintenance (CBM) systems for complex systems with...

  18. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  19. Mechanism of information resources knowledge

    Directory of Open Access Journals (Sweden)

    A. M. Izmailov

    2016-01-01

    Full Text Available Modern vector of development of the economy is quite different from previous periods, which in it turn is reflected in the strategic state documents generated including the Government of Russia. In particular, it should be noted that to date, the state defined the vector of development defined as an innovative and socially oriented. Given the level of technical and technological progress and its impact on the realities of the present time it is necessary to pay attention to enhance the relevance and significance of information in the development of modern society. The article analyzes the degree of influence of information and knowledge in modern economic processes taking place in society. The analysis is based on the study of scientific works of modern scientists specializing in the study of information problems and knowledge economy. The main trends in the development of the information society status related primarily to the expansion of data, as well as strengthening the role of progressive information technologies to reduce the role of the geographical factor of resistance in sharing knowledge. The authors defined the essence of information knowledge environment, as well as the marked characteristics form the spectrum of features inherent to the modern information environment. The analysis of the concept of «information knowledge resource» (IKR, which is a synthesis of three terms: information, knowledge, resources, and refined his interpretation. IKR designated role in shaping the knowledge economy. Presents specific features which have knowledge information and resources, among which stands out: intangibility, immeasurable, non-exclusive use. The paper presents a model of the PSI, based on the transformation of information into knowledge under the influence of certain factors that could affect the quality of the resulting knowledge. Also in the proposed IKR management mechanism as a central element of organizational change

  20. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  1. Supporting Knowledge Transfer in IS Deployment Projects

    Science.gov (United States)

    Schönström, Mikael

    To deploy new information systems is an expensive and complex task, and does seldom result in successful usage where the system adds strategic value to the firm (e.g. Sharma et al. 2003). It has been argued that innovation diffusion is a knowledge integration problem (Newell et al. 2000). Knowledge about business processes, deployment processes, information systems and technology are needed in a large-scale deployment of a corporate IS. These deployments can therefore to a large extent be argued to be a knowledge management (KM) problem. An effective deployment requires that knowledge about the system is effectively transferred to the target organization (Ko et al. 2005).

  2. Knowledge Sharing Strategies for Large Complex Building Projects.

    Directory of Open Access Journals (Sweden)

    Esra Bektas

    2013-06-01

    knowledge sharing. In the literature, two main approaches on knowledge sharing can be distinguished, an object or content -oriented perspective and a community-oriented perspective. In the object perspective, technology is seen as a medium to store and share knowledge. The limitations of this perspective are that social processes and tacit knowledge are not adequately supported and there has generally been a slow adoption of such technology in design practices. The community perspective prevents these problems by allowing for the natural and informal formation of communities, however, communities can become largely independent and unconnected, which makes it more difficult to entice them towards a certain strategic direction. Since both approaches have their limitations, this thesis proposes a holistic framework for knowledge sharing in LCBPs drawing on concepts by Mintzberg (1973 and Activity Theory. Mintzberg’s concepts are used to discuss the type of implementation (top-down, bottom-up, and the origin of the strategies (deliberate/emergent/(unrealized. For analysing the content and effect of each strategy, concepts from Activity Theory (tools, subject, object, rules, community, and division of labour are used. The proposed model, the Knowledge Diamond, consists of four dimensions for analysing and designing knowledge sharing strategies for LCBPs. Three of these were inspired by Activity Theory, namely tools, procedures, and social practices, while a fourth emerged as a crucial dimension in this thesis: physical settings. This framework was used to examine knowledge sharing strategies in a comparative analysis of two large complex building projects. Based on rich data from observations, documents and interview, the origins, the development and the effect of both forms of knowledge sharing environments are investigated. The first part of the analysis sheds new light on the possibilities of knowledge sharing in large complex building project. The unique, temporary, and

  3. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  4. Information Impact | Journal of Information and Knowledge ...

    African Journals Online (AJOL)

    USER

    Information Impact | Journal of Information and Knowledge Management ... Abstract. This study was designed to determine the impact of collaboration on research and teaching .... delivery of quality instruction. ... for doing Collaborative projects, and there ... reveals that co-construction of knowledge ..... Get relevant materials.

  5. Requirements and principles for the implementation and construction of large-scale geographic information systems

    Science.gov (United States)

    Smith, Terence R.; Menon, Sudhakar; Star, Jeffrey L.; Estes, John E.

    1987-01-01

    This paper provides a brief survey of the history, structure and functions of 'traditional' geographic information systems (GIS), and then suggests a set of requirements that large-scale GIS should satisfy, together with a set of principles for their satisfaction. These principles, which include the systematic application of techniques from several subfields of computer science to the design and implementation of GIS and the integration of techniques from computer vision and image processing into standard GIS technology, are discussed in some detail. In particular, the paper provides a detailed discussion of questions relating to appropriate data models, data structures and computational procedures for the efficient storage, retrieval and analysis of spatially-indexed data.

  6. Challenges for Large Scale Structure Theory

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    I will describe some of the outstanding questions in Cosmology where answers could be provided by observations of the Large Scale Structure of the Universe at late times.I will discuss some of the theoretical challenges which will have to be overcome to extract this information from the observations. I will describe some of the theoretical tools that might be useful to achieve this goal. 

  7. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  8. Portraiture in the Large Lecture: Storying One Chemistry Professor's Practical Knowledge

    Science.gov (United States)

    Eddleton, Jeannine E.

    Practical knowledge, as defined by Freema Elbaz (1983), is a complex, practically oriented set of understandings which teachers use to actively shape and direct their work. The goal of this study is the construction of a social science portrait that illuminates the practical knowledge of a large lecture professor of general chemistry at a public research university in the southeast. This study continues Elbaz's (1981) work on practical knowledge with the incorporation of a qualitative and intentionally interventionist methodology which "blurs the boundaries of aesthetics and empiricism in an effort to capture the complexity, dynamics, and subtlety of human experience and organizational life," (Lawrence-Lightfoot & Davis, 1997). This collection of interviews, observations, writings, and reflections is designed for an eclectic audience with the intent of initiating conversation on the topic of the large lecture and is a purposeful attempt to link research and practice. Social science portraiture is uniquely suited to this intersection of researcher and researched, the perfect combination of methodology and analysis for a project that is both product and praxis. The following research questions guide the study. • Are aspects of Elbaz's practical knowledge identifiable in the research conversations conducted with a large lecture college professor? • Is practical knowledge identifiable during observations of Patricia's large lecture? Freema Elbaz conducted research conversations with Sarah, a high school classroom and writing resource teacher who conducted much of her teaching work one on one with students. Patricia's practice differs significantly from Sarah's with respect to subject matter and to scale.

  9. Knowledge of Knowledge: Problematic of Epistemology of Library and Information Science

    Directory of Open Access Journals (Sweden)

    Hasan Keseroğlu

    2010-12-01

    philosophy, taken off from all the implementations, is only based on concepts and language. It is upper disciplinary. The focus of this study is to argue the Library and Information Science theory problematic in Turkey and an attempt to describe knowledge of this field. The theory of knowledge of any discipline can solely be established and enhanced onto the unique knowledge of that discipline. Mentioning of theory of Library and Information Science knowledge, is possible due to the distinctive knowledge detached from other disciplines. This distinctive knowledge, is the knowledge of library institution, that has come unchanged since its first models, and when removed from the field (LIS, becomes ordinary and moves out of originality of the library and information science. “The theory of knowledge of the field of Library and information science” need to be examined from three perspectives: Library and information science field knowledge; knowledge of organization of recorded information as object of the library (all processes from selection to use and knowledge of the user.

  10. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  11. Tacit knowledge: A refinement and empirical test of the Academic Tacit Knowledge Scale.

    Science.gov (United States)

    Insch, Gary S; McIntyre, Nancy; Dawley, David

    2008-11-01

    Researchers have linked tacit knowledge to improved organizational performance, but research on how to measure tacit knowledge is scarce. In the present study, the authors proposed and empirically tested a model of tacit knowledge and an accompanying measurement scale of academic tacit knowledge. They present 6 hypotheses that support the proposed tacit knowledge model regarding the role of cognitive (self-motivation, self-organization); technical (individual task, institutional task); and social (task-related, general) skills. The authors tested these hypotheses with 542 responses to the Academic Tacit Knowledge Scale, which included the respondents' grade point average-the performance variable. All 6 hypotheses were supported.

  12. Multi-level discriminative dictionary learning with application to large scale image classification.

    Science.gov (United States)

    Shen, Li; Sun, Gang; Huang, Qingming; Wang, Shuhui; Lin, Zhouchen; Wu, Enhua

    2015-10-01

    The sparse coding technique has shown flexibility and capability in image representation and analysis. It is a powerful tool in many visual applications. Some recent work has shown that incorporating the properties of task (such as discrimination for classification task) into dictionary learning is effective for improving the accuracy. However, the traditional supervised dictionary learning methods suffer from high computation complexity when dealing with large number of categories, making them less satisfactory in large scale applications. In this paper, we propose a novel multi-level discriminative dictionary learning method and apply it to large scale image classification. Our method takes advantage of hierarchical category correlation to encode multi-level discriminative information. Each internal node of the category hierarchy is associated with a discriminative dictionary and a classification model. The dictionaries at different layers are learnt to capture the information of different scales. Moreover, each node at lower layers also inherits the dictionary of its parent, so that the categories at lower layers can be described with multi-scale information. The learning of dictionaries and associated classification models is jointly conducted by minimizing an overall tree loss. The experimental results on challenging data sets demonstrate that our approach achieves excellent accuracy and competitive computation cost compared with other sparse coding methods for large scale image classification.

  13. Rank Order Coding: a Retinal Information Decoding Strategy Revealed by Large-Scale Multielectrode Array Retinal Recordings.

    Science.gov (United States)

    Portelli, Geoffrey; Barrett, John M; Hilgen, Gerrit; Masquelier, Timothée; Maccione, Alessandro; Di Marco, Stefano; Berdondini, Luca; Kornprobst, Pierre; Sernagor, Evelyne

    2016-01-01

    How a population of retinal ganglion cells (RGCs) encodes the visual scene remains an open question. Going beyond individual RGC coding strategies, results in salamander suggest that the relative latencies of a RGC pair encode spatial information. Thus, a population code based on this concerted spiking could be a powerful mechanism to transmit visual information rapidly and efficiently. Here, we tested this hypothesis in mouse by recording simultaneous light-evoked responses from hundreds of RGCs, at pan-retinal level, using a new generation of large-scale, high-density multielectrode array consisting of 4096 electrodes. Interestingly, we did not find any RGCs exhibiting a clear latency tuning to the stimuli, suggesting that in mouse, individual RGC pairs may not provide sufficient information. We show that a significant amount of information is encoded synergistically in the concerted spiking of large RGC populations. Thus, the RGC population response described with relative activities, or ranks, provides more relevant information than classical independent spike count- or latency- based codes. In particular, we report for the first time that when considering the relative activities across the whole population, the wave of first stimulus-evoked spikes is an accurate indicator of stimulus content. We show that this coding strategy coexists with classical neural codes, and that it is more efficient and faster. Overall, these novel observations suggest that already at the level of the retina, concerted spiking provides a reliable and fast strategy to rapidly transmit new visual scenes.

  14. Knowledge base mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Suwa, M; Furukawa, K; Makinouchi, A; Mizoguchi, T; Mizoguchi, F; Yamasaki, H

    1982-01-01

    One of the principal goals of the Fifth Generation Computer System Project for the coming decade is to develop a methodology for building knowledge information processing systems which will provide people with intelligent agents. The key notion of the fifth generation computer system is knowledge used for problem solving. In this paper the authors describe the plan of Randd on knowledge base mechanisms. A knowledge representation system is to be designed to support knowledge acquisition for the knowledge information processing systems. The system will include a knowledge representation language, a knowledge base editor and a debugger. It is also expected to perform as a kind of meta-inference system. In order to develop the large scale knowledge base systems, a knowledge base mechanism based on the relational model is to be studied in the earlier stage of the project. Distributed problem solving is also one of the main issues of the project. 19 references.

  15. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  16. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  17. Collaborative filtering to improve navigation of large radiology knowledge resources.

    Science.gov (United States)

    Kahn, Charles E

    2005-06-01

    Collaborative filtering is a knowledge-discovery technique that can help guide readers to items of potential interest based on the experience of prior users. This study sought to determine the impact of collaborative filtering on navigation of a large, Web-based radiology knowledge resource. Collaborative filtering was applied to a collection of 1,168 radiology hypertext documents available via the Internet. An item-based collaborative filtering algorithm identified each document's six most closely related documents based on 248,304 page views in an 18-day period. Documents were amended to include links to their related documents, and use was analyzed over the next 5 days. The mean number of documents viewed per visit increased from 1.57 to 1.74 (P Collaborative filtering can increase a radiology information resource's utilization and can improve its usefulness and ease of navigation. The technique holds promise for improving navigation of large Internet-based radiology knowledge resources.

  18. Impact of preoperative information on anxiety and disease-related knowledge in women undergoing mastectomy for breast cancer: a randomized clinical trial.

    Science.gov (United States)

    Wysocki, W M; Mituś, J; Komorowski, A L; Karolewski, K

    2012-01-01

    Despite the large number of clinical trials on breast cancer, patient-related factors such as perioperative anxiety and level of knowledge about the disease and treatment have not been included in mainstream research efforts. This randomized trial was performed to evaluate the impact of information, provided preoperatively, on anxiety and knowledge of women undergoing mastectomy for breast cancer. Sixty consecutive patients with breast cancer, admitted for a mastectomy, as primary treatment for breast cancer, with no previous cancer history, were randomized to receive structured information (short video about practical aspects of the hospital stay, surgical and adjuvant treatment) in addition to the routine informed consent procedure for surgery or the routine informed consent only. Anxiety and subjective knowledge levels were measured with the visual analogue scales; in addition, knowledge was assessed with a questionnaire. There was no significant effect of the additional information on perioperative anxiety or knowledge (subjective). Significantly more patients in the additional information group correctly listed all major available treatment options compared to the patients that received routine information (preoperatively 54% vs. 19%; p = 0.0101; 7 days postoperatively 50% vs.19%; p = 0.0367). Use of an informational video, preoperatively, did not significantly affect perioperative anxiety or subjective knowledge. Additional research is needed on effective delivery of disease- and treatment-specific information perioperatively.

  19. Adaptive Texture Synthesis for Large Scale City Modeling

    Science.gov (United States)

    Despine, G.; Colleu, T.

    2015-02-01

    Large scale city models textured with aerial images are well suited for bird-eye navigation but generally the image resolution does not allow pedestrian navigation. One solution to face this problem is to use high resolution terrestrial photos but it requires huge amount of manual work to remove occlusions. Another solution is to synthesize generic textures with a set of procedural rules and elementary patterns like bricks, roof tiles, doors and windows. This solution may give realistic textures but with no correlation to the ground truth. Instead of using pure procedural modelling we present a method to extract information from aerial images and adapt the texture synthesis to each building. We describe a workflow allowing the user to drive the information extraction and to select the appropriate texture patterns. We also emphasize the importance to organize the knowledge about elementary pattern in a texture catalogue allowing attaching physical information, semantic attributes and to execute selection requests. Roofs are processed according to the detected building material. Façades are first described in terms of principal colours, then opening positions are detected and some window features are computed. These features allow selecting the most appropriate patterns from the texture catalogue. We experimented this workflow on two samples with 20 cm and 5 cm resolution images. The roof texture synthesis and opening detection were successfully conducted on hundreds of buildings. The window characterization is still sensitive to the distortions inherent to the projection of aerial images onto the facades.

  20. The Problem of Scale in Indigenous Knowledge: a Perspective from Northern Australia

    Directory of Open Access Journals (Sweden)

    Marc Wohling

    2009-06-01

    Full Text Available Over the last decade, indigenous knowledge has been widely touted by researchers and natural resource managers as a valuable contributor to natural resource management and biodiversity conservation. In Australia, the concept of indigenous knowledge has gained such rapid currency that it has tended toward an essentialized and universal truth rather than remaining a diverse range of highly localized and contested knowledge. In this paper, I undertake a critical analysis of some of the current issues around the interpretation and application of indigenous knowledge and its relationship with natural resource management in northern Australia. Through a focus on how indigenous knowledge operates at a range of scales, I argue that indigenous knowledge is not adapted to the scales and kinds of disturbances that contemporary society is exerting on natural systems. Rather than being realistic about the limitations of indigenous knowledge, I argue that nonindigenous interpretations of indigenous knowledge have propelled us toward reified meanings, abstracted concepts, and an information-based taxonomy of place. The result can be the diminishing and ossifying of a dynamic living practice and the failure to recognize expressions of indigeneity in contemporary forms.

  1. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  2. Knowledge diffusion within a large conservation organization and beyond

    Science.gov (United States)

    Montambault, Jensen; Burford, Kyle P.; Gopalakrishna, Trisha; Masuda, Yuta J.; Reddy, Sheila M. W.; Torphy, Kaitlin; Salcedo, Andrea I.

    2018-01-01

    The spread and uptake of new ideas (diffusion of innovations) is critical for organizations to adapt over time, but there is little evidence of how this happens within organizations and to their broader community. To address this, we analyzed how individuals accessed information about a recent science innovation at a large, international, biodiversity conservation non-profit–The Nature Conservancy–and then traced the flow of how this information was shared within the organization and externally, drawing on an exceptionally data-rich environment. We used surveys and tracking of individual internet activity to understand mechanisms for early-stage diffusion (knowledge seeking and sharing) following the integration of social science and evidence principles into the institutional planning framework: Conservation by Design (CbD 2.0). Communications sent to all employees effectively catalyzed 56.4% to exhibit knowledge seeking behavior, measured by individual downloads from and visits to a restricted-access site. Individuals who self-reported through a survey that they shared information about CbD 2.0 internally were more likely to have both received and sought out information about the framework. Such individuals tended to hold positions within a higher job grade, were more likely to train others on CbD as part of their job, and to enroll in other online professional development offerings. Communication strategies targeting external audiences did not appear to influence information seeking behavior. Staff who engaged in internal knowledge sharing and adopting “evidence” practices from CbD 2.0 were more likely to have shared the document externally. We found a negative correlation with external sharing behavior and in-person trainings. Our findings suggest repeated, direct email communications aimed at wide audiences can effectively promote diffusion of new ideas. We also found a wide range of employee characteristics and circumstances to be associated with

  3. Knowledge diffusion within a large conservation organization and beyond.

    Science.gov (United States)

    Fisher, Jonathan R B; Montambault, Jensen; Burford, Kyle P; Gopalakrishna, Trisha; Masuda, Yuta J; Reddy, Sheila M W; Torphy, Kaitlin; Salcedo, Andrea I

    2018-01-01

    The spread and uptake of new ideas (diffusion of innovations) is critical for organizations to adapt over time, but there is little evidence of how this happens within organizations and to their broader community. To address this, we analyzed how individuals accessed information about a recent science innovation at a large, international, biodiversity conservation non-profit-The Nature Conservancy-and then traced the flow of how this information was shared within the organization and externally, drawing on an exceptionally data-rich environment. We used surveys and tracking of individual internet activity to understand mechanisms for early-stage diffusion (knowledge seeking and sharing) following the integration of social science and evidence principles into the institutional planning framework: Conservation by Design (CbD 2.0). Communications sent to all employees effectively catalyzed 56.4% to exhibit knowledge seeking behavior, measured by individual downloads from and visits to a restricted-access site. Individuals who self-reported through a survey that they shared information about CbD 2.0 internally were more likely to have both received and sought out information about the framework. Such individuals tended to hold positions within a higher job grade, were more likely to train others on CbD as part of their job, and to enroll in other online professional development offerings. Communication strategies targeting external audiences did not appear to influence information seeking behavior. Staff who engaged in internal knowledge sharing and adopting "evidence" practices from CbD 2.0 were more likely to have shared the document externally. We found a negative correlation with external sharing behavior and in-person trainings. Our findings suggest repeated, direct email communications aimed at wide audiences can effectively promote diffusion of new ideas. We also found a wide range of employee characteristics and circumstances to be associated with knowledge

  4. Information Impact: Journal of Information and Knowledge ...

    African Journals Online (AJOL)

    Information Impact: Journal of Information and Knowledge management

    Five (5) research questions guided the study and three hypotheses were tested at 0.05 level of ... key factor of knowledge management is knowledge sharing. ... difference has been recorded in terms of skills acquisition, information literacy, classroom ... However, little is known about knowledge-sharing strategies and their ...

  5. A Combined Eulerian-Lagrangian Data Representation for Large-Scale Applications.

    Science.gov (United States)

    Sauer, Franz; Xie, Jinrong; Ma, Kwan-Liu

    2017-10-01

    The Eulerian and Lagrangian reference frames each provide a unique perspective when studying and visualizing results from scientific systems. As a result, many large-scale simulations produce data in both formats, and analysis tasks that simultaneously utilize information from both representations are becoming increasingly popular. However, due to their fundamentally different nature, drawing correlations between these data formats is a computationally difficult task, especially in a large-scale setting. In this work, we present a new data representation which combines both reference frames into a joint Eulerian-Lagrangian format. By reorganizing Lagrangian information according to the Eulerian simulation grid into a "unit cell" based approach, we can provide an efficient out-of-core means of sampling, querying, and operating with both representations simultaneously. We also extend this design to generate multi-resolution subsets of the full data to suit the viewer's needs and provide a fast flow-aware trajectory construction scheme. We demonstrate the effectiveness of our method using three large-scale real world scientific datasets and provide insight into the types of performance gains that can be achieved.

  6. Information Impact | Journal of Information and Knowledge ...

    African Journals Online (AJOL)

    Information Impact | Journal of Information and Knowledge Management

    Information Impact | Journal of Information and Knowledge Management. 77 ... Libraries all over the world are applying information and communication ..... Bank Branch Performance, The International Journal on Advances in ICT for Emerging.

  7. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. EvArnoldi: A New Algorithm for Large-Scale Eigenvalue Problems.

    Science.gov (United States)

    Tal-Ezer, Hillel

    2016-05-19

    Eigenvalues and eigenvectors are an essential theme in numerical linear algebra. Their study is mainly motivated by their high importance in a wide range of applications. Knowledge of eigenvalues is essential in quantum molecular science. Solutions of the Schrödinger equation for the electrons composing the molecule are the basis of electronic structure theory. Electronic eigenvalues compose the potential energy surfaces for nuclear motion. The eigenvectors allow calculation of diople transition matrix elements, the core of spectroscopy. The vibrational dynamics molecule also requires knowledge of the eigenvalues of the vibrational Hamiltonian. Typically in these problems, the dimension of Hilbert space is huge. Practically, only a small subset of eigenvalues is required. In this paper, we present a highly efficient algorithm, named EvArnoldi, for solving the large-scale eigenvalues problem. The algorithm, in its basic formulation, is mathematically equivalent to ARPACK ( Sorensen , D. C. Implicitly Restarted Arnoldi/Lanczos Methods for Large Scale Eigenvalue Calculations ; Springer , 1997 ; Lehoucq , R. B. ; Sorensen , D. C. SIAM Journal on Matrix Analysis and Applications 1996 , 17 , 789 ; Calvetti , D. ; Reichel , L. ; Sorensen , D. C. Electronic Transactions on Numerical Analysis 1994 , 2 , 21 ) (or Eigs of Matlab) but significantly simpler.

  9. New Visions for Large Scale Networks: Research and Applications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This paper documents the findings of the March 12-14, 2001 Workshop on New Visions for Large-Scale Networks: Research and Applications. The workshops objectives were...

  10. Challenges in Managing Trustworthy Large-scale Digital Science

    Science.gov (United States)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  11. Environment and host as large-scale controls of ectomycorrhizal fungi.

    Science.gov (United States)

    van der Linde, Sietse; Suz, Laura M; Orme, C David L; Cox, Filipa; Andreae, Henning; Asi, Endla; Atkinson, Bonnie; Benham, Sue; Carroll, Christopher; Cools, Nathalie; De Vos, Bruno; Dietrich, Hans-Peter; Eichhorn, Johannes; Gehrmann, Joachim; Grebenc, Tine; Gweon, Hyun S; Hansen, Karin; Jacob, Frank; Kristöfel, Ferdinand; Lech, Paweł; Manninger, Miklós; Martin, Jan; Meesenburg, Henning; Merilä, Päivi; Nicolas, Manuel; Pavlenda, Pavel; Rautio, Pasi; Schaub, Marcus; Schröck, Hans-Werner; Seidling, Walter; Šrámek, Vít; Thimonier, Anne; Thomsen, Iben Margrete; Titeux, Hugues; Vanguelova, Elena; Verstraeten, Arne; Vesterdal, Lars; Waldner, Peter; Wijk, Sture; Zhang, Yuxin; Žlindra, Daniel; Bidartondo, Martin I

    2018-06-06

    Explaining the large-scale diversity of soil organisms that drive biogeochemical processes-and their responses to environmental change-is critical. However, identifying consistent drivers of belowground diversity and abundance for some soil organisms at large spatial scales remains problematic. Here we investigate a major guild, the ectomycorrhizal fungi, across European forests at a spatial scale and resolution that is-to our knowledge-unprecedented, to explore key biotic and abiotic predictors of ectomycorrhizal diversity and to identify dominant responses and thresholds for change across complex environmental gradients. We show the effect of 38 host, environment, climate and geographical variables on ectomycorrhizal diversity, and define thresholds of community change for key variables. We quantify host specificity and reveal plasticity in functional traits involved in soil foraging across gradients. We conclude that environmental and host factors explain most of the variation in ectomycorrhizal diversity, that the environmental thresholds used as major ecosystem assessment tools need adjustment and that the importance of belowground specificity and plasticity has previously been underappreciated.

  12. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  13. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  14. ADAPTIVE TEXTURE SYNTHESIS FOR LARGE SCALE CITY MODELING

    Directory of Open Access Journals (Sweden)

    G. Despine

    2015-02-01

    Full Text Available Large scale city models textured with aerial images are well suited for bird-eye navigation but generally the image resolution does not allow pedestrian navigation. One solution to face this problem is to use high resolution terrestrial photos but it requires huge amount of manual work to remove occlusions. Another solution is to synthesize generic textures with a set of procedural rules and elementary patterns like bricks, roof tiles, doors and windows. This solution may give realistic textures but with no correlation to the ground truth. Instead of using pure procedural modelling we present a method to extract information from aerial images and adapt the texture synthesis to each building. We describe a workflow allowing the user to drive the information extraction and to select the appropriate texture patterns. We also emphasize the importance to organize the knowledge about elementary pattern in a texture catalogue allowing attaching physical information, semantic attributes and to execute selection requests. Roofs are processed according to the detected building material. Façades are first described in terms of principal colours, then opening positions are detected and some window features are computed. These features allow selecting the most appropriate patterns from the texture catalogue. We experimented this workflow on two samples with 20 cm and 5 cm resolution images. The roof texture synthesis and opening detection were successfully conducted on hundreds of buildings. The window characterization is still sensitive to the distortions inherent to the projection of aerial images onto the facades.

  15. Expectation propagation for large scale Bayesian inference of non-linear molecular networks from perturbation data.

    Science.gov (United States)

    Narimani, Zahra; Beigy, Hamid; Ahmad, Ashar; Masoudi-Nejad, Ali; Fröhlich, Holger

    2017-01-01

    Inferring the structure of molecular networks from time series protein or gene expression data provides valuable information about the complex biological processes of the cell. Causal network structure inference has been approached using different methods in the past. Most causal network inference techniques, such as Dynamic Bayesian Networks and ordinary differential equations, are limited by their computational complexity and thus make large scale inference infeasible. This is specifically true if a Bayesian framework is applied in order to deal with the unavoidable uncertainty about the correct model. We devise a novel Bayesian network reverse engineering approach using ordinary differential equations with the ability to include non-linearity. Besides modeling arbitrary, possibly combinatorial and time dependent perturbations with unknown targets, one of our main contributions is the use of Expectation Propagation, an algorithm for approximate Bayesian inference over large scale network structures in short computation time. We further explore the possibility of integrating prior knowledge into network inference. We evaluate the proposed model on DREAM4 and DREAM8 data and find it competitive against several state-of-the-art existing network inference methods.

  16. Large scale tracking of stem cells using sparse coding and coupled graphs

    DEFF Research Database (Denmark)

    Vestergaard, Jacob Schack; Dahl, Anders Lindbjerg; Holm, Peter

    Stem cell tracking is an inherently large scale problem. The challenge is to identify and track hundreds or thousands of cells over a time period of several weeks. This requires robust methods that can leverage the knowledge of specialists on the field. The tracking pipeline presented here consists...

  17. Large-Scale medical image analytics: Recent methodologies, applications and Future directions.

    Science.gov (United States)

    Zhang, Shaoting; Metaxas, Dimitris

    2016-10-01

    Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion. Copyright © 2016. Published by Elsevier B.V.

  18. Nuclear knowledge and information management in Croatia

    International Nuclear Information System (INIS)

    Pleslic, S.; Novosel, N.

    2004-01-01

    Since the IAEA was authorized for exchange of technical and scientific information on peaceful uses of atomic energy, it established INIS in 1970 as an international bibliographic database in nuclear field and in nuclear related areas. All Member States, which are at different levels of technological development, could derive benefits from INIS output products and get the support from the IAEA in systematic knowledge preservation and information exchange. Intention is the transferring of practical experience to the younger generation and the archiving of important information. Croatia is successfully involved in activities in knowledge and information management from 1994 when joined INIS. Accumulation of knowledge including technical information in databases and documents, and knowledge of scientists, engineers, researchers and technicians is base for the use of nuclear technology. Nuclear knowledge and information exchange are important for process of decision-making. Thanks to development and application of new information technologies within INIS information management framework, Members improve the collection, production and dissemination of nuclear knowledge and information. (author)

  19. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  20. Large-scale chromosome folding versus genomic DNA sequences: A discrete double Fourier transform technique.

    Science.gov (United States)

    Chechetkin, V R; Lobzin, V V

    2017-08-07

    Using state-of-the-art techniques combining imaging methods and high-throughput genomic mapping tools leaded to the significant progress in detailing chromosome architecture of various organisms. However, a gap still remains between the rapidly growing structural data on the chromosome folding and the large-scale genome organization. Could a part of information on the chromosome folding be obtained directly from underlying genomic DNA sequences abundantly stored in the databanks? To answer this question, we developed an original discrete double Fourier transform (DDFT). DDFT serves for the detection of large-scale genome regularities associated with domains/units at the different levels of hierarchical chromosome folding. The method is versatile and can be applied to both genomic DNA sequences and corresponding physico-chemical parameters such as base-pairing free energy. The latter characteristic is closely related to the replication and transcription and can also be used for the assessment of temperature or supercoiling effects on the chromosome folding. We tested the method on the genome of E. coli K-12 and found good correspondence with the annotated domains/units established experimentally. As a brief illustration of further abilities of DDFT, the study of large-scale genome organization for bacteriophage PHIX174 and bacterium Caulobacter crescentus was also added. The combined experimental, modeling, and bioinformatic DDFT analysis should yield more complete knowledge on the chromosome architecture and genome organization. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Direction of information flow in large-scale resting-state networks is frequency-dependent

    NARCIS (Netherlands)

    Hillebrand, Arjan; Tewarie, Prejaas; Van Dellen, Edwin; Yu, Meichen; Carbo, Ellen W S; Douw, Linda; Gouw, Alida A.; Van Straaten, Elisabeth C W; Stam, Cornelis J.

    2016-01-01

    Normal brain function requires interactions between spatially separated, and functionally specialized, macroscopic regions, yet the directionality of these interactions in large-scale functional networks is unknown. Magnetoencephalography was used to determine the directionality of these

  2. An industrial perspective on bioreactor scale-down: what we can learn from combined large-scale bioprocess and model fluid studies.

    Science.gov (United States)

    Noorman, Henk

    2011-08-01

    For industrial bioreactor design, operation, control and optimization, the scale-down approach is often advocated to efficiently generate data on a small scale, and effectively apply suggested improvements to the industrial scale. In all cases it is important to ensure that the scale-down conditions are representative of the real large-scale bioprocess. Progress is hampered by limited detailed and local information from large-scale bioprocesses. Complementary to real fermentation studies, physical aspects of model fluids such as air-water in large bioreactors provide useful information with limited effort and cost. Still, in industrial practice, investments of time, capital and resources often prohibit systematic work, although, in the end, savings obtained in this way are trivial compared to the expenses that result from real process disturbances, batch failures, and non-flyers with loss of business opportunity. Here we try to highlight what can be learned from real large-scale bioprocess in combination with model fluid studies, and to provide suitable computation tools to overcome data restrictions. Focus is on a specific well-documented case for a 30-m(3) bioreactor. Areas for further research from an industrial perspective are also indicated. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Report of the LASCAR forum: Large scale reprocessing plant safeguards

    International Nuclear Information System (INIS)

    1992-01-01

    This report has been prepared to provide information on the studies which were carried out from 1988 to 1992 under the auspices of the multinational forum known as Large Scale Reprocessing Plant Safeguards (LASCAR) on safeguards for four large scale reprocessing plants operated or planned to be operated in the 1990s. The report summarizes all of the essential results of these studies. The participants in LASCAR were from France, Germany, Japan, the United Kingdom, the United States of America, the Commission of the European Communities - Euratom, and the International Atomic Energy Agency

  4. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  5. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    Science.gov (United States)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  6. Information Impact: Journal of Information and Knowledge ...

    African Journals Online (AJOL)

    Information Impact: Journal of Information and Knowledge Management

    Information Impact: Journal of Information and Knowledge Management. 55. Adeola Esther Olutoki ... The main instruments used for data collection were research ... collections in the library and enhance academic performance. ... libraries; they contain the latest information on research findings which are of great importance.

  7. Evaluation of the stigma towards people with a diagnosis of Schizophrenia using a Knowledge Scale.

    Science.gov (United States)

    Grandón, Pamela; Aguilera, Alexis Vielma; Bustos, Claudio; Alzate, Elvis Castro; Saldivia, Sandra

    Social stigma is the assigning of negative stereotypes to people with schizophrenia. Different measurement tools have been used to evaluate this, including knowledge scales. The aim of this study was to evaluate the public stigma by measuring this knowledge and relate the degree of information with variables that have shown to influence on stigma presented by the affected population. The sample was composed of 399 people and the inclusion criterion was being between 18 and 65 years of age. The "Questionnaire of knowledge on schizophrenia" was applied, as well as a questionnaire to collect sociodemographic information. Participants were recruited in places with large crowds. The following analyses were performed: multiple correlations, non-parametric bivariate and hierarchical clusters. The questionnaire had two dimensions: "Beliefs on the knowledge of schizophrenia" and "Attitudes towards schizophrenia". There are significant differences between them, and the contact with people with SMI. In the analysis of clusters, there was difference in the two groups according to the combination of the two dimensions of the tools. It is highlighted that none of the dimensions measures true knowledge, and the questionnaire has an attitudinal dimension. More than contact itself, it is the type of interaction of a relevant variable at the level of stigma that questions the traditional hypothesis of contact. Further research is required on the characteristics of the tool and the aspects of the contact associated to a lower level of stigma in the population. Copyright © 2017 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  8. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 19: Computer and information technology and aerospace knowledge diffusion

    Science.gov (United States)

    Pinelli, Thomas E.; Kennedy, John M.; Barclay, Rebecca O.; Bishop, Ann P.

    1992-01-01

    To remain a world leader in aerospace, the US must improve and maintain the professional competency of its engineers and scientists, increase the research and development (R&D) knowledge base, improve productivity, and maximize the integration of recent technological developments into the R&D process. How well these objectives are met, and at what cost, depends on a variety of factors, but largely on the ability of US aerospace engineers and scientists to acquire and process the results of federally funded R&D. The Federal Government's commitment to high speed computing and networking systems presupposes that computer and information technology will play a major role in the aerospace knowledge diffusion process. However, we know little about information technology needs, uses, and problems within the aerospace knowledge diffusion process. The use of computer and information technology by US aerospace engineers and scientists in academia, government, and industry is reported.

  9. Risk Management Challenges in Large-scale Energy PSS

    DEFF Research Database (Denmark)

    Tegeltija, Miroslava; Oehmen, Josef; Kozin, Igor

    2017-01-01

    data and representation of the results to the decision makers play an important role. Second, we introduce a selection of alternative, so-called “post-probabilistic”, risk management methods developed across different scientific fields to cope with uncertainty due to lack of knowledge. Possibilities......Probabilistic risk management approaches have a long tradition in engineering. A large variety of tools and techniques based on the probabilistic view of risk is available and applied in PSS practice. However, uncertainties that arise due to lack of knowledge and information are still missing...... for overcoming industrial PSS risk management challenges are suggested through application of post-probabilistic methods. We conclude with the discussion on the importance for the field to consider their application....

  10. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  11. Large-scale event extraction from literature with multi-level gene normalization.

    Directory of Open Access Journals (Sweden)

    Sofie Van Landeghem

    Full Text Available Text mining for the life sciences aims to aid database curation, knowledge summarization and information retrieval through the automated processing of biomedical texts. To provide comprehensive coverage and enable full integration with existing biomolecular database records, it is crucial that text mining tools scale up to millions of articles and that their analyses can be unambiguously linked to information recorded in resources such as UniProt, KEGG, BioGRID and NCBI databases. In this study, we investigate how fully automated text mining of complex biomolecular events can be augmented with a normalization strategy that identifies biological concepts in text, mapping them to identifiers at varying levels of granularity, ranging from canonicalized symbols to unique gene and proteins and broad gene families. To this end, we have combined two state-of-the-art text mining components, previously evaluated on two community-wide challenges, and have extended and improved upon these methods by exploiting their complementary nature. Using these systems, we perform normalization and event extraction to create a large-scale resource that is publicly available, unique in semantic scope, and covers all 21.9 million PubMed abstracts and 460 thousand PubMed Central open access full-text articles. This dataset contains 40 million biomolecular events involving 76 million gene/protein mentions, linked to 122 thousand distinct genes from 5032 species across the full taxonomic tree. Detailed evaluations and analyses reveal promising results for application of this data in database and pathway curation efforts. The main software components used in this study are released under an open-source license. Further, the resulting dataset is freely accessible through a novel API, providing programmatic and customized access (http://www.evexdb.org/api/v001/. Finally, to allow for large-scale bioinformatic analyses, the entire resource is available for bulk download from

  12. Engineering large-scale agent-based systems with consensus

    Science.gov (United States)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  13. Patterns of Knowledge Sharing and Knowledge Creation in New Information Environments

    DEFF Research Database (Denmark)

    Nielsen, Jørgen Lerche; Meyer, Kirsten

    2006-01-01

    and creation processes. The aim is to obtain a deeper comprehension of which factors determine whether the use of information technology becomes a success or a failure in relation to knowledge sharing and creation. The paper is based on three previous studies investigating the use of information technology......Do the knowledge sharing and creation processes in collaborating groups benefit from the use of new information environments or are the environments rather inhibitive to the development of these processes? A number of different studies have shown quite varied results when it comes to appraising...... the importance and value of using new information technology in knowledge sharing and creation processes. In this paper we will try to unveil the patterns appearing in the use of new information environment and the users' understanding of the significance of using information technology in knowledge sharing...

  14. Information literacy and personal knowledge management

    DEFF Research Database (Denmark)

    Schreiber, Trine; Harbo, Karen

    2004-01-01

    The aim of the paper is to discuss a new subject called personal knowledge management and to compare it with the better-known concept information literacy. Firstly, the paper describes and discusses the course called personal knowledge management. People from three institutions, the Library...... the participants partly how to manage information in such a way that it supports a learning process, and partly how to negotiate with the colleagues about the information needs, locate the information, and mediate it in such a way that the colleagues will use it. At the end of the course the participants construct...... a ´knowledge map´, which constitutes the mediation of the information to the workplace. The course has got a very positively reception. Secondly, the paper compares the course of personal knowledge management with the concept of information literacy. There exist a number of different definitions of the last...

  15. Knowledge Management and Global Information Dissemination

    Science.gov (United States)

    Umunadi, Ejiwoke Kennedy

    2014-01-01

    The paper looked at knowledge management and global information dissemination. Knowledge is a very powerful tool for survival, growth and development. It can be seen as the information, understanding and skills that you gain through education or experience. The paper was addressed under the following sub-headings: Knowledge management knowledge…

  16. Integrating Traditional Ecological Knowledge and Ecological Science: a Question of Scale

    Directory of Open Access Journals (Sweden)

    Catherine A. Gagnon

    2009-12-01

    Full Text Available The benefits and challenges of integrating traditional ecological knowledge and scientific knowledge have led to extensive discussions over the past decades, but much work is still needed to facilitate the articulation and co-application of these two types of knowledge. Through two case studies, we examined the integration of traditional ecological knowledge and scientific knowledge by emphasizing their complementarity across spatial and temporal scales. We expected that combining Inuit traditional ecological knowledge and scientific knowledge would expand the spatial and temporal scales of currently documented knowledge on the arctic fox (Vulpes lagopus and the greater snow goose (Chen caerulescens atlantica, two important tundra species. Using participatory approaches in Mittimatalik (also known as Pond Inlet, Nunavut, Canada, we documented traditional ecological knowledge about these species and found that, in fact, it did expand the spatial and temporal scales of current scientific knowledge for local arctic fox ecology. However, the benefits were not as apparent for snow goose ecology, probably because of the similar spatial and temporal observational scales of the two types of knowledge for this species. Comparing sources of knowledge at similar scales allowed us to gain confidence in our conclusions and to identify areas of disagreement that should be studied further. Emphasizing complementarities across scales was more powerful for generating new insights and hypotheses. We conclude that determining the scales of the observations that form the basis for traditional ecological knowledge and scientific knowledge represents a critical step when evaluating the benefits of integrating these two types of knowledge. This is also critical when examining the congruence or contrast between the two types of knowledge for a given subject.

  17. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Nusser, Adi [Physics Department and the Asher Space Science Institute-Technion, Haifa 32000 (Israel); Branchini, Enzo [Department of Physics, Universita Roma Tre, Via della Vasca Navale 84, 00146 Rome (Italy); Davis, Marc, E-mail: adi@physics.technion.ac.il, E-mail: branchin@fis.uniroma3.it, E-mail: mdavis@berkeley.edu [Departments of Astronomy and Physics, University of California, Berkeley, CA 94720 (United States)

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  18. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [The University of Texas at Austin

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUARO Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.

  19. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  20. Children's informal learning in the context of school of knowledge society

    DEFF Research Database (Denmark)

    Sørensen, Birgitte Holm; Danielsen, Oluf; Nielsen, Janni

    2007-01-01

    interactive media. The project shows that in children's spare-time use of ICT they employ informal forms of learning based to a large extent on their social interaction both in physical and virtual spaces. These informal learning forms can be identified as learning hierarchies, learning communities...... and learning networks; they are important contributions to the school of the knowledge society. The ICT in New Learning Environments project based on anthropologically inspired methods and social learning theories shows that students bring their informal forms of learning into the school context. This happens...... working with ICT and project-based learning is shown to simultaneously constitute a mixed mode between the school of the industrial and the knowledge society. The research shows that it is possible to tip the balance in the direction of the school of the knowledge society, and thus of the future...

  1. KBGIS-2: A knowledge-based geographic information system

    Science.gov (United States)

    Smith, T.; Peuquet, D.; Menon, S.; Agarwal, P.

    1986-01-01

    The architecture and working of a recently implemented knowledge-based geographic information system (KBGIS-2) that was designed to satisfy several general criteria for the geographic information system are described. The system has four major functions that include query-answering, learning, and editing. The main query finds constrained locations for spatial objects that are describable in a predicate-calculus based spatial objects language. The main search procedures include a family of constraint-satisfaction procedures that use a spatial object knowledge base to search efficiently for complex spatial objects in large, multilayered spatial data bases. These data bases are represented in quadtree form. The search strategy is designed to reduce the computational cost of search in the average case. The learning capabilities of the system include the addition of new locations of complex spatial objects to the knowledge base as queries are answered, and the ability to learn inductively definitions of new spatial objects from examples. The new definitions are added to the knowledge base by the system. The system is currently performing all its designated tasks successfully, although currently implemented on inadequate hardware. Future reports will detail the performance characteristics of the system, and various new extensions are planned in order to enhance the power of KBGIS-2.

  2. Using information technology to support knowledge conversion processes

    Directory of Open Access Journals (Sweden)

    2001-01-01

    Full Text Available One of the main roles of Information Technology in Knowledge Management programs is to accelerate the speed of knowledge transfer and creation. The Knowledge Management tools intend to help the processes of collecting and organizing the knowledge of groups of individuals in order to make this knowledge available in a shared base. Due to the largeness of the concept of knowledge, the software market for Knowledge Management seems to be quite confusing. Technology vendors are developing different implementations of the Knowledge Management concepts in their software products. Because of the variety and quantity of Knowledge Management tools available on the market, a typology may be a valuable aid to organizations that are looking for answers to specific needs. The objective of this article is to present guidelines that help to design such a typology. Knowledge Management solutions such as intranet systems, Electronic Document Management (EDM, groupware, workflow, artificial intelligence-based systems, Business Intelligence (BI, knowledge map systems, innovation support, competitive intelligence tools and knowledge portals are discussed in terms of their potential contributions to the processes of creating, registering and sharing knowledge. A number of Knowledge Management tools (Lotus Notes, Microsoft Exchange, Business Objects, Aris Toolset, File Net, Gingo, Vigipro, Sopheon have been checked. The potential of each category of solutions to support the transfer of tacit and/or explicit knowledge and to facilitate the knowledge conversion spiral in the sense of Nonaka and Takeuchi (1995 is discussed.

  3. How the Internet Will Help Large-Scale Assessment Reinvent Itself

    Directory of Open Access Journals (Sweden)

    Randy Elliot Bennett

    2001-02-01

    Full Text Available Large-scale assessment in the United States is undergoing enormous pressure to change. That pressure stems from many causes. Depending upon the type of test, the issues precipitating change include an outmoded cognitive-scientific basis for test design; a mismatch with curriculum; the differential performance of population groups; a lack of information to help individuals improve; and inefficiency. These issues provide a strong motivation to reconceptualize both the substance and the business of large-scale assessment. At the same time, advances in technology, measurement, and cognitive science are providing the means to make that reconceptualization a reality. The thesis of this paper is that the largest facilitating factor will be technological, in particular the Internet. In the same way that it is already helping to revolutionize commerce, education, and even social interaction, the Internet will help revolutionize the business and substance of large-scale assessment.

  4. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  5. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  6. Side effects of problem-solving strategies in large-scale nutrition science: towards a diversification of health.

    Science.gov (United States)

    Penders, Bart; Vos, Rein; Horstman, Klasien

    2009-11-01

    Solving complex problems in large-scale research programmes requires cooperation and division of labour. Simultaneously, large-scale problem solving also gives rise to unintended side effects. Based upon 5 years of researching two large-scale nutrigenomic research programmes, we argue that problems are fragmented in order to be solved. These sub-problems are given priority for practical reasons and in the process of solving them, various changes are introduced in each sub-problem. Combined with additional diversity as a result of interdisciplinarity, this makes reassembling the original and overall goal of the research programme less likely. In the case of nutrigenomics and health, this produces a diversification of health. As a result, the public health goal of contemporary nutrition science is not reached in the large-scale research programmes we studied. Large-scale research programmes are very successful in producing scientific publications and new knowledge; however, in reaching their political goals they often are less successful.

  7. Participatory Design and the Challenges of Large-Scale Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    With its 10th biannual anniversary conference, Participatory Design (PD) is leaving its teens and must now be considered ready to join the adult world. In this article we encourage the PD community to think big: PD should engage in large-scale information-systems development and opt for a PD...

  8. Lessons from a large-scale assessment: Results from conceptual inventories

    Directory of Open Access Journals (Sweden)

    Beth Thacker

    2014-07-01

    Full Text Available We report conceptual inventory results of a large-scale assessment project at a large university. We studied the introduction of materials and instructional methods informed by physics education research (PER (physics education research-informed materials into a department where most instruction has previously been traditional and a significant number of faculty are hesitant, ambivalent, or even resistant to the introduction of such reforms. Data were collected in all of the sections of both the large algebra- and calculus-based introductory courses for a number of years employing commonly used conceptual inventories. Results from a small PER-informed, inquiry-based, laboratory-based class are also reported. Results suggest that when PER-informed materials are introduced in the labs and recitations, independent of the lecture style, there is an increase in students’ conceptual inventory gains. There is also an increase in the results on conceptual inventories if PER-informed instruction is used in the lecture. The highest conceptual inventory gains were achieved by the combination of PER-informed lectures and laboratories in large class settings and by the hands-on, laboratory-based, inquiry-based course taught in a small class setting.

  9. Dementia knowledge assessment scale (DKAS): confirmatory factor analysis and comparative subscale scores among an international cohort.

    Science.gov (United States)

    Annear, Michael J; Toye, Chris; Elliott, Kate-Ellen J; McInerney, Frances; Eccleston, Claire; Robinson, Andrew

    2017-07-31

    Dementia is a life-limiting condition that is increasing in global prevalence in line with population ageing. In this context, it is necessary to accurately measure dementia knowledge across a spectrum of health professional and lay populations with the aim of informing targeted educational interventions and improving literacy, care, and support. Building on prior exploratory analysis, which informed the development of the preliminarily valid and reliable version of the Dementia Knowledge Assessment Scale (DKAS), a Confirmatory Factor Analysis (CFA) was performed to affirm construct validity and proposed subscales to further increase the measure's utility for academics and educators. A large, de novo sample of 3649 volunteer respondents to a dementia-related online course was recruited to evaluate the performance of the DKAS and its proposed subscales. Respondents represented diverse cohorts, including health professionals, students, and members of the general public. Analyses included CFA (using structural equation modelling), measures of internal consistency (α), and non-parametric tests of subscale correlation (Spearman Correlation) and score differences between cohorts (Kruskal-Wallis one-way analysis of variance). Findings of the CFA supported a 25-item, four-factor model for the DKAS with two items removed due to poor performance and one item moved between factors. The resultant model exhibited good reliability (α = .85; ω h  = .87; overall scale), with acceptable subscale internal consistency (α ≥ .65; subscales). Subscales showed acceptable correlation without any indication of redundancy. Finally, total and DKAS subscale scores showed good discrimination between cohorts of respondents who would be anticipated to hold different levels of knowledge on the basis of education or experience related to dementia. The DKAS has been confirmed as a reliable and valid measure of dementia knowledge for diverse populations that is capable of elucidating

  10. Information Impact: Journal of Information and Knowledge ...

    African Journals Online (AJOL)

    Information Impact: Journal of Information and Knowledge Management. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 7, No 2 (2016) >. Log in or Register to get access to full text downloads.

  11. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  12. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  13. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    Science.gov (United States)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  14. Information needs and information seeking behaviour of small-scale ...

    African Journals Online (AJOL)

    It is thus important for the government to improve access to extension services, and equip them with necessary skills and adequate information resources. ... information needs, map communities' knowledge and information sources, create awareness of information sources, and knowledge culture, and use multiple sources ...

  15. Validating Bayesian truth serum in large-scale online human experiments.

    Science.gov (United States)

    Frank, Morgan R; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad

    2017-01-01

    Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method's mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon's Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the "honest" distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where "honest" answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers.

  16. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  17. The Alzheimer's Disease Knowledge Scale: Development and Psychometric Properties

    Science.gov (United States)

    Carpenter, Brian D.; Balsis, Steve; Otilingam, Poorni G.; Hanson, Priya K.; Gatz, Margaret

    2009-01-01

    Purpose: This study provides preliminary evidence for the acceptability, reliability, and validity of the new Alzheimer's Disease Knowledge Scale (ADKS), a content and psychometric update to the Alzheimer's Disease Knowledge Test. Design and Methods: Traditional scale development methods were used to generate items and evaluate their psychometric…

  18. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales.

    Science.gov (United States)

    Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe

    2016-07-01

    We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.

  19. INTEGRATED DESIGN AND ENGINEERING USING BUILDING INFORMATION MODELLING: A PILOT PROJECT OF SMALL-SCALE HOUSING DEVELOPMENT IN THE NETHERLANDS

    Directory of Open Access Journals (Sweden)

    Rizal Sebastian

    2010-11-01

    Full Text Available During the design phase, decisions are made that affect, on average, 70% of the life-cycle cost of a building. Therefore, collaborative design relying on multidisciplinary knowledge of the building life cycle is essential. Building information modelling (BIM makes it possible to integrate knowledge from various project participants that traditionally work in different phases of the building process. BIM has been applied in a number of large-scale projects in the industrial real estate and infrastructure sectors in different countries, including The Netherlands. The projects in the housing sector, however, are predominantly small scale and carried out by small and medium enterprises (SMEs. These SMEs are looking for practical and affordable BIM solutions for housing projects. This article reports a pilot project of small-scale housing development using BIM in the province of Zeeland, The Netherlands. The conceptual knowledge derived from European and national research projects is disseminated to the SMEs through a series of experimental working sessions. Action learning protocols within a pilot project are developed to ensure direct impacts in terms of cost reduction and quality improvement. The project shows that BIM can be applied without radical changes to the SMEs' information and communication technology systems or to their business organizations. DOI: 10.3763/aedm.2010.0116 Source: Architectural Engineering and Design Management, Volume 6, Number 2, 2010 , pp. 103-110(8

  20. The influence of socioeconomic factors on traditional knowledge: a cross scale comparison of palm use in northwestern South America

    Directory of Open Access Journals (Sweden)

    Narel Y. Paniagua-Zambrana

    2014-12-01

    Full Text Available We explored the power of 14 socioeconomic factors for predicting differences in traditional knowledge about palms (Arecaceae at the personal, household, and regional levels in 25 locations in the Amazon, Andes, and Chocó of northwestern South America. Using semistructured interviews, we gathered data on palm uses from 2050 informants in 53 communities and four countries (Colombia, Ecuador, Peru, and Bolivia . We performed multilevel statistical analyses, which showed that the influence of each socioeconomic factor differed depending on whether the analysis was performed on the overall palm knowledge or on individual use categories. At the general palm knowledge level, gender was the only factor that had a significant association in all five subregions, and showed that men had more knowledge than women, and age had a positive significant association only in the lowlands. Most of the analyzed socioeconomic factors had a greater influence on the lowland ecoregions of the Amazon and Chocó, although there were mixed trends in these ecoregions. Our results show that there are no regional patterns in the predictive power of socioeconomic factors and that their influence on palm-use knowledge is highly localized. We can conclude that (1 conservation strategies of traditional knowledge of palm use in the region should be developed mainly at the local level, and (2 large-scale comparable ethnoecological studies are necessary to understand indigenous communities' livelihoods at different scales.

  1. Information modelling and knowledge bases XXV

    CERN Document Server

    Tokuda, T; Jaakkola, H; Yoshida, N

    2014-01-01

    Because of our ever increasing use of and reliance on technology and information systems, information modelling and knowledge bases continue to be important topics in those academic communities concerned with data handling and computer science. As the information itself becomes more complex, so do the levels of abstraction and the databases themselves. This book is part of the series Information Modelling and Knowledge Bases, which concentrates on a variety of themes in the important domains of conceptual modeling, design and specification of information systems, multimedia information modelin

  2. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  3. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  4. INFORMATION AND KNOWLEDGE IN A GLOBAL CONTEXT

    Directory of Open Access Journals (Sweden)

    Florina BRAN

    2015-12-01

    Full Text Available Information and knowledge are two important entities, which make up present stage of globalization, based mostly on their dynamics. This paper is providing an overview of information and knowledge in global context, highlighting the importance of information society that turned into knowledge society in the beginning of the 21 century, being driven by Internet – the latter, as part of globalization process. Modern economic theories recognise the importance of information in economic process because its impact on globalization process in economy was essential, and change the way how markets and companies work and represent the key factor of new era of economic development. This paper presents main results from available literature about the relationship between information, knowledge and economic theory in a global conterxt and finally explained the benefits of the knowledge economy to all countries.

  5. PERSEUS-HUB: Interactive and Collective Exploration of Large-Scale Graphs

    Directory of Open Access Journals (Sweden)

    Di Jin

    2017-07-01

    Full Text Available Graphs emerge naturally in many domains, such as social science, neuroscience, transportation engineering, and more. In many cases, such graphs have millions or billions of nodes and edges, and their sizes increase daily at a fast pace. How can researchers from various domains explore large graphs interactively and efficiently to find out what is ‘important’? How can multiple researchers explore a new graph dataset collectively and “help” each other with their findings? In this article, we present Perseus-Hub, a large-scale graph mining tool that computes a set of graph properties in a distributed manner, performs ensemble, multi-view anomaly detection to highlight regions that are worth investigating, and provides users with uncluttered visualization and easy interaction with complex graph statistics. Perseus-Hub uses a Spark cluster to calculate various statistics of large-scale graphs efficiently, and aggregates the results in a summary on the master node to support interactive user exploration. In Perseus-Hub, the visualized distributions of graph statistics provide preliminary analysis to understand a graph. To perform a deeper analysis, users with little prior knowledge can leverage patterns (e.g., spikes in the power-law degree distribution marked by other users or experts. Moreover, Perseus-Hub guides users to regions of interest by highlighting anomalous nodes and helps users establish a more comprehensive understanding about the graph at hand. We demonstrate our system through the case study on real, large-scale networks.

  6. The Eighth Stage of Information Management: Information Resources Management (IRM) vs. Knowledge Management (KM), and the Chief Information Officer (CIO) vs. the Chief Knowledge Officer (CKO).

    Science.gov (United States)

    Chen, Rui

    1998-01-01

    Describes the characteristics of the transfer point of information management to knowledge management (KM), what information resources management (IRM) does, and compares information and knowledge management and the roles of chief information officer (CIO) and chief knowledge officer (CKO). (PEN)

  7. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  8. Large-scale silviculture experiments of western Oregon and Washington.

    Science.gov (United States)

    Nathan J. Poage; Paul D. Anderson

    2007-01-01

    We review 12 large-scale silviculture experiments (LSSEs) in western Washington and Oregon with which the Pacific Northwest Research Station of the USDA Forest Service is substantially involved. We compiled and arrayed information about the LSSEs as a series of matrices in a relational database, which is included on the compact disc published with this report and...

  9. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  10. The implementation of nuclear knowledge information infra database

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Shin Bok; Hwang, In Ah [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2001-02-01

    The purpose of this research is to develop DB for establishing knowledge management DB, which will be effectively utilized for carrying out nuclear R and D program. This report describes detailed methodology for data retrieval, classification, acquisition, accumulation, common ownership and regeneration of professional knowledge. An example of knowledge DB construction in the field of nuclear energy is illustrated in the report. This project is supported by a part of digital information program related to science and technology. The total of 894 man year (3 man per day) is invested in this project. Major works carried out in this project are design of DB field for nuclear knowledge base, classification and selection of knowledge information information, review of classified and selected information, preparation of information DB, and so on. As a result of this project, the developed DB for nuclear knowledge information infrastructure will serve as a valuable source of information to the nuclear researchers and information users. (Author)

  11. Information and Knowledge Management in the Scope of the Information Security practices: the human factor within Organizations

    Directory of Open Access Journals (Sweden)

    Luciana Emirena Santos Carneiro

    2013-08-01

    Full Text Available The security of informational assets has always been a corporate requirement. These assets can be scaled in three main spheres, namely, people, organizational processes and technologies. The internet, the web, the broadcast of networks, and the growing presence of technology both in people's lives and in organizational contexts have caused profound transformations in the intrinsic processes that constitute personal and organizational routines. On the one hand, these changes provided by the technological progress have fostered competitiveness and decentralization; on the other hand, they require better management, control, security and protection for information and knowledge. This article presents the results of an investigation within information security realm, focusing on the human aspects of knowledge and information management related to security practices. Using a quality-quantitative approach, we identify behavioral actions and profiles of employees of a company in the field of healthcare, which reveal some connections with information security failures. We conclude that the human element is a relevant variable, even a critical one, for the management of information security in organizations.

  12. 3D fully convolutional networks for subcortical segmentation in MRI: A large-scale study.

    Science.gov (United States)

    Dolz, Jose; Desrosiers, Christian; Ben Ayed, Ismail

    2018-04-15

    This study investigates a 3D and fully convolutional neural network (CNN) for subcortical brain structure segmentation in MRI. 3D CNN architectures have been generally avoided due to their computational and memory requirements during inference. We address the problem via small kernels, allowing deeper architectures. We further model both local and global context by embedding intermediate-layer outputs in the final prediction, which encourages consistency between features extracted at different scales and embeds fine-grained information directly in the segmentation process. Our model is efficiently trained end-to-end on a graphics processing unit (GPU), in a single stage, exploiting the dense inference capabilities of fully CNNs. We performed comprehensive experiments over two publicly available datasets. First, we demonstrate a state-of-the-art performance on the ISBR dataset. Then, we report a large-scale multi-site evaluation over 1112 unregistered subject datasets acquired from 17 different sites (ABIDE dataset), with ages ranging from 7 to 64 years, showing that our method is robust to various acquisition protocols, demographics and clinical factors. Our method yielded segmentations that are highly consistent with a standard atlas-based approach, while running in a fraction of the time needed by atlas-based methods and avoiding registration/normalization steps. This makes it convenient for massive multi-site neuroanatomical imaging studies. To the best of our knowledge, our work is the first to study subcortical structure segmentation on such large-scale and heterogeneous data. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. The BioLexicon: a large-scale terminological resource for biomedical text mining

    Directory of Open Access Journals (Sweden)

    Thompson Paul

    2011-10-01

    Full Text Available Abstract Background Due to the rapidly expanding body of biomedical literature, biologists require increasingly sophisticated and efficient systems to help them to search for relevant information. Such systems should account for the multiple written variants used to represent biomedical concepts, and allow the user to search for specific pieces of knowledge (or events involving these concepts, e.g., protein-protein interactions. Such functionality requires access to detailed information about words used in the biomedical literature. Existing databases and ontologies often have a specific focus and are oriented towards human use. Consequently, biological knowledge is dispersed amongst many resources, which often do not attempt to account for the large and frequently changing set of variants that appear in the literature. Additionally, such resources typically do not provide information about how terms relate to each other in texts to describe events. Results This article provides an overview of the design, construction and evaluation of a large-scale lexical and conceptual resource for the biomedical domain, the BioLexicon. The resource can be exploited by text mining tools at several levels, e.g., part-of-speech tagging, recognition of biomedical entities, and the extraction of events in which they are involved. As such, the BioLexicon must account for real usage of words in biomedical texts. In particular, the BioLexicon gathers together different types of terms from several existing data resources into a single, unified repository, and augments them with new term variants automatically extracted from biomedical literature. Extraction of events is facilitated through the inclusion of biologically pertinent verbs (around which events are typically organized together with information about typical patterns of grammatical and semantic behaviour, which are acquired from domain-specific texts. In order to foster interoperability, the BioLexicon is

  14. The BioLexicon: a large-scale terminological resource for biomedical text mining

    Science.gov (United States)

    2011-01-01

    Background Due to the rapidly expanding body of biomedical literature, biologists require increasingly sophisticated and efficient systems to help them to search for relevant information. Such systems should account for the multiple written variants used to represent biomedical concepts, and allow the user to search for specific pieces of knowledge (or events) involving these concepts, e.g., protein-protein interactions. Such functionality requires access to detailed information about words used in the biomedical literature. Existing databases and ontologies often have a specific focus and are oriented towards human use. Consequently, biological knowledge is dispersed amongst many resources, which often do not attempt to account for the large and frequently changing set of variants that appear in the literature. Additionally, such resources typically do not provide information about how terms relate to each other in texts to describe events. Results This article provides an overview of the design, construction and evaluation of a large-scale lexical and conceptual resource for the biomedical domain, the BioLexicon. The resource can be exploited by text mining tools at several levels, e.g., part-of-speech tagging, recognition of biomedical entities, and the extraction of events in which they are involved. As such, the BioLexicon must account for real usage of words in biomedical texts. In particular, the BioLexicon gathers together different types of terms from several existing data resources into a single, unified repository, and augments them with new term variants automatically extracted from biomedical literature. Extraction of events is facilitated through the inclusion of biologically pertinent verbs (around which events are typically organized) together with information about typical patterns of grammatical and semantic behaviour, which are acquired from domain-specific texts. In order to foster interoperability, the BioLexicon is modelled using the Lexical

  15. Large scale anisotropy studies with the Auger Observatory

    International Nuclear Information System (INIS)

    Santos, E.M.; Letessier-Selvon, A.

    2006-01-01

    With the increasing Auger surface array data sample of the highest energy cosmic rays, large scale anisotropy studies at this part of the spectrum become a promising path towards the understanding of the origin of ultra-high energy cosmic particles. We describe the methods underlying the search for distortions in the cosmic rays arrival directions over large angular scales, that is, bigger than those commonly employed in the search for correlations with point-like sources. The widely used tools, known as coverage maps, are described and some of the issues involved in their calculations are presented through Monte Carlo based studies. Coverage computation requires a deep knowledge on the local detection efficiency, including the influence of weather parameters like temperature and pressure. Particular attention is devoted to a new proposed method to extract the coverage, based upon the assumption of time factorization of an extensive air shower detector acceptance. We use Auger monitoring data to test the goodness of such a hypothesis. We finally show the necessity of using more than one coverage to extract any possible anisotropic pattern on the sky, by pointing to some of the biases present in commonly used methods based, for example, on the scrambling of the UTC arrival times for each event. (author)

  16. Internet use, online information seeking and knowledge among third molar patients attending public dental services.

    Science.gov (United States)

    Hanna, K; Sambrook, P; Armfield, J M; Brennan, D S

    2017-09-01

    While Australians are searching the internet for third molar (TM) information, the usefulness of online sources may be questioned due to quality variation. This study explored: (i) internet use, online information-seeking behaviour among TM patients attending public dental services; and (ii) whether patients' TM knowledge scores are associated with the level of internet use and eHealth Literacy Scale (eHEALS) scores. Baseline survey data from the 'Engaging Patients in Decision-Making' study were used. Variables included: sociodemographics, internet access status, online information-seeking behaviour, eHEALS, the Control Preferences Scale (CPS) and TM knowledge. Participants (N = 165) were mainly female (73.8%), aged 19-25 years (42.4%) and had 'secondary school or less' education (58.4%). A majority (N = 79, 52.7%) had sought online dental information which was associated with active decisional control preference (odds ratio = 3.1, P = 0.034) and higher educational attainment (odds ratio = 2.7, P = 0.040). TM knowledge scores were not associated with either the level of internet use (F (2,152) = 2.1, P = 0.094, χ 2 = 0.0310) or the eHEALS scores (r = 0.147, P = 0.335). 'The internet-prepared patient' phenomena exists among public TM patients and was explained by preference for involvement in decision-making. However, internet use was not associated with better TM knowledge. Providing TM patients with internet guidance may be an opportunity to improve TM knowledge. © 2017 Australian Dental Association.

  17. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  18. Detection of large-scale concentric gravity waves from a Chinese airglow imager network

    Science.gov (United States)

    Lai, Chang; Yue, Jia; Xu, Jiyao; Yuan, Wei; Li, Qinzeng; Liu, Xiao

    2018-06-01

    Concentric gravity waves (CGWs) contain a broad spectrum of horizontal wavelengths and periods due to their instantaneous localized sources (e.g., deep convection, volcanic eruptions, or earthquake, etc.). However, it is difficult to observe large-scale gravity waves of >100 km wavelength from the ground for the limited field of view of a single camera and local bad weather. Previously, complete large-scale CGW imagery could only be captured by satellite observations. In the present study, we developed a novel method that uses assembling separate images and applying low-pass filtering to obtain temporal and spatial information about complete large-scale CGWs from a network of all-sky airglow imagers. Coordinated observations from five all-sky airglow imagers in Northern China were assembled and processed to study large-scale CGWs over a wide area (1800 km × 1 400 km), focusing on the same two CGW events as Xu et al. (2015). Our algorithms yielded images of large-scale CGWs by filtering out the small-scale CGWs. The wavelengths, wave speeds, and periods of CGWs were measured from a sequence of consecutive assembled images. Overall, the assembling and low-pass filtering algorithms can expand the airglow imager network to its full capacity regarding the detection of large-scale gravity waves.

  19. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  20. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  1. Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed

    Science.gov (United States)

    Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve

    2005-01-01

    Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.

  2. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  3. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  4. Decisional role preferences, risk knowledge and information interests in patients with multiple sclerosis.

    Science.gov (United States)

    Heesen, Christoph; Kasper, Jürgen; Segal, Julia; Köpke, Sascha; Mühlhauser, Ingrid

    2004-12-01

    Shared decision making is increasingly recognized as the ideal model of patient-physician communication especially in chronic diseases with partially effective treatments as multiple sclerosis (MS). To evaluate prerequisite factors for this kind of decision making we studied patients' decisional role preferences in medical decision making, knowledge on risks, information interests and the relations between these factors in MS. After conducting focus groups to generate hypotheses, 219 randomly selected patients from the MS Outpatient Clinic register (n = 1374) of the University Hospital Hamburg received mailed questionnaires on their knowledge of risks in MS, their perception of their own level of knowledge, information interests and role preferences. Most patients (79%) indicated that they preferred an active role in treatment decisions giving the shared decision and the informed choice model the highest priority. MS risk knowledge was low but questionnaire results depended on disease course, disease duration and ongoing immune therapy. Measured knowledge as well as perceived knowledge was only weakly correlated with preferences of active roles. Major information interests were related to symptom alleviation, diagnostic procedures and prognosis. Patients with MS claimed autonomous roles in their health care decisions. The weak correlation between knowledge and preferences for active roles implicates that other factors largely influence role preferences.

  5. Zebrafish whole-adult-organism chemogenomics for large-scale predictive and discovery chemical biology.

    Directory of Open Access Journals (Sweden)

    Siew Hong Lam

    2008-07-01

    Full Text Available The ability to perform large-scale, expression-based chemogenomics on whole adult organisms, as in invertebrate models (worm and fly, is highly desirable for a vertebrate model but its feasibility and potential has not been demonstrated. We performed expression-based chemogenomics on the whole adult organism of a vertebrate model, the zebrafish, and demonstrated its potential for large-scale predictive and discovery chemical biology. Focusing on two classes of compounds with wide implications to human health, polycyclic (halogenated aromatic hydrocarbons [P(HAHs] and estrogenic compounds (ECs, we generated robust prediction models that can discriminate compounds of the same class from those of different classes in two large independent experiments. The robust expression signatures led to the identification of biomarkers for potent aryl hydrocarbon receptor (AHR and estrogen receptor (ER agonists, respectively, and were validated in multiple targeted tissues. Knowledge-based data mining of human homologs of zebrafish genes revealed highly conserved chemical-induced biological responses/effects, health risks, and novel biological insights associated with AHR and ER that could be inferred to humans. Thus, our study presents an effective, high-throughput strategy of capturing molecular snapshots of chemical-induced biological states of a whole adult vertebrate that provides information on biomarkers of effects, deregulated signaling pathways, and possible affected biological functions, perturbed physiological systems, and increased health risks. These findings place zebrafish in a strategic position to bridge the wide gap between cell-based and rodent models in chemogenomics research and applications, especially in preclinical drug discovery and toxicology.

  6. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    Full Text Available Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994. There is increasing interest in understanding patterns of synchrony, or lack of synchrony, between populations and the environmental and dispersal mechanisms that bring about these patterns (Paradis et al., 2000. To investigate these processes we need to measure abundance, demographic rates and dispersal at large spatial scales, in addition to gathering data on relevant environmental variables. There is an increasing realisation that conservation needs to address rapid declines of common and widespread species (they will not remain so if such trends continue as well as the management of small populations that are at risk of extinction. While the knowledge needed to support the management of small populations can often be obtained from intensive studies in a few restricted areas, conservation of widespread species often requires information on population trends and processes measured at regional, national and continental scales (Baillie, 2001. While management prescriptions for widespread populations may initially be developed from a small number of local studies or experiments, there is an increasing need to understand how such results will scale up when applied across wider areas. There is also a vital role for monitoring at large spatial scales both in identifying such population declines and in assessing population recovery. Gathering data on avian abundance and demography at large spatial scales usually relies on the efforts of large numbers of skilled volunteers. Volunteer studies based on ringing (for example Constant Effort Sites [CES

  7. INFORMATION SOCIETY: the instrumental logic of access to information and knowledge

    Directory of Open Access Journals (Sweden)

    Vinicius Aleixo Gerbasi

    2017-06-01

    Full Text Available The idea of an information society cannot dissociate from the socio-productive structure. In contemporary capitalism, the immaterial is a preponderant factor in the productive model. Due to the informational field becomes essential in the productive-economic, scientific and cultural process. Information and knowledge characterize as intrinsic factors to the productive reconfiguration to capitalism, in which the rationalization and appropriation is shaped: Science and Technology, innovation and appropriation of cooperation and social relations. For this reason, their proper operation and instrumentality are very important for the creation of surplus value. This article presents a brief critical analysis of the term information society and highlights the historicalideological plan from which it elaborated. We introduces the concepts of "Knowledge Economy" and "Information Regime". The methodology used is of bibliographic-exploratory nature. As a conclusion, it reflects on the importance of the dissemination of information and knowledge as a process of democratization. The operationalization of scientific information and control methods that seek to capture information and knowledge, and thus generate surplus value, information technologies and citizenship have put in the horizon possible actions and capacities of transformation, even if impartial and unstable.

  8. Oral health knowledge and sources of information among male Saudi school children.

    Science.gov (United States)

    Wyne, Amjad H; Chohan, Arham N; Al-Dosari, Khalid; Al-Dokheil, Majed

    2004-06-01

    The purpose of present study was to determine the oral health knowledge and sources of information in male Saudi school children. The required information was collected through a especially designed questionnaire. A total of 130 children completed the questionnaire with the mean age of 13.3 (SD 1.9) years. There was no significant difference in oral health knowledge or sources of information in relation to age and educational level. Less than half (44.6%) of the children actually had heard about fluoride, one-third (34.6%) correctly identified the action of fluoride as preventing tooth decay. Almost all (97.2%) the children thought that sweets (chocolates/candies) could cause tooth decay. However, a large number of children were not aware of cariogenic potential of soft drinks (31.5%). More than half (53.1%) of the children reported that their dentist taught them how to brush properly. However, 11.5% children were not taught by any one about proper tooth-brushing. A large number (40.0%) of children thought that one must visit the dentist only in case of pain in the teeth. Dentists were the most popular (61.5%) source of oral health information. It can be concluded that the children need further oral health education in areas of caries prevention, and there was a need to utilise parents, schoolteachers and media to enhance their oral health knowledge.

  9. Practices and Strategies of Distributed Knowledge Collaboration

    Science.gov (United States)

    Kudaravalli, Srinivas

    2010-01-01

    Information Technology is enabling large-scale, distributed collaboration across many different kinds of boundaries. Researchers have used the label new organizational forms to describe such collaborations and suggested that they are better able to meet the demands of flexibility, speed and adaptability that characterize the knowledge economy.…

  10. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  11. Contribution of large scale coherence to wind turbine power: A large eddy simulation study in periodic wind farms

    Science.gov (United States)

    Chatterjee, Tanmoy; Peet, Yulia T.

    2018-03-01

    Length scales of eddies involved in the power generation of infinite wind farms are studied by analyzing the spectra of the turbulent flux of mean kinetic energy (MKE) from large eddy simulations (LES). Large-scale structures with an order of magnitude bigger than the turbine rotor diameter (D ) are shown to have substantial contribution to wind power. Varying dynamics in the intermediate scales (D -10 D ) are also observed from a parametric study involving interturbine distances and hub height of the turbines. Further insight about the eddies responsible for the power generation have been provided from the scaling analysis of two-dimensional premultiplied spectra of MKE flux. The LES code is developed in a high Reynolds number near-wall modeling framework, using an open-source spectral element code Nek5000, and the wind turbines have been modelled using a state-of-the-art actuator line model. The LES of infinite wind farms have been validated against the statistical results from the previous literature. The study is expected to improve our understanding of the complex multiscale dynamics in the domain of large wind farms and identify the length scales that contribute to the power. This information can be useful for design of wind farm layout and turbine placement that take advantage of the large-scale structures contributing to wind turbine power.

  12. Qualitative approaches to large scale studies and students' achievements in Science and Mathematics - An Australian and Nordic Perspective

    DEFF Research Database (Denmark)

    Davidsson, Eva; Sørensen, Helene

    Large scale studies play an increasing role in educational politics and results from surveys such as TIMSS and PISA are extensively used in medial debates about students' knowledge in science and mathematics. Although this debate does not usually shed light on the more extensive quantitative...... analyses, there is a lack of investigations which aim at exploring what is possible to conclude or not to conclude from these analyses. There is also a need for more detailed discussions about what trends could be discern concerning students' knowledge in science and mathematics. The aim of this symposium...... is therefore to highlight and discuss different approaches to how data from large scale studies could be used for additional analyses in order to increase our understanding of students' knowledge in science and mathematics, but also to explore possible longitudinal trends, hidden in the data material...

  13. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  14. Information Measures of Roughness of Knowledge and Rough Sets for Incomplete Information Systems

    Institute of Scientific and Technical Information of China (English)

    LIANG Ji-ye; QU Kai-she

    2001-01-01

    In this paper we address information measures of roughness of knowledge and rough sets for incomplete information systems. The definition of rough entropy of knowledge and its important properties are given. In particular, the relationship between rough entropy of knowledge and the Hartley measure of uncertainty is established. We show that rough entropy of knowledge decreases monotonously as granularity of information become smaller. This gives an information interpretation for roughness of knowledge. Based on rough entropy of knowledge and roughness of rough set. a definition of rough entropy of rough set is proposed, and we show that rough entropy of rough set decreases monotonously as granularity of information become smaller. This gives more accurate measure for roughness of rough set.

  15. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  16. Large-scale exact diagonalizations reveal low-momentum scales of nuclei

    Science.gov (United States)

    Forssén, C.; Carlsson, B. D.; Johansson, H. T.; Sääf, D.; Bansal, A.; Hagen, G.; Papenbrock, T.

    2018-03-01

    Ab initio methods aim to solve the nuclear many-body problem with controlled approximations. Virtually exact numerical solutions for realistic interactions can only be obtained for certain special cases such as few-nucleon systems. Here we extend the reach of exact diagonalization methods to handle model spaces with dimension exceeding 1010 on a single compute node. This allows us to perform no-core shell model (NCSM) calculations for 6Li in model spaces up to Nmax=22 and to reveal the 4He+d halo structure of this nucleus. Still, the use of a finite harmonic-oscillator basis implies truncations in both infrared (IR) and ultraviolet (UV) length scales. These truncations impose finite-size corrections on observables computed in this basis. We perform IR extrapolations of energies and radii computed in the NCSM and with the coupled-cluster method at several fixed UV cutoffs. It is shown that this strategy enables information gain also from data that is not fully UV converged. IR extrapolations improve the accuracy of relevant bound-state observables for a range of UV cutoffs, thus making them profitable tools. We relate the momentum scale that governs the exponential IR convergence to the threshold energy for the first open decay channel. Using large-scale NCSM calculations we numerically verify this small-momentum scale of finite nuclei.

  17. The assessment of the readiness of five countries to implement child maltreatment prevention programs on a large scale.

    Science.gov (United States)

    Mikton, Christopher; Power, Mick; Raleva, Marija; Makoae, Mokhantso; Al Eissa, Majid; Cheah, Irene; Cardia, Nancy; Choo, Claire; Almuneef, Maha

    2013-12-01

    This study aimed to systematically assess the readiness of five countries - Brazil, the Former Yugoslav Republic of Macedonia, Malaysia, Saudi Arabia, and South Africa - to implement evidence-based child maltreatment prevention programs on a large scale. To this end, it applied a recently developed method called Readiness Assessment for the Prevention of Child Maltreatment based on two parallel 100-item instruments. The first measures the knowledge, attitudes, and beliefs concerning child maltreatment prevention of key informants; the second, completed by child maltreatment prevention experts using all available data in the country, produces a more objective assessment readiness. The instruments cover all of the main aspects of readiness including, for instance, availability of scientific data on the problem, legislation and policies, will to address the problem, and material resources. Key informant scores ranged from 31.2 (Brazil) to 45.8/100 (the Former Yugoslav Republic of Macedonia) and expert scores, from 35.2 (Brazil) to 56/100 (Malaysia). Major gaps identified in almost all countries included a lack of professionals with the skills, knowledge, and expertise to implement evidence-based child maltreatment programs and of institutions to train them; inadequate funding, infrastructure, and equipment; extreme rarity of outcome evaluations of prevention programs; and lack of national prevalence surveys of child maltreatment. In sum, the five countries are in a low to moderate state of readiness to implement evidence-based child maltreatment prevention programs on a large scale. Such an assessment of readiness - the first of its kind - allows gaps to be identified and then addressed to increase the likelihood of program success. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Local, distributed topology control for large-scale wireless ad-hoc networks

    NARCIS (Netherlands)

    Nieberg, T.; Hurink, Johann L.

    In this document, topology control of a large-scale, wireless network by a distributed algorithm that uses only locally available information is presented. Topology control algorithms adjust the transmission power of wireless nodes to create a desired topology. The algorithm, named local power

  19. Large-scale simulations with distributed computing: Asymptotic scaling of ballistic deposition

    International Nuclear Information System (INIS)

    Farnudi, Bahman; Vvedensky, Dimitri D

    2011-01-01

    Extensive kinetic Monte Carlo simulations are reported for ballistic deposition (BD) in (1 + 1) dimensions. The large system sizes L observed for the onset of asymptotic scaling (L ≅ 2 12 ) explains the widespread discrepancies in previous reports for exponents of BD in one and likely in higher dimensions. The exponents obtained directly from our simulations, α = 0.499 ± 0.004 and β = 0.336 ± 0.004, capture the exact values α = 1/2 and β = 1/3 for the one-dimensional Kardar-Parisi-Zhang equation. An analysis of our simulations suggests a criterion for identifying the onset of true asymptotic scaling, which enables a more informed evaluation of exponents for BD in higher dimensions. These simulations were made possible by the Simulation through Social Networking project at the Institute for Advanced Studies in Basic Sciences in 2007, which was re-launched in November 2010.

  20. Large-scale climatic anomalies affect marine predator foraging behaviour and demography

    Science.gov (United States)

    Bost, Charles A.; Cotté, Cedric; Terray, Pascal; Barbraud, Christophe; Bon, Cécile; Delord, Karine; Gimenez, Olivier; Handrich, Yves; Naito, Yasuhiko; Guinet, Christophe; Weimerskirch, Henri

    2015-10-01

    Determining the links between the behavioural and population responses of wild species to environmental variations is critical for understanding the impact of climate variability on ecosystems. Using long-term data sets, we show how large-scale climatic anomalies in the Southern Hemisphere affect the foraging behaviour and population dynamics of a key marine predator, the king penguin. When large-scale subtropical dipole events occur simultaneously in both subtropical Southern Indian and Atlantic Oceans, they generate tropical anomalies that shift the foraging zone southward. Consequently the distances that penguins foraged from the colony and their feeding depths increased and the population size decreased. This represents an example of a robust and fast impact of large-scale climatic anomalies affecting a marine predator through changes in its at-sea behaviour and demography, despite lack of information on prey availability. Our results highlight a possible behavioural mechanism through which climate variability may affect population processes.

  1. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  2. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  3. Design of a Large-scale Three-dimensional Flexible Arrayed Tactile Sensor

    Directory of Open Access Journals (Sweden)

    Junxiang Ding

    2011-01-01

    Full Text Available This paper proposes a new type of large-scale three-dimensional flexible arrayed tactile sensor based on conductive rubber. It can be used to detect three-dimensional force information on the continuous surface of the sensor, which realizes a true skin type tactile sensor. The widely used method of liquid rubber injection molding (LIMS method is used for "the overall injection molding" sample preparation. The structure details of staggered nodes and a new decoupling algorithm of force analysis are given. Simulation results show that the sensor based on this structure can achieve flexible measurement of large-scale 3-D tactile sensor arrays.

  4. A Very Large Area Network (VLAN) knowledge-base applied to space communication problems

    Science.gov (United States)

    Zander, Carol S.

    1988-01-01

    This paper first describes a hierarchical model for very large area networks (VLAN). Space communication problems whose solution could profit by the model are discussed and then an enhanced version of this model incorporating the knowledge needed for the missile detection-destruction problem is presented. A satellite network or VLAN is a network which includes at least one satellite. Due to the complexity, a compromise between fully centralized and fully distributed network management has been adopted. Network nodes are assigned to a physically localized group, called a partition. Partitions consist of groups of cell nodes with one cell node acting as the organizer or master, called the Group Master (GM). Coordinating the group masters is a Partition Master (PM). Knowledge is also distributed hierarchically existing in at least two nodes. Each satellite node has a back-up earth node. Knowledge must be distributed in such a way so as to minimize information loss when a node fails. Thus the model is hierarchical both physically and informationally.

  5. Integration of an OWL-DL knowledge base with an EHR prototype and providing customized information.

    Science.gov (United States)

    Jing, Xia; Kay, Stephen; Marley, Tom; Hardiker, Nicholas R

    2014-09-01

    When clinicians use electronic health record (EHR) systems, their ability to obtain general knowledge is often an important contribution to their ability to make more informed decisions. In this paper we describe a method by which an external, formal representation of clinical and molecular genetic knowledge can be integrated into an EHR such that customized knowledge can be delivered to clinicians in a context-appropriate manner.Web Ontology Language-Description Logic (OWL-DL) is a formal knowledge representation language that is widely used for creating, organizing and managing biomedical knowledge through the use of explicit definitions, consistent structure and a computer-processable format, particularly in biomedical fields. In this paper we describe: 1) integration of an OWL-DL knowledge base with a standards-based EHR prototype, 2) presentation of customized information from the knowledge base via the EHR interface, and 3) lessons learned via the process. The integration was achieved through a combination of manual and automatic methods. Our method has advantages for scaling up to and maintaining knowledge bases of any size, with the goal of assisting clinicians and other EHR users in making better informed health care decisions.

  6. Knowledge of attention deficit hyperactivity disorder and its associated factors among teachers in 3 large primary schools in Phra Nakorn Sri Ayutthaya Province, Thailand.

    Science.gov (United States)

    Muanprasart, Pongchanok; Traivaree, Chanchai; Arunyanart, Wirongrong; Teeranate, Chakriya

    2014-02-01

    Though attention deficit, hyperactivity disorder ADHD is a common problem in childhood. Thai teachers' knowledge regarding the disease has never been assessed. To identify knowledge of Thai teachers regardingADHD and its influencingfactors. Cross-sectional study was operated in three primary schools in Ayutthaya, Thailand. Standardized questionnaires comprised ofdemographic data, ADHD experiences and the Knowledge of Attention Deficit Disorder Scale, KADDS, were distributed to participating teachers. Results were reported using frequency, percent, mean, and standard deviation. Association between demographic and ADHD experiences and the KADDS score was identified by logistic regression analysis. Lack ofknowledge of ADHD among teachers was apparent. Only 19.4% of them passed the total scale of KADDS. Teachers under 31-years-old were more likely to pass general information and signs, symptoms & diagnosis subscales and total scale. In addition, familiarity with ADHD patients was associated with passing scores of general information subscale and total scale. Despite public awareness of ADHD, Thai teachers lacked knowledge concerning the disease. Young teachers were more acquainted with ADHD. Direct experience with ADHD patient might help teachers develop their knowledge on ADHD.

  7. KNOWLEDGE MANAGEMENT AND INFORMATION SYSTEMS IN ORGANIZATIONS

    Directory of Open Access Journals (Sweden)

    Karla Torres

    2015-11-01

    Full Text Available At present, knowledge and information are considered vital resources for organizations, so some of them have realized that the creation, transfer and knowledge management are essential for success. This paper aims to demonstrate knowledge management as a transformative power for organizations using information systems; addressing the study from the interpretive perspective with the use of hermeneutical method in theory, documentary context. It is concluded that people living in a changing characteristic environment of the globalized world companies and motivated by the same company changes have accelerated in them the generation and acquisition of new knowledge and innovative capabilities to achieve competitive positions with the help of systems of information.

  8. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  9. Knowledge information management toolkit and method

    Science.gov (United States)

    Hempstead, Antoinette R.; Brown, Kenneth L.

    2006-08-15

    A system is provided for managing user entry and/or modification of knowledge information into a knowledge base file having an integrator support component and a data source access support component. The system includes processing circuitry, memory, a user interface, and a knowledge base toolkit. The memory communicates with the processing circuitry and is configured to store at least one knowledge base. The user interface communicates with the processing circuitry and is configured for user entry and/or modification of knowledge pieces within a knowledge base. The knowledge base toolkit is configured for converting knowledge in at least one knowledge base from a first knowledge base form into a second knowledge base form. A method is also provided.

  10. How Did the Information Flow in the #AlphaGo Hashtag Network? A Social Network Analysis of the Large-Scale Information Network on Twitter.

    Science.gov (United States)

    Kim, Jinyoung

    2017-12-01

    As it becomes common for Internet users to use hashtags when posting and searching information on social media, it is important to understand who builds a hashtag network and how information is circulated within the network. This article focused on unlocking the potential of the #AlphaGo hashtag network by addressing the following questions. First, the current study examined whether traditional opinion leadership (i.e., the influentials hypothesis) or grassroot participation by the public (i.e., the interpersonal hypothesis) drove dissemination of information in the hashtag network. Second, several unique patterns of information distribution by key users were identified. Finally, the association between attributes of key users who exerted great influence on information distribution (i.e., the number of followers and follows) and their central status in the network was tested. To answer the proffered research questions, a social network analysis was conducted using a large-scale hashtag network data set from Twitter (n = 21,870). The results showed that the leading actors in the network were actively receiving information from their followers rather than serving as intermediaries between the original information sources and the public. Moreover, the leading actors played several roles (i.e., conversation starters, influencers, and active engagers) in the network. Furthermore, the number of their follows and followers were significantly associated with their central status in the hashtag network. Based on the results, the current research explained how the information was exchanged in the hashtag network by proposing the reciprocal model of information flow.

  11. Knowledge acquisition process as an issue in information sciences

    Directory of Open Access Journals (Sweden)

    Boris Bosančić

    2016-07-01

    Full Text Available The paper presents an overview of some problems of information science which are explicitly portrayed in literature. It covers the following issues: information explosion, information flood and data deluge, information retrieval and relevance of information, and finally, the problem of scientific communication. The purpose of this paper is to explain why knowledge acquisition, can be considered as an issue in information sciences. The existing theoretical foundation within the information sciences, i.e. the DIKW hierarchy and its key concepts - data, information, knowledge and wisdom, is recognized as a symbolic representation as well as the theoretical foundation of the knowledge acquisition process. Moreover, it seems that the relationship between the DIKW hierarchy and the knowledge acquisition process is essential for a stronger foundation of information sciences in the 'body' of the overall human knowledge. In addition, the history of both the human and machine knowledge acquisition has been considered, as well as a proposal that the DIKW hierarchy take place as a symbol of general knowledge acquisition process, which could equally relate to both human and machine knowledge acquisition. To achieve this goal, it is necessary to modify the existing concept of the DIKW hierarchy. The appropriate modification of the DIKW hierarchy (one of which is presented in this paper could result in a much more solid theoretical foundation of the knowledge acquisition process and information sciences as a whole. The theoretical assumptions on which the knowledge acquisition process may be established as a problem of information science are presented at the end of the paper. The knowledge acquisition process does not necessarily have to be the subject of epistemology. It may establish a stronger link between the concepts of data and knowledge; furthermore, it can be used in the context of scientific research, but on the more primitive level than conducting

  12. Results of Large-Scale Spacecraft Flammability Tests

    Science.gov (United States)

    Ferkul, Paul; Olson, Sandra; Urban, David L.; Ruff, Gary A.; Easton, John; T'ien, James S.; Liao, Ta-Ting T.; Fernandez-Pello, A. Carlos; Torero, Jose L.; Eigenbrand, Christian; hide

    2017-01-01

    For the first time, a large-scale fire was intentionally set inside a spacecraft while in orbit. Testing in low gravity aboard spacecraft had been limited to samples of modest size: for thin fuels the longest samples burned were around 15 cm in length and thick fuel samples have been even smaller. This is despite the fact that fire is a catastrophic hazard for spaceflight and the spread and growth of a fire, combined with its interactions with the vehicle cannot be expected to scale linearly. While every type of occupied structure on earth has been the subject of full scale fire testing, this had never been attempted in space owing to the complexity, cost, risk and absence of a safe location. Thus, there is a gap in knowledge of fire behavior in spacecraft. The recent utilization of large, unmanned, resupply craft has provided the needed capability: a habitable but unoccupied spacecraft in low earth orbit. One such vehicle was used to study the flame spread over a 94 x 40.6 cm thin charring solid (fiberglasscotton fabric). The sample was an order of magnitude larger than anything studied to date in microgravity and was of sufficient scale that it consumed 1.5 of the available oxygen. The experiment which is called Saffire consisted of two tests, forward or concurrent flame spread (with the direction of flow) and opposed flame spread (against the direction of flow). The average forced air speed was 20 cms. For the concurrent flame spread test, the flame size remained constrained after the ignition transient, which is not the case in 1-g. These results were qualitatively different from those on earth where an upward-spreading flame on a sample of this size accelerates and grows. In addition, a curious effect of the chamber size is noted. Compared to previous microgravity work in smaller tunnels, the flame in the larger tunnel spread more slowly, even for a wider sample. This is attributed to the effect of flow acceleration in the smaller tunnels as a result of hot

  13. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  14. Information Seek and Retrieval in Knowledge Management

    International Nuclear Information System (INIS)

    Maximov, N.; Pryakhin, A.; Golitsyna, O.; Kupriyanov, V.

    2016-01-01

    Full text: Information search is considered as a complex self-consistent process of constructing new knowledge, where knowledge is introduced as information related to context (specific circumstances). Operational space of such environment includes documentary components (implicit knowledge) and conceptual and terminological systems (glossaries, thesauri, and ontologies) as tools of cognitive process and semantic context. In the process of information search, context is injected by using a pre-coordinated linguistic structures (taxonomies, dictionaries of application domain) that is an adequate image for well-defined information, and by a cognitive tree taxonomy for new information needs, that is dynamically formed for each project or point of view in search task. A node of this structure can have as a properties both information (documents, queries, references to associated resources) and meta-information (application domain local dictionaries, corresponding parts of classifications, subject headings, thesauri, ontologies), and,in addition, the results of analytical processing. (author

  15. Using Large Scale Test Results for Pedagogical Purposes

    DEFF Research Database (Denmark)

    Dolin, Jens

    2012-01-01

    The use and influence of large scale tests (LST), both national and international, has increased dramatically within the last decade. This process has revealed a tension between the legitimate need for information about the performance of the educational system and teachers to inform policy......, and the teachers’ and students’ use of this information for pedagogical purposes in the classroom. We know well how the policy makers interpret and use the outcomes of such tests, but we know less about how teachers make use of LSTs to inform their pedagogical practice. An important question is whether...... there is a contradiction between the political system’s use of LST and teachers’ (possible) pedagogical use of LST. And if yes: What is a contradiction based on? This presentation will give some results from a systematic review on how tests have influenced the pedagogical practice. The research revealed many of the fatal...

  16. Hydrogen combustion modelling in large-scale geometries

    International Nuclear Information System (INIS)

    Studer, E.; Beccantini, A.; Kudriakov, S.; Velikorodny, A.

    2014-01-01

    Hydrogen risk mitigation issues based on catalytic recombiners cannot exclude flammable clouds to be formed during the course of a severe accident in a Nuclear Power Plant. Consequences of combustion processes have to be assessed based on existing knowledge and state of the art in CFD combustion modelling. The Fukushima accidents have also revealed the need for taking into account the hydrogen explosion phenomena in risk management. Thus combustion modelling in a large-scale geometry is one of the remaining severe accident safety issues. At present day there doesn't exist a combustion model which can accurately describe a combustion process inside a geometrical configuration typical of the Nuclear Power Plant (NPP) environment. Therefore the major attention in model development has to be paid on the adoption of existing approaches or creation of the new ones capable of reliably predicting the possibility of the flame acceleration in the geometries of that type. A set of experiments performed previously in RUT facility and Heiss Dampf Reactor (HDR) facility is used as a validation database for development of three-dimensional gas dynamic model for the simulation of hydrogen-air-steam combustion in large-scale geometries. The combustion regimes include slow deflagration, fast deflagration, and detonation. Modelling is based on Reactive Discrete Equation Method (RDEM) where flame is represented as an interface separating reactants and combustion products. The transport of the progress variable is governed by different flame surface wrinkling factors. The results of numerical simulation are presented together with the comparisons, critical discussions and conclusions. (authors)

  17. Complex Formation Control of Large-Scale Intelligent Autonomous Vehicles

    Directory of Open Access Journals (Sweden)

    Ming Lei

    2012-01-01

    Full Text Available A new formation framework of large-scale intelligent autonomous vehicles is developed, which can realize complex formations while reducing data exchange. Using the proposed hierarchy formation method and the automatic dividing algorithm, vehicles are automatically divided into leaders and followers by exchanging information via wireless network at initial time. Then, leaders form formation geometric shape by global formation information and followers track their own virtual leaders to form line formation by local information. The formation control laws of leaders and followers are designed based on consensus algorithms. Moreover, collision-avoiding problems are considered and solved using artificial potential functions. Finally, a simulation example that consists of 25 vehicles shows the effectiveness of theory.

  18. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  19. Building Participation in Large-scale Conservation: Lessons from Belize and Panama

    Directory of Open Access Journals (Sweden)

    Jesse Guite Hastings

    2015-01-01

    Full Text Available Motivated by biogeography and a desire for alignment with the funding priorities of donors, the twenty-first century has seen big international NGOs shifting towards a large-scale conservation approach. This shift has meant that even before stakeholders at the national and local scale are involved, conservation programmes often have their objectives defined and funding allocated. This paper uses the experiences of Conservation International′s Marine Management Area Science (MMAS programme in Belize and Panama to explore how to build participation at the national and local scale while working within the bounds of the current conservation paradigm. Qualitative data about MMAS was gathered through a multi-sited ethnographic research process, utilising document review, direct observation, and semi-structured interviews with 82 informants in Belize, Panama, and the United States of America. Results indicate that while a large-scale approach to conservation disadvantages early national and local stakeholder participation, this effect can be mediated through focusing engagement efforts, paying attention to context, building horizontal and vertical partnerships, and using deliberative processes that promote learning. While explicit consideration of geopolitics and local complexity alongside biogeography in the planning phase of a large-scale conservation programme is ideal, actions taken by programme managers during implementation can still have a substantial impact on conservation outcomes.

  20. Mapping spatial patterns of denitrifiers at large scales (Invited)

    Science.gov (United States)

    Philippot, L.; Ramette, A.; Saby, N.; Bru, D.; Dequiedt, S.; Ranjard, L.; Jolivet, C.; Arrouays, D.

    2010-12-01

    Little information is available regarding the landscape-scale distribution of microbial communities and its environmental determinants. Here we combined molecular approaches and geostatistical modeling to explore spatial patterns of the denitrifying community at large scales. The distribution of denitrifrying community was investigated over 107 sites in Burgundy, a 31 500 km2 region of France, using a 16 X 16 km sampling grid. At each sampling site, the abundances of denitrifiers and 42 soil physico-chemical properties were measured. The relative contributions of land use, spatial distance, climatic conditions, time and soil physico-chemical properties to the denitrifier spatial distribution were analyzed by canonical variation partitioning. Our results indicate that 43% to 85% of the spatial variation in community abundances could be explained by the measured environmental parameters, with soil chemical properties (mostly pH) being the main driver. We found spatial autocorrelation up to 739 km and used geostatistical modelling to generate predictive maps of the distribution of denitrifiers at the landscape scale. Studying the distribution of the denitrifiers at large scale can help closing the artificial gap between the investigation of microbial processes and microbial community ecology, therefore facilitating our understanding of the relationships between the ecology of denitrifiers and N-fluxes by denitrification.

  1. Large-scale road safety programmes in low- and middle-income countries: an opportunity to generate evidence.

    Science.gov (United States)

    Hyder, Adnan A; Allen, Katharine A; Peters, David H; Chandran, Aruna; Bishai, David

    2013-01-01

    The growing burden of road traffic injuries, which kill over 1.2 million people yearly, falls mostly on low- and middle-income countries (LMICs). Despite this, evidence generation on the effectiveness of road safety interventions in LMIC settings remains scarce. This paper explores a scientific approach for evaluating road safety programmes in LMICs and introduces such a road safety multi-country initiative, the Road Safety in 10 Countries Project (RS-10). By building on existing evaluation frameworks, we develop a scientific approach for evaluating large-scale road safety programmes in LMIC settings. This also draws on '13 lessons' of large-scale programme evaluation: defining the evaluation scope; selecting study sites; maintaining objectivity; developing an impact model; utilising multiple data sources; using multiple analytic techniques; maximising external validity; ensuring an appropriate time frame; the importance of flexibility and a stepwise approach; continuous monitoring; providing feedback to implementers, policy-makers; promoting the uptake of evaluation results; and understanding evaluation costs. The use of relatively new approaches for evaluation of real-world programmes allows for the production of relevant knowledge. The RS-10 project affords an important opportunity to scientifically test these approaches for a real-world, large-scale road safety evaluation and generate new knowledge for the field of road safety.

  2. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  3. Managing knowledge and information on nuclear safety

    International Nuclear Information System (INIS)

    Hahn, L.

    2005-01-01

    Described is the management of nuclear safety knowledge through education networks, knowledge pool, sharing, archiving and distributing the knowledge information. Demonstrated is the system used at Gesellschaft fuer Anlagen-und Reaktorsicherheit

  4. Restoring large-scale brain networks in PTSD and related disorders: a proposal for neuroscientifically-informed treatment interventions

    Directory of Open Access Journals (Sweden)

    Ruth A. Lanius

    2015-03-01

    Full Text Available Background: Three intrinsic connectivity networks in the brain, namely the central executive, salience, and default mode networks, have been identified as crucial to the understanding of higher cognitive functioning, and the functioning of these networks has been suggested to be impaired in psychopathology, including posttraumatic stress disorder (PTSD. Objective: 1 To describe three main large-scale networks of the human brain; 2 to discuss the functioning of these neural networks in PTSD and related symptoms; and 3 to offer hypotheses for neuroscientifically-informed interventions based on treating the abnormalities observed in these neural networks in PTSD and related disorders. Method: Literature relevant to this commentary was reviewed. Results: Increasing evidence for altered functioning of the central executive, salience, and default mode networks in PTSD has been demonstrated. We suggest that each network is associated with specific clinical symptoms observed in PTSD, including cognitive dysfunction (central executive network, increased and decreased arousal/interoception (salience network, and an altered sense of self (default mode network. Specific testable neuroscientifically-informed treatments aimed to restore each of these neural networks and related clinical dysfunction are proposed. Conclusions: Neuroscientifically-informed treatment interventions will be essential to future research agendas aimed at targeting specific PTSD and related symptoms.

  5. The combustion behavior of large scale lithium titanate battery

    Science.gov (United States)

    Huang, Peifeng; Wang, Qingsong; Li, Ke; Ping, Ping; Sun, Jinhua

    2015-01-01

    Safety problem is always a big obstacle for lithium battery marching to large scale application. However, the knowledge on the battery combustion behavior is limited. To investigate the combustion behavior of large scale lithium battery, three 50 Ah Li(NixCoyMnz)O2/Li4Ti5O12 batteries under different state of charge (SOC) were heated to fire. The flame size variation is depicted to analyze the combustion behavior directly. The mass loss rate, temperature and heat release rate are used to analyze the combustion behavior in reaction way deeply. Based on the phenomenon, the combustion process is divided into three basic stages, even more complicated at higher SOC with sudden smoke flow ejected. The reason is that a phase change occurs in Li(NixCoyMnz)O2 material from layer structure to spinel structure. The critical temperatures of ignition are at 112–121°C on anode tab and 139 to 147°C on upper surface for all cells. But the heating time and combustion time become shorter with the ascending of SOC. The results indicate that the battery fire hazard increases with the SOC. It is analyzed that the internal short and the Li+ distribution are the main causes that lead to the difference. PMID:25586064

  6. Advances in Large-Scale Solar Heating and Long Term Storage in Denmark

    DEFF Research Database (Denmark)

    Heller, Alfred

    2000-01-01

    According to (the) information from the European Large-Scale Solar Heating Network, (See http://www.hvac.chalmers.se/cshp/), the area of installed solar collectors for large-scale application is in Europe, approximately 8 mill m2, corresponding to about 4000 MW thermal power. The 11 plants...... the last 10 years and the corresponding cost per collector area for the final installed plant is kept constant, even so the solar production is increased. Unfortunately large-scale seasonal storage was not able to keep up with the advances in solar technology, at least for pit water and gravel storage...... of the total 51 plants are equipped with long-term storage. In Denmark, 7 plants are installed, comprising of approx. 18,000-m2 collector area with new plants planned. The development of these plants and the involved technologies will be presented in this paper, with a focus on the improvements for Danish...

  7. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  8. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  9. Information Society and Knowledge Economy - Essence and Key Relationships

    Directory of Open Access Journals (Sweden)

    Rafał Żelazny

    2015-04-01

    Full Text Available This paper focuses on essence and relationships between information society (IS and knowledge economy (KE concepts. The aim of this article is twofold. The first objective is to denominate the conceptual framework and relationships between IS and KE conceptions. The second is to present dependencies between the indexes of IS and KE development level in selected countries. Firstly, based on the notional relations between information and knowledge, there are characterized the relationships between concepts of information society, knowledge economy and knowledge society (KS. Secondly, using popular composite indexes evaluating the degree of IS and KE development i.e. Networked Readiness Index (NRI, ICT Development Index (IDI, Knowledge Economy Index (KEI and Summary Innovation Index (SII, there were studied corelations between information society and knowledge economy in 34 selected countries in 2012. The paper concludes by stating limits and implications for further research. This work contributes to systematization and integration of knowledge about the mutually permeable conceptions of information society and knowledge economy

  10. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  11. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  12. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  13. USING A KNOWLEDGE MANAGEMENT MODEL AS A FRAMEWORK FOR ADVANCEMENT OF SMALL-SCALE ECOTOURISM ENTREPRENEURSHIP IN JAMAICA

    Directory of Open Access Journals (Sweden)

    Dawn H. PEARCY

    2010-06-01

    Full Text Available The Caribbean island of Jamaica relies heavily upon tourism to support its economy. Despite the influx of significant tourism revenue, large numbers of Jamaica's indigenous people still face substantial economic hardships. This paper examines the potential for Jamaica to expand small-scale ecotourism entrepreneurship in order to improve the economic situation of larger numbers of its people. This analysis is conducted within a knowledge management framework, with particular emphasis placed upon the involvement of a wide array of stakeholders. The overall premise is that successful small-scale ecotourism entrepreneurship will rely on Jamaica's effective use of both its natural surroundings and its knowledge base as key assets.

  14. Incipient multiple fault diagnosis in real time with applications to large-scale systems

    International Nuclear Information System (INIS)

    Chung, H.Y.; Bien, Z.; Park, J.H.; Seon, P.H.

    1994-01-01

    By using a modified signed directed graph (SDG) together with the distributed artificial neutral networks and a knowledge-based system, a method of incipient multi-fault diagnosis is presented for large-scale physical systems with complex pipes and instrumentations such as valves, actuators, sensors, and controllers. The proposed method is designed so as to (1) make a real-time incipient fault diagnosis possible for large-scale systems, (2) perform the fault diagnosis not only in the steady-state case but also in the transient case as well by using a concept of fault propagation time, which is newly adopted in the SDG model, (3) provide with highly reliable diagnosis results and explanation capability of faults diagnosed as in an expert system, and (4) diagnose the pipe damage such as leaking, break, or throttling. This method is applied for diagnosis of a pressurizer in the Kori Nuclear Power Plant (NPP) unit 2 in Korea under a transient condition, and its result is reported to show satisfactory performance of the method for the incipient multi-fault diagnosis of such a large-scale system in a real-time manner

  15. Extending SME to Handle Large-Scale Cognitive Modeling.

    Science.gov (United States)

    Forbus, Kenneth D; Ferguson, Ronald W; Lovett, Andrew; Gentner, Dedre

    2017-07-01

    Analogy and similarity are central phenomena in human cognition, involved in processes ranging from visual perception to conceptual change. To capture this centrality requires that a model of comparison must be able to integrate with other processes and handle the size and complexity of the representations required by the tasks being modeled. This paper describes extensions to Structure-Mapping Engine (SME) since its inception in 1986 that have increased its scope of operation. We first review the basic SME algorithm, describe psychological evidence for SME as a process model, and summarize its role in simulating similarity-based retrieval and generalization. Then we describe five techniques now incorporated into the SME that have enabled it to tackle large-scale modeling tasks: (a) Greedy merging rapidly constructs one or more best interpretations of a match in polynomial time: O(n 2 log(n)); (b) Incremental operation enables mappings to be extended as new information is retrieved or derived about the base or target, to model situations where information in a task is updated over time; (c) Ubiquitous predicates model the varying degrees to which items may suggest alignment; (d) Structural evaluation of analogical inferences models aspects of plausibility judgments; (e) Match filters enable large-scale task models to communicate constraints to SME to influence the mapping process. We illustrate via examples from published studies how these enable it to capture a broader range of psychological phenomena than before. Copyright © 2016 Cognitive Science Society, Inc.

  16. Formalizing Knowledge in Multi-Scale Agent-Based Simulations.

    Science.gov (United States)

    Somogyi, Endre; Sluka, James P; Glazier, James A

    2016-10-01

    Multi-scale, agent-based simulations of cellular and tissue biology are increasingly common. These simulations combine and integrate a range of components from different domains. Simulations continuously create, destroy and reorganize constituent elements causing their interactions to dynamically change. For example, the multi-cellular tissue development process coordinates molecular, cellular and tissue scale objects with biochemical, biomechanical, spatial and behavioral processes to form a dynamic network. Different domain specific languages can describe these components in isolation, but cannot describe their interactions. No current programming language is designed to represent in human readable and reusable form the domain specific knowledge contained in these components and interactions. We present a new hybrid programming language paradigm that naturally expresses the complex multi-scale objects and dynamic interactions in a unified way and allows domain knowledge to be captured, searched, formalized, extracted and reused.

  17. Financial Information Source, Knowledge, and Practices of College Students from Diverse Backgrounds

    Science.gov (United States)

    Mimura, Yoko; Koonce, Joan; Plunkett, Scott W.; Pleskus, Lindsey

    2015-01-01

    Using cross-sectional data, we examined the financial information sources, financial knowledge, and financial practices of young adults, many of whom are first generation college students, ethnic minorities, and immigrants or children of immigrants. Participants (n = 1,249) were undergraduate students at a large regional comprehensive university.…

  18. Knowledge representation within information systems in manufacturing environments

    OpenAIRE

    Sharif, Amir M

    2004-01-01

    This thesis was submitted for the degree of Doctor of Philosophy and awarded by Brunel University. Representing knowledge as information content alone is insufficient in providing us with an understanding of the world around us. A combination of context as well as reasoning of the information content is fundamental to representing knowledge in an information system. Knowledge Representation is typically concerned with providing structures and theories that are used as a basis for intellige...

  19. Information and Knowledge Management: Dimensions and Approaches

    Science.gov (United States)

    Schlögl, Christian

    2005-01-01

    Introduction: Though literature on information and knowledge management is vast, there is much confusion concerning the meaning of these terms. Hence, this article should give some orientation and work out the main aspects of information and knowledge management. Method: An author co-citation analysis, which identified the main dimensions of…

  20. Modelling financial markets with agents competing on different time scales and with different amount of information

    Science.gov (United States)

    Wohlmuth, Johannes; Andersen, Jørgen Vitting

    2006-05-01

    We use agent-based models to study the competition among investors who use trading strategies with different amount of information and with different time scales. We find that mixing agents that trade on the same time scale but with different amount of information has a stabilizing impact on the large and extreme fluctuations of the market. Traders with the most information are found to be more likely to arbitrage traders who use less information in the decision making. On the other hand, introducing investors who act on two different time scales has a destabilizing effect on the large and extreme price movements, increasing the volatility of the market. Closeness in time scale used in the decision making is found to facilitate the creation of local trends. The larger the overlap in commonly shared information the more the traders in a mixed system with different time scales are found to profit from the presence of traders acting at another time scale than themselves.

  1. Knowledge-based information systems in practice

    CERN Document Server

    Jain, Lakhmi; Watada, Junzo; Howlett, Robert

    2015-01-01

    This book contains innovative research from leading researchers who presented their work at the 17th International Conference on Knowledge-Based and Intelligent Information and Engineering Systems, KES 2013, held in Kitakyusha, Japan, in September 2013. The conference provided a competitive field of 236 contributors, from which 38 authors expanded their contributions and only 21 published. A plethora of techniques and innovative applications are represented within this volume. The chapters are organized using four themes. These topics include: data mining, knowledge management, advanced information processes and system modelling applications. Each topic contains multiple contributions and many offer case studies or innovative examples. Anyone that wants to work with information repositories or process knowledge should consider reading one or more chapters focused on their technique of choice. They may also benefit from reading other chapters to assess if an alternative technique represents a more suitable app...

  2. Informed consent: attitudes, knowledge and information concerning prenatal examination

    DEFF Research Database (Denmark)

    Dahl, Katja; Kesmodel, Ulrik; hvidman, lone

    2006-01-01

    Background: Providing women with information enabling an informed consent to prenatal examinations has been widely recommended. Objective: The primary purpose of this review is to summarise current knowledge of the pregnant woman's expectations and attitudes concerning prenatal examinations, as w...

  3. Earth Science Data Analytics: Preparing for Extracting Knowledge from Information

    Science.gov (United States)

    Kempler, Steven; Barbieri, Lindsay

    2016-01-01

    Data analytics is the process of examining large amounts of data of a variety of types to uncover hidden patterns, unknown correlations and other useful information. Data analytics is a broad term that includes data analysis, as well as an understanding of the cognitive processes an analyst uses to understand problems and explore data in meaningful ways. Analytics also include data extraction, transformation, and reduction, utilizing specific tools, techniques, and methods. Turning to data science, definitions of data science sound very similar to those of data analytics (which leads to a lot of the confusion between the two). But the skills needed for both, co-analyzing large amounts of heterogeneous data, understanding and utilizing relevant tools and techniques, and subject matter expertise, although similar, serve different purposes. Data Analytics takes on a practitioners approach to applying expertise and skills to solve issues and gain subject knowledge. Data Science, is more theoretical (research in itself) in nature, providing strategic actionable insights and new innovative methodologies. Earth Science Data Analytics (ESDA) is the process of examining, preparing, reducing, and analyzing large amounts of spatial (multi-dimensional), temporal, or spectral data using a variety of data types to uncover patterns, correlations and other information, to better understand our Earth. The large variety of datasets (temporal spatial differences, data types, formats, etc.) invite the need for data analytics skills that understand the science domain, and data preparation, reduction, and analysis techniques, from a practitioners point of view. The application of these skills to ESDA is the focus of this presentation. The Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster was created in recognition of the practical need to facilitate the co-analysis of large amounts of data and information for Earth science. Thus, from a to

  4. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  5. Knowledge, information and communication among cancer patients

    International Nuclear Information System (INIS)

    Parvez, T.; Saeed, N.; Pervaiz, K.

    2001-01-01

    Objective: Knowledge, information and communication, within oncology, are a core clinical strength for the out-come of the disease and inadequate communication, can cause distress for the patient and their families. Design: A senior doctor conducted this study by filling in the performa after interviewing the subject of the study. Place and duration of study: This study was done in Oncology Department of Service Hospital, Lahore and was completed in four months. Subjects and Method: One hundred cancer patients were interviewed regarding their knowledge about their disease, its causes, prognosis, and information supplied by the health-care providers. They were also asked about their satisfaction regarding this information, deficiencies and pitfalls in this information, need for more information, which should supply the information from among the hospital team or their relative, attitude of the family and their communication regarding the disease. Results: Study revealed that the knowledge about the disease and its causes was present in 53% and 7% respectively. The patients (59%) wanted more information. Majority perceived that the information was not adequate and needed more information and 68% thought that more information would reduce their anxiety. The attitude of family was found encouraging in 87% of patients and 42% were communicating with other family members regarding their diseases. Conclusion: Knowledge about the disease and its cause should be increasingly supplied by the doctors, as it will reduce the anxiety and have a good effect on health. Communication among the family members needs to be improved. (author)

  6. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  7. Detecting differential protein expression in large-scale population proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Soyoung; Qian, Weijun; Camp, David G.; Smith, Richard D.; Tompkins, Ronald G.; Davis, Ronald W.; Xiao, Wenzhong

    2014-06-17

    Mass spectrometry-based high-throughput quantitative proteomics shows great potential in clinical biomarker studies, identifying and quantifying thousands of proteins in biological samples. However, methods are needed to appropriately handle issues/challenges unique to mass spectrometry data in order to detect as many biomarker proteins as possible. One issue is that different mass spectrometry experiments generate quite different total numbers of quantified peptides, which can result in more missing peptide abundances in an experiment with a smaller total number of quantified peptides. Another issue is that the quantification of peptides is sometimes absent, especially for less abundant peptides and such missing values contain the information about the peptide abundance. Here, we propose a Significance Analysis for Large-scale Proteomics Studies (SALPS) that handles missing peptide intensity values caused by the two mechanisms mentioned above. Our model has a robust performance in both simulated data and proteomics data from a large clinical study. Because varying patients’ sample qualities and deviating instrument performances are not avoidable for clinical studies performed over the course of several years, we believe that our approach will be useful to analyze large-scale clinical proteomics data.

  8. Large-scale information flow in conscious and unconscious states: an ECoG study in monkeys.

    Directory of Open Access Journals (Sweden)

    Toru Yanagawa

    Full Text Available Consciousness is an emergent property of the complex brain network. In order to understand how consciousness is constructed, neural interactions within this network must be elucidated. Previous studies have shown that specific neural interactions between the thalamus and frontoparietal cortices; frontal and parietal cortices; and parietal and temporal cortices are correlated with levels of consciousness. However, due to technical limitations, the network underlying consciousness has not been investigated in terms of large-scale interactions with high temporal and spectral resolution. In this study, we recorded neural activity with dense electrocorticogram (ECoG arrays and used the spectral Granger causality to generate a more comprehensive network that relates to consciousness in monkeys. We found that neural interactions were significantly different between conscious and unconscious states in all combinations of cortical region pairs. Furthermore, the difference in neural interactions between conscious and unconscious states could be represented in 4 frequency-specific large-scale networks with unique interaction patterns: 2 networks were related to consciousness and showed peaks in alpha and beta bands, while the other 2 networks were related to unconsciousness and showed peaks in theta and gamma bands. Moreover, networks in the unconscious state were shared amongst 3 different unconscious conditions, which were induced either by ketamine and medetomidine, propofol, or sleep. Our results provide a novel picture that the difference between conscious and unconscious states is characterized by a switch in frequency-specific modes of large-scale communications across the entire cortex, rather than the cessation of interactions between specific cortical regions.

  9. Introduction of an agent-based multi-scale modular architecture for dynamic knowledge representation of acute inflammation.

    Science.gov (United States)

    An, Gary

    2008-05-27

    One of the greatest challenges facing biomedical research is the integration and sharing of vast amounts of information, not only for individual researchers, but also for the community at large. Agent Based Modeling (ABM) can provide a means of addressing this challenge via a unifying translational architecture for dynamic knowledge representation. This paper presents a series of linked ABMs representing multiple levels of biological organization. They are intended to translate the knowledge derived from in vitro models of acute inflammation to clinically relevant phenomenon such as multiple organ failure. ABM development followed a sequence starting with relatively direct translation from in-vitro derived rules into a cell-as-agent level ABM, leading on to concatenated ABMs into multi-tissue models, eventually resulting in topologically linked aggregate multi-tissue ABMs modeling organ-organ crosstalk. As an underlying design principle organs were considered to be functionally composed of an epithelial surface, which determined organ integrity, and an endothelial/blood interface, representing the reaction surface for the initiation and propagation of inflammation. The development of the epithelial ABM derived from an in-vitro model of gut epithelial permeability is described. Next, the epithelial ABM was concatenated with the endothelial/inflammatory cell ABM to produce an organ model of the gut. This model was validated against in-vivo models of the inflammatory response of the gut to ischemia. Finally, the gut ABM was linked to a similarly constructed pulmonary ABM to simulate the gut-pulmonary axis in the pathogenesis of multiple organ failure. The behavior of this model was validated against in-vivo and clinical observations on the cross-talk between these two organ systems. A series of ABMs are presented extending from the level of intracellular mechanism to clinically observed behavior in the intensive care setting. The ABMs all utilize cell-level agents

  10. Models, Metaphors and Symbols for Information and Knowledge Systems

    Directory of Open Access Journals (Sweden)

    David Williams

    2014-01-01

    Full Text Available A literature search indicates that Data, Information and Knowledge continue to be placed into a hierarchical construct where it is considered that information is more valuable than data and that information can be processed into becoming precious knowledge. Wisdom continues to be added to the model to further confuse the issue. This model constrains our ability to think more logically about how and why we develop knowledge management systems to support and enhance knowledge- intensive processes, tasks or projects. This paper seeks to summarise development of the Data-Information-Knowledge-Wisdom hierarchy, explore the extensive criticism of it and present a more logical (and accurate construct for the elements of intellectual capital when developing and managing Knowledge Management Systems.

  11. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  12. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  13. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  14. Rock sealing - large scale field test and accessory investigations

    International Nuclear Information System (INIS)

    Pusch, R.

    1988-03-01

    The experience from the pilot field test and the basic knowledge extracted from the lab experiments have formed the basis of the planning of a Large Scale Field Test. The intention is to find out how the 'instrument of rock sealing' can be applied to a number of practical cases, where cutting-off and redirection of groundwater flow in repositories are called for. Five field subtests, which are integrated mutually or with other Stripa projects (3D), are proposed. One of them concerns 'near-field' sealing, i.e. sealing of tunnel floors hosting deposition holes, while two involve sealing of 'disturbed' rock around tunnels. The fourth concerns sealing of a natural fracture zone in the 3D area, and this latter test has the expected spin-off effect of obtaining additional information on the general flow pattern around the northeastern wing of the 3D cross. The fifth test is an option of sealing structures in the Validation Drift. The longevity of major grout types is focussed on as the most important part of the 'Accessory Investigations', and detailed plans have been worked out for that purpose. It is foreseen that the continuation of the project, as outlined in this report, will yield suitable methods and grouts for effective and long-lasting sealing of rock for use at stategic points in repositories. (author)

  15. Creating the sustainable conditions for knowledge information sharing in virtual community.

    Science.gov (United States)

    Wang, Jiangtao; Yang, Jianmei; Chen, Quan; Tsai, Sang-Bing

    2016-01-01

    Encyclopedias are not a new platform for the distribution of knowledge, but they have recently drawn a great deal of attention in their online iteration. Peer production in particular has emerged as a new mode of providing information with value and offering competitive advantage in information production. Large numbers of volunteers actively share their knowledge by continuously editing articles in Baidu encyclopedias. Most articles in the online communities are the cumulative and integrated products of the contributions of many coauthors. Email-based surveys and objective data mining were here used to collect analytical data. Critical mass theory is here used to analyze the characteristics of these collective actions and to explain the emergence and sustainability of these actions in the Baidu Encyclopedia communities. These results show that, based on the collective action framework, the contributors group satisfied the two key characteristics that ensure the collective action of knowledge contribution will both take place and become self-sustaining. This analysis not only facilitates the identification of collective actions related to individuals sharing knowledge in virtual communities, but also can provide an insight for other similar virtual communities' management and development.

  16. Probing cosmology with the homogeneity scale of the Universe through large scale structure surveys

    International Nuclear Information System (INIS)

    Ntelis, Pierros

    2017-01-01

    . It is thus possible to reconstruct the distribution of matter in 3 dimensions in gigantic volumes. We can then extract various statistical observables to measure the BAO scale and the scale of homogeneity of the universe. Using Data Release 12 CMASS galaxy catalogs, we obtained precision on the homogeneity scale reduced by 5 times compared to Wiggle Z measurement. At large scales, the universe is remarkably well described in linear order by the ΛCDM-model, the standard model of cosmology. In general, it is not necessary to take into account the nonlinear effects which complicate the model at small scales. On the other hand, at large scales, the measurement of our observables becomes very sensitive to the systematic effects. This is particularly true for the analysis of cosmic homogeneity, which requires an observational method so as not to bias the measurement. In order to study the homogeneity principle in a model independent way, we explore a new way to infer distances using cosmic clocks and type Ia Supernovae. This establishes the Cosmological Principle using only a small number of a priori assumption, i.e. the theory of General Relativity and astrophysical assumptions that are independent from Friedmann Universes and in extend the homogeneity assumption. This manuscript is as follows. After a short presentation of the knowledge in cosmology necessary for the understanding of this manuscript, presented in Chapter 1, Chapter 2 will deal with the challenges of the Cosmological Principle as well as how to overcome those. In Chapter 3, we will discuss the technical characteristics of the large scale structure surveys, in particular focusing on BOSS and eBOSS galaxy surveys. Chapter 4 presents the detailed analysis of the measurement of cosmic homogeneity and the various systematic effects likely to impact our observables. Chapter 5 will discuss how to use the cosmic homogeneity as a standard ruler to constrain dark energy models from current and future surveys. In

  17. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    Science.gov (United States)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  18. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  19. A large-scale perspective on stress-induced alterations in resting-state networks

    Science.gov (United States)

    Maron-Katz, Adi; Vaisvaser, Sharon; Lin, Tamar; Hendler, Talma; Shamir, Ron

    2016-02-01

    Stress is known to induce large-scale neural modulations. However, its neural effect once the stressor is removed and how it relates to subjective experience are not fully understood. Here we used a statistically sound data-driven approach to investigate alterations in large-scale resting-state functional connectivity (rsFC) induced by acute social stress. We compared rsfMRI profiles of 57 healthy male subjects before and after stress induction. Using a parcellation-based univariate statistical analysis, we identified a large-scale rsFC change, involving 490 parcel-pairs. Aiming to characterize this change, we employed statistical enrichment analysis, identifying anatomic structures that were significantly interconnected by these pairs. This analysis revealed strengthening of thalamo-cortical connectivity and weakening of cross-hemispheral parieto-temporal connectivity. These alterations were further found to be associated with change in subjective stress reports. Integrating report-based information on stress sustainment 20 minutes post induction, revealed a single significant rsFC change between the right amygdala and the precuneus, which inversely correlated with the level of subjective recovery. Our study demonstrates the value of enrichment analysis for exploring large-scale network reorganization patterns, and provides new insight on stress-induced neural modulations and their relation to subjective experience.

  20. The Spoken Knowledge in Low Literacy in Diabetes scale: a diabetes knowledge scale for vulnerable patients.

    Science.gov (United States)

    Rothman, Russell L; Malone, Robb; Bryant, Betsy; Wolfe, Catherine; Padgett, Penelope; DeWalt, Darren A; Weinberger, Morris; Pignone, Michael

    2005-01-01

    The purpose of this study was to develop and validate a new knowledge scale for patients with type 2 diabetes and poor literacy: the Spoken Knowledge in Low Literacy patients with Diabetes (SKILLD). The authors evaluated the 10-item SKILLD among 217 patients with type 2 diabetes and poor glycemic control at an academic general medicine clinic. Internal reliability was measured using the Kuder-Richardson coefficient. Performance on the SKILLD was compared to patient socioeconomic status, literacy level, duration of diabetes, and glycated hemoglobin (A1C). Respondents' mean age was 55 years, and they had diabetes for an average of 8.4 years; 38% had less than a sixth-grade literacy level. The average score on the SKILLD was 49%. Less than one third of patients knew the signs of hypoglycemia or the normal fasting blood glucose range. The internal reliability of the SKILLD was good (0.72). Higher performance on the SKILLD was significantly correlated with higher income (r = 0.22), education level (r = 0.36), literacy status (r = 0.33), duration of diabetes (r = 0.30), and lower A1C (r = -0.16). When dichotomized, patients with low SKILLD scores (< or = 50%) had significantly higher A1C (11.2% vs 10.3%, P < .01). This difference remained significant when adjusted for covariates. The SKILLD demonstrated good internal consistency and validity. It revealed significant knowledge deficits and was associated with glycemic control. The SKILLD represents a practical scale for patients with diabetes and low literacy.

  1. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  2. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  3. Large-scale inverse model analyses employing fast randomized data reduction

    Science.gov (United States)

    Lin, Youzuo; Le, Ellen B.; O'Malley, Daniel; Vesselinov, Velimir V.; Bui-Thanh, Tan

    2017-08-01

    When the number of observations is large, it is computationally challenging to apply classical inverse modeling techniques. We have developed a new computationally efficient technique for solving inverse problems with a large number of observations (e.g., on the order of 107 or greater). Our method, which we call the randomized geostatistical approach (RGA), is built upon the principal component geostatistical approach (PCGA). We employ a data reduction technique combined with the PCGA to improve the computational efficiency and reduce the memory usage. Specifically, we employ a randomized numerical linear algebra technique based on a so-called "sketching" matrix to effectively reduce the dimension of the observations without losing the information content needed for the inverse analysis. In this way, the computational and memory costs for RGA scale with the information content rather than the size of the calibration data. Our algorithm is coded in Julia and implemented in the MADS open-source high-performance computational framework (http://mads.lanl.gov). We apply our new inverse modeling method to invert for a synthetic transmissivity field. Compared to a standard geostatistical approach (GA), our method is more efficient when the number of observations is large. Most importantly, our method is capable of solving larger inverse problems than the standard GA and PCGA approaches. Therefore, our new model inversion method is a powerful tool for solving large-scale inverse problems. The method can be applied in any field and is not limited to hydrogeological applications such as the characterization of aquifer heterogeneity.

  4. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  5. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  6. Internationalization Measures in Large Scale Research Projects

    Science.gov (United States)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  7. On uncertainty in information and ignorance in knowledge

    Science.gov (United States)

    Ayyub, Bilal M.

    2010-05-01

    This paper provides an overview of working definitions of knowledge, ignorance, information and uncertainty and summarises formalised philosophical and mathematical framework for their analyses. It provides a comparative examination of the generalised information theory and the generalised theory of uncertainty. It summarises foundational bases for assessing the reliability of knowledge constructed as a collective set of justified true beliefs. It discusses system complexity for ancestor simulation potentials. It offers value-driven communication means of knowledge and contrarian knowledge using memes and memetics.

  8. Recent Advances in Understanding Large Scale Vapour Explosions

    International Nuclear Information System (INIS)

    Board, S.J.; Hall, R.W.

    1976-01-01

    description of efficient large scale explosions it will be necessary to consider three stages: a) the setting up of a quasi-stable initial configuration; b) the triggering of this configuration; c) the propagation of the explosion. In this paper we consider each stage in turn, reviewing the relevant experimental information and theory to see to what extent the requirements for energetic explosions, and the physical processes that can satisfy these requirements, are understood. We pay particular attention to an attractively simple criterion for explosiveness, suggested by Fauske, that the contact temperature should exceed the temperature for spontaneous nucleation of the coolant, because on this criterion, sodium and UO 2 in particular are not explosive

  9. Measuring practical knowledge about balanced meals: development and validation of the brief PKB-7 scale.

    Science.gov (United States)

    Mötteli, S; Barbey, J; Keller, C; Bucher, T; Siegrist, M

    2016-04-01

    As a high-quality diet is associated with a lower risk for several diseases and all-cause mortality, current nutrition education tools provide people with information regarding how to build a healthy and a balanced meal. To assess this basic nutrition knowledge, the research aim was to develop and validate a brief scale to measure the Practical Knowledge about Balanced meals (PKB-7). A pool of 25 items was pretested with experts and laypeople before being tested on a random sample in Switzerland (n=517). For item selection, a Rasch model analysis was applied. The validity and reliability of the new scale were assessed by three additional studies including laypeople (n=597; n=145) and nutrition experts (n=59). The final scale consists of seven multiple-choice items, which met the assumptions of the Rasch model. The validity of the new scale was shown by several aspects: the Rasch model was replicated in a second study, and nutrition experts achieved significantly higher scores than laypeople (t(148)=20.27, Pbalanced meals based on current dietary guidelines. This brief and easy-to-use scale is intended for application in both research and practice.

  10. Use of a large-scale rainfall simulator reveals novel insights into stemflow generation

    Science.gov (United States)

    Levia, D. F., Jr.; Iida, S. I.; Nanko, K.; Sun, X.; Shinohara, Y.; Sakai, N.

    2017-12-01

    Detailed knowledge of stemflow generation and its effects on both hydrological and biogoechemical cycling is important to achieve a holistic understanding of forest ecosystems. Field studies and a smaller set of experiments performed under laboratory conditions have increased our process-based knowledge of stemflow production. Building upon these earlier works, a large-scale rainfall simulator was employed to deepen our understanding of stemflow generation processes. The use of the large-scale rainfall simulator provides a unique opportunity to examine a range of rainfall intensities under constant conditions that are difficult under natural conditions due to the variable nature of rainfall intensities in the field. Stemflow generation and production was examined for three species- Cryptomeria japonica D. Don (Japanese cedar), Chamaecyparis obtusa (Siebold & Zucc.) Endl. (Japanese cypress), Zelkova serrata Thunb. (Japanese zelkova)- under both leafed and leafless conditions at several different rainfall intensities (15, 20, 30, 40, 50, and 100 mm h-1) using a large-scale rainfall simulator in National Research Institute for Earth Science and Disaster Resilience (Tsukuba, Japan). Stemflow production and rates and funneling ratios were examined in relation to both rainfall intensity and canopy structure. Preliminary results indicate a dynamic and complex response of the funneling ratios of individual trees to different rainfall intensities among the species examined. This is partly the result of different canopy structures, hydrophobicity of vegetative surfaces, and differential wet-up processes across species and rainfall intensities. This presentation delves into these differences and attempts to distill them into generalizable patterns, which can advance our theories of stemflow generation processes and ultimately permit better stewardship of forest resources. ________________ Funding note: This research was supported by JSPS Invitation Fellowship for Research in

  11. Innovation, knowledge and information management in supply chains

    Directory of Open Access Journals (Sweden)

    Szuster Mariusz

    2016-03-01

    Full Text Available In this study the question of innovation and information management in supply chain is addressed. We assume that innovation and information management are interrelated in supply chains and that the relationship is crucial for their success on the market. Considerable attention was given to the issue of outsourcing which is now a commonplace in supply chain management. In particular, we examined how approaches to managing information and knowledge in the supply chain differ according to ICT outsourcing. The deduction is based on a data set of 426 companies located in Poland, representing a variety of industry sectors. Two stages of the research were realised. The rationale behind this was to identify enterprises that utilise a welldeveloped system of information and knowledge management to determine the scope of possible in-depth analyses. This helped to receive valuable responses. We find what information and knowledge management is mainly driven by. We show the similarities and differences in information and knowledge management between entities that use ICT outsourcing and those that do not. We discuss the research results and draw conclusions.

  12. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  13. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  14. Information and knowledge management for sustainable forestry

    Science.gov (United States)

    Alan J. Thomson; Michael Rauscher; Daniel L. Schmoldt; Harald Vacik

    2007-01-01

    Institutional information and knowledge management often involves a range of systems and technologies to aid decisions and produce reports. Construction of a knowledge system organizing hierarchy facilitates exploration of the interrelationships among knowledge management, inventory and monitoring, statistics and modeling, and policy. Two case studies illustrate these...

  15. Small-scale hydropower plants and rare bryophytes and lichens. Knowledge and lack of knowledge; Smaakraftverk og sjeldne moser og lav. Kunnskap og kunnskapsmangler

    Energy Technology Data Exchange (ETDEWEB)

    Evju, Marianne; Hassel, Kristian; Hagen, Dagmar; Erikstad, Lars

    2011-08-15

    There is a large and increasing interest for the development of small-scale hydropower in Norway. Small-scale hydropower plants may impact the biological diversity negatively through destruction, degradation or fragmentation of habitats. Both the environmental investigations and the treatment of applications for small-scale hydropower plants put a great emphasis on red listed species, and in particular on red-listed bryophytes and lichens growing in stream ravines and in meadows and rock faces influenced by waterfalls. Bryophytes and lichens can be difficult to identify in the field, and knowledge of the species' ecology, distribution and population sizes is insufficient. A large review of environmental investigations of small-scale hydropower plants, documented that red-listed lichens were rarely recorded, and red-listed bryophytes were never recorded. In this report, we try to make visible the knowledge we have and the knowledge we lack of red listed bryophytes and lichens in areas in which the development of small-scale hydropower is relevant. Most focus is placed on bryophytes. The report is mainly a collation of existing knowledge. There is a great variation among stream ravines in the occurrence of species. Several factors, such as stability of moisture conditions, tree species composition and bedrock, interact to affect the occurrence of species. Red-listed bryophytes and lichens occur both in the forest and in affiliation with the stream. A reduction of local moisture, through e.g. logging of forest close to the stream or reduction of the water flow, will probably affect the species negatively. River regulation will change the frequency of flooding and affect the ice drift in the stream, which may negatively affect species living on dead wood in or close to the stream. Several species are vulnerable to deteriorated habitat quality and habitat fragmentation as their habitat requirements are narrow and their dispersal capacity is limited. However, we

  16. Role of the Technical Information Center in the knowledge management

    International Nuclear Information System (INIS)

    Morales, Alfredo; Marrero, Carmen; Aguero, Manuel

    1999-01-01

    Competitive advantage of companies is directly proportional to their capacity for creating, capturing, handling, inventorying, transferring information, and generating knowledge, as well as for implementing best practices, in order to add value to the production process. Creation of an environment that allows carrying out this process efficiently, constitutes a transcendental step toward the systematic and useful application of knowledge management. This paper presents the role of Technical Information Centers, as entities which provide and integrate information and knowledge, within knowledge communities. The Technical Information Center (CIT for the Spanish: Centro de Information Tecnica) of PDVSA-Intevep and its contribution to strengthen the corporate technological intelligence through the information analysis and technical-scientific knowledge diffusion is also analyzed. The petrochemical and petroleum information network (RIPPET) and its data base RIPPET (from the Spanish Red de Informacion Petrolera y Petroquimica), coordinate by the CIT, and the CIT on line, a virtual organization, are also presented. Both are tools which facilitate the transfer of information and knowledge to communities organized within the company to manage knowledge

  17. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  18. Managing Knowledge And Information In The Sustainable Organization

    Science.gov (United States)

    Grecu, Valentin

    2015-09-01

    Knowledge and information management are essential for the success of organizations and bring significant competitive advantages. There has been significant investments in setting up technological platforms that support business processes and increase the efficiency of operational structure in many organizations through an efficient management of knowledge and information. This research highlights the importance of using knowledge and information management in order to increase the competitiveness of organizations and to foster the transition towards the sustainable organization, as nowadays an organization that wants to be competitive needs to be sustainable.

  19. Knowledge Sharing is Knowledge Creation

    DEFF Research Database (Denmark)

    Greve, Linda

    2015-01-01

    Knowledge sharing and knowledge transfer are important to knowledge communication. However when groups of knowledge workers engage in knowledge communication activities, it easily turns into mere mechanical information processing despite other ambitions. This article relates literature of knowledge...... communication and knowledge creation to an intervention study in a large Danish food production company. For some time a specific group of employees uttered a wish for knowledge sharing, but it never really happened. The group was observed and submitted to metaphor analysis as well as analysis of co...

  20. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    Science.gov (United States)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  1. Inflationary tensor fossils in large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Dimastrogiovanni, Emanuela [School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States); Fasiello, Matteo [Department of Physics, Case Western Reserve University, Cleveland, OH 44106 (United States); Jeong, Donghui [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Kamionkowski, Marc, E-mail: ema@physics.umn.edu, E-mail: mrf65@case.edu, E-mail: duj13@psu.edu, E-mail: kamion@jhu.edu [Department of Physics and Astronomy, 3400 N. Charles St., Johns Hopkins University, Baltimore, MD 21218 (United States)

    2014-12-01

    Inflation models make specific predictions for a tensor-scalar-scalar three-point correlation, or bispectrum, between one gravitational-wave (tensor) mode and two density-perturbation (scalar) modes. This tensor-scalar-scalar correlation leads to a local power quadrupole, an apparent departure from statistical isotropy in our Universe, as well as characteristic four-point correlations in the current mass distribution in the Universe. So far, the predictions for these observables have been worked out only for single-clock models in which certain consistency conditions between the tensor-scalar-scalar correlation and tensor and scalar power spectra are satisfied. Here we review the requirements on inflation models for these consistency conditions to be satisfied. We then consider several examples of inflation models, such as non-attractor and solid-inflation models, in which these conditions are put to the test. In solid inflation the simplest consistency conditions are already violated whilst in the non-attractor model we find that, contrary to the standard scenario, the tensor-scalar-scalar correlator probes directly relevant model-dependent information. We work out the predictions for observables in these models. For non-attractor inflation we find an apparent local quadrupolar departure from statistical isotropy in large-scale structure but that this power quadrupole decreases very rapidly at smaller scales. The consistency of the CMB quadrupole with statistical isotropy then constrains the distance scale that corresponds to the transition from the non-attractor to attractor phase of inflation to be larger than the currently observable horizon. Solid inflation predicts clustering fossils signatures in the current galaxy distribution that may be large enough to be detectable with forthcoming, and possibly even current, galaxy surveys.

  2. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  3. Information and knowledge: an evolutionary framework for information science

    Directory of Open Access Journals (Sweden)

    Marcia J. Bates

    2005-01-01

    Full Text Available Background. Many definitions of information, knowledge, and data have been suggested throughout the history of information science. In this article, the objective is to provide definitions that are usable for the physical, biological, and social meanings of the terms, covering the various senses important to our field. Argument. Information 1 is defined as the pattern of organization of matter and energy. Information 2 is defined as some pattern of organization of matter and energy that has been given meaning by a living being. Knowledge is defined as information given meaning and integrated with other contents of understanding. Elaboration. The approach is rooted in an evolutionary framework; that is, modes of information perception, processing, transmission, and storage are seen to have developed as a part of the general evolution of members of the animal kingdom. Brains are expensive for animals to support; consequently, efficient storage, including, particularly, storage at emergent levels-for example, storing the concept of chair, rather than specific memories of all chairs ever seen, is powerful and effective for animals. Conclusion. Thus, rather than being reductionist, the approach taken demonstrates the fundamentally emergent nature of most of what higher animals and human beings, in particular, experience as information.

  4. Semantic knowledge representation for information retrieval

    CERN Document Server

    Gödert, Winfried; Nagelschmidt, Matthias

    2014-01-01

    This book covers the basics of semantic web technologies and indexing languages, and describes their contribution to improve languages as a tool for subject queries and knowledge exploration. The book is relevant to information scientists, knowledge workers and indexers. It provides a suitable combination of theoretical foundations and practical applications.

  5. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  6. Developing measures for information ergonomics in knowledge work.

    Science.gov (United States)

    Franssila, Heljä; Okkonen, Jussi; Savolainen, Reijo

    2016-03-01

    Information ergonomics is an evolving application domain of ergonomics focusing on the management of workload in the real-world contexts of information-intensive tasks. This study introduces a method for the evaluation of information ergonomics in knowledge work. To this end, five key dimensions of information ergonomics were identified: contextual factors of knowledge work, multitasking, interruptions at work, practices for managing information load, and perceived job control and productivity. In total, 24 measures focusing on the above dimensions were constructed. The measures include, for example, the number of fragmented work tasks per work day. The measures were preliminarily tested in two Finnish organisations, making use of empirical data gathered by interviews, electronic questionnaires and log data applications tracking work processes on personal computers. The measures are applicable to the evaluation of information ergonomics, even though individual measures vary with regard to the amount of work and time needed for data analysis. Practitioner Summary: The study introduces a method for the evaluation of information ergonomics in knowledge work. To this end, 24 measures were constructed and tested empirically. The measures focus on contextual factors of knowledge work, multitasking, interruptions at work, practices for managing information load, and perceived job control and productivity.

  7. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  8. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  9. Hypertension Knowledge-Level Scale (HK-LS: A Study on Development, Validity and Reliability

    Directory of Open Access Journals (Sweden)

    Cemalettin Kalyoncu

    2012-03-01

    Full Text Available This study was conducted to develop a scale to measure knowledge about hypertension among Turkish adults. The Hypertension Knowledge-Level Scale (HK-LS was generated based on content, face, and construct validity, internal consistency, test re-test reliability, and discriminative validity procedures. The final scale had 22 items with six sub-dimensions. The scale was applied to 457 individuals aged ≥18 years, and 414 of them were re-evaluated for test-retest reliability. The six sub-dimensions encompassed 60.3% of the total variance. Cronbach alpha coefficients were 0.82 for the entire scale and 0.92, 0.59, 0.67, 0.77, 0.72, and 0.76 for the sub-dimensions of definition, medical treatment, drug compliance, lifestyle, diet, and complications, respectively. The scale ensured internal consistency in reliability and construct validity, as well as stability over time. Significant relationships were found between knowledge score and age, gender, educational level, and history of hypertension of the participants. No correlation was found between knowledge score and working at an income-generating job. The present scale, developed to measure the knowledge level of hypertension among Turkish adults, was found to be valid and reliable.

  10. Hypertension Knowledge-Level Scale (HK-LS): a study on development, validity and reliability.

    Science.gov (United States)

    Erkoc, Sultan Baliz; Isikli, Burhanettin; Metintas, Selma; Kalyoncu, Cemalettin

    2012-03-01

    This study was conducted to develop a scale to measure knowledge about hypertension among Turkish adults. The Hypertension Knowledge-Level Scale (HK-LS) was generated based on content, face, and construct validity, internal consistency, test re-test reliability, and discriminative validity procedures. The final scale had 22 items with six sub-dimensions. The scale was applied to 457 individuals aged ≥ 18 years, and 414 of them were re-evaluated for test-retest reliability. The six sub-dimensions encompassed 60.3% of the total variance. Cronbach alpha coefficients were 0.82 for the entire scale and 0.92, 0.59, 0.67, 0.77, 0.72, and 0.76 for the sub-dimensions of definition, medical treatment, drug compliance, lifestyle, diet, and complications, respectively. The scale ensured internal consistency in reliability and construct validity, as well as stability over time. Significant relationships were found between knowledge score and age, gender, educational level, and history of hypertension of the participants. No correlation was found between knowledge score and working at an income-generating job. The present scale, developed to measure the knowledge level of hypertension among Turkish adults, was found to be valid and reliable.

  11. Knowledge Organization = Information Organization?

    DEFF Research Database (Denmark)

    Hjørland, Birger

    Are the terms ―information organization‖ (IO), ―organization of information‖ (OI) and ―information architecture‖ (IA) synonyms for knowledge organization (KO)? This study uses bibliometric methods, among others, to determine some relations between these terms and their meanings. Apparently the data...... shows that these terms should not be considered synonyms because each of the terms IO, OI, IA and KO produce a different set of high ranked authors, journals and papers. In many cases the terms are, however, used interchangeably (and thus indicating synonymity) and it is argued that the underlying...

  12. eDOC : A collaboration infrastructure to manage knowledge and information on nuclear projects and research activities

    International Nuclear Information System (INIS)

    Van Craeynest, J.M.; Jacquemet, F.; Chermette, D.; Bonneau, S.

    2004-01-01

    common information management platform. But managing a large project is very difficult; so, obtaining and deploying such a platform should not be another problem to solve but a simple and quickly useful solution. In addition, costs and charges have to be compatible with a wide range of project management context and variability (in terms of finances, staff, security, language, etc.). Facing those challenges, we have decided to launch the eDOC project. eDOC is the name of an application that aims to provide a large catalog of web-based tools to create and manage communication and collaboration portals for communities of practices. An eDOC workspace a web (customisable) portal look. This portal gives access to a first hierarchy of workspaces and subspaces only visible by users that have adequate rights. In those spaces, various natures of information may be uploaded for sharing or for co-authoring. Many work flows are available to send, review, annotate, submit for validation or publication. eDOC also provides a second hierarchy of publication headings used to enlarge the publication circle of certain kind of information or documents (project news, results, reporting charts, etc.). eDOC includes a set of plug-in tools to create and manage newsletters, mailing lists, animate discussion forums, schedule tasks, facilitate reporting to the European Commission, track issues or bugs, share an agenda, etc. One of the core functionality is the members directory. It is very useful for large communities to know each others and may initiate a competencies or skills map. We have begun this project by a large evaluation campaign. As we preferred an ''on-the-shelf'' solution, we have started with market leaders and challengers. The conclusions of the study were very interesting: first, it became obvious that the market and solutions were not consolidated and that it was very risky to bet on a particular product. Second: as we needed a large scale deployment at once to get per user

  13. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  14. Safeguarding of large scale reprocessing and MOX plants

    International Nuclear Information System (INIS)

    Howsley, R.; Burrows, B.; Longevialle, H. de; Kuroi, H.; Izumi, A.

    1997-01-01

    In May 97, the IAEA Board of Governors approved the final measures of the ''93+2'' safeguards strengthening programme, thus improving the international non-proliferation regime by enhancing the effectiveness and efficiency of safeguards verification. These enhancements are not however, a revolution in current practices, but rather an important step in the continuous evolution of the safeguards system. The principles embodied in 93+2, for broader access to information and increased physical access already apply, in a pragmatic way, to large scale reprocessing and MOX fabrication plants. In these plants, qualitative measures and process monitoring play an important role in addition to accountancy and material balance evaluations in attaining the safeguard's goals. This paper will reflect on the safeguards approaches adopted for these large bulk handling facilities and draw analogies, conclusions and lessons for the forthcoming implementation of the 93+2 Programme. (author)

  15. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  16. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  17. Large-scale simulation of ductile fracture process of microstructured materials

    International Nuclear Information System (INIS)

    Tian Rong; Wang Chaowei

    2011-01-01

    The promise of computational science in the extreme-scale computing era is to reduce and decompose macroscopic complexities into microscopic simplicities with the expense of high spatial and temporal resolution of computing. In materials science and engineering, the direct combination of 3D microstructure data sets and 3D large-scale simulations provides unique opportunity for the development of a comprehensive understanding of nano/microstructure-property relationships in order to systematically design materials with specific desired properties. In the paper, we present a framework simulating the ductile fracture process zone in microstructural detail. The experimentally reconstructed microstructural data set is directly embedded into a FE mesh model to improve the simulation fidelity of microstructure effects on fracture toughness. To the best of our knowledge, it is for the first time that the linking of fracture toughness to multiscale microstructures in a realistic 3D numerical model in a direct manner is accomplished. (author)

  18. Utilisation of information technology to support information and knowledge management by lawyers in Polokwane City

    Directory of Open Access Journals (Sweden)

    Solomon Bopape

    2010-01-01

    Full Text Available A revolution in information and communication technology is taking place in the world. With this technological revolution, information and knowledge are also considered as crucial assets for every organization. Law firms are regarded as one of the industries which are information and knowledge-intensive. The utilization of information technology can play an essential role in supporting information and knowledge management in law firms. An investigation into the extent to which lawyers or law firms in Polokwane city utilize information technology to support information and knowledge management was conducted through a survey questionnaire based on the Technology Acceptance Model. The findings of this research showed that lawyers utilise information technology systems or applications that are common, such as word processing, e-mail, client billing and online databases for searching legal information. Other information and knowledge management tools, such as Intranets, extranets and web portals, were the least and non-utilised applications by these lawyers. The main reason for non-utilization of such systems may be linked to non- exposure to information technology and unfamiliarity with information and knowledge management tools. It is, therefore, recommended that legal schools should include, in their curriculum, modules on the application and role of information technology in the legal practice. Recommendations for future research related to this subject are also provided.

  19. Developing a Scale to Measure Content Knowledge and Pedagogy Content Knowledge of In-Service Elementary Teachers on Fractions

    Science.gov (United States)

    Kazemi, Farhad; Rafiepour, Abolfazl

    2018-01-01

    The main purpose of this study was to develop a scale for measuring content knowledge (CK) and pedagogy content knowledge (PCK) of in-service elementary teachers on mathematical fractions. Another aim of this study was to consider whether CK and PCK are separate from each other, or are in a single body. Therefore, a scale containing 22 items about…

  20. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  1. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  2. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  3. The knowledge about cervical cancer among female students of Lublin universities

    Directory of Open Access Journals (Sweden)

    Agata Żminda

    2017-08-01

    The level of knowledge of female students in Lublin about cervical cancer seems to be insufficient. There is a need to educate women in the prevention and diagnosis of this cancer. The most commonly pointed source of knowledge about cervical cancer was the Internet. This makes it possible to improve young women's knowledge by conducting large-scale information campaigns on social networking sites or on websites concerning health. Doctors should pay particular attention to the educational aspect of patient care.

  4. Nuclear information and knowledge. News from the INIS and Nuclear Knowledge Management Section. No. 1, April 2006

    International Nuclear Information System (INIS)

    2006-04-01

    This first newsletter, a bi-annual publication, is aimed at informing about current developments in the INIS and Nuclear Knowledge Management Section of the IAEA. The vision for the Section is that knowledge powers the future and that knowledge is the basis of all nuclear activities. The International Nuclear Information System (INIS) is the worlds most authoritative and comprehensive source of reliable nuclear information with the aim that existing nuclear information and knowledge will be available in Member States, whenever and wherever needed, for the peaceful, safe and efficient use of nuclear energy. This first issue of the newspaper constitutes a review of the year 2005 in these fields and informs about some of planned activities for 2006 and 2007. It provides short articles about digitizing documents to preserve knowledge, INIS production statistics, the International Conference on Knowledge Management in Nuclear Facilities, supporting education and training, the School of Nuclear Knowledge Management and Coordinated Research Projects on Knowledge Preservation

  5. Unraveling The Connectome: Visualizing and Abstracting Large-Scale Connectomics Data

    KAUST Repository

    Al-Awami, Ali K.

    2017-04-30

    We explore visualization and abstraction approaches to represent neuronal data. Neuroscientists acquire electron microscopy volumes to reconstruct a complete wiring diagram of the neurons in the brain, called the connectome. This will be crucial to understanding brains and their development. However, the resulting data is complex and large, posing a big challenge to existing visualization techniques in terms of clarity and scalability. We describe solutions to tackle the problems of scalability and cluttered presentation. We first show how a query-guided interactive approach to visual exploration can reduce the clutter and help neuroscientists explore their data dynamically. We use a knowledge-based query algebra that facilitates the interactive creation of queries. This allows neuroscientists to pose domain-specific questions related to their research. Simple queries can be combined to form complex queries to answer more sophisticated questions. We then show how visual abstractions from 3D to 2D can significantly reduce the visual clutter and add clarity to the visualization so that scientists can focus more on the analysis. We abstract the topology of 3D neurons into a multi-scale, relative distance-preserving subway map visualization that allows scientists to interactively explore the morphological and connectivity features of neuronal cells. We then focus on the process of acquisition, where neuroscientists segment electron microscopy images to reconstruct neurons. The segmentation process of such data is tedious, time-intensive, and usually performed using a diverse set of tools. We present a novel web-based visualization system for tracking the state, progress, and evolution of segmentation data in neuroscience. Our multi-user system seamlessly integrates a diverse set of tools. Our system provides support for the management, provenance, accountability, and auditing of large-scale segmentations. Finally, we present a novel architecture to render very large

  6. Large-scale sequential quadratic programming algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  7. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  8. Using relational databases for improved sequence similarity searching and large-scale genomic analyses.

    Science.gov (United States)

    Mackey, Aaron J; Pearson, William R

    2004-10-01

    Relational databases are designed to integrate diverse types of information and manage large sets of search results, greatly simplifying genome-scale analyses. Relational databases are essential for management and analysis of large-scale sequence analyses, and can also be used to improve the statistical significance of similarity searches by focusing on subsets of sequence libraries most likely to contain homologs. This unit describes using relational databases to improve the efficiency of sequence similarity searching and to demonstrate various large-scale genomic analyses of homology-related data. This unit describes the installation and use of a simple protein sequence database, seqdb_demo, which is used as a basis for the other protocols. These include basic use of the database to generate a novel sequence library subset, how to extend and use seqdb_demo for the storage of sequence similarity search results and making use of various kinds of stored search results to address aspects of comparative genomic analysis.

  9. Extraction of relations between genes and diseases from text and large-scale data analysis: implications for translational research.

    Science.gov (United States)

    Bravo, Àlex; Piñero, Janet; Queralt-Rosinach, Núria; Rautschka, Michael; Furlong, Laura I

    2015-02-21

    Current biomedical research needs to leverage and exploit the large amount of information reported in scientific publications. Automated text mining approaches, in particular those aimed at finding relationships between entities, are key for identification of actionable knowledge from free text repositories. We present the BeFree system aimed at identifying relationships between biomedical entities with a special focus on genes and their associated diseases. By exploiting morpho-syntactic information of the text, BeFree is able to identify gene-disease, drug-disease and drug-target associations with state-of-the-art performance. The application of BeFree to real-case scenarios shows its effectiveness in extracting information relevant for translational research. We show the value of the gene-disease associations extracted by BeFree through a number of analyses and integration with other data sources. BeFree succeeds in identifying genes associated to a major cause of morbidity worldwide, depression, which are not present in other public resources. Moreover, large-scale extraction and analysis of gene-disease associations, and integration with current biomedical knowledge, provided interesting insights on the kind of information that can be found in the literature, and raised challenges regarding data prioritization and curation. We found that only a small proportion of the gene-disease associations discovered by using BeFree is collected in expert-curated databases. Thus, there is a pressing need to find alternative strategies to manual curation, in order to review, prioritize and curate text-mining data and incorporate it into domain-specific databases. We present our strategy for data prioritization and discuss its implications for supporting biomedical research and applications. BeFree is a novel text mining system that performs competitively for the identification of gene-disease, drug-disease and drug-target associations. Our analyses show that mining only a

  10. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  11. Developing knowledge level scale of functional foods: Validity and ...

    African Journals Online (AJOL)

    The aim of the study was to develop a scale to determine the knowledge levels of University students on functional foods and to investigate the validity and reliability of the scale. The research was conducted on 417 (209 girls and 208 boys) undergraduate students in Selcuk University regarding functional foods.

  12. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  13. Large-area perovskite nanowire arrays fabricated by large-scale roll-to-roll micro-gravure printing and doctor blading

    Science.gov (United States)

    Hu, Qiao; Wu, Han; Sun, Jia; Yan, Donghang; Gao, Yongli; Yang, Junliang

    2016-02-01

    Organic-inorganic hybrid halide perovskite nanowires (PNWs) show great potential applications in electronic and optoelectronic devices such as solar cells, field-effect transistors and photodetectors. It is very meaningful to fabricate ordered, large-area PNW arrays and greatly accelerate their applications and commercialization in electronic and optoelectronic devices. Herein, highly oriented and ultra-long methylammonium lead iodide (CH3NH3PbI3) PNW array thin films were fabricated by large-scale roll-to-roll (R2R) micro-gravure printing and doctor blading in ambient environments (humility ~45%, temperature ~28 °C), which produced PNW lengths as long as 15 mm. Furthermore, photodetectors based on these PNWs were successfully fabricated on both silicon oxide (SiO2) and flexible polyethylene terephthalate (PET) substrates and showed moderate performance. This study provides low-cost, large-scale techniques to fabricate large-area PNW arrays with great potential applications in flexible electronic and optoelectronic devices.Organic-inorganic hybrid halide perovskite nanowires (PNWs) show great potential applications in electronic and optoelectronic devices such as solar cells, field-effect transistors and photodetectors. It is very meaningful to fabricate ordered, large-area PNW arrays and greatly accelerate their applications and commercialization in electronic and optoelectronic devices. Herein, highly oriented and ultra-long methylammonium lead iodide (CH3NH3PbI3) PNW array thin films were fabricated by large-scale roll-to-roll (R2R) micro-gravure printing and doctor blading in ambient environments (humility ~45%, temperature ~28 °C), which produced PNW lengths as long as 15 mm. Furthermore, photodetectors based on these PNWs were successfully fabricated on both silicon oxide (SiO2) and flexible polyethylene terephthalate (PET) substrates and showed moderate performance. This study provides low-cost, large-scale techniques to fabricate large-area PNW arrays

  14. Atlas of knowledge anyone can map

    CERN Document Server

    Börner, Katy

    2015-01-01

    Maps of physical spaces locate us in the world and help us navigate unfamiliar routes. Maps of topical spaces help us visualize the extent and structure of our collective knowledge; they reveal bursts of activity, pathways of ideas, and borders that beg to be crossed. This book, from the author of Atlas of Science, describes the power of topical maps, providing readers with principles for visualizing knowledge and offering as examples forty large-scale and more than 100 small-scale full-color maps. Today, data literacy is becoming as important as language literacy. Well-designed visualizations can rescue us from a sea of data, helping us to make sense of information, connect ideas, and make better decisions in real time. In Atlas of Knowledge, leading visualization expert Katy Borner makes the case for a systems science approach to science and technology studies and explains different types and levels of analysis. Drawing on fifteen years of teaching and tool development, she introduces a theoretical framewor...

  15. Information Sharing and Knowledge Sharing as Communicative Activities

    Science.gov (United States)

    Savolainen, Reijo

    2017-01-01

    Introduction: This paper elaborates the picture of information sharing and knowledge sharing as forms of communicative activity. Method: A conceptual analysis was made to find out how researchers have approached information sharing and knowledge sharing from the perspectives of transmission and ritual. The findings are based on the analysis of one…

  16. Temporal scaling in information propagation

    Science.gov (United States)

    Huang, Junming; Li, Chao; Wang, Wen-Qiang; Shen, Hua-Wei; Li, Guojie; Cheng, Xue-Qi

    2014-06-01

    For the study of information propagation, one fundamental problem is uncovering universal laws governing the dynamics of information propagation. This problem, from the microscopic perspective, is formulated as estimating the propagation probability that a piece of information propagates from one individual to another. Such a propagation probability generally depends on two major classes of factors: the intrinsic attractiveness of information and the interactions between individuals. Despite the fact that the temporal effect of attractiveness is widely studied, temporal laws underlying individual interactions remain unclear, causing inaccurate prediction of information propagation on evolving social networks. In this report, we empirically study the dynamics of information propagation, using the dataset from a population-scale social media website. We discover a temporal scaling in information propagation: the probability a message propagates between two individuals decays with the length of time latency since their latest interaction, obeying a power-law rule. Leveraging the scaling law, we further propose a temporal model to estimate future propagation probabilities between individuals, reducing the error rate of information propagation prediction from 6.7% to 2.6% and improving viral marketing with 9.7% incremental customers.

  17. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  18. The impact of mobile phones on knowledge access and transfer of small-scale horticultural farmers in Tanzania

    Directory of Open Access Journals (Sweden)

    Krone, Madlen

    2014-09-01

    Full Text Available Agriculture is the main economic activity in Tanzania and the country´s largest employer, providing livelihood for at least 80 % of the economically active population. Many studies have identified key challenges facing the sector for Africa in general – among these lack of access to knowledge. For agricultural producers, access to knowledge is important for an improved productivity and competitiveness. The fast diffusion of information and communication technologies (ICT such as mobile phones across Africa in the last years has resulted in an improved access and transfer of agricultural knowledge. Studies have shown that rural actors like farmers in remote areas even use mobile phones for their farming business. Based on qualitative interviews in the Mwanza Region in northwestern Tanzania, this study aims to identify and categorise the different types of knowledge which are transferred via mobile phones. Our results show that mobile phones enlarge the ability of farmers to access business-relevant knowledge at an increasing spatial scale. However, the effects of the use depend on the type of knowledge and other factors. The results add to existing studies by deepening the understanding of the benefits of ICT on knowledge access and transfer for the context of rural small-scale framers in Tanzania.

  19. Cerebral methodology based computing to estimate real phenomena from large-scale nuclear simulation

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2011-01-01

    Our final goal is to estimate real phenomena from large-scale nuclear simulations by using computing processes. Large-scale simulations mean that they include scale variety and physical complexity so that corresponding experiments and/or theories do not exist. In nuclear field, it is indispensable to estimate real phenomena from simulations in order to improve the safety and security of nuclear power plants. Here, the analysis of uncertainty included in simulations is needed to reveal sensitivity of uncertainty due to randomness, to reduce the uncertainty due to lack of knowledge and to lead a degree of certainty by verification and validation (V and V) and uncertainty quantification (UQ) processes. To realize this, we propose 'Cerebral Methodology based Computing (CMC)' as computing processes with deductive and inductive approaches by referring human reasoning processes. Our idea is to execute deductive and inductive simulations contrasted with deductive and inductive approaches. We have established its prototype system and applied it to a thermal displacement analysis of a nuclear power plant. The result shows that our idea is effective to reduce the uncertainty and to get the degree of certainty. (author)

  20. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  1. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  2. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  3. Introduction of an agent-based multi-scale modular architecture for dynamic knowledge representation of acute inflammation

    Directory of Open Access Journals (Sweden)

    An Gary

    2008-05-01

    Full Text Available Abstract Background One of the greatest challenges facing biomedical research is the integration and sharing of vast amounts of information, not only for individual researchers, but also for the community at large. Agent Based Modeling (ABM can provide a means of addressing this challenge via a unifying translational architecture for dynamic knowledge representation. This paper presents a series of linked ABMs representing multiple levels of biological organization. They are intended to translate the knowledge derived from in vitro models of acute inflammation to clinically relevant phenomenon such as multiple organ failure. Results and Discussion ABM development followed a sequence starting with relatively direct translation from in-vitro derived rules into a cell-as-agent level ABM, leading on to concatenated ABMs into multi-tissue models, eventually resulting in topologically linked aggregate multi-tissue ABMs modeling organ-organ crosstalk. As an underlying design principle organs were considered to be functionally composed of an epithelial surface, which determined organ integrity, and an endothelial/blood interface, representing the reaction surface for the initiation and propagation of inflammation. The development of the epithelial ABM derived from an in-vitro model of gut epithelial permeability is described. Next, the epithelial ABM was concatenated with the endothelial/inflammatory cell ABM to produce an organ model of the gut. This model was validated against in-vivo models of the inflammatory response of the gut to ischemia. Finally, the gut ABM was linked to a similarly constructed pulmonary ABM to simulate the gut-pulmonary axis in the pathogenesis of multiple organ failure. The behavior of this model was validated against in-vivo and clinical observations on the cross-talk between these two organ systems Conclusion A series of ABMs are presented extending from the level of intracellular mechanism to clinically observed behavior

  4. Access to Information About Stuttering and Societal Knowledge of Stuttering.

    Science.gov (United States)

    Gabel, Rodney; Brackenbury, Tim; Irani, Farzan

    2010-08-01

    The purpose of this study was to examine societal knowledge of stuttering, access to information sources, and the influence of information sources on knowledge of stuttering. 185 participants from Northwest Ohio were surveyed. Results of the study indicated that the general public varies in their knowledge of stuttering and that majority of participants had not accessed information about stuttering, and the few who had, did so a long time ago. Finally, access to information sources had little influence on knowledge of stuttering. Implications for future research are discussed.

  5. The changing face of informed surgical consent.

    LENUS (Irish Health Repository)

    Oosthuizen, J C

    2012-03-01

    To determine whether procedure-specific brochures improve patients\\' pre-operative knowledge, to determine the amount of information expected by patients during the consenting process, and to determine whether the recently proposed \\'Request for Treatment\\' consenting process is viable on a large scale.

  6. Modeling the impact of large-scale energy conversion systems on global climate

    International Nuclear Information System (INIS)

    Williams, J.

    There are three energy options which could satisfy a projected energy requirement of about 30 TW and these are the solar, nuclear and (to a lesser extent) coal options. Climate models can be used to assess the impact of large scale deployment of these options. The impact of waste heat has been assessed using energy balance models and general circulation models (GCMs). Results suggest that the impacts are significant when the heat imput is very high and studies of more realistic scenarios are required. Energy balance models, radiative-convective models and a GCM have been used to study the impact of doubling the atmospheric CO 2 concentration. State-of-the-art models estimate a surface temperature increase of 1.5-3.0 0 C with large amplification near the poles, but much uncertainty remains. Very few model studies have been made of the impact of particles on global climate, more information on the characteristics of particle input are required. The impact of large-scale deployment of solar energy conversion systems has received little attention but model studies suggest that large scale changes in surface characteristics associated with such systems (surface heat balance, roughness and hydrological characteristics and ocean surface temperature) could have significant global climatic effects. (Auth.)

  7. An Evaluation of Applying Knowledge Base to Academic Information Service

    OpenAIRE

    Seok-Hyoung Lee; Hwan-Min Kim; Ho-Seop Choe

    2013-01-01

    Through a series of precise text handling processes, including automatic extraction of information from documents with knowledge from various fields, recognition of entity names, detection of core topics, analysis of the relations between the extracted information and topics, and automatic inference of new knowledge, the most efficient knowledge base of the relevant field is created, and plans to apply these to the information knowledge management and service are the core requirements necessa...

  8. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  9. Nuclear information and knowledge, No. 7, June 2009

    International Nuclear Information System (INIS)

    Lecossois, B.

    2009-06-01

    This bi-annual newsletter reports on the activities of the INIS and Nuclear Knowledge Management Section of the IAEA Department of Nuclear Energy. Issue no. 7 is centred on cooperation and partnerships in nuclear information, focusing specifically on two international networks coordinated by the IAEA's nuclear information and knowledge management services: the International Nuclear Information System (INIS) and the International Nuclear Libraries Network (INLN). Table of contents : To our Readers; INIS and International Cooperation in Nuclear Information; INLN: Facilitating Exchange and Building Partnerships; News from INIS and NKM; Recent Publications; IAEA Library Update; 2009 Meetings

  10. Dynamic classification system in large-scale supervision of energy efficiency in buildings

    International Nuclear Information System (INIS)

    Kiluk, S.

    2014-01-01

    Highlights: • Rough set approximation of classification improves energy efficiency prediction. • Dynamic features of diagnostic classification allow for its precise prediction. • Indiscernibility in large population enhances identification of process features. • Diagnostic information can be refined by dynamic references to local neighbourhood. • We introduce data exploration validation based on system dynamics and uncertainty. - Abstract: Data mining and knowledge discovery applied to the billing data provide the diagnostic instruments for the evaluation of energy use in buildings connected to a district heating network. To ensure the validity of an algorithm-based classification system, the dynamic properties of a sequence of partitions for consecutive detected events were investigated. The information regarding the dynamic properties of the classification system refers to the similarities between the supervised objects and migrations that originate from the changes in the building energy use and loss similarity to their neighbourhood and thus represents the refinement of knowledge. In this study, we demonstrate that algorithm-based diagnostic knowledge has dynamic properties that can be exploited with a rough set predictor to evaluate whether the implementation of classification for supervision of energy use aligns with the dynamics of changes of district heating-supplied building properties. Moreover, we demonstrate the refinement of the current knowledge with the previous findings and we present the creation of predictive diagnostic systems based on knowledge dynamics with a satisfactory level of classification errors, even for non-stationary data

  11. Asynchronous Two-Level Checkpointing Scheme for Large-Scale Adjoints in the Spectral-Element Solver Nek5000

    Energy Technology Data Exchange (ETDEWEB)

    Schanen, Michel; Marin, Oana; Zhang, Hong; Anitescu, Mihai

    2016-01-01

    Adjoints are an important computational tool for large-scale sensitivity evaluation, uncertainty quantification, and derivative-based optimization. An essential component of their performance is the storage/recomputation balance in which efficient checkpointing methods play a key role. We introduce a novel asynchronous two-level adjoint checkpointing scheme for multistep numerical time discretizations targeted at large-scale numerical simulations. The checkpointing scheme combines bandwidth-limited disk checkpointing and binomial memory checkpointing. Based on assumptions about the target petascale systems, which we later demonstrate to be realistic on the IBM Blue Gene/Q system Mira, we create a model of the expected performance of our checkpointing approach and validate it using the highly scalable Navier-Stokes spectralelement solver Nek5000 on small to moderate subsystems of the Mira supercomputer. In turn, this allows us to predict optimal algorithmic choices when using all of Mira. We also demonstrate that two-level checkpointing is significantly superior to single-level checkpointing when adjoining a large number of time integration steps. To our knowledge, this is the first time two-level checkpointing had been designed, implemented, tuned, and demonstrated on fluid dynamics codes at large scale of 50k+ cores.

  12. Neighborhood Discriminant Hashing for Large-Scale Image Retrieval.

    Science.gov (United States)

    Tang, Jinhui; Li, Zechao; Wang, Meng; Zhao, Ruizhen

    2015-09-01

    With the proliferation of large-scale community-contributed images, hashing-based approximate nearest neighbor search in huge databases has aroused considerable interest from the fields of computer vision and multimedia in recent years because of its computational and memory efficiency. In this paper, we propose a novel hashing method named neighborhood discriminant hashing (NDH) (for short) to implement approximate similarity search. Different from the previous work, we propose to learn a discriminant hashing function by exploiting local discriminative information, i.e., the labels of a sample can be inherited from the neighbor samples it selects. The hashing function is expected to be orthogonal to avoid redundancy in the learned hashing bits as much as possible, while an information theoretic regularization is jointly exploited using maximum entropy principle. As a consequence, the learned hashing function is compact and nonredundant among bits, while each bit is highly informative. Extensive experiments are carried out on four publicly available data sets and the comparison results demonstrate the outperforming performance of the proposed NDH method over state-of-the-art hashing techniques.

  13. Nuclear information and knowledge. News from the INIS and Nuclear Knowledge Management Section. No. 3, March 2007

    International Nuclear Information System (INIS)

    Dyck, E.; Gowin, P.J.

    2007-03-01

    This newsletter, a bi-annual publication, is aimed at informing about current developments in Nuclear Knowledge Management (NKM) and the International Nuclear Information System (INIS), in particular about usage of nuclear information and developing nuclear knowledge management programmes. This third issue constitutes a review of the year 2006 in these fields and informs about some planned activities for 2007. In particular summaries are given about the IAEA Conference on Knowledge Management in Nuclear Facilities, the 33rd INIS Liaison Officer Meeting and the 2007 School of Nuclear Knowledge Management

  14. Efforts in improvement of nuclear knowledge and information management in Croatia

    International Nuclear Information System (INIS)

    Pleslic, S.; Novosel, N.

    2005-01-01

    The IAEA was authorised for exchange of technical and scientific information on peaceful uses of atomic energy and established INIS in 1970 as an international bibliographic database in the nuclear field and in nuclear related areas. Countries at different levels of technological development could derive benefits from INIS output products. The use of nuclear technology relies on the accumulation of knowledge in nuclear science and technology, including both technical information in documents and databases, and knowledge in human resources. Nuclear knowledge and information exchange are important for the process of decision-making. The IAEA supports all Members in systematic knowledge preservation and information exchange, who want to transfer their practical experience to the younger generation and to archive important information. Croatia is involved in activities in knowledge and information management since 1994 when she joined INIS. Thanks to development and application of new information technologies within the INIS information management framework, Members improve the collection, production and dissemination of nuclear knowledge and information. (author)

  15. The use of production management techniques in the construction of large scale physics detectors

    CERN Document Server

    Bazan, A; Estrella, F; Kovács, Z; Le Flour, T; Le Goff, J M; Lieunard, S; McClatchey, R; Murray, S; Varga, L Z; Vialle, J P; Zsenei, M

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so- called Workflow Management software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector ...

  16. How International Large-Scale Skills Assessments Engage with National Actors: Mobilising Networks through Policy, Media and Public Knowledge

    Science.gov (United States)

    Hamilton, Mary

    2017-01-01

    This paper examines how international, large-scale skills assessments (ILSAs) engage with the broader societies they seek to serve and improve. It looks particularly at the discursive work that is done by different interest groups and the media through which the findings become part of public conversations and are translated into usable form in…

  17. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  18. Visual Data-Analytics of Large-Scale Parallel Discrete-Event Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ross, Caitlin; Carothers, Christopher D.; Mubarak, Misbah; Carns, Philip; Ross, Robert; Li, Jianping Kelvin; Ma, Kwan-Liu

    2016-11-13

    Parallel discrete-event simulation (PDES) is an important tool in the codesign of extreme-scale systems because PDES provides a cost-effective way to evaluate designs of highperformance computing systems. Optimistic synchronization algorithms for PDES, such as Time Warp, allow events to be processed without global synchronization among the processing elements. A rollback mechanism is provided when events are processed out of timestamp order. Although optimistic synchronization protocols enable the scalability of large-scale PDES, the performance of the simulations must be tuned to reduce the number of rollbacks and provide an improved simulation runtime. To enable efficient large-scale optimistic simulations, one has to gain insight into the factors that affect the rollback behavior and simulation performance. We developed a tool for ROSS model developers that gives them detailed metrics on the performance of their large-scale optimistic simulations at varying levels of simulation granularity. Model developers can use this information for parameter tuning of optimistic simulations in order to achieve better runtime and fewer rollbacks. In this work, we instrument the ROSS optimistic PDES framework to gather detailed statistics about the simulation engine. We have also developed an interactive visualization interface that uses the data collected by the ROSS instrumentation to understand the underlying behavior of the simulation engine. The interface connects real time to virtual time in the simulation and provides the ability to view simulation data at different granularities. We demonstrate the usefulness of our framework by performing a visual analysis of the dragonfly network topology model provided by the CODES simulation framework built on top of ROSS. The instrumentation needs to minimize overhead in order to accurately collect data about the simulation performance. To ensure that the instrumentation does not introduce unnecessary overhead, we perform a

  19. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  20. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  1. The Need for Large-Scale, Longitudinal Empirical Studies in Middle Level Education Research

    Science.gov (United States)

    Mertens, Steven B.; Caskey, Micki M.; Flowers, Nancy

    2016-01-01

    This essay describes and discusses the ongoing need for large-scale, longitudinal, empirical research studies focused on middle grades education. After a statement of the problem and concerns, the essay describes and critiques several prior middle grades efforts and research studies. Recommendations for future research efforts to inform policy…

  2. Information technology, knowledge processes, and innovation success

    NARCIS (Netherlands)

    Song, X.M.; Zang, F.; Bij, van der J.D.; Weggeman, M.C.D.P.

    2001-01-01

    Despite the obvious linkage between information technologies (IT) and knowledge processes and the apparent strategic importance of both, little research has done to explicitly examine how, if at all, IT and knowledge processes affect firm outcomes. The purpose of this study is to bridge this

  3. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  4. An inertia-free filter line-search algorithm for large-scale nonlinear programming

    Energy Technology Data Exchange (ETDEWEB)

    Chiang, Nai-Yuan; Zavala, Victor M.

    2016-02-15

    We present a filter line-search algorithm that does not require inertia information of the linear system. This feature enables the use of a wide range of linear algebra strategies and libraries, which is essential to tackle large-scale problems on modern computing architectures. The proposed approach performs curvature tests along the search step to detect negative curvature and to trigger convexification. We prove that the approach is globally convergent and we implement the approach within a parallel interior-point framework to solve large-scale and highly nonlinear problems. Our numerical tests demonstrate that the inertia-free approach is as efficient as inertia detection via symmetric indefinite factorizations. We also demonstrate that the inertia-free approach can lead to reductions in solution time because it reduces the amount of convexification needed.

  5. Large scale mapping of groundwater resources using a highly integrated set of tools

    DEFF Research Database (Denmark)

    Søndergaard, Verner; Auken, Esben; Christiansen, Anders Vest

    large areas with information from an optimum number of new investigation boreholes, existing boreholes, logs and water samples to get an integrated and detailed description of the groundwater resources and their vulnerability.Development of more time efficient and airborne geophysical data acquisition...... platforms (e.g. SkyTEM) have made large-scale mapping attractive and affordable in the planning and administration of groundwater resources. The handling and optimized use of huge amounts of geophysical data covering large areas has also required a comprehensive database, where data can easily be stored...

  6. Mapping the distribution of the denitrifier community at large scales (Invited)

    Science.gov (United States)

    Philippot, L.; Bru, D.; Ramette, A.; Dequiedt, S.; Ranjard, L.; Jolivet, C.; Arrouays, D.

    2010-12-01

    Little information is available regarding the landscape-scale distribution of microbial communities and its environmental determinants. Here we combined molecular approaches and geostatistical modeling to explore spatial patterns of the denitrifying community at large scales. The distribution of denitrifrying community was investigated over 107 sites in Burgundy, a 31 500 km2 region of France, using a 16 X 16 km sampling grid. At each sampling site, the abundances of denitrifiers and 42 soil physico-chemical properties were measured. The relative contributions of land use, spatial distance, climatic conditions, time and soil physico-chemical properties to the denitrifier spatial distribution were analyzed by canonical variation partitioning. Our results indicate that 43% to 85% of the spatial variation in community abundances could be explained by the measured environmental parameters, with soil chemical properties (mostly pH) being the main driver. We found spatial autocorrelation up to 740 km and used geostatistical modelling to generate predictive maps of the distribution of denitrifiers at the landscape scale. Studying the distribution of the denitrifiers at large scale can help closing the artificial gap between the investigation of microbial processes and microbial community ecology, therefore facilitating our understanding of the relationships between the ecology of denitrifiers and N-fluxes by denitrification.

  7. Pooling knowledge and improving safety for contracted works at a large industrial park.

    Science.gov (United States)

    Agnello, Patrizia; Ansaldi, Silvia; Bragatto, Paolo

    2015-01-01

    At a large chemical park maintenance is contracted by the major companies operating the plants to many small firms. The cultural and psychological isolation of contractor workers was recognized a root cause of severe accidents in the recent years. That problem is common in chemical industry. The knowledge sharing has been assumed a good key to involve contractors and sub contractors in safety culture and contributing to injuries prevention. The selection of personal protective equipment PPE for the maintenance works has been taken as benchmark to demonstrate the adequateness of the proposed approach. To support plant operators, contractors and subcontractors in PPE discussion, a method has been developed. Its core is a knowledge-base, organized in an Ontology, as suitable for inferring decisions. By means of this tool all stakeholders have merged experience and information and find out the right PPE, to be provided, with adequate training and information package. PPE selection requires sound competencies about process and environmental hazards, including major accident, preventive and protective measures, maintenance activities. These pieces of knowledge previously fragmented among plant operators and contractors, have to be pooled, and used to find out the adequate PPE for a number of maintenance works. The PPE selection is per se important, but it is also a good chance to break the contractors' isolation and involve them in safety objectives. Thus by pooling experience and practical knowledge, the common understanding of safety issues has been strengthened.

  8. An improved method to characterise the modulation of small-scale turbulent by large-scale structures

    Science.gov (United States)

    Agostini, Lionel; Leschziner, Michael; Gaitonde, Datta

    2015-11-01

    A key aspect of turbulent boundary layer dynamics is ``modulation,'' which refers to degree to which the intensity of coherent large-scale structures (LS) cause an amplification or attenuation of the intensity of the small-scale structures (SS) through large-scale-linkage. In order to identify the variation of the amplitude of the SS motion, the envelope of the fluctuations needs to be determined. Mathis et al. (2009) proposed to define this latter by low-pass filtering the modulus of the analytic signal built from the Hilbert transform of SS. The validity of this definition, as a basis for quantifying the modulated SS signal, is re-examined on the basis of DNS data for a channel flow. The analysis shows that the modulus of the analytic signal is very sensitive to the skewness of its PDF, which is dependent, in turn, on the sign of the LS fluctuation and thus of whether these fluctuations are associated with sweeps or ejections. The conclusion is that generating an envelope by use of a low-pass filtering step leads to an important loss of information associated with the effects of the local skewness of the PDF of the SS on the modulation process. An improved Hilbert-transform-based method is proposed to characterize the modulation of SS turbulence by LS structures

  9. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    DEFF Research Database (Denmark)

    Jensen, Tue Vissing; Pinson, Pierre

    2017-01-01

    , we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven...... to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecastingof renewable power generation....

  10. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  11. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  12. Access to Information About Stuttering and Societal Knowledge of Stuttering

    OpenAIRE

    Gabel, Rodney; Brackenbury, Tim; Irani, Farzan

    2010-01-01

    The purpose of this study was to examine societal knowledge of stuttering, access to information sources, and the influence of information sources on knowledge of stuttering. 185 participants from Northwest Ohio were surveyed. Results of the study indicated that the general public varies in their knowledge of stuttering and that majority of participants had not accessed information about stuttering, and the few who had, did so a long time ago. Finally, access to information sources had little...

  13. Large-scale network dynamics of beta-band oscillations underlie auditory perceptual decision-making

    Directory of Open Access Journals (Sweden)

    Mohsen Alavash

    2017-06-01

    Full Text Available Perceptual decisions vary in the speed at which we make them. Evidence suggests that translating sensory information into perceptual decisions relies on distributed interacting neural populations, with decision speed hinging on power modulations of the neural oscillations. Yet the dependence of perceptual decisions on the large-scale network organization of coupled neural oscillations has remained elusive. We measured magnetoencephalographic signals in human listeners who judged acoustic stimuli composed of carefully titrated clouds of tone sweeps. These stimuli were used in two task contexts, in which the participants judged the overall pitch or direction of the tone sweeps. We traced the large-scale network dynamics of the source-projected neural oscillations on a trial-by-trial basis using power-envelope correlations and graph-theoretical network discovery. In both tasks, faster decisions were predicted by higher segregation and lower integration of coupled beta-band (∼16–28 Hz oscillations. We also uncovered the brain network states that promoted faster decisions in either lower-order auditory or higher-order control brain areas. Specifically, decision speed in judging the tone sweep direction critically relied on the nodal network configurations of anterior temporal, cingulate, and middle frontal cortices. Our findings suggest that global network communication during perceptual decision-making is implemented in the human brain by large-scale couplings between beta-band neural oscillations. The speed at which we make perceptual decisions varies. This translation of sensory information into perceptual decisions hinges on dynamic changes in neural oscillatory activity. However, the large-scale neural-network embodiment supporting perceptual decision-making is unclear. We addressed this question by experimenting two auditory perceptual decision-making situations. Using graph-theoretical network discovery, we traced the large-scale network

  14. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  15. Information and Knowledge Management at South African Law Firms

    Directory of Open Access Journals (Sweden)

    T du Plessis

    2011-07-01

    Full Text Available Global and national law firms alike operate in a challenging business environment and managing the firm's information and knowledge assets is increasingly viewed as a key factor in efficient legal service delivery. In legal practice, information management technologies, for example intranets, portals, workflow management systems, document and content management systems, case and project management systems and online dispute resolution systems are becoming important means of legal service delivery. The reason for applying information management technologies and implementing knowledge management strategies in law firms is not only to satisfy clients' growing need for a trusted online platform to interact with legal service providers, but for law firms to capitalise on their intellectual assets, to continuously modernise legal practice management, to empower lawyers, to increase productivity, to use time efficiently, to transfer skills and knowledge from senior to junior professionals, to improve service delivery and to gain competitive advantage. This article firstly reviews the role of information and knowledge management in providing an effective legal service to clients and compares foreign and South African law firms' information management related contexts, challenges and benefits. Secondly, it presents the findings of a survey conducted at South African law firms based on their knowledge management practices. The aim of the article is to provide insights into law firm knowledge management and its effect on providing legal services in an online business environment.

  16. An Algebraic Approach to Knowledge Bases Informational Equivalence

    OpenAIRE

    Plotkin, B.; Plotkin, T.

    2003-01-01

    In this paper we study the notion of knowledge from the positions of universal algebra and algebraic logic. We consider first order knowledge which is based on first order logic. We define categories of knowledge and knowledge bases. These notions are defined for the fixed subject of knowledge. The key notion of informational equivalence of two knowledge bases is introduced. We use the idea of equivalence of categories in this definition. We prove that for finite models there is a clear way t...

  17. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  18. Hypertension Knowledge-Level Scale (HK-LS): A Study on Development, Validity and Reliability

    OpenAIRE

    Erkoc, Sultan Baliz; Isikli, Burhanettin; Metintas, Selma; Kalyoncu, Cemalettin

    2012-01-01

    This study was conducted to develop a scale to measure knowledge about hypertension among Turkish adults. The Hypertension Knowledge-Level Scale (HK-LS) was generated based on content, face, and construct validity, internal consistency, test re-test reliability, and discriminative validity procedures. The final scale had 22 items with six sub-dimensions. The scale was applied to 457 individuals aged ≥18 years, and 414 of them were re-evaluated for test-retest reliability. The six sub-dimensio...

  19. How do the multiple large-scale climate oscillations trigger extreme precipitation?

    Science.gov (United States)

    Shi, Pengfei; Yang, Tao; Xu, Chong-Yu; Yong, Bin; Shao, Quanxi; Li, Zhenya; Wang, Xiaoyan; Zhou, Xudong; Li, Shu

    2017-10-01

    Identifying the links between variations in large-scale climate patterns and precipitation is of tremendous assistance in characterizing surplus or deficit of precipitation, which is especially important for evaluation of local water resources and ecosystems in semi-humid and semi-arid regions. Restricted by current limited knowledge on underlying mechanisms, statistical correlation methods are often used rather than physical based model to characterize the connections. Nevertheless, available correlation methods are generally unable to reveal the interactions among a wide range of climate oscillations and associated effects on precipitation, especially on extreme precipitation. In this work, a probabilistic analysis approach by means of a state-of-the-art Copula-based joint probability distribution is developed to characterize the aggregated behaviors for large-scale climate patterns and their connections to precipitation. This method is employed to identify the complex connections between climate patterns (Atlantic Multidecadal Oscillation (AMO), El Niño-Southern Oscillation (ENSO) and Pacific Decadal Oscillation (PDO)) and seasonal precipitation over a typical semi-humid and semi-arid region, the Haihe River Basin in China. Results show that the interactions among multiple climate oscillations are non-uniform in most seasons and phases. Certain joint extreme phases can significantly trigger extreme precipitation (flood and drought) owing to the amplification effect among climate oscillations.

  20. Informal Knowledge Institutions and Market Innovation by ...

    African Journals Online (AJOL)

    A framework is built in which small firms develop new market innovations through the utilization of knowledge acquired from informal (e.g. personal contact, network of friends, families etc.) institutions. Data was collected through a survey of 510 small and medium sized enterprises (SMEs) in knowledge intensive business ...

  1. Information Management for a Large Multidisciplinary Project

    Science.gov (United States)

    Jones, Kennie H.; Randall, Donald P.; Cronin, Catherine K.

    1992-01-01

    In 1989, NASA's Langley Research Center (LaRC) initiated the High-Speed Airframe Integration Research (HiSAIR) Program to develop and demonstrate an integrated environment for high-speed aircraft design using advanced multidisciplinary analysis and optimization procedures. The major goals of this program were to evolve the interactions among disciplines and promote sharing of information, to provide a timely exchange of information among aeronautical disciplines, and to increase the awareness of the effects each discipline has upon other disciplines. LaRC historically has emphasized the advancement of analysis techniques. HiSAIR was founded to synthesize these advanced methods into a multidisciplinary design process emphasizing information feedback among disciplines and optimization. Crucial to the development of such an environment are the definition of the required data exchanges and the methodology for both recording the information and providing the exchanges in a timely manner. These requirements demand extensive use of data management techniques, graphic visualization, and interactive computing. HiSAIR represents the first attempt at LaRC to promote interdisciplinary information exchange on a large scale using advanced data management methodologies combined with state-of-the-art, scientific visualization techniques on graphics workstations in a distributed computing environment. The subject of this paper is the development of the data management system for HiSAIR.

  2. Digital Learning Characteristics and Principles of Information Resources Knowledge Structuring

    Science.gov (United States)

    Belichenko, Margarita; Davidovitch, Nitza; Kravchenko, Yuri

    2017-01-01

    Analysis of principles knowledge representation in information systems led to the necessity of improving the structuring knowledge. It is caused by the development of software component and new possibilities of information technologies. The article combines methodological aspects of structuring knowledge and effective usage of information…

  3. Informed Consent - Attitudes, knowledge and information concerning prenatal examination

    DEFF Research Database (Denmark)

    Dahl, Katja; Kesmodel, Ulrik; Hvidman, Lone

    estimates is low and possible consequences if the test reveals a problem is seldom considered beforehand. A woman's attitude to prenatal examinations is found decisive for up-take of prenatal tests, with no association between a woman's attitude towards prenatal examinations and her knowledge of those tests....... Most women consider their doctor an important source of information, and state that information has influenced their decision.      Conclusions: Pregnant women favor prenatal examinations, but participation does not seem to be based on an informed consent....

  4. Cluster galaxy dynamics and the effects of large-scale environment

    Science.gov (United States)

    White, Martin; Cohn, J. D.; Smit, Renske

    2010-11-01

    Advances in observational capabilities have ushered in a new era of multi-wavelength, multi-physics probes of galaxy clusters and ambitious surveys are compiling large samples of cluster candidates selected in different ways. We use a high-resolution N-body simulation to study how the influence of large-scale structure in and around clusters causes correlated signals in different physical probes and discuss some implications this has for multi-physics probes of clusters (e.g. richness, lensing, Compton distortion and velocity dispersion). We pay particular attention to velocity dispersions, matching galaxies to subhaloes which are explicitly tracked in the simulation. We find that not only do haloes persist as subhaloes when they fall into a larger host, but groups of subhaloes retain their identity for long periods within larger host haloes. The highly anisotropic nature of infall into massive clusters, and their triaxiality, translates into an anisotropic velocity ellipsoid: line-of-sight galaxy velocity dispersions for any individual halo show large variance depending on viewing angle. The orientation of the velocity ellipsoid is correlated with the large-scale structure, and thus velocity outliers correlate with outliers caused by projection in other probes. We quantify this orientation uncertainty and give illustrative examples. Such a large variance suggests that velocity dispersion estimators will work better in an ensemble sense than for any individual cluster, which may inform strategies for obtaining redshifts of cluster members. We similarly find that the ability of substructure indicators to find kinematic substructures is highly viewing angle dependent. While groups of subhaloes which merge with a larger host halo can retain their identity for many Gyr, they are only sporadically picked up by substructure indicators. We discuss the effects of correlated scatter on scaling relations estimated through stacking, both analytically and in the simulations

  5. Plant phenomics and the need for physiological phenotyping across scales to narrow the genotype-to-phenotype knowledge gap

    DEFF Research Database (Denmark)

    Grosskinsky, Dominik Kilian; Svensgaard, Jesper; Christensen, Svend

    2015-01-01

    Plants are affected by complex genome×environment×management interactions which determine phenotypic plasticity as a result of the variability of genetic components. Whereas great advances have been made in the cost-efficient and high-throughput analyses of genetic information and non-invasive ph......Plants are affected by complex genome×environment×management interactions which determine phenotypic plasticity as a result of the variability of genetic components. Whereas great advances have been made in the cost-efficient and high-throughput analyses of genetic information and non......-invasive phenotyping, the large-scale analyses of the underlying physiological mechanisms lag behind. The external phenotype is determined by the sum of the complex interactions of metabolic pathways and intracellular regulatory networks that is reflected in an internal, physiological, and biochemical phenotype......, ultimately enabling the in silico assessment of responses under defined environments with advanced crop models. This will allow generation of robust physiological predictors also for complex traits to bridge the knowledge gap between genotype and phenotype for applications in breeding, precision farming...

  6. Nuclear information and knowledge. News from the INIS and Nuclear Knowledge Management Section. No. 2, September 2006

    International Nuclear Information System (INIS)

    2006-09-01

    This newsletter, a bi-annual publication, is aimed at informing about current developments in Nuclear Knowledge Management (NKM) and the International Nuclear Information System (INIS), in particular about usage of nuclear information and developing nuclear knowledge management programmes. This second issue constitutes a review of the year 2006 in these fields and informs about some planned activities for 2007. In particular the strategies, vision and mission of the International Nuclear Information System are outlined and the activities of the Nuclear Knowledge Management Unit in training the next generation of nuclear experts are described

  7. Organizational Influences on Interdisciplinary Interactions during Research and Design of Large-Scale Complex Engineered Systems

    Science.gov (United States)

    McGowan, Anna-Maria R.; Seifert, Colleen M.; Papalambros, Panos Y.

    2012-01-01

    The design of large-scale complex engineered systems (LaCES) such as an aircraft is inherently interdisciplinary. Multiple engineering disciplines, drawing from a team of hundreds to thousands of engineers and scientists, are woven together throughout the research, development, and systems engineering processes to realize one system. Though research and development (R&D) is typically focused in single disciplines, the interdependencies involved in LaCES require interdisciplinary R&D efforts. This study investigates the interdisciplinary interactions that take place during the R&D and early conceptual design phases in the design of LaCES. Our theoretical framework is informed by both engineering practices and social science research on complex organizations. This paper provides preliminary perspective on some of the organizational influences on interdisciplinary interactions based on organization theory (specifically sensemaking), data from a survey of LaCES experts, and the authors experience in the research and design. The analysis reveals couplings between the engineered system and the organization that creates it. Survey respondents noted the importance of interdisciplinary interactions and their significant benefit to the engineered system, such as innovation and problem mitigation. Substantial obstacles to interdisciplinarity are uncovered beyond engineering that include communication and organizational challenges. Addressing these challenges may ultimately foster greater efficiencies in the design and development of LaCES and improved system performance by assisting with the collective integration of interdependent knowledge bases early in the R&D effort. This research suggests that organizational and human dynamics heavily influence and even constrain the engineering effort for large-scale complex systems.

  8. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  9. Large-scale geographic variation in distribution and abundance of Australian deep-water kelp forests.

    Directory of Open Access Journals (Sweden)

    Ezequiel M Marzinelli

    Full Text Available Despite the significance of marine habitat-forming organisms, little is known about their large-scale distribution and abundance in deeper waters, where they are difficult to access. Such information is necessary to develop sound conservation and management strategies. Kelps are main habitat-formers in temperate reefs worldwide; however, these habitats are highly sensitive to environmental change. The kelp Ecklonia radiate is the major habitat-forming organism on subtidal reefs in temperate Australia. Here, we provide large-scale ecological data encompassing the latitudinal distribution along the continent of these kelp forests, which is a necessary first step towards quantitative inferences about the effects of climatic change and other stressors on these valuable habitats. We used the Autonomous Underwater Vehicle (AUV facility of Australia's Integrated Marine Observing System (IMOS to survey 157,000 m2 of seabed, of which ca 13,000 m2 were used to quantify kelp covers at multiple spatial scales (10-100 m to 100-1,000 km and depths (15-60 m across several regions ca 2-6° latitude apart along the East and West coast of Australia. We investigated the large-scale geographic variation in distribution and abundance of deep-water kelp (>15 m depth and their relationships with physical variables. Kelp cover generally increased with latitude despite great variability at smaller spatial scales. Maximum depth of kelp occurrence was 40-50 m. Kelp latitudinal distribution along the continent was most strongly related to water temperature and substratum availability. This extensive survey data, coupled with ongoing AUV missions, will allow for the detection of long-term shifts in the distribution and abundance of habitat-forming kelp and the organisms they support on a continental scale, and provide information necessary for successful implementation and management of conservation reserves.

  10. Journal of Information and Knowledge Management: Editorial Policies

    African Journals Online (AJOL)

    Focus and Scope. Information Impact: Journal of Information and Knowledge Management (IIJIKM) is a Journal of Library and Information Science published in Nigeria. IIJIKM is a peer review journal for librarians, information scientists, information specialist, library educators and other related practitioners to report their ...

  11. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  12. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  13. Hierarchical hybrid control of manipulators: Artificial intelligence in large scale integrated circuits

    Science.gov (United States)

    Greene, P. H.

    1972-01-01

    Both in practical engineering and in control of muscular systems, low level subsystems automatically provide crude approximations to the proper response. Through low level tuning of these approximations, the proper response variant can emerge from standardized high level commands. Such systems are expressly suited to emerging large scale integrated circuit technology. A computer, using symbolic descriptions of subsystem responses, can select and shape responses of low level digital or analog microcircuits. A mathematical theory that reveals significant informational units in this style of control and software for realizing such information structures are formulated.

  14. Large scale power suppression in a multifield landscape

    International Nuclear Information System (INIS)

    Blanco-Pillado, Jose J.; Frazer, Jonathan; Sousa, Kepa; Dias, Mafalda

    2015-01-01

    Power suppression of the cosmic microwave background on the largest observable scales could provide valuable clues about the particle physics underlying inflation. Here we consider the prospect of power suppression in the context of the multifield landscape. Based on the assumption that our observable universe emerges from a tunnelling event and that the relevant features originate purely from inflationary dynamics, we find that the power spectrum not only contains information on single-field dynamics, but also places strong constraints on all scalar fields present in the theory. We find that the simplest single-field models giving rise to power suppression do not generalise to multifield models in a straightforward way, as the resulting superhorizon evolution of the curvature perturbation tends to erase any power suppression present at horizon crossing. On the other hand, multifield effects do present a means of generating power suppression which to our knowledge has so far not been considered. We propose a mechanism to illustrate this, which we dub flume inflation

  15. Large scale power suppression in a multifield landscape

    Energy Technology Data Exchange (ETDEWEB)

    Blanco-Pillado, Jose J.; Frazer, Jonathan; Sousa, Kepa [Department of Theoretical Physics, Bizkaiako Campusa/Campus de Bizkaia, Posta Kodea 48940, Leioa, Bizkaia (Spain); Dias, Mafalda, E-mail: josejuan.blanco@ehu.es, E-mail: m.dias@sussex.ac.uk, E-mail: j.frazer@ucl.ac.uk, E-mail: kepa.sousa@ehu.es [Astronomy Centre, Department of Physics and Astronomy, School of Maths and Physical Sciences, University of Sussex, Pevensey II Building, Falmer, Brighton, BN1 9QH (United Kingdom)

    2015-08-01

    Power suppression of the cosmic microwave background on the largest observable scales could provide valuable clues about the particle physics underlying inflation. Here we consider the prospect of power suppression in the context of the multifield landscape. Based on the assumption that our observable universe emerges from a tunnelling event and that the relevant features originate purely from inflationary dynamics, we find that the power spectrum not only contains information on single-field dynamics, but also places strong constraints on all scalar fields present in the theory. We find that the simplest single-field models giving rise to power suppression do not generalise to multifield models in a straightforward way, as the resulting superhorizon evolution of the curvature perturbation tends to erase any power suppression present at horizon crossing. On the other hand, multifield effects do present a means of generating power suppression which to our knowledge has so far not been considered. We propose a mechanism to illustrate this, which we dub flume inflation.

  16. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  17. Competency development information system - Knowledge management based competency development management tool

    International Nuclear Information System (INIS)

    Aminuddin, R.; Zainuddin, Z.; Taib, Z.; Hamid, A.H.Ab.; Hamdan, S.N.

    2007-01-01

    information on business or division level mission, vision, objectives, strategies, projects and activities. From here the desired competencies are identified and broken down into knowledge and knowledge content. From this process the organization knowledge taxonomy is derived. The next process is the knowledge needs analysis conducted at group level and then at individual level. The level of all identified knowledge necessary to carryout planned projects and activities are assessed at group and individual level on a scale of 1-10. This process is conducted in a group lead by the group leader or manager. The knowledge profile that results is presented graphically and the knowledge gap that has to be filled through some learning initiatives is clearly portrayed Having identified the gap, the next task is to identify the knowledge sources in the form of books, journal articles, websites, laboratories, experts, vendors, electronic media and organised training and these are keyed into the system. At this stage individual staff would have enough information to plan his learning and knowledge acquisition. He would then plan his learning using the training plan module. He can learn through self directed learning or go for courses, seminars, attachments, scientific visit, or Masters and PhD. The time, place, budget and source of fund need to be determined. The staff biodata and development plan is also captured by the system. After implementing the training, the staff must submit a report and lessons learnt to the system. The system requires that the supervisor evaluates the training effectiveness, reviews recommendations and lessons learnt that was submitted and support and facilitate application of learning and implementation of any useful recommendations as a result of the training All the learning initiatives should increase the knowledge and competency level. This assessment is conducted on a regular basis to evaluate the effectiveness of learning initiatives and investment in

  18. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  19. Short views and hints on information, knowledge and scenarios

    DEFF Research Database (Denmark)

    Kristiansson, Michael

    2011-01-01

    This article discusses information and knowledge in relation to scenario planning (van der Heijden) and considers the type of information that is relied on/included as well as the nature of knowledge produced by the scenario planning sequence. In addition, the chief tools and processes drawn upon...

  20. Effects of informed consent for individual genome sequencing on relevant knowledge.

    Science.gov (United States)

    Kaphingst, K A; Facio, F M; Cheng, M-R; Brooks, S; Eidem, H; Linn, A; Biesecker, B B; Biesecker, L G

    2012-11-01

    Increasing availability of individual genomic information suggests that patients will need knowledge about genome sequencing to make informed decisions, but prior research is limited. In this study, we examined genome sequencing knowledge before and after informed consent among 311 participants enrolled in the ClinSeq™ sequencing study. An exploratory factor analysis of knowledge items yielded two factors (sequencing limitations knowledge; sequencing benefits knowledge). In multivariable analysis, high pre-consent sequencing limitations knowledge scores were significantly related to education [odds ratio (OR): 8.7, 95% confidence interval (CI): 2.45-31.10 for post-graduate education, and OR: 3.9; 95% CI: 1.05, 14.61 for college degree compared with less than college degree] and race/ethnicity (OR: 2.4, 95% CI: 1.09, 5.38 for non-Hispanic Whites compared with other racial/ethnic groups). Mean values increased significantly between pre- and post-consent for the sequencing limitations knowledge subscale (6.9-7.7, p benefits knowledge subscale (7.0-7.5, p < 0.0001); increase in knowledge did not differ by sociodemographic characteristics. This study highlights gaps in genome sequencing knowledge and underscores the need to target educational efforts toward participants with less education or from minority racial/ethnic groups. The informed consent process improved genome sequencing knowledge. Future studies could examine how genome sequencing knowledge influences informed decision making. © 2012 John Wiley & Sons A/S.

  1. Enhancing health policymakers' information literacy knowledge and skill for policymaking on control of infectious diseases of poverty in Nigeria.

    Science.gov (United States)

    Uneke, Chigozie Jesse; Ezeoha, Abel Ebeh; Uro-Chukwu, Henry; Ezeonu, Chinonyelum Thecla; Ogbu, Ogbonnaya; Onwe, Friday; Edoga, Chima

    2015-01-01

    In Nigeria, one of the major challenges associated with evidence-to-policy link in the control of infectious diseases of poverty (IDP), is deficient information literacy knowledge and skill among policymakers. There is need for policymakers to acquire the skill to discover relevant information, accurately evaluate retrieved information and to apply it correctly. To use information literacy tool of International Network for Availability of Scientific Publications (INASP) to enhance policymakers' knowledge and skill for policymaking on control of IDP in Nigeria. Modified "before and after" intervention study design was used in which outcomes were measured on target participants both before the intervention is implemented and after. This study was conducted in Ebonyi State, south-eastern Nigeria and participants were career health policy makers. A two-day health-policy information literacy training workshop was organized to enhance participants" information literacy capacity. Topics covered included: introduction to information literacy; defining information problem; searching for information online; evaluating information; science information; knowledge sharing interviews; and training skills. A total of 52 policymakers attended the workshop. The pre-workshop mean rating (MNR) of knowledge and capacity for information literacy ranged from 2.15-2.97, while the post-workshop MNR ranged from 3.34-3.64 on 4-point scale. The percentage increase in MNR of knowledge and capacity at the end of the workshop ranged from 22.6%-55.3%. The results of this study suggest that through information literacy training workshop policy makers can acquire the knowledge and skill to identify, capture and share the right kind of information in the right contexts to influence relevant action or a policy decision.

  2. Column-oriented datalog materialization for large knowledge graphs

    NARCIS (Netherlands)

    Urbani, Jacopo; Jacobs, Ceriel; Krötzsch, Markus

    2016-01-01

    The evaluation of Datalog rules over large Knowledge Graphs (KGs) is essential for many applications. In this paper, we present a new method of materializing Datalog inferences, which combines a column-based memory layout with novel optimization methods that avoid redundant inferences at runtime.

  3. Governance, Scale and the Environment: The Importance of Recognizing Knowledge Claims in Transdisciplinary Arenas

    Directory of Open Access Journals (Sweden)

    Marleen Buizer

    2011-03-01

    Full Text Available Any present day approach of the world's most pressing environmental problems involves both scale and governance issues. After all, current local events might have long-term global consequences (the scale issue and solving complex environmental problems requires policy makers to think and govern beyond generally used time-space scales (the governance issue. To an increasing extent, the various scientists in these fields have used concepts like social-ecological systems, hierarchies, scales and levels to understand and explain the "complex cross-scale dynamics" of issues like climate change. A large part of this work manifests a realist paradigm: the scales and levels, either in ecological processes or in governance systems, are considered as "real". However, various scholars question this position and claim that scales and levels are continuously (reconstructed in the interfaces of science, society, politics and nature. Some of these critics even prefer to adopt a non-scalar approach, doing away with notions such as hierarchy, scale and level. Here we take another route, however. We try to overcome the realist-constructionist dualism by advocating a dialogue between them on the basis of exchanging and reflecting on different knowledge claims in transdisciplinary arenas. We describe two important developments, one in the ecological scaling literature and the other in the governance literature, which we consider to provide a basis for such a dialogue. We will argue that scale issues, governance practices as well as their mutual interdependencies should be considered as human constructs, although dialectically related to nature's materiality, and therefore as contested processes, requiring intensive and continuous dialogue and cooperation among natural scientists, social scientists, policy makers and citizens alike. They also require critical reflection on scientists' roles and on academic practices in general. Acknowledging knowledge claims

  4. Ontology-aided annotation, visualization and generalization of geological time-scale information from online geological map services

    NARCIS (Netherlands)

    Ma, X.; Carranza, E.J.M.; Wu, C.; Meer, F.D. van der

    2012-01-01

    Geological maps are increasingly published and shared online, whereas tools and services supporting information retrieval and knowledge discovery are underdeveloped. In this study, we developed an ontology of geological time scale by using a RDF (Resource Description Framework) model to represent

  5. Ontology-aided annotation, visualization and generalization of geological time scale information from online geological map services

    NARCIS (Netherlands)

    Ma, Marshal; Ma, X.; Carranza, E.J.M; Wu, C.; van der Meer, F.D.

    2012-01-01

    Geological maps are increasingly published and shared online, whereas tools and services supporting information retrieval and knowledge discovery are underdeveloped. In this study, we developed an ontology of geological time scale by using a Resource Description Framework model to represent the

  6. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  7. Literature Review: Herbal Medicine Treatment after Large-Scale Disasters.

    Science.gov (United States)

    Takayama, Shin; Kaneko, Soichiro; Numata, Takehiro; Kamiya, Tetsuharu; Arita, Ryutaro; Saito, Natsumi; Kikuchi, Akiko; Ohsawa, Minoru; Kohayagawa, Yoshitaka; Ishii, Tadashi

    2017-01-01

    Large-scale natural disasters, such as earthquakes, tsunamis, volcanic eruptions, and typhoons, occur worldwide. After the Great East Japan earthquake and tsunami, our medical support operation's experiences suggested that traditional medicine might be useful for treating the various symptoms of the survivors. However, little information is available regarding herbal medicine treatment in such situations. Considering that further disasters will occur, we performed a literature review and summarized the traditional medicine approaches for treatment after large-scale disasters. We searched PubMed and Cochrane Library for articles written in English, and Ichushi for those written in Japanese. Articles published before 31 March 2016 were included. Keywords "disaster" and "herbal medicine" were used in our search. Among studies involving herbal medicine after a disaster, we found two randomized controlled trials investigating post-traumatic stress disorder (PTSD), three retrospective investigations of trauma or common diseases, and seven case series or case reports of dizziness, pain, and psychosomatic symptoms. In conclusion, herbal medicine has been used to treat trauma, PTSD, and other symptoms after disasters. However, few articles have been published, likely due to the difficulty in designing high quality studies in such situations. Further study will be needed to clarify the usefulness of herbal medicine after disasters.

  8. Analyzing Subject Disciplines of Knowledge Originality and Knowledge Generality for Library & Information Science

    Directory of Open Access Journals (Sweden)

    Mu-Hsuan Huang

    2007-12-01

    Full Text Available This study used bibliometric methods to analyze subject disciplines of knowledge originality and knowledge generality for Library and Information Science (LIS by using citing and cited documents from 1997 to 2006. We found that the major subject disciplines of knowledge originality and generality are still LIS, and computer science and LIS interact and influence each other closely. It is evident that number of subject disciplines of knowledge originality is higher than that of knowledge generality. The interdisciplinary characteristics of LIS are illustrated by variety areas of knowledge originality and knowledge generality. Because the number of received subject disciplines is higher than that of given subject disciplines, it suggests that LIS is an application-oriented research area. [Article content in Chinese

  9. Journal of Information and Knowledge Management Dorcas Ejemeh

    African Journals Online (AJOL)

    Information Impact | Journal of Information and Knowledge Management

    Information Impact | Journal of Information and Knowledge Management. 99. Dorcas Ejemeh .... The proliferation of information in a world driven by technology requires a .... Anthony Comper, the President of the Bank of Montreal, told the 1999 ...

  10. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  11. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  12. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  13. Testing Inflation with Large Scale Structure: Connecting Hopes with Reality

    International Nuclear Information System (INIS)

    Alvarez, Marcello; Baldauf, T.; Bond, J. Richard; Dalal, N.; Putter, R. D.; Dore, O.; Green, Daniel; Hirata, Chris; Huang, Zhiqi; Huterer, Dragan; Jeong, Donghui; Johnson, Matthew C.; Krause, Elisabeth; Loverde, Marilena; Meyers, Joel; Meeburg, Daniel; Senatore, Leonardo; Shandera, Sarah; Silverstein, Eva; Slosar, Anze; Smith, Kendrick; Zaldarriaga, Matias; Assassi, Valentin; Braden, Jonathan; Hajian, Amir; Kobayashi, Takeshi; Stein, George; Engelen, Alexander van

    2014-01-01

    The statistics of primordial curvature fluctuations are our window into the period of inflation, where these fluctuations were generated. To date, the cosmic microwave background has been the dominant source of information about these perturbations. Large-scale structure is, however, from where drastic improvements should originate. In this paper, we explain the theoretical motivations for pursuing such measurements and the challenges that lie ahead. In particular, we discuss and identify theoretical targets regarding the measurement of primordial non-Gaussianity. We argue that when quantified in terms of the local (equilateral) template amplitude floc\

  14. Knowledge Management and Information Technology (Know-IT Encyclopedia)

    National Research Council Canada - National Science Library

    Pollock, Neal

    2002-01-01

    .... This encyclopedia is an attempt to create and distribute a knowledge-level tool. Much of it is tacit knowledge taken from the author's experience on-the-job at the Program Executive Office for Information Technology (PEO...

  15. How Can the Evidence from Global Large-scale Clinical Trials for Cardiovascular Diseases be Improved?

    Science.gov (United States)

    Sawata, Hiroshi; Tsutani, Kiichiro

    2011-06-29

    Clinical investigations are important for obtaining evidence to improve medical treatment. Large-scale clinical trials with thousands of participants are particularly important for this purpose in cardiovascular diseases. Conducting large-scale clinical trials entails high research costs. This study sought to investigate global trends in large-scale clinical trials in cardiovascular diseases. We searched for trials using clinicaltrials.gov (URL: http://www.clinicaltrials.gov/) using the key words 'cardio' and 'event' in all fields on 10 April, 2010. We then selected trials with 300 or more participants examining cardiovascular diseases. The search revealed 344 trials that met our criteria. Of 344 trials, 71% were randomized controlled trials, 15% involved more than 10,000 participants, and 59% were funded by industry. In RCTs whose results were disclosed, 55% of industry-funded trials and 25% of non-industry funded trials reported statistically significant superiority over control (p = 0.012, 2-sided Fisher's exact test). Our findings highlighted concerns regarding potential bias related to funding sources, and that researchers should be aware of the importance of trial information disclosures and conflicts of interest. We should keep considering management and training regarding information disclosures and conflicts of interest for researchers. This could lead to better clinical evidence and further improvements in the development of medical treatment worldwide.

  16. Knowledge management, health information technology and nurses' work engagement.

    Science.gov (United States)

    Hendriks, Paul H J; Ligthart, Paul E M; Schouteten, Roel L J

    2016-01-01

    Knowledge management (KM) extends the health information technology (HIT) literature by addressing its impact on creating knowledge by sharing and using the knowledge of health care professionals in hospitals. The aim of the study was to provide insight into how HIT affects nurses' explicit and tacit knowledge of their ongoing work processes and work engagement. Data were collected from 74 nurses in four wards of a Dutch hospital via a paper-and-pencil survey using validated measurement instruments. In a quasiexperimental research design, HIT was introduced in the two experimental wards in contrast to the two control wards. At the time of the HIT introduction, a pretest was administered in all four wards and was followed by a posttest after 3 months. Data were analyzed via partial least squares modeling. Generally, nurses' tacit knowledge (i.e., their insight into and their capacity to make sense of the work processes) appears to be a significant and strong predictor of their work engagement. In contrast, nurses' explicit knowledge (i.e., information feedback about patients and tasks) only indirectly affects work engagement via its effect on tacit knowledge. Its effect on work engagement therefore depends on the mediating role of tacit knowledge. Interestingly, introducing HIT significantly affects only nurses' explicit knowledge, not their tacit knowledge or work engagement. Nurses' tacit and explicit knowledge needs to be systematically distinguished when implementing HIT/KM programs to increase work engagement in the workplace. Tacit knowledge (insight into work processes) appears to be pivotal, whereas efforts aimed only at improving available information will not lead to a higher level of work engagement in nurses' work environments.

  17. Translation and validation of the Greek version of the hypertension knowledge-level scale.

    Science.gov (United States)

    Chatziefstratiou, Anastasia A; Giakoumidakis, Konstantinos; Fotos, Nikolaos V; Baltopoulos, George; Brokalaki-Pananoudaki, Hero

    2015-12-01

    To translate and validate a Greek version of the Hypertension Knowledge-Level Scale. The major barrier in the management of hypertension is the lack of adherence to medications and lifestyle adjustments. Patients' knowledge of the nature of hypertension and cardiovascular risk factors is a significant factor affecting individuals' adherence. However, few instruments have been developed to assess patients' knowledge level and no one has been translated into Greek. This study used a case control study design. Data collection for this research occurred between February 7, 2013 and March 10, 2013. The sample included both hypertensives and non-hypertensives. Participants simultaneously completed the version of the Hypertension Knowledge-Level Scale. A total of 68 individuals completed the questionnaire. Coefficient alpha was 0·66 for hypertensives and 0·79 for non-hypertensives. The difference for the mean scores in the entire scale between the two samples was statistically significant. In addition, significant differences were observed in many sub-dimensions and no correlation was found between level, knowledge and age, gender and education level. Findings provide support for the validity of the Greek version of the Hypertension Knowledge-Level Scale. The translation and validation of an instrument evaluating the level of knowledge of hypertension contribute to assessing the provided educational intervention. Low knowledge level should lead to the development of new methods of education, therefore nurses will have the opportunity to amplify their role in patients' education and develop relationships based on honesty and respect. © 2015 John Wiley & Sons Ltd.

  18. Knowledge structure representation and automated updates in intelligent information management systems

    Science.gov (United States)

    Corey, Stephen; Carnahan, Richard S., Jr.

    1990-01-01

    A continuing effort to apply rapid prototyping and Artificial Intelligence techniques to problems associated with projected Space Station-era information management systems is examined. In particular, timely updating of the various databases and knowledge structures within the proposed intelligent information management system (IIMS) is critical to support decision making processes. Because of the significantly large amounts of data entering the IIMS on a daily basis, information updates will need to be automatically performed with some systems requiring that data be incorporated and made available to users within a few hours. Meeting these demands depends first, on the design and implementation of information structures that are easily modified and expanded, and second, on the incorporation of intelligent automated update techniques that will allow meaningful information relationships to be established. Potential techniques are studied for developing such an automated update capability and IIMS update requirements are examined in light of results obtained from the IIMS prototyping effort.

  19. An Anesthesia Preinduction Checklist to Improve Information Exchange, Knowledge of Critical Information, Perception of Safety, and Possibly Perception of Teamwork in Anesthesia Teams.

    Science.gov (United States)

    Tscholl, David W; Weiss, Mona; Kolbe, Michaela; Staender, Sven; Seifert, Burkhardt; Landert, Daniel; Grande, Bastian; Spahn, Donat R; Noethiger, Christoph B

    2015-10-01

    An anesthesia preinduction checklist (APIC) to be performed before anesthesia induction was introduced and evaluated with respect to 5 team-level outcomes, each being a surrogate end point for patient safety: information exchange (the percentage of checklist items exchanged by a team, out of 12 total items); knowledge of critical information (the percentage of critical information items out of 5 total items such as allergies, reported as known by the members of a team); team members' perceptions of safety (the median scores given by the members of a team on a continuous rating scale); their perception of teamwork (the median scores given by the members of a team on a continuous rating scale); and clinical performance (the percentage of completed items out of 14 required tasks, e.g., suction device checked). A prospective interventional study comparing anesthesia teams using the APIC with a control group not using the APIC was performed using a multimethod design. Trained observers rated information exchange and clinical performance during on-site observations of anesthesia inductions. After the observations, each team member indicated the critical information items they knew and their perceptions of safety and teamwork. One hundred five teams using the APIC were compared with 100 teams not doing so. The medians of the team-level outcome scores in the APIC group versus the control group were as follows: information exchange: 100% vs 33% (P safety: 91% vs 84% (P improves information exchange, knowledge of critical information, and perception of safety in anesthesia teams-all parameters contributing to patient safety. There was a trend indicating improved perception of teamwork.

  20. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  1. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  2. 78 FR 70076 - Large Scale Networking (LSN)-Middleware and Grid Interagency Coordination (MAGIC) Team

    Science.gov (United States)

    2013-11-22

    ... projects. The MAGIC Team reports to the Large Scale Networking (LSN) Coordinating Group (CG). Public... Coordination (MAGIC) Team AGENCY: The Networking and Information Technology Research and Development (NITRD... MAGIC Team meetings are held on the first Wednesday of each month, 2:00-4:00 p.m., at the National...

  3. A route to explosive large-scale magnetic reconnection in a super-ion-scale current sheet

    Directory of Open Access Journals (Sweden)

    K. G. Tanaka

    2009-01-01

    Full Text Available How to trigger magnetic reconnection is one of the most interesting and important problems in space plasma physics. Recently, electron temperature anisotropy (αeo=Te⊥/Te|| at the center of a current sheet and non-local effect of the lower-hybrid drift instability (LHDI that develops at the current sheet edges have attracted attention in this context. In addition to these effects, here we also study the effects of ion temperature anisotropy (αio=Ti⊥/Ti||. Electron anisotropy effects are known to be helpless in a current sheet whose thickness is of ion-scale. In this range of current sheet thickness, the LHDI effects are shown to weaken substantially with a small increase in thickness and the obtained saturation level is too low for a large-scale reconnection to be achieved. Then we investigate whether introduction of electron and ion temperature anisotropies in the initial stage would couple with the LHDI effects to revive quick triggering of large-scale reconnection in a super-ion-scale current sheet. The results are as follows. (1 The initial electron temperature anisotropy is consumed very quickly when a number of minuscule magnetic islands (each lateral length is 1.5~3 times the ion inertial length form. These minuscule islands do not coalesce into a large-scale island to enable large-scale reconnection. (2 The subsequent LHDI effects disturb the current sheet filled with the small islands. This makes the triggering time scale to be accelerated substantially but does not enhance the saturation level of reconnected flux. (3 When the ion temperature anisotropy is added, it survives through the small island formation stage and makes even quicker triggering to happen when the LHDI effects set-in. Furthermore the saturation level is seen to be elevated by a factor of ~2 and large-scale reconnection is achieved only in this case. Comparison with two-dimensional simulations that exclude the LHDI effects confirms that the saturation level

  4. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  5. Information Visualization for Agile Development in Large‐Scale Organizations

    OpenAIRE

    Manzoor, Numan; Shahzad, Umar

    2012-01-01

    Context: Agile/lean development has been successful situations where small teams collaborate over long periods of time with project stakeholders. Unclear is how such teams plan and coordinate their work in such situations where inter-dependencies with other projects exist. In large organizations, scattered teams and complex team structure makes it difficult for every stakeholder to have a clear understanding of project information. These factors make it difficult for large‐scale organizations...

  6. Letter-Sound Knowledge: Exploring Gender Differences in Children When They Start School Regarding Knowledge of Large Letters, Small Letters, Sound Large Letters, and Sound Small Letters

    Directory of Open Access Journals (Sweden)

    Hermundur Sigmundsson

    2017-09-01

    Full Text Available This study explored whether there is a gender difference in letter-sound knowledge when children start at school. 485 children aged 5–6 years completed assessment of letter-sound knowledge, i.e., large letters; sound of large letters; small letters; sound of small letters. The findings indicate a significant difference between girls and boys in all four factors tested in this study in favor of the girls. There are still no clear explanations to the basis of a presumed gender difference in letter-sound knowledge. That the findings have origin in neuro-biological factors cannot be excluded, however, the fact that girls probably have been exposed to more language experience/stimulation compared to boys, lends support to explanations derived from environmental aspects.

  7. New approach to knowledge and information exchange

    International Nuclear Information System (INIS)

    Pleslic, S.; Novosel, N.

    2004-01-01

    The International Atomic Energy Agency (IAEA, Vienna, Austria) founded in 1957 as an autonomous intergovernmental organization. The Agency is authorized for exchange of technical and scientific information on peaceful uses of atomic energy. Also, applications of isotopes and nuclear power expanded rapidly during sixties of the 20th century. The output of the related scientific literature was increased by all achievements and developments. It was necessary to have an instrument for a comprehensive and systematic dissemination of all information and knowledge from these fields of science. With this goal International Nuclear Information System (INIS) was established in 1970 from International Atomic Energy Agency as an international bibliographic database in the nuclear field and in nuclear related areas. INIS becomes a big technological and science information system with 127 Members (108 countries and 19 international organizations). Expert participation from countries spread over a wide range of technological capability and infrastructure availability allowed INIS to assimilate all useful innovations in information technology into the INIS work. Countries at different levels of technological development could derive benefits from the output products but most of members are developing countries, in which the major population and economic growth is expected. A critical problem for future development is need for non-fossil and clean (in ecological sense) energy. Probably nuclear energy is not the best and only solution but it is obvious that nuclear sources of energy as a major energy sources are important for the future energy systems. Also, energy source problems should be solved according to demands for sustainable development. Nuclear energy and all application of it, including nuclear and radiation techniques, are very important for such development. Application of all techniques of interest in different areas such as medicine, agriculture, water resource

  8. New approach to knowledge and information exchange

    Energy Technology Data Exchange (ETDEWEB)

    Pleslic, S [University of Zagreb, Faculty of Electrical Engineering and Computing, Department of Applied Physics, Zagreb (Croatia); Novosel, N [Ministry of Economy of the Republic of Croatia, Zagreb (Croatia)

    2004-07-01

    The International Atomic Energy Agency (IAEA, Vienna, Austria) founded in 1957 as an autonomous intergovernmental organization. The Agency is authorized for exchange of technical and scientific information on peaceful uses of atomic energy. Also, applications of isotopes and nuclear power expanded rapidly during sixties of the 20th century. The output of the related scientific literature was increased by all achievements and developments. It was necessary to have an instrument for a comprehensive and systematic dissemination of all information and knowledge from these fields of science. With this goal International Nuclear Information System (INIS) was established in 1970 from International Atomic Energy Agency as an international bibliographic database in the nuclear field and in nuclear related areas. INIS becomes a big technological and science information system with 127 Members (108 countries and 19 international organizations). Expert participation from countries spread over a wide range of technological capability and infrastructure availability allowed INIS to assimilate all useful innovations in information technology into the INIS work. Countries at different levels of technological development could derive benefits from the output products but most of members are developing countries, in which the major population and economic growth is expected. A critical problem for future development is need for non-fossil and clean (in ecological sense) energy. Probably nuclear energy is not the best and only solution but it is obvious that nuclear sources of energy as a major energy sources are important for the future energy systems. Also, energy source problems should be solved according to demands for sustainable development. Nuclear energy and all application of it, including nuclear and radiation techniques, are very important for such development. Application of all techniques of interest in different areas such as medicine, agriculture, water resource

  9. Evolutionary Hierarchical Multi-Criteria Metaheuristics for Scheduling in Large-Scale Grid Systems

    CERN Document Server

    Kołodziej, Joanna

    2012-01-01

    One of the most challenging issues in modelling today's large-scale computational systems is to effectively manage highly parametrised distributed environments such as computational grids, clouds, ad hoc networks and P2P networks. Next-generation computational grids must provide a wide range of services and high performance computing infrastructures. Various types of information and data processed in the large-scale dynamic grid environment may be incomplete, imprecise, and fragmented, which complicates the specification of proper evaluation criteria and which affects both the availability of resources and the final collective decisions of users. The complexity of grid architectures and grid management may also contribute towards higher energy consumption. All of these issues necessitate the development of intelligent resource management techniques, which are capable of capturing all of this complexity and optimising meaningful metrics for a wide range of grid applications.   This book covers hot topics in t...

  10. Large-scale dynamic compaction demonstration using WIPP salt: Fielding and preliminary results

    International Nuclear Information System (INIS)

    Ahrens, E.H.; Hansen, F.D.

    1995-10-01

    Reconsolidation of crushed rock salt is a phenomenon of great interest to programs studying isolation of hazardous materials in natural salt geologic settings. Of particular interest is the potential for disaggregated salt to be restored to nearly an impermeable state. For example, reconsolidated crushed salt is proposed as a major shaft seal component for the Waste Isolation Pilot Plant (WIPP) Project. The concept for a permanent shaft seal component of the WIPP repository is to densely compact crushed salt in the four shafts; an effective seal will then be developed as the surrounding salt creeps into the shafts, further consolidating the crushed salt. Fundamental information on placement density and permeability is required to ensure attainment of the design function. The work reported here is the first large-scale compaction demonstration to provide information on initial salt properties applicable to design, construction, and performance expectations. The shaft seals must function for 10,000 years. Over this period a crushed salt mass will become less permeable as it is compressed by creep closure of salt surrounding the shaft. These facts preclude the possibility of conducting a full-scale, real-time field test. Because permanent seals taking advantage of salt reconsolidation have never been constructed, performance measurements have not been made on an appropriately large scale. An understanding of potential construction methods, achievable initial density and permeability, and performance of reconsolidated salt over time is required for seal design and performance assessment. This report discusses fielding and operations of a nearly full-scale dynamic compaction of mine-run WIPP salt, and presents preliminary density and in situ (in place) gas permeability results

  11. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  12. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  13. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  14. Large-scale visualization system for grid environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of Japan Atomic Energy Agency (CCSE/JAEA) has been conducting R and Ds of distributed computing (grid computing) environments: Seamless Thinking Aid (STA), Information Technology Based Laboratory (ITBL) and Atomic Energy Grid InfraStructure (AEGIS). In these R and Ds, we have developed the visualization technology suitable for the distributed computing environment. As one of the visualization tools, we have developed the Parallel Support Toolkit (PST) which can execute the visualization process parallely on a computer. Now, we improve PST to be executable simultaneously on multiple heterogeneous computers using Seamless Thinking Aid Message Passing Interface (STAMPI). STAMPI, we have developed in these R and Ds, is the MPI library executable on a heterogeneous computing environment. The improvement realizes the visualization of extremely large-scale data and enables more efficient visualization processes in a distributed computing environment. (author)

  15. Scaling of an information system in a public healthcare market--infrastructuring from the vendor's perspective.

    Science.gov (United States)

    Johannessen, Liv Karen; Obstfelder, Aud; Lotherington, Ann Therese

    2013-05-01

    The purpose of this paper is to explore the making and scaling of information infrastructures, as well as how the conditions for scaling a component may change for the vendor. The first research question is how the making and scaling of a healthcare information infrastructure can be done and by whom. The second question is what scope for manoeuvre there might be for vendors aiming to expand their market. This case study is based on an interpretive approach, whereby data is gathered through participant observation and semi-structured interviews. A case study of the making and scaling of an electronic system for general practitioners ordering laboratory services from hospitals is described as comprising two distinct phases. The first may be characterized as an evolving phase, when development, integration and implementation were achieved in small steps, and the vendor, together with end users, had considerable freedom to create the solution according to the users' needs. The second phase was characterized by a large-scale procurement process over which regional healthcare authorities exercised much more control and the needs of groups other than the end users influenced the design. The making and scaling of healthcare information infrastructures is not simply a process of evolution, in which the end users use and change the technology. It also consists of large steps, during which different actors, including vendors and healthcare authorities, may make substantial contributions. This process requires work, negotiation and strategies. The conditions for the vendor may change dramatically, from considerable freedom and close relationships with users and customers in the small-scale development, to losing control of the product and being required to engage in more formal relations with customers in the wider public healthcare market. Onerous procurement processes may be one of the reasons why large-scale implementation of information projects in healthcare is difficult

  16. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  17. Small-scale microwave background anisotropies implied by large-scale data

    Science.gov (United States)

    Kashlinsky, A.

    1993-01-01

    In the absence of reheating microwave background radiation (MBR) anisotropies on arcminute scales depend uniquely on the amplitude and the coherence length of the primordial density fluctuations (PDFs). These can be determined from the recent data on galaxy correlations, xi(r), on linear scales (APM survey). We develop here expressions for the MBR angular correlation function, C(theta), on arcminute scales in terms of the power spectrum of PDFs and demonstrate their accuracy by comparing with detailed calculations of MBR anisotropies. We then show how to evaluate C(theta) directly in terms of the observed xi(r) and show that the APM data give information on the amplitude, C(O), and the coherence angle of MBR anisotropies on small scales.

  18. A fast approach to generate large-scale topographic maps based on new Chinese vehicle-borne Lidar system

    International Nuclear Information System (INIS)

    Youmei, Han; Bogang, Yang

    2014-01-01

    Large -scale topographic maps are important basic information for city and regional planning and management. Traditional large- scale mapping methods are mostly based on artificial mapping and photogrammetry. The traditional mapping method is inefficient and limited by the environments. While the photogrammetry methods(such as low-altitude aerial mapping) is an economical and effective way to map wide and regulate range of large scale topographic map but doesn't work well in the small area due to the high cost of manpower and resources. Recent years, the vehicle-borne LIDAR technology has a rapid development, and its application in surveying and mapping is becoming a new topic. The main objective of this investigation is to explore the potential of vehicle-borne LIDAR technology to be used to fast mapping large scale topographic maps based on new Chinese vehicle-borne LIDAR system. It studied how to use the new Chinese vehicle-borne LIDAR system measurement technology to map large scale topographic maps. After the field data capture, it can be mapped in the office based on the LIDAR data (point cloud) by software which programmed by ourselves. In addition, the detailed process and accuracy analysis were proposed by an actual case. The result show that this new technology provides a new fast method to generate large scale topographic maps, which is high efficient and accuracy compared to traditional methods

  19. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  20. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  1. 77 FR 58416 - Large Scale Networking (LSN); Middleware and Grid Interagency Coordination (MAGIC) Team

    Science.gov (United States)

    2012-09-20

    ..., Grid, and cloud projects. The MAGIC Team reports to the Large Scale Networking (LSN) Coordinating Group... Coordination (MAGIC) Team AGENCY: The Networking and Information Technology Research and Development (NITRD.... Dates/Location: The MAGIC Team meetings are held on the first Wednesday of each month, 2:00-4:00pm, at...

  2. Knowledge-based system for flight information management. Thesis

    Science.gov (United States)

    Ricks, Wendell R.

    1990-01-01

    The use of knowledge-based system (KBS) architectures to manage information on the primary flight display (PFD) of commercial aircraft is described. The PFD information management strategy used tailored the information on the PFD to the tasks the pilot performed. The KBS design and implementation of the task-tailored PFD information management application is described. The knowledge acquisition and subsequent system design of a flight-phase-detection KBS is also described. The flight-phase output of this KBS was used as input to the task-tailored PFD information management KBS. The implementation and integration of this KBS with existing aircraft systems and the other KBS is described. The flight tests are examined of both KBS's, collectively called the Task-Tailored Flight Information Manager (TTFIM), which verified their implementation and integration, and validated the software engineering advantages of the KBS approach in an operational environment.

  3. Genephony: a knowledge management tool for genome-wide research

    Directory of Open Access Journals (Sweden)

    Riva Alberto

    2009-09-01

    Full Text Available Abstract Background One of the consequences of the rapid and widespread adoption of high-throughput experimental technologies is an exponential increase of the amount of data produced by genome-wide experiments. Researchers increasingly need to handle very large volumes of heterogeneous data, including both the data generated by their own experiments and the data retrieved from publicly available repositories of genomic knowledge. Integration, exploration, manipulation and interpretation of data and information therefore need to become as automated as possible, since their scale and breadth are, in general, beyond the limits of what individual researchers and the basic data management tools in normal use can handle. This paper describes Genephony, a tool we are developing to address these challenges. Results We describe how Genephony can be used to manage large datesets of genomic information, integrating them with existing knowledge repositories. We illustrate its functionalities with an example of a complex annotation task, in which a set of SNPs coming from a genotyping experiment is annotated with genes known to be associated to a phenotype of interest. We show how, thanks to the modular architecture of Genephony and its user-friendly interface, this task can be performed in a few simple steps. Conclusion Genephony is an online tool for the manipulation of large datasets of genomic information. It can be used as a browser for genomic data, as a high-throughput annotation tool, and as a knowledge discovery tool. It is designed to be easy to use, flexible and extensible. Its knowledge management engine provides fine-grained control over individual data elements, as well as efficient operations on large datasets.

  4. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    Science.gov (United States)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated

  5. First Joint Workshop on Energy Management for Large-Scale Research Infrastructures

    CERN Document Server

    2011-01-01

      CERN, ERF (European Association of National Research Facilities) and ESS (European Spallation Source) announce the first Joint Workshop on Energy Management for Large-Scale Research Infrastructures. The event will take place on 13-14 October 2011 at the ESS office in Sparta - Lund, Sweden.   The workshop will bring together international experts on energy and representatives from laboratories and future projects all over the world in order to identify the challenges and best practice in respect of energy efficiency and optimization, solutions and implementation as well as to review the challenges represented by potential future technical solutions and the tools for effective collaboration. Further information at: http://ess-scandinavia.eu/general-information

  6. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  7. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  8. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  9. Knowledge management, health information technology and nurses' work engagement

    NARCIS (Netherlands)

    Hendriks, P.H.J.; Ligthart, P.E.M.; Schouteten, R.L.J.

    2016-01-01

    BACKGROUND: Knowledge management (KM) extends the health information technology (HIT) literature by addressing its impact on creating knowledge by sharing and using the knowledge of health care professionals in hospitals. PURPOSE: The aim of the study was to provide insight into how HIT affects

  10. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  11. Mass-media information campaigns and knowledge-gap effects

    NARCIS (Netherlands)

    Weenig, M.W.H.; Midden, C.J.H.

    1997-01-01

    The knowledge-gap hypothesis of Tichenor, Donohue, and Olien (1970) states that people from the higher socioeconomic segments of society acquire information at a faster rate than people from the lower socioeconomic segments. The consequence is a growing knowledge gap between the high and low

  12. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  13. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  14. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  15. Large scale access tests and online interfaces to ATLAS conditions databases

    International Nuclear Information System (INIS)

    Amorim, A; Lopes, L; Pereira, P; Simoes, J; Soloviev, I; Burckhart, D; Schmitt, J V D; Caprini, M; Kolos, S

    2008-01-01

    The access of the ATLAS Trigger and Data Acquisition (TDAQ) system to the ATLAS Conditions Databases sets strong reliability and performance requirements on the database storage and access infrastructures. Several applications were developed to support the integration of Conditions database access with the online services in TDAQ, including the interface to the Information Services (IS) and to the TDAQ Configuration Databases. The information storage requirements were the motivation for the ONline A Synchronous Interface to COOL (ONASIC) from the Information Service (IS) to LCG/COOL databases. ONASIC avoids the possible backpressure from Online Database servers by managing a local cache. In parallel, OKS2COOL was developed to store Configuration Databases into an Offline Database with history record. The DBStressor application was developed to test and stress the access to the Conditions database using the LCG/COOL interface while operating in an integrated way as a TDAQ application. The performance scaling of simultaneous Conditions database read accesses was studied in the context of the ATLAS High Level Trigger large computing farms. A large set of tests were performed involving up to 1000 computing nodes that simultaneously accessed the LCG central database server infrastructure at CERN

  16. Low frequency steady-state brain responses modulate large scale functional networks in a frequency-specific means.

    Science.gov (United States)

    Wang, Yi-Feng; Long, Zhiliang; Cui, Qian; Liu, Feng; Jing, Xiu-Juan; Chen, Heng; Guo, Xiao-Nan; Yan, Jin H; Chen, Hua-Fu

    2016-01-01

    Neural oscillations are essential for brain functions. Research has suggested that the frequency of neural oscillations is lower for more integrative and remote communications. In this vein, some resting-state studies have suggested that large scale networks function in the very low frequency range (frequency characteristics of brain networks because both resting-state studies and conventional frequency tagging approaches cannot simultaneously capture multiple large scale networks in controllable cognitive activities. In this preliminary study, we aimed to examine whether large scale networks can be modulated by task-induced low frequency steady-state brain responses (lfSSBRs) in a frequency-specific pattern. In a revised attention network test, the lfSSBRs were evoked in the triple network system and sensory-motor system, indicating that large scale networks can be modulated in a frequency tagging way. Furthermore, the inter- and intranetwork synchronizations as well as coherence were increased at the fundamental frequency and the first harmonic rather than at other frequency bands, indicating a frequency-specific modulation of information communication. However, there was no difference among attention conditions, indicating that lfSSBRs modulate the general attention state much stronger than distinguishing attention conditions. This study provides insights into the advantage and mechanism of lfSSBRs. More importantly, it paves a new way to investigate frequency-specific large scale brain activities. © 2015 Wiley Periodicals, Inc.

  17. Studies on learning by detecting impasse and by resulting it for building large scale knowledge base for autonomous plant

    International Nuclear Information System (INIS)

    Sawaragi, Tetsuo

    1997-03-01

    The acquisition of knowledge from human experts in an exhaustive way is extremely difficult, and even if it were possible, the maintenance of such a large knowledge base for realtime operation is not an easy task. The autonomous system having just incomplete knowledge would face with so many problems that contradicts with the system's current beliefs and/or are novel or unknown to the system. Experienced humans can manage to do with such novelty due to their generalizing ability and analogical inference based on the repertoire of precedents, even if they with new problems. Moreover, through experiencing such breakdowns and impasse, they can acquire some novel knowledge by their proactive attempts to interpret a provided problem as well as by updating their beliefs and contents and organization of their prior knowledge. We call such a style of learning as impasse-driven learning, meaning that learning dose occur being motivated by facing with contradiction and impasse. The related studies concerning with such a style of leaning have been studied within a field of machine learning of artificial intelligence so far as well as within a cognitive science field. In this paper, we at first summarize an outline of machine learning methodologies, and then, we detail about the impasse-driven learning. We discuss that from two different perspective of learning, one is from deductive and analogical learning and the other one is from inductive conceptual learning (i.e., concept formation or generalization-based memory). The former mainly discuss about how the learning system updates its prior beliefs and knowledge so that it can explain away the current contradiction using some meta-cognition heuristics. The latter attempts to assimilate a contradicting problem into its prior memory structure by dynamically reorganizing a collection of the precedents. We present those methodologies, and finally we introduce a case study of concept formation for plant anomalies and its usage for

  18. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  19. The relationship between different information sources and disease-related patient knowledge and anxiety in patients with inflammatory bowel disease.

    Science.gov (United States)

    Selinger, C P; Carbery, I; Warren, V; Rehman, A F; Williams, C J; Mumtaz, S; Bholah, H; Sood, R; Gracie, D J; Hamlin, P J; Ford, A C

    2017-01-01

    Patient education forms a cornerstone of management of inflammatory bowel disease (IBD). The Internet has opened new avenues for information gathering. To determine the relationship between different information sources and patient knowledge and anxiety in patients with IBD. The use of information sources in patients with IBD was examined via questionnaire. Anxiety was assessed with the hospital anxiety and depression scale and disease-related patient knowledge with the Crohn's and colitis knowledge score questionnaires. Associations between these outcomes and demographics, disease-related factors, and use of different information sources were analysed using linear regression analysis. Of 307 patients (165 Crohn's disease, 142 ulcerative colitis) 60.6% were female. Participants used the hospital IBD team (82.3%), official leaflets (59.5%), and official websites (53.5%) most frequently in contrast to alternative health websites (9%). University education (P sex (P = 0.004), clinically active disease (P sources are associated with better knowledge or worse anxiety levels. Face-to-face education and written information materials remain the first line of patient education. Patients should be guided towards official information websites and warned about the association between the use of alternative health websites or random links and anxiety. © 2016 John Wiley & Sons Ltd.

  20. How Can the Evidence from Global Large-scale Clinical Trials for Cardiovascular Diseases be Improved?

    Directory of Open Access Journals (Sweden)

    Tsutani Kiichiro

    2011-06-01

    Full Text Available Abstract Background Clinical investigations are important for obtaining evidence to improve medical treatment. Large-scale clinical trials with thousands of participants are particularly important for this purpose in cardiovascular diseases. Conducting large-scale clinical trials entails high research costs. This study sought to investigate global trends in large-scale clinical trials in cardiovascular diseases. Findings We searched for trials using clinicaltrials.gov (URL: http://www.clinicaltrials.gov/ using the key words 'cardio' and 'event' in all fields on 10 April, 2010. We then selected trials with 300 or more participants examining cardiovascular diseases. The search revealed 344 trials that met our criteria. Of 344 trials, 71% were randomized controlled trials, 15% involved more than 10,000 participants, and 59% were funded by industry. In RCTs whose results were disclosed, 55% of industry-funded trials and 25% of non-industry funded trials reported statistically significant superiority over control (p = 0.012, 2-sided Fisher's exact test. Conclusions Our findings highlighted concerns regarding potential bias related to funding sources, and that researchers should be aware of the importance of trial information disclosures and conflicts of interest. We should keep considering management and training regarding information disclosures and conflicts of interest for researchers. This could lead to better clinical evidence and further improvements in the development of medical treatment worldwide.

  1. A large-scale study of epilepsy in Ecuador: methodological aspects.

    Science.gov (United States)

    Placencia, M; Suarez, J; Crespo, F; Sander, J W; Shorvon, S D; Ellison, R H; Cascante, S M

    1992-01-01

    The methodology is presented of a large-scale study of epilepsy carried out in a highland area in northern Ecuador, South America, covering a population of 72,121 people; The study was carried out in two phases, the first, a cross-sectional phase, consisted of a house-to-house survey of all persons in this population, screening for epileptic seizures using a specially designed questionnaire. Possible cases identified in screening were assessed in a cascade diagnostic procedure applied by general doctors and neurologists. Its objectives were: to establish a comprehensive epidemiological profile of epileptic seizures; to describe the clinical phenomenology of this condition in the community; to validate methods for diagnosis and classification of epileptic seizures by a non-specialised team; and to ascertain the community's knowledge, attitudes and practices regarding epilepsy. A sample was selected in this phase in order to study the social aspects of epilepsy in this community. The second phase, which was longitudinal, assessed the ability of non-specialist care in the treatment of epilepsy. It consisted of a prospective clinical trial of antiepileptic therapy in untreated patients using two standard anti-epileptic drugs. Patients were followed for 12 months by a multidisciplinary team consisting of a primary health worker, rural doctor, neurologist, anthropologist, and psychologist. Standardised, reproducible instruments and methods were used. This study was carried out through co-operation between the medical profession, political agencies and the pharmaceutical industry, at an international level. We consider this a model for further large-scale studies of this type.

  2. Best Practices in the Evaluation of Large-scale STEM-focused Events: A Review of Recent Literature

    Science.gov (United States)

    Shebby, S.; Cobb, W. H.; Buxner, S.; Shipp, S. S.

    2015-12-01

    Each year, the National Aeronautics and Space Administration (NASA) sponsors a variety of educational events to share information with educators, students, and the general public. Intended outcomes of these events include increased interest in and awareness of the mission and goals of NASA. Events range in size from relatively small family science nights at a local school to large-scale mission and celestial event celebrations involving thousands of members of the general public. To support community members in designing event evaluations, the Science Mission Directorate (SMD) Planetary Science Forum sponsored the creation of a Best Practices Guide. The guide was generated by reviewing published large-scale event evaluation reports; however, the best practices described within are pertinent for all event organizers and evaluators regardless of event size. Each source included in the guide identified numerous challenges to conducting their event evaluation. These included difficulty in identifying extant instruments or items, collecting representative data, and disaggregating data to inform different evaluation questions. Overall, the guide demonstrates that evaluations of the large-scale events are generally done at a very basic level, with the types of data collected limited to observable demographic information and participant reactions collected via online survey. In addition to these findings, this presentation will describe evaluation best practices that will help practitioners move beyond these basic indicators and examine how to make the evaluation process an integral—and valuable—element of event planning, ultimately informing event outcomes and impacts. It will provide detailed information on five recommendations presented in the guide: 1) consider evaluation methodology, including data analysis, in advance; 2) design data collection instruments well in advance of the event; 3) collect data at different times and from multiple sources; 4) use

  3. BioPlex Display: An Interactive Suite for Large-Scale AP-MS Protein-Protein Interaction Data.

    Science.gov (United States)

    Schweppe, Devin K; Huttlin, Edward L; Harper, J Wade; Gygi, Steven P

    2018-01-05

    The development of large-scale data sets requires a new means to display and disseminate research studies to large audiences. Knowledge of protein-protein interaction (PPI) networks has become a principle interest of many groups within the field of proteomics. At the confluence of technologies, such as cross-linking mass spectrometry, yeast two-hybrid, protein cofractionation, and affinity purification mass spectrometry (AP-MS), detection of PPIs can uncover novel biological inferences at a high-throughput. Thus new platforms to provide community access to large data sets are necessary. To this end, we have developed a web application that enables exploration and dissemination of the growing BioPlex interaction network. BioPlex is a large-scale interactome data set based on AP-MS of baits from the human ORFeome. The latest BioPlex data set release (BioPlex 2.0) contains 56 553 interactions from 5891 AP-MS experiments. To improve community access to this vast compendium of interactions, we developed BioPlex Display, which integrates individual protein querying, access to empirical data, and on-the-fly annotation of networks within an easy-to-use and mobile web application. BioPlex Display enables rapid acquisition of data from BioPlex and development of hypotheses based on protein interactions.

  4. Analyzing the cosmic variance limit of remote dipole measurements of the cosmic microwave background using the large-scale kinetic Sunyaev Zel'dovich effect

    Energy Technology Data Exchange (ETDEWEB)

    Terrana, Alexandra; Johnson, Matthew C. [Department of Physics and Astronomy, York University, Toronto, Ontario, M3J 1P3 (Canada); Harris, Mary-Jean, E-mail: aterrana@perimeterinstitute.ca, E-mail: mharris8@perimeterinstitute.ca, E-mail: mjohnson@perimeterinstitute.ca [Perimeter Institute for Theoretical Physics, Waterloo, Ontario N2L 2Y5 (Canada)

    2017-02-01

    Due to cosmic variance we cannot learn any more about large-scale inhomogeneities from the primary cosmic microwave background (CMB) alone. More information on large scales is essential for resolving large angular scale anomalies in the CMB. Here we consider cross correlating the large-scale kinetic Sunyaev Zel'dovich (kSZ) effect and probes of large-scale structure, a technique known as kSZ tomography. The statistically anisotropic component of the cross correlation encodes the CMB dipole as seen by free electrons throughout the observable Universe, providing information about long wavelength inhomogeneities. We compute the large angular scale power asymmetry, constructing the appropriate transfer functions, and estimate the cosmic variance limited signal to noise for a variety of redshift bin configurations. The signal to noise is significant over a large range of power multipoles and numbers of bins. We present a simple mode counting argument indicating that kSZ tomography can be used to estimate more modes than the primary CMB on comparable scales. A basic forecast indicates that a first detection could be made with next-generation CMB experiments and galaxy surveys. This paper motivates a more systematic investigation of how close to the cosmic variance limit it will be possible to get with future observations.

  5. Electronic patient records in action: Transforming information into professionally relevant knowledge.

    Science.gov (United States)

    Winman, Thomas; Rystedt, Hans

    2011-03-01

    The implementation of generic models for organizing information in complex institutions like those in healthcare creates a gap between standardization and the need for locally relevant knowledge. The present study addresses how this gap can be bridged by focusing on the practical work of healthcare staff in transforming information in EPRs into knowledge that is useful for everyday work. Video recording of shift handovers on a rehabilitation ward serves as the empirical case. The results show how extensive selections and reorganizations of information in EPRs are carried out in order to transform information into professionally relevant accounts. We argue that knowledge about the institutional obligations and professional ways of construing information are fundamental for these transitions. The findings point to the need to consider the role of professional knowledge inherent in unpacking information in efforts to develop information systems intended to bridge between institutional and professional boundaries in healthcare. © The Author(s) 2011.

  6. Management of nuclear information and knowledge in Cuban institutions

    International Nuclear Information System (INIS)

    Garcia, A.G.; Rondon, C.F.; Aldama, C.L.; Aruca, L.A.; Labrada, C.

    2004-01-01

    Full text: The peaceful use and application of nuclear energy demands a wide domain of the capabilities and an inherent knowledge for technicians employee and a part of the personnel linked to the nuclear specialties, the application of the generated and accumulated information in databases and the organization in an integral culture that allows the socialization of the generated and acquired knowledge, supported on a solid infrastructure based on the use of the information and communication technologies. The Nuclear Ramal Program in Cuba (NRP) recognizes as a main priority the establishment of the knowledge management system, which offer possibilities of participation for all institutions belonging to the Agency of Nuclear Energy and Advanced Technologies (AEN and TA). In this rank an important role belongs to the Energy Development and Information Management Centre (CUBAENERGIA) as a coordinating entity, on which are executed projects focused: To develop the web site of the AEN and TA connected to web sites of other institutions of the proper Agency; To develop the executive web site (Intranet of the AEN and TA), which manages the corporate information, as a support to the process of taking decisions. Here also participate all the institutions belonging to agency; Networking education system for human resources of these institutions and others that belong to the energy sector in Cuba; Application and implementation of data warehousing process for all institutions on corporate levels; Approaches and concepts for managing nuclear information supported on a collective catalogue of scientific and technical publications of nuclear profile; Application of technology watching system for all the scientific and technical activities linked to the use and application of the peaceful use of nuclear energy, based on the information and knowledge contained in the databases of INIS, WIPO and RRIAN; To promote and disclose the peaceful, efficient and safety use of nuclear energy

  7. Development of traditional Chinese medicine clinical data warehouse for medical knowledge discovery and decision support.

    Science.gov (United States)

    Zhou, Xuezhong; Chen, Shibo; Liu, Baoyan; Zhang, Runsun; Wang, Yinghui; Li, Ping; Guo, Yufeng; Zhang, Hua; Gao, Zhuye; Yan, Xiufeng

    2010-01-01

    Traditional Chinese medicine (TCM) is a scientific discipline, which develops the related theories from the long-term clinical practices. The large-scale clinical data are the core empirical knowledge source for TCM research. This paper introduces a clinical data warehouse (CDW) system, which incorporates the structured electronic medical record (SEMR) data for medical knowledge discovery and TCM clinical decision support (CDS). We have developed the clinical reference information model (RIM) and physical data model to manage the various information entities and their relationships in TCM clinical data. An extraction-transformation-loading (ETL) tool is implemented to integrate and normalize the clinical data from different operational data sources. The CDW includes online analytical processing (OLAP) and complex network analysis (CNA) components to explore the various clinical relationships. Furthermore, the data mining and CNA methods are used to discover the valuable clinical knowledge from the data. The CDW has integrated 20,000 TCM inpatient data and 20,000 outpatient data, which contains manifestations (e.g. symptoms, physical examinations and laboratory test results), diagnoses and prescriptions as the main information components. We propose a practical solution to accomplish the large-scale clinical data integration and preprocessing tasks. Meanwhile, we have developed over 400 OLAP reports to enable the multidimensional analysis of clinical data and the case-based CDS. We have successfully conducted several interesting data mining applications. Particularly, we use various classification methods, namely support vector machine, decision tree and Bayesian network, to discover the knowledge of syndrome differentiation. Furthermore, we have applied association rule and CNA to extract the useful acupuncture point and herb combination patterns from the clinical prescriptions. A CDW system consisting of TCM clinical RIM, ETL, OLAP and data mining as the core

  8. Collective Influence of Multiple Spreaders Evaluated by Tracing Real Information Flow in Large-Scale Social Networks.

    Science.gov (United States)

    Teng, Xian; Pei, Sen; Morone, Flaviano; Makse, Hernán A

    2016-10-26

    Identifying the most influential spreaders that maximize information flow is a central question in network theory. Recently, a scalable method called "Collective Influence (CI)" has been put forward through collective influence maximization. In contrast to heuristic methods evaluating nodes' significance separately, CI method inspects the collective influence of multiple spreaders. Despite that CI applies to the influence maximization problem in percolation model, it is still important to examine its efficacy in realistic information spreading. Here, we examine real-world information flow in various social and scientific platforms including American Physical Society, Facebook, Twitter and LiveJournal. Since empirical data cannot be directly mapped to ideal multi-source spreading, we leverage the behavioral patterns of users extracted from data to construct "virtual" information spreading processes. Our results demonstrate that the set of spreaders selected by CI can induce larger scale of information propagation. Moreover, local measures as the number of connections or citations are not necessarily the deterministic factors of nodes' importance in realistic information spreading. This result has significance for rankings scientists in scientific networks like the APS, where the commonly used number of citations can be a poor indicator of the collective influence of authors in the community.

  9. Accelerating Relevance Vector Machine for Large-Scale Data on Spark

    Directory of Open Access Journals (Sweden)

    Liu Fang

    2017-01-01

    Full Text Available Relevance vector machine (RVM is a machine learning algorithm based on a sparse Bayesian framework, which performs well when running classification and regression tasks on small-scale datasets. However, RVM also has certain drawbacks which restricts its practical applications such as (1 slow training process, (2 poor performance on training large-scale datasets. In order to solve these problem, we propose Discrete AdaBoost RVM (DAB-RVM which incorporate ensemble learning in RVM at first. This method performs well with large-scale low-dimensional datasets. However, as the number of features increases, the training time of DAB-RVM increases as well. To avoid this phenomenon, we utilize the sufficient training samples of large-scale datasets and propose all features boosting RVM (AFB-RVM, which modifies the way of obtaining weak classifiers. In our experiments we study the differences between various boosting techniques with RVM, demonstrating the performance of the proposed approaches on Spark. As a result of this paper, two proposed approaches on Spark for different types of large-scale datasets are available.

  10. A large-scale dataset of solar event reports from automated feature recognition modules

    Science.gov (United States)

    Schuh, Michael A.; Angryk, Rafal A.; Martens, Petrus C.

    2016-05-01

    The massive repository of images of the Sun captured by the Solar Dynamics Observatory (SDO) mission has ushered in the era of Big Data for Solar Physics. In this work, we investigate the entire public collection of events reported to the Heliophysics Event Knowledgebase (HEK) from automated solar feature recognition modules operated by the SDO Feature Finding Team (FFT). With the SDO mission recently surpassing five years of operations, and over 280,000 event reports for seven types of solar phenomena, we present the broadest and most comprehensive large-scale dataset of the SDO FFT modules to date. We also present numerous statistics on these modules, providing valuable contextual information for better understanding and validating of the individual event reports and the entire dataset as a whole. After extensive data cleaning through exploratory data analysis, we highlight several opportunities for knowledge discovery from data (KDD). Through these important prerequisite analyses presented here, the results of KDD from Solar Big Data will be overall more reliable and better understood. As the SDO mission remains operational over the coming years, these datasets will continue to grow in size and value. Future versions of this dataset will be analyzed in the general framework established in this work and maintained publicly online for easy access by the community.

  11. Ever-present threats from information technology: the Cyber-Paranoia and Fear Scale

    Directory of Open Access Journals (Sweden)

    Oliver John Mason

    2014-11-01

    Full Text Available Delusions involving technology, and specifically the internet, are increasingly common, and fear-reality statistics suggest computer-related fears are very widespread. These fears form a continuum from the widely understandable and realistic to the unrealistic, and frankly paranoid. The present study investigated the validity of this construct in a non-clinical population by constructing a novel self-report measure. The new Cyber-Paranoia and Fear Scale aims to measure the perception of information technology-related threats originating from or enabled by computers, smartphones, social networks and digital surveillance. Psychometric properties of the new Cyber-Paranoia and Fear Scale are reported alongside an established measure of suspiciousness and paranoia in 181 participants including a sub-group of fifty information technology professionals. Exploratory factor analysis suggested the presence of two, related, dimensions that we term Cyber-Fear and Cyber-Paranoia. Both sub-scales were internally consistent and produced a normal distribution of scores. The relationships of the sub-scales with age, gender, trait paranoia, digital literacy and digital inclusion are supportive of construct validity. The distinctiveness of ‘cyber-paranoia’ from general trait paranoia appears to mirror the clinical distinctiveness of ‘internet’ and other technology fuelled delusions. Knowledge provision to increase technological proficiency and awareness may bring about a reduction in cyber-paranoia.

  12. Ever-present threats from information technology: the Cyber-Paranoia and Fear Scale.

    Science.gov (United States)

    Mason, Oliver J; Stevenson, Caroline; Freedman, Fleur

    2014-01-01

    Delusions involving technology, and specifically the internet, are increasingly common, and fear-reality statistics suggest computer-related fears are very widespread. These fears form a continuum from the widely understandable and realistic to the unrealistic, and frankly paranoid. The present study investigated the validity of this construct in a non-clinical population by constructing a novel self-report measure. The new Cyber-Paranoia and Fear Scale aims to measure the perception of information technology-related threats originating from or enabled by computers, smartphones, social networks, and digital surveillance. Psychometric properties of the new Cyber-Paranoia and Fear Scale are reported alongside an established measure of suspiciousness and paranoia in 181 participants including a sub-group of fifty information technology professionals. Exploratory factor analysis suggested the presence of two, related, dimensions that we term cyber-paranoia and cyber-fear. Both sub-scales were internally consistent and produced a normal distribution of scores. The relationships of the sub-scales with age, gender, trait paranoia, digital literacy, and digital inclusion are supportive of construct validity. The distinctiveness of 'cyber-paranoia' from general trait paranoia appears to mirror the clinical distinctiveness of 'internet' and other technology-fuelled delusions. Knowledge provision to increase technological proficiency and awareness may bring about a reduction in cyber-paranoia.

  13. [Information system in nursing: interacion of tacit-explicit knowledge].

    Science.gov (United States)

    dos Santos, Sérgio Ribeiro

    2005-01-01

    The present article aims to trace some theoretical and conceptual considerations on information systems in nursing, seeking to point out the knowledge based on the clinical practice evidences to construct a model of system integrated to the conceptual structures, formed by the combination of three sciences: information, computing and nursing. This knowledge can systematically describe and explain the necessary phenomena to develop a comprehensive information system that contribute for nursing records improvement and to consolidate a mechanism to provide basic measuring of costs, quality, patient access to care, and results of this care.

  14. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  15. ORNL Pre-test Analyses of A Large-scale Experiment in STYLE

    International Nuclear Information System (INIS)

    Williams, Paul T.; Yin, Shengjun; Klasky, Hilda B.; Bass, Bennett Richard

    2011-01-01

    Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current status of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite

  16. Large-scale Organized Magnetic Fields in O, B and A Stars

    Science.gov (United States)

    Mathys, G.

    2009-06-01

    The status of our current knowledge of magnetic fields in stars of spectral types ranging from early F to O is reviewed. Fields with large-scale organised structure have now been detected and measured throughout this range. These fields are consistent with the oblique rotator model. In early F to late B stars, their occurrence is restricted to the subgroup of the Ap stars, which have the best studied fields among the early-type stars. Presence of fields with more complex topologies in other A and late B stars has been suggested, but is not firmly established. Magnetic fields have not been studied in a sufficient number of OB stars yet so as to establish whether they occur in all or only in some subset of these stars.

  17. Effects of microhabitat and large-scale land use on stream salamander occupancy in the coalfields of Central Appalachia

    Science.gov (United States)

    Sweeten, Sara E.; Ford, W. Mark

    2016-01-01

    Large-scale coal mining practices, particularly surface coal extraction and associated valley fills as well as residential wastewater discharge, are of ecological concern for aquatic systems in central Appalachia. Identifying and quantifying alterations to ecosystems along a gradient of spatial scales is a necessary first-step to aid in mitigation of negative consequences to aquatic biota. In central Appalachian headwater streams, apart from fish, salamanders are the most abundant vertebrate predator that provide a significant intermediate trophic role linking aquatic and terrestrial food webs. Stream salamander species are considered to be sensitive to aquatic stressors and environmental alterations, as past research has shown linkages among microhabitat parameters, large-scale land use such as urbanization and logging, and salamander abundances. However, there is little information examining these relationships between environmental conditions and salamander occupancy in the coalfields of central Appalachia. In the summer of 2013, 70 sites (sampled two to three times each) in the southwest Virginia coalfields were visited to collect salamanders and quantify stream and riparian microhabitat parameters. Using an information-theoretic framework, effects of microhabitat and large-scale land use on stream salamander occupancy were compared. The findings indicate that Desmognathus spp. occupancy rates are more correlated to microhabitat parameters such as canopy cover than to large-scale land uses. However, Eurycea spp. occupancy rates had a strong association with large-scale land uses, particularly recent mining and forest cover within the watershed. These findings suggest that protection of riparian habitats is an important consideration for maintaining aquatic systems in central Appalachia. If this is not possible, restoration riparian areas should follow guidelines using quick-growing tree species that are native to Appalachian riparian areas. These types of trees

  18. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  19. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  20. Groups like the support sharing channel of information and knowledge

    Directory of Open Access Journals (Sweden)

    Armando Sergio de Aguiar Filho

    2016-12-01

    Full Text Available Introduction: The role of sharing information and knowledge which tends to lead to a new understanding of distribution channels, allowing the maturation of sharing concept and its relationship to the process of information management. This interaction arises range of alternatives par as organizations relate internally with employees and externally with your audience. Objects: The goal is to survey and presentation of studies related to information sharing and knowledge channels, trying to identify its correlates in the area of administration. Methodology: The work was developed from a literature search. For both sought to initially align the concepts and terminology of information science area and a second time to identify a differentiated approach to sharing that would contribute to validate the interdisciplinary character of the information area and the contribution that other areas can make to the studies of information management and knowledge. Results: The analysis of the survey indicated considerations relevant to the understanding of the various approaches used in relation to the sharing of channels, as well as the common and different characteristics of these media and the impact on their dynamics. Conclusions: The Support Group terminology is one of several approaches used in the sharing of information and knowledge, and, like the other approaches presented to assess and promote better information services to meet the specific demands.

  1. Decentralised stabilising controllers for a class of large-scale linear ...

    Indian Academy of Sciences (India)

    subsystems resulting from a new aggregation-decomposition technique. The method has been illustrated through a numerical example of a large-scale linear system consisting of three subsystems each of the fourth order. Keywords. Decentralised stabilisation; large-scale linear systems; optimal feedback control; algebraic ...

  2. Modelling large scale human activity in San Francisco

    Science.gov (United States)

    Gonzalez, Marta

    2010-03-01

    Diverse group of people with a wide variety of schedules, activities and travel needs compose our cities nowadays. This represents a big challenge for modeling travel behaviors in urban environments; those models are of crucial interest for a wide variety of applications such as traffic forecasting, spreading of viruses, or measuring human exposure to air pollutants. The traditional means to obtain knowledge about travel behavior is limited to surveys on travel journeys. The obtained information is based in questionnaires that are usually costly to implement and with intrinsic limitations to cover large number of individuals and some problems of reliability. Using mobile phone data, we explore the basic characteristics of a model of human travel: The distribution of agents is proportional to the population density of a given region, and each agent has a characteristic trajectory size contain information on frequency of visits to different locations. Additionally we use a complementary data set given by smart subway fare cards offering us information about the exact time of each passenger getting in or getting out of the subway station and the coordinates of it. This allows us to uncover the temporal aspects of the mobility. Since we have the actual time and place of individual's origin and destination we can understand the temporal patterns in each visited location with further details. Integrating two described data set we provide a dynamical model of human travels that incorporates different aspects observed empirically.

  3. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  4. Similitude and scaling of large structural elements: Case study

    Directory of Open Access Journals (Sweden)

    M. Shehadeh

    2015-06-01

    Full Text Available Scaled down models are widely used for experimental investigations of large structures due to the limitation in the capacities of testing facilities along with the expenses of the experimentation. The modeling accuracy depends upon the model material properties, fabrication accuracy and loading techniques. In the present work the Buckingham π theorem is used to develop the relations (i.e. geometry, loading and properties between the model and a large structural element as that is present in the huge existing petroleum oil drilling rigs. The model is to be designed, loaded and treated according to a set of similitude requirements that relate the model to the large structural element. Three independent scale factors which represent three fundamental dimensions, namely mass, length and time need to be selected for designing the scaled down model. Numerical prediction of the stress distribution within the model and its elastic deformation under steady loading is to be made. The results are compared with those obtained from the full scale structure numerical computations. The effect of scaled down model size and material on the accuracy of the modeling technique is thoroughly examined.

  5. Large scale waste combustion projects. A study of financial structures and sensitivities

    International Nuclear Information System (INIS)

    Brandler, A.

    1993-01-01

    The principal objective of the study was to determine the key contractual and financial aspects of large scale energy-from-waste projects, and to provide the necessary background information on financing to appreciate the approach lenders take when they consider financing waste combustion projects. An integral part of the study has been the preparation of a detailed financial model, incorporating all major financing parameters, to assess the economic and financial viability of typical waste combustion projects. (author)

  6. The use of production management techniques in the construction of large scale physics detectors

    International Nuclear Information System (INIS)

    Bazan, A.; Chevenier, G.; Estrella, F.

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Workflow Management Software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an Information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector construction. This is the first time industrial production techniques have been deployed to this extent in detector construction

  7. 9th International Conference on Knowledge, Information and Creativity Support Systems

    CERN Document Server

    Papadopoulos, George; Skulimowski, Andrzej; Kacprzyk  , Janusz

    2016-01-01

    This volume consists of a number of selected papers that were presented at the 9th International Conference on Knowledge, Information and Creativity Support Systems (KICSS 2014) in Limassol, Cyprus, after they were substantially revised and extended. The 27 regular papers and 19 short papers included in this proceedings cover all aspects of knowledge management, knowledge engineering, intelligent information systems, and creativity in an information technology context, including computational creativity and its cognitive and collaborative aspects. .

  8. Large-scale preparation of hollow graphitic carbon nanospheres

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jun; Li, Fu [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Bai, Yu-Jun, E-mail: byj97@126.com [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); State Key laboratory of Crystal Materials, Shandong University, Jinan 250100 (China); Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Lu, Xi-Feng [Lunan Institute of Coal Chemical Engineering, Jining 272000 (China)

    2013-01-15

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 Degree-Sign C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g{sup -1} after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 Degree-Sign C, which exhibit superior electrochemical performance to graphite. Highlights: Black-Right-Pointing-Pointer Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 Degree-Sign C Black-Right-Pointing-Pointer The preparation is simple, effective and eco-friendly. Black-Right-Pointing-Pointer The in situ yielded MgO nanocrystals promote the graphitization. Black-Right-Pointing-Pointer The HGCNSs exhibit superior electrochemical performance to graphite.

  9. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  10. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  11. Large scale treatment of total petroleum-hydrocarbon contaminated groundwater using bioaugmentation.

    Science.gov (United States)

    Poi, Gregory; Shahsavari, Esmaeil; Aburto-Medina, Arturo; Mok, Puah Chum; Ball, Andrew S

    2018-05-15

    Bioaugmentation or the addition of microbes to contaminated sites has been widely used to treat contaminated soil or water; however this approach is often limited to laboratory based studies. In the present study, large scale bioaugmentation has been applied to total petroleum hydrocarbons (TPH)-contaminated groundwater at a petroleum facility. Initial TPH concentrations of 1564 mg L -1 in the field were reduced to 89 mg L -1 over 32 days. This reduction was accompanied by improved ecotoxicity, as shown by Brassica rapa germination numbers that increased from 52 at day 0 to 82% by the end of the treatment. Metagenomic analysis indicated that there was a shift in the microbial community when compared to the beginning of the treatment. The microbial community was dominated by Proteobacteria and Bacteroidetes from day 0 to day 32, although differences at the genus level were observed. The predominant genera at the beginning of the treatment (day 0 just after inoculation) were Cloacibacterium, Sediminibacterium and Brevundimonas while at the end of the treatment members of Flavobacterium dominated, reaching almost half the population (41%), followed by Pseudomonas (6%) and Limnobacter (5.8%). To the author's knowledge, this is among the first studies to report the successful large scale biodegradation of TPH-contaminated groundwater (18,000 L per treatment session) at an offshore petrochemical facility. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. Role of Information Professionals in Knowledge Management Programs: Empirical Evidence from Canada

    Directory of Open Access Journals (Sweden)

    la Ajiferuke

    2003-01-01

    Full Text Available The implementation of a knowledge management program in an organization has the potential of im-proving customer services, quickly bringing new products to market, and reducing cost of business operations. Information technologies are often used in knowledge management programs in informing clients and employees of latest innovation/development in the business sector as well as sharing knowledge among the employees. The key professionals involved in knowledge management programs are information technologists and human resource managers but the information professionals also have a role to play as they are traditionally known as good managers of explicit knowledge. Hence, the aim of this study is to provide empirical evidence of the role of information professionals in knowledge management programs. 386 information professionals working in Canadian organizations were selected from the Special Libraries Association's Who's Who in Special Libraries 2001/2002, and a questionnaire with a stamped self-addressed envelope for its return was sent to each one of them. 63 questionnaires were completed and returned, and 8 in-depth interviews conducted. About 59% of the information professionals surveyed are working in organizations that have knowledge management programs with about 86% of these professionals being involved in the programs. Factors such as gender, age, and educational background (i.e. highest educational qualifications and discipline did not seem to have any relationship with involvement in knowledge management programs. Many of those involved in the programs are playing key roles, such as the design of the information architecture, development of taxonomy, or con-tent management of the organization's intranet. Others play lesser roles, such as providing information for the intranet, gathering competitive intelligence, or providing research services as requested by the knowledge management team.

  13. HOW ROMANIAN FINANCIAL AND INTERNAL AUDITORS ACQUIRE ACCOUNTING INFORMATION SYSTEMS KNOWLEDGE AND COMPETENCES?

    Directory of Open Access Journals (Sweden)

    Cardos Vasile - Daniel

    2011-07-01

    Full Text Available Research theme in this article we investigate how Romanian financial and internal auditors acquire accounting information systems knowledge and competences and how they use this knowledge to improve their activity in order to fulfill their mission as required by the professional standards. Objectives our main purpose is to establish through what type of courses Romanian financial and internal auditors acquiring accounting information systems knowledge and competences and how useful these courses are perceived by the auditors. Prior work audit professional organizations prescribed that auditors must acquire, maintain and develop their knowledge and competences. Information technology and information systems are considered to be a main knowledge component of professional development programs. The scientific literature indicates that auditors have to enhance their information systems knowledge in order to cope with the increasing complexity of the client's entities accounting information systems. We consider that our article embraces Curtis et al. (2009 call for research on how auditors obtain information systems knowledge. Methodology an electronic questionnaire was created and sent to Romanian financial and internal auditors, which were required to indicate the number of accounting information systems course they attended and how the knowledge gained improved their activity. Results We concluded that financial auditors acquire accounting information systems knowledge mainly by attending the courses organized by the Chamber of Financial Auditors of Romanian, while internal auditors by attending the course organized by the companies they are working with. Implications - The results of this study might be used by Romanian professional audit organizations in reconsidering their priorities regarding the accounting information systems knowledge and competence needs of their constituents. Originality/Contribution Our study is the first one to

  14. Geophysical mapping of complex glaciogenic large-scale structures

    DEFF Research Database (Denmark)

    Høyer, Anne-Sophie

    2013-01-01

    This thesis presents the main results of a four year PhD study concerning the use of geophysical data in geological mapping. The study is related to the Geocenter project, “KOMPLEKS”, which focuses on the mapping of complex, large-scale geological structures. The study area is approximately 100 km2...... data types and co-interpret them in order to improve our geological understanding. However, in order to perform this successfully, methodological considerations are necessary. For instance, a structure indicated by a reflection in the seismic data is not always apparent in the resistivity data...... information) can be collected. The geophysical data are used together with geological analyses from boreholes and pits to interpret the geological history of the hill-island. The geophysical data reveal that the glaciotectonic structures truncate at the surface. The directions of the structures were mapped...

  15. Assessing Knowledge Sharing Among Academics: A Validation of the Knowledge Sharing Behavior Scale (KSBS).

    Science.gov (United States)

    Ramayah, T; Yeap, Jasmine A L; Ignatius, Joshua

    2014-04-01

    There is a belief that academics tend to hold on tightly to their knowledge and intellectual resources. However, not much effort has been put into the creation of a valid and reliable instrument to measure knowledge sharing behavior among the academics. To apply and validate the Knowledge Sharing Behavior Scale (KSBS) as a measure of knowledge sharing behavior within the academic community. Respondents (N = 447) were academics from arts and science streams in 10 local, public universities in Malaysia. Data were collected using the 28-item KSBS that assessed four dimensions of knowledge sharing behavior namely written contributions, organizational communications, personal interactions, and communities of practice. The exploratory factor analysis showed that the items loaded on the dimension constructs that they were supposed to represent, thus proving construct validity. A within-factor analysis revealed that each set of items representing their intended dimension loaded on only one construct, therefore establishing convergent validity. All four dimensions were not perfectly correlated with each other or organizational citizenship behavior, thereby proving discriminant validity. However, all four dimensions correlated with organizational commitment, thus confirming predictive validity. Furthermore, all four factors correlated with both tacit and explicit sharing, which confirmed their concurrent validity. All measures also possessed sufficient reliability (α > .70). The KSBS is a valid and reliable instrument that can be used to formally assess the types of knowledge artifacts residing among academics and the degree of knowledge sharing in relation to those artifacts. © The Author(s) 2014.

  16. Identifying influential nodes in large-scale directed networks: the role of clustering.

    Science.gov (United States)

    Chen, Duan-Bing; Gao, Hui; Lü, Linyuan; Zhou, Tao

    2013-01-01

    Identifying influential nodes in very large-scale directed networks is a big challenge relevant to disparate applications, such as accelerating information propagation, controlling rumors and diseases, designing search engines, and understanding hierarchical organization of social and biological networks. Known methods range from node centralities, such as degree, closeness and betweenness, to diffusion-based processes, like PageRank and LeaderRank. Some of these methods already take into account the influences of a node's neighbors but do not directly make use of the interactions among it's neighbors. Local clustering is known to have negative impacts on the information spreading. We further show empirically that it also plays a negative role in generating local connections. Inspired by these facts, we propose a local ranking algorithm named ClusterRank, which takes into account not only the number of neighbors and the neighbors' influences, but also the clustering coefficient. Subject to the susceptible-infected-recovered (SIR) spreading model with constant infectivity, experimental results on two directed networks, a social network extracted from delicious.com and a large-scale short-message communication network, demonstrate that the ClusterRank outperforms some benchmark algorithms such as PageRank and LeaderRank. Furthermore, ClusterRank can also be applied to undirected networks where the superiority of ClusterRank is significant compared with degree centrality and k-core decomposition. In addition, ClusterRank, only making use of local information, is much more efficient than global methods: It takes only 191 seconds for a network with about [Formula: see text] nodes, more than 15 times faster than PageRank.

  17. Identifying influential nodes in large-scale directed networks: the role of clustering.

    Directory of Open Access Journals (Sweden)

    Duan-Bing Chen

    Full Text Available Identifying influential nodes in very large-scale directed networks is a big challenge relevant to disparate applications, such as accelerating information propagation, controlling rumors and diseases, designing search engines, and understanding hierarchical organization of social and biological networks. Known methods range from node centralities, such as degree, closeness and betweenness, to diffusion-based processes, like PageRank and LeaderRank. Some of these methods already take into account the influences of a node's neighbors but do not directly make use of the interactions among it's neighbors. Local clustering is known to have negative impacts on the information spreading. We further show empirically that it also plays a negative role in generating local connections. Inspired by these facts, we propose a local ranking algorithm named ClusterRank, which takes into account not only the number of neighbors and the neighbors' influences, but also the clustering coefficient. Subject to the susceptible-infected-recovered (SIR spreading model with constant infectivity, experimental results on two directed networks, a social network extracted from delicious.com and a large-scale short-message communication network, demonstrate that the ClusterRank outperforms some benchmark algorithms such as PageRank and LeaderRank. Furthermore, ClusterRank can also be applied to undirected networks where the superiority of ClusterRank is significant compared with degree centrality and k-core decomposition. In addition, ClusterRank, only making use of local information, is much more efficient than global methods: It takes only 191 seconds for a network with about [Formula: see text] nodes, more than 15 times faster than PageRank.

  18. Identifying Influential Nodes in Large-Scale Directed Networks: The Role of Clustering

    Science.gov (United States)

    Chen, Duan-Bing; Gao, Hui; Lü, Linyuan; Zhou, Tao

    2013-01-01

    Identifying influential nodes in very large-scale directed networks is a big challenge relevant to disparate applications, such as accelerating information propagation, controlling rumors and diseases, designing search engines, and understanding hierarchical organization of social and biological networks. Known methods range from node centralities, such as degree, closeness and betweenness, to diffusion-based processes, like PageRank and LeaderRank. Some of these methods already take into account the influences of a node’s neighbors but do not directly make use of the interactions among it’s neighbors. Local clustering is known to have negative impacts on the information spreading. We further show empirically that it also plays a negative role in generating local connections. Inspired by these facts, we propose a local ranking algorithm named ClusterRank, which takes into account not only the number of neighbors and the neighbors’ influences, but also the clustering coefficient. Subject to the susceptible-infected-recovered (SIR) spreading model with constant infectivity, experimental results on two directed networks, a social network extracted from delicious.com and a large-scale short-message communication network, demonstrate that the ClusterRank outperforms some benchmark algorithms such as PageRank and LeaderRank. Furthermore, ClusterRank can also be applied to undirected networks where the superiority of ClusterRank is significant compared with degree centrality and k-core decomposition. In addition, ClusterRank, only making use of local information, is much more efficient than global methods: It takes only 191 seconds for a network with about nodes, more than 15 times faster than PageRank. PMID:24204833

  19. Hierarchical Cantor set in the large scale structure with torus geometry

    Energy Technology Data Exchange (ETDEWEB)

    Murdzek, R. [Physics Department, ' Al. I. Cuza' University, Blvd. Carol I, Nr. 11, Iassy 700506 (Romania)], E-mail: rmurdzek@yahoo.com

    2008-12-15

    The formation of large scale structures is considered within a model with string on toroidal space-time. Firstly, the space-time geometry is presented. In this geometry, the Universe is represented by a string describing a torus surface. Thereafter, the large scale structure of the Universe is derived from the string oscillations. The results are in agreement with the cellular structure of the large scale distribution and with the theory of a Cantorian space-time.

  20. Providing Knowledge Recommendations: An Approach for Informal Electronic Mentoring

    Science.gov (United States)

    Colomo-Palacios, Ricardo; Casado-Lumbreras, Cristina; Soto-Acosta, Pedro; Misra, Sanjay

    2014-01-01

    The use of Web 2.0 technologies for knowledge management is invading the corporate sphere. The Web 2.0 is the most adopted knowledge transfer tool within knowledge intensive firms and is starting to be used for mentoring. This paper presents IM-TAG, a Web 2.0 tool, based on semantic technologies, for informal mentoring. The tool offers…

  1. Detecting Large-Scale Landslides Using Lidar Data and Aerial Photos in the Namasha-Liuoguey Area, Taiwan

    Directory of Open Access Journals (Sweden)

    Meei-Ling Lin

    2013-12-01

    Full Text Available Large-scale landslides often cause severe damage to lives and properties; therefore, their identification is essential in order to adopt proper mitigation measures. The objective of this study was to set up a methodological approach to help identify large-scale landslides using Lidar data, aerial photos and field investigation. The selected study areas were the Namasha and Liuoguey Areas in Kaohsiung City, Taiwan, both of which were severely hit by the Typhoon Morakot in 2009. The identification of large-scale landslides was performed based on Lidar high-resolution topographic information. The linear structures were mapped according to the shading map, with aspect in different azimuth to show good details of the structures. The scarps of the landslides were also identified. Validation of the results was done using both aerial photos and field investigations. In addition, stability analyses were performed on designated cases to further validate the results of Lidar identification.

  2. Large-scale educational telecommunications systems for the US: An analysis of educational needs and technological opportunities

    Science.gov (United States)

    Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.

    1975-01-01

    The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.

  3. Towards an information extraction and knowledge formation framework based on Shannon entropy

    Directory of Open Access Journals (Sweden)

    Iliescu Dragoș

    2017-01-01

    Full Text Available Information quantity subject is approached in this paperwork, considering the specific domain of nonconforming product management as information source. This work represents a case study. Raw data were gathered from a heavy industrial works company, information extraction and knowledge formation being considered herein. Involved method for information quantity estimation is based on Shannon entropy formula. Information and entropy spectrum are decomposed and analysed for extraction of specific information and knowledge-that formation. The result of the entropy analysis point out the information needed to be acquired by the involved organisation, this being presented as a specific knowledge type.

  4. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  5. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  6. The future of primordial features with large-scale structure surveys

    International Nuclear Information System (INIS)

    Chen, Xingang; Namjoo, Mohammad Hossein; Dvorkin, Cora; Huang, Zhiqi; Verde, Licia

    2016-01-01

    Primordial features are one of the most important extensions of the Standard Model of cosmology, providing a wealth of information on the primordial Universe, ranging from discrimination between inflation and alternative scenarios, new particle detection, to fine structures in the inflationary potential. We study the prospects of future large-scale structure (LSS) surveys on the detection and constraints of these features. We classify primordial feature models into several classes, and for each class we present a simple template of power spectrum that encodes the essential physics. We study how well the most ambitious LSS surveys proposed to date, including both spectroscopic and photometric surveys, will be able to improve the constraints with respect to the current Planck data. We find that these LSS surveys will significantly improve the experimental sensitivity on features signals that are oscillatory in scales, due to the 3D information. For a broad range of models, these surveys will be able to reduce the errors of the amplitudes of the features by a factor of 5 or more, including several interesting candidates identified in the recent Planck data. Therefore, LSS surveys offer an impressive opportunity for primordial feature discovery in the next decade or two. We also compare the advantages of both types of surveys.

  7. The future of primordial features with large-scale structure surveys

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xingang; Namjoo, Mohammad Hossein [Institute for Theory and Computation, Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Dvorkin, Cora [Department of Physics, Harvard University, Cambridge, MA 02138 (United States); Huang, Zhiqi [School of Physics and Astronomy, Sun Yat-Sen University, 135 Xingang Xi Road, Guangzhou, 510275 (China); Verde, Licia, E-mail: xingang.chen@cfa.harvard.edu, E-mail: dvorkin@physics.harvard.edu, E-mail: huangzhq25@sysu.edu.cn, E-mail: mohammad.namjoo@cfa.harvard.edu, E-mail: liciaverde@icc.ub.edu [ICREA and ICC-UB, University of Barcelona (IEEC-UB), Marti i Franques, 1, Barcelona 08028 (Spain)

    2016-11-01

    Primordial features are one of the most important extensions of the Standard Model of cosmology, providing a wealth of information on the primordial Universe, ranging from discrimination between inflation and alternative scenarios, new particle detection, to fine structures in the inflationary potential. We study the prospects of future large-scale structure (LSS) surveys on the detection and constraints of these features. We classify primordial feature models into several classes, and for each class we present a simple template of power spectrum that encodes the essential physics. We study how well the most ambitious LSS surveys proposed to date, including both spectroscopic and photometric surveys, will be able to improve the constraints with respect to the current Planck data. We find that these LSS surveys will significantly improve the experimental sensitivity on features signals that are oscillatory in scales, due to the 3D information. For a broad range of models, these surveys will be able to reduce the errors of the amplitudes of the features by a factor of 5 or more, including several interesting candidates identified in the recent Planck data. Therefore, LSS surveys offer an impressive opportunity for primordial feature discovery in the next decade or two. We also compare the advantages of both types of surveys.

  8. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  9. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    Directory of Open Access Journals (Sweden)

    Steiakakis Chrysanthos

    2016-01-01

    Full Text Available The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions.

  10. The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  11. Adaptation of regulatory information and knowledge through knowledge maps in the Argentine Nuclear Regulatory Authority within the framework of nuclear renaissance

    International Nuclear Information System (INIS)

    Chahab, Martin; Dawyd, Noelia

    2008-01-01

    Full text: In the new framework of nuclear renaissance in the world in general, and in Argentina in particular, proper and efficient management of information and knowledge produced in the past and to be produced during renaissance becomes critically important. The fact that in the nuclear sector across the world human resources are going through significant change as a result of the massive number of experts who are retiring from the workforce, the ensuing general gap, the new generation of workers who are joining the nuclear rank and file with different training, values and cultural beliefs, and the slow information and knowledge transfer process call for carefully considering and assessing new methods to manage information and knowledge. This paper discusses the topic of knowledge maps as a method to adapt historical information and knowledge and to make it more readily available for future workers; the paper also deals with a new management approach to such information. Knowledge maps probably represent an up-to-date method to manage both historical and new information and knowledge, adapting to a number of new cultural features, including but not limited to the intensive use of information technologies and the tendency to summarize and integrate concepts. A distinguishing feature of this new method of organizing information and knowledge is the need for a closer interrelation across the organisation's sectors. As a result, knowledge maps help create and improve manuals and procedures related to the specific tasks performed in the institution, based on the analysis carried out by those creating the maps. This tool also helps better analyze the tasks already conducted or to be conducted by workers, all of which optimizes the job description process in the area of human resources. Another benefit of knowledge maps is that they help preserve the information and knowledge that can be used to train the staff in merely technical or induction issues as well as in an

  12. Information Tailoring Enhancements for Large-Scale Social Data

    Science.gov (United States)

    2016-09-26

    person shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number...Unclassified Unclassified Unclassified SAR 12 19a. NAME OF RESPONSIBLE PERSON Dr. Rebecca Goolsby 19b. TELEPHONE NUMBER flnclude area code) (703...Accounts: In addition to linking Twitter accounts, users can now link their Instagram accounts. This is encouraged because users can use their token (as

  13. Responding to the challenge of artisanal and small-scale mining. How can knowledge networks help?

    Energy Technology Data Exchange (ETDEWEB)

    Buxton, Abbi

    2013-02-15

    This paper reviews what is known about the problems and structural challenges facing the 20-30 million artisanal and small-scale miners and their communities worldwide. Better understanding of these structural challenges is needed to improve policies and policy implementation to further sustainable development opportunities for the sector. The paper explores the current gaps in knowledge to achieve policy change from researchers, practitioners and artisanal and small-scale miners themselves. It explores how a 'knowledge intermediary', which acts to link knowledge with policy, could address these gaps and includes case studies of IIED’s work on knowledge networks and programmes. The paper concludes by proposing a way forward for designing a knowledge programme to meet the particular needs of the artisanal and small-scale mining (ASM) sector, and by inviting ASM sector stakeholders to share their views on the options outlined.

  14. Responding to the challenge of artisanal and small-scale mining. How can knowledge networks help?

    Energy Technology Data Exchange (ETDEWEB)

    Buxton, Abbi

    2013-02-15

    This paper reviews what is known about the problems and structural challenges facing the 20-30 million artisanal and small-scale miners and their communities worldwide. Better understanding of these structural challenges is needed to improve policies and policy implementation to further sustainable development opportunities for the sector. The paper explores the current gaps in knowledge to achieve policy change from researchers, practitioners and artisanal and small-scale miners themselves. It explores how a 'knowledge intermediary', which acts to link knowledge with policy, could address these gaps and includes case studies of IIED’s work on knowledge networks and programmes. The paper concludes by proposing a way forward for designing a knowledge programme to meet the particular needs of the artisanal and small-scale mining (ASM) sector, and by inviting ASM sector stakeholders to share their views on the options outlined.

  15. Collective Influence of Multiple Spreaders Evaluated by Tracing Real Information Flow in Large-Scale Social Networks

    Science.gov (United States)

    Teng, Xian; Pei, Sen; Morone, Flaviano; Makse, Hernán A.

    2016-01-01

    Identifying the most influential spreaders that maximize information flow is a central question in network theory. Recently, a scalable method called “Collective Influence (CI)” has been put forward through collective influence maximization. In contrast to heuristic methods evaluating nodes’ significance separately, CI method inspects the collective influence of multiple spreaders. Despite that CI applies to the influence maximization problem in percolation model, it is still important to examine its efficacy in realistic information spreading. Here, we examine real-world information flow in various social and scientific platforms including American Physical Society, Facebook, Twitter and LiveJournal. Since empirical data cannot be directly mapped to ideal multi-source spreading, we leverage the behavioral patterns of users extracted from data to construct “virtual” information spreading processes. Our results demonstrate that the set of spreaders selected by CI can induce larger scale of information propagation. Moreover, local measures as the number of connections or citations are not necessarily the deterministic factors of nodes’ importance in realistic information spreading. This result has significance for rankings scientists in scientific networks like the APS, where the commonly used number of citations can be a poor indicator of the collective influence of authors in the community. PMID:27782207

  16. Risk-based optimization of pipe inspections in large underground networks with imprecise information

    International Nuclear Information System (INIS)

    Mancuso, A.; Compare, M.; Salo, A.; Zio, E.; Laakso, T.

    2016-01-01

    In this paper, we present a novel risk-based methodology for optimizing the inspections of large underground infrastructure networks in the presence of incomplete information about the network features and parameters. The methodology employs Multi Attribute Value Theory to assess the risk of each pipe in the network, whereafter the optimal inspection campaign is built with Portfolio Decision Analysis (PDA). Specifically, Robust Portfolio Modeling (RPM) is employed to identify Pareto-optimal portfolios of pipe inspections. The proposed methodology is illustrated by reporting a real case study on the large-scale maintenance optimization of the sewerage network in Espoo, Finland. - Highlights: • Risk-based approach to optimize pipe inspections on large underground networks. • Reasonable computational effort to select efficient inspection portfolios. • Possibility to accommodate imprecise expert information. • Feasibility of the approach shown by Espoo water system case study.

  17. Upscaling of Large-Scale Transport in Spatially Heterogeneous Porous Media Using Wavelet Transformation

    Science.gov (United States)

    Moslehi, M.; de Barros, F.; Ebrahimi, F.; Sahimi, M.

    2015-12-01

    Modeling flow and solute transport in large-scale heterogeneous porous media involves substantial computational burdens. A common approach to alleviate this complexity is to utilize upscaling methods. These processes generate upscaled models with less complexity while attempting to preserve the hydrogeological properties comparable to the original fine-scale model. We use Wavelet Transformations (WT) of the spatial distribution of aquifer's property to upscale the hydrogeological models and consequently transport processes. In particular, we apply the technique to a porous formation with broadly distributed and correlated transmissivity to verify the performance of the WT. First, transmissivity fields are coarsened using WT in such a way that the high transmissivity zones, in which more important information is embedded, mostly remain the same, while the low transmissivity zones are averaged out since they contain less information about the hydrogeological formation. Next, flow and non-reactive transport are simulated in both fine-scale and upscaled models to predict both the concentration breakthrough curves at a control location and the large-scale spreading of the plume around its centroid. The results reveal that the WT of the fields generates non-uniform grids with an average of 2.1% of the number of grid blocks in the original fine-scale models, which eventually leads to a significant reduction in the computational costs. We show that the upscaled model obtained through the WT reconstructs the concentration breakthrough curves and the spreading of the plume at different times accurately. Furthermore, the impacts of the Hurst coefficient, size of the flow domain and the orders of magnitude difference in transmissivity values on the results have been investigated. It is observed that as the heterogeneity and the size of the domain increase, better agreement between the results of fine-scale and upscaled models can be achieved. Having this framework at hand aids

  18. Scalable Analysis Methods and In Situ Infrastructure for Extreme Scale Knowledge Discovery

    Energy Technology Data Exchange (ETDEWEB)

    Bethel, Wes [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-07-24

    The primary challenge motivating this team’s work is the widening gap between the ability to compute information and to store it for subsequent analysis. This gap adversely impacts science code teams, who are able to perform analysis only on a small fraction of the data they compute, resulting in the very real likelihood of lost or missed science, when results are computed but not analyzed. Our approach is to perform as much analysis or visualization processing on data while it is still resident in memory, an approach that is known as in situ processing. The idea in situ processing was not new at the time of the start of this effort in 2014, but efforts in that space were largely ad hoc, and there was no concerted effort within the research community that aimed to foster production-quality software tools suitable for use by DOE science projects. In large, our objective was produce and enable use of production-quality in situ methods and infrastructure, at scale, on DOE HPC facilities, though we expected to have impact beyond DOE due to the widespread nature of the challenges, which affect virtually all large-scale computational science efforts. To achieve that objective, we assembled a unique team of researchers consisting of representatives from DOE national laboratories, academia, and industry, and engaged in software technology R&D, as well as engaged in close partnerships with DOE science code teams, to produce software technologies that were shown to run effectively at scale on DOE HPC platforms.

  19. Open Geosciences Knowledge: foster Information Preparedness in a Disaster Resilience Perspective

    Science.gov (United States)

    Rapisardi, Elena; Di Franco, Sabina

    2014-05-01

    Information in science communication is the ability and the capacity to transfer scientific knowledge to enable the understanding of communication content. Particularly, as stated in many documents and programs (e.g. UNISDR, a clear and correct information on hazards and emergency matters is crucial,either for practitioners or population,to cope with disaster and to allow collaboration to take the best decision. The Open Knowledge is defined as a set of criteria and conditions related to production, use and distribution, that include principles for better access to knowledge. However,knowledge is a pillar to understand the world in itself and to guide human actions and interactions with the environment. A free and open access to knowledge in a wider perspective includes also an ethical topic that is strictly connected to the acting in terms of interactions and responsibilities, in other words with the purpose of knowledge. Focusing on "data" as a technical issue, could displace ethics and responsibility as external issues, enhancing the technical value of data. In this perspective "opening" to an open knowledge perspective could not only solve problems related to the téchne, such as functionalities and efficiency, but it should foster sharing and collaboration expressed through ethics (praxis). The web era frees the information, hence the internet "information deluge" brings to the idea of "encyclopedia" (and of Wikipedia) as a tool to "organize, control and filter" knowledge, to allow communication, knowledge transfer, education, and sense-making. Social media and crowdsourcing have considerable promise for supporting collaborative and innovative ways that reshape the information production and distribution. However, the debate is now facing an important concern related to true/false issues, focusing on validation, and liability. Without any doubt the massive use of Social Media during recent major and minor disasters highlighted a huge need of clear, correct

  20. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...