WorldWideScience

Sample records for minimal computer knowledge

  1. Towards minimal resources of measurement-based quantum computation

    International Nuclear Information System (INIS)

    Perdrix, Simon

    2007-01-01

    We improve the upper bound on the minimal resources required for measurement-only quantum computation (M A Nielsen 2003 Phys. Rev. A 308 96-100; D W Leung 2004 Int. J. Quantum Inform. 2 33; S Perdrix 2005 Int. J. Quantum Inform. 3 219-23). Minimizing the resources required for this model is a key issue for experimental realization of a quantum computer based on projective measurements. This new upper bound also allows one to reply in the negative to the open question presented by Perdrix (2004 Proc. Quantum Communication Measurement and Computing) about the existence of a trade-off between observable and ancillary qubits in measurement-only QC

  2. Knowledge Generation as Natural Computation

    Directory of Open Access Journals (Sweden)

    Gordana Dodig-Crnkovic

    2008-04-01

    Full Text Available Knowledge generation can be naturalized by adopting computational model of cognition and evolutionary approach. In this framework knowledge is seen as a result of the structuring of input data (data ? information ? knowledge by an interactive computational process going on in the agent during the adaptive interplay with the environment, which clearly presents developmental advantage by increasing agent's ability to cope with the situation dynamics. This paper addresses the mechanism of knowledge generation, a process that may be modeled as natural computation in order to be better understood and improved.

  3. emMAW: computing minimal absent words in external memory.

    Science.gov (United States)

    Héliou, Alice; Pissis, Solon P; Puglisi, Simon J

    2017-09-01

    The biological significance of minimal absent words has been investigated in genomes of organisms from all domains of life. For instance, three minimal absent words of the human genome were found in Ebola virus genomes. There exists an O(n) -time and O(n) -space algorithm for computing all minimal absent words of a sequence of length n on a fixed-sized alphabet based on suffix arrays. A standard implementation of this algorithm, when applied to a large sequence of length n , requires more than 20 n  bytes of RAM. Such memory requirements are a significant hurdle to the computation of minimal absent words in large datasets. We present emMAW, the first external-memory algorithm for computing minimal absent words. A free open-source implementation of our algorithm is made available. This allows for computation of minimal absent words on far bigger data sets than was previously possible. Our implementation requires less than 3 h on a standard workstation to process the full human genome when as little as 1 GB of RAM is made available. We stress that our implementation, despite making use of external memory, is fast; indeed, even on relatively smaller datasets when enough RAM is available to hold all necessary data structures, it is less than two times slower than state-of-the-art internal-memory implementations. https://github.com/solonas13/maw (free software under the terms of the GNU GPL). alice.heliou@lix.polytechnique.fr or solon.pissis@kcl.ac.uk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  4. Computers and the Environment: Minimizing the Carbon Footprint

    Science.gov (United States)

    Kaestner, Rich

    2009-01-01

    Computers can be good and bad for the environment; one can maximize the good and minimize the bad. When dealing with environmental issues, it's difficult to ignore the computing infrastructure. With an operations carbon footprint equal to the airline industry's, computer energy use is only part of the problem; everyone is also dealing with the use…

  5. Cloud Computing-An Ultimate Technique to Minimize Computing cost for Developing Countries

    OpenAIRE

    Narendra Kumar; Shikha Jain

    2012-01-01

    The presented paper deals with how remotely managed computing and IT resources can be beneficial in the developing countries like India and Asian sub-continent countries. This paper not only defines the architectures and functionalities of cloud computing but also indicates strongly about the current demand of Cloud computing to achieve organizational and personal level of IT supports in very minimal cost with high class flexibility. The power of cloud can be used to reduce the cost of IT - r...

  6. Knowledge-based computer security advisor

    International Nuclear Information System (INIS)

    Hunteman, W.J.; Squire, M.B.

    1991-01-01

    The rapid expansion of computer security information and technology has included little support to help the security officer identify the safeguards needed to comply with a policy and to secure a computing system. This paper reports that Los Alamos is developing a knowledge-based computer security system to provide expert knowledge to the security officer. This system includes a model for expressing the complex requirements in computer security policy statements. The model is part of an expert system that allows a security officer to describe a computer system and then determine compliance with the policy. The model contains a generic representation that contains network relationships among the policy concepts to support inferencing based on information represented in the generic policy description

  7. Sharing experience and knowledge with wearable computers

    OpenAIRE

    Nilsson, Marcus; Drugge, Mikael; Parnes, Peter

    2004-01-01

    Wearable computer have mostly been looked on when used in isolation. But the wearable computer with Internet connection is a good tool for communication and for sharing knowledge and experience with other people. The unobtrusiveness of this type of equipment makes it easy to communicate at most type of locations and contexts. The wearable computer makes it easy to be a mediator of other people knowledge and becoming a knowledgeable user. This paper describes the experience gained from testing...

  8. Assessing Computer Knowledge among College Students.

    Science.gov (United States)

    Parrish, Allen; And Others

    This paper reports on a study involving the administration of two examinations that were designed to evaluate student knowledge in several areas of computing. The tests were given both to computer science majors and to those enrolled in computer science classes from other majors. They sought to discover whether computer science majors demonstrated…

  9. Explicit knowledge programming for computer games

    NARCIS (Netherlands)

    Witzel, A.; Zvesper, J.A.; Kennerly, E.; Darken, C.; Mateas, M.

    2008-01-01

    The main aim of this paper is to raise awareness of higher-order knowledge (knowledge about someone else's knowledge) as an issue for computer game AI. We argue that a number of existing game genres, especially those involving social interaction, are natural fields of application for an approach we

  10. Free energy minimization to predict RNA secondary structures and computational RNA design.

    Science.gov (United States)

    Churkin, Alexander; Weinbrand, Lina; Barash, Danny

    2015-01-01

    Determining the RNA secondary structure from sequence data by computational predictions is a long-standing problem. Its solution has been approached in two distinctive ways. If a multiple sequence alignment of a collection of homologous sequences is available, the comparative method uses phylogeny to determine conserved base pairs that are more likely to form as a result of billions of years of evolution than by chance. In the case of single sequences, recursive algorithms that compute free energy structures by using empirically derived energy parameters have been developed. This latter approach of RNA folding prediction by energy minimization is widely used to predict RNA secondary structure from sequence. For a significant number of RNA molecules, the secondary structure of the RNA molecule is indicative of its function and its computational prediction by minimizing its free energy is important for its functional analysis. A general method for free energy minimization to predict RNA secondary structures is dynamic programming, although other optimization methods have been developed as well along with empirically derived energy parameters. In this chapter, we introduce and illustrate by examples the approach of free energy minimization to predict RNA secondary structures.

  11. Semantic computing and language knowledge bases

    Science.gov (United States)

    Wang, Lei; Wang, Houfeng; Yu, Shiwen

    2017-09-01

    As the proposition of the next-generation Web - semantic Web, semantic computing has been drawing more and more attention within the circle and the industries. A lot of research has been conducted on the theory and methodology of the subject, and potential applications have also been investigated and proposed in many fields. The progress of semantic computing made so far cannot be detached from its supporting pivot - language resources, for instance, language knowledge bases. This paper proposes three perspectives of semantic computing from a macro view and describes the current status of affairs about the construction of language knowledge bases and the related research and applications that have been carried out on the basis of these resources via a case study in the Institute of Computational Linguistics at Peking University.

  12. Trends in life science grid: from computing grid to knowledge grid

    Directory of Open Access Journals (Sweden)

    Konagaya Akihiko

    2006-12-01

    Full Text Available Abstract Background Grid computing has great potential to become a standard cyberinfrastructure for life sciences which often require high-performance computing and large data handling which exceeds the computing capacity of a single institution. Results This survey reviews the latest grid technologies from the viewpoints of computing grid, data grid and knowledge grid. Computing grid technologies have been matured enough to solve high-throughput real-world life scientific problems. Data grid technologies are strong candidates for realizing "resourceome" for bioinformatics. Knowledge grids should be designed not only from sharing explicit knowledge on computers but also from community formulation for sharing tacit knowledge among a community. Conclusion Extending the concept of grid from computing grid to knowledge grid, it is possible to make use of a grid as not only sharable computing resources, but also as time and place in which people work together, create knowledge, and share knowledge and experiences in a community.

  13. Rough – Granular Computing knowledge discovery models

    Directory of Open Access Journals (Sweden)

    Mohammed M. Eissa

    2016-11-01

    Full Text Available Medical domain has become one of the most important areas of research in order to richness huge amounts of medical information about the symptoms of diseases and how to distinguish between them to diagnose it correctly. Knowledge discovery models play vital role in refinement and mining of medical indicators to help medical experts to settle treatment decisions. This paper introduces four hybrid Rough – Granular Computing knowledge discovery models based on Rough Sets Theory, Artificial Neural Networks, Genetic Algorithm and Rough Mereology Theory. A comparative analysis of various knowledge discovery models that use different knowledge discovery techniques for data pre-processing, reduction, and data mining supports medical experts to extract the main medical indicators, to reduce the misdiagnosis rates and to improve decision-making for medical diagnosis and treatment. The proposed models utilized two medical datasets: Coronary Heart Disease dataset and Hepatitis C Virus dataset. The main purpose of this paper was to explore and evaluate the proposed models based on Granular Computing methodology for knowledge extraction according to different evaluation criteria for classification of medical datasets. Another purpose is to make enhancement in the frame of KDD processes for supervised learning using Granular Computing methodology.

  14. Minimizing the Free Energy: A Computer Method for Teaching Chemical Equilibrium Concepts.

    Science.gov (United States)

    Heald, Emerson F.

    1978-01-01

    Presents a computer method for teaching chemical equilibrium concepts using material balance conditions and the minimization of the free energy. Method for the calculation of chemical equilibrium, the computer program used to solve equilibrium problems and applications of the method are also included. (HM)

  15. Computational Evolutionary Methodology for Knowledge Discovery and Forecasting in Epidemiology and Medicine

    International Nuclear Information System (INIS)

    Rao, Dhananjai M.; Chernyakhovsky, Alexander; Rao, Victoria

    2008-01-01

    Humanity is facing an increasing number of highly virulent and communicable diseases such as avian influenza. Researchers believe that avian influenza has potential to evolve into one of the deadliest pandemics. Combating these diseases requires in-depth knowledge of their epidemiology. An effective methodology for discovering epidemiological knowledge is to utilize a descriptive, evolutionary, ecological model and use bio-simulations to study and analyze it. These types of bio-simulations fall under the category of computational evolutionary methods because the individual entities participating in the simulation are permitted to evolve in a natural manner by reacting to changes in the simulated ecosystem. This work describes the application of the aforementioned methodology to discover epidemiological knowledge about avian influenza using a novel eco-modeling and bio-simulation environment called SEARUMS. The mathematical principles underlying SEARUMS, its design, and the procedure for using SEARUMS are discussed. The bio-simulations and multi-faceted case studies conducted using SEARUMS elucidate its ability to pinpoint timelines, epicenters, and socio-economic impacts of avian influenza. This knowledge is invaluable for proactive deployment of countermeasures in order to minimize negative socioeconomic impacts, combat the disease, and avert a pandemic

  16. Minimal models of multidimensional computations.

    Directory of Open Access Journals (Sweden)

    Jeffrey D Fitzgerald

    2011-03-01

    Full Text Available The multidimensional computations performed by many biological systems are often characterized with limited information about the correlations between inputs and outputs. Given this limitation, our approach is to construct the maximum noise entropy response function of the system, leading to a closed-form and minimally biased model consistent with a given set of constraints on the input/output moments; the result is equivalent to conditional random field models from machine learning. For systems with binary outputs, such as neurons encoding sensory stimuli, the maximum noise entropy models are logistic functions whose arguments depend on the constraints. A constraint on the average output turns the binary maximum noise entropy models into minimum mutual information models, allowing for the calculation of the information content of the constraints and an information theoretic characterization of the system's computations. We use this approach to analyze the nonlinear input/output functions in macaque retina and thalamus; although these systems have been previously shown to be responsive to two input dimensions, the functional form of the response function in this reduced space had not been unambiguously identified. A second order model based on the logistic function is found to be both necessary and sufficient to accurately describe the neural responses to naturalistic stimuli, accounting for an average of 93% of the mutual information with a small number of parameters. Thus, despite the fact that the stimulus is highly non-Gaussian, the vast majority of the information in the neural responses is related to first and second order correlations. Our results suggest a principled and unbiased way to model multidimensional computations and determine the statistics of the inputs that are being encoded in the outputs.

  17. Sociopathic Knowledge Bases: Correct Knowledge Can Be Harmful Even Given Unlimited Computation

    Science.gov (United States)

    1989-08-01

    1757 I Sociopathic Knowledge Bases: Correct Knowledge Can Be Harmful Even Given Unlimited Computation DTIC5 by flELECTE 5David C. Wilkins and Yong...NUMBERSWOKNI PROGRAM RAT TSWOKUI 61153N RR04206 OC 443g-008 11 TITLE (Include Security Classification) Sociopathic Knowledge Bases: Correct Knowledge Can be...probabilistic rules are shown to be sociopathic and so this problem is very widespread. Sociopathicity has important consequences for rule induction

  18. Knowledge of the educational implications of computer games by ...

    African Journals Online (AJOL)

    This paper reports an expost facto research carried out to find out from 153 Computer Education Students (from Colleges of Education) their knowledge of Computer games. The researcher specifically set out to investigate, if those Computer Education Students thought pupils could actually learn form Computer games.

  19. Knowledge-based computer systems for radiotherapy planning.

    Science.gov (United States)

    Kalet, I J; Paluszynski, W

    1990-08-01

    Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.

  20. Gender Factor in Computer Anxiety, Knowledge and Utilization ...

    African Journals Online (AJOL)

    The study investigated the influence of computer anxiety and knowledge on computer utilization among senior secondary school students in Ogun state, Nigeria. A sample of four hundred students randomly selected from twenty secondary schools participated in the study. Ex-post facto research design method was adopted ...

  1. Knowledge of computer among healthcare professionals of India: a key toward e-health.

    Science.gov (United States)

    Gour, Neeraj; Srivastava, Dhiraj

    2010-11-01

    Information technology has radically changed the way that many people work and think. Over the years, technology has touched a new acme and now it is not confined to developed countries. Developing countries such as India have kept pace with the world in modern technology. Healthcare professionals can no longer ignore the application of information technology to healthcare because they are key to e-health. This study was conducted to enlighten the perspective and implications of computers among healthcare professionals, with the objective to assess the knowledge, use, and need of computers among healthcare professionals. A cross-sectional study of 240 healthcare professionals, including doctors, nurses, lab technicians, and pharmacists, was conducted. Each participant was interviewed using a pretested, semistructured format. Of 240 healthcare professionals, 57.91% were knowledgeable about computers. Of them, 22.08% had extensive knowledge and 35.83% had partial knowledge. Computer knowledge was greater among the age group 20-25 years (high knowledge-43.33% and partial knowledge-46.66%). Of 99 males, 21.21% were found to have good knowledge and 42.42% had partial knowledge. A majority of doctors and nurses used computer for study purposes. The remaining healthcare professionals used it basically for the sake of entertainment, Internet, and e-mail. A majority of all healthcare professionals (95.41%) requested computer training, which according to them would definitely help to make their future more bright and nurtured as well as to enhance their knowledge regarding computers.

  2. The BioIntelligence Framework: a new computational platform for biomedical knowledge computing.

    Science.gov (United States)

    Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles; Mousses, Spyro

    2013-01-01

    Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information.

  3. The HEP Software and Computing Knowledge Base

    Science.gov (United States)

    Wenaus, T.

    2017-10-01

    HEP software today is a rich and diverse domain in itself and exists within the mushrooming world of open source software. As HEP software developers and users we can be more productive and effective if our work and our choices are informed by a good knowledge of what others in our community have created or found useful. The HEP Software and Computing Knowledge Base, hepsoftware.org, was created to facilitate this by serving as a collection point and information exchange on software projects and products, services, training, computing facilities, and relating them to the projects, experiments, organizations and science domains that offer them or use them. It was created as a contribution to the HEP Software Foundation, for which a HEP S&C knowledge base was a much requested early deliverable. This contribution will motivate and describe the system, what it offers, its content and contributions both existing and needed, and its implementation (node.js based web service and javascript client app) which has emphasized ease of use for both users and contributors.

  4. Implementation of generalized measurements with minimal disturbance on a quantum computer

    International Nuclear Information System (INIS)

    Decker, T.; Grassl, M.

    2006-01-01

    We consider the problem of efficiently implementing a generalized measurement on a quantum computer. Using methods from representation theory, we exploit symmetries of the states we want to identify respectively symmetries of the measurement operators. In order to allow the information to be extracted sequentially, the disturbance of the quantum state due to the measurement should be minimal. (Abstract Copyright [2006], Wiley Periodicals, Inc.)

  5. Computer Experiences, Self-Efficacy and Knowledge of Students Enrolled in Introductory University Agriculture Courses.

    Science.gov (United States)

    Johnson, Donald M.; Ferguson, James A.; Lester, Melissa L.

    1999-01-01

    Of 175 freshmen agriculture students, 74% had prior computer courses, 62% owned computers. The number of computer topics studied predicted both computer self-efficacy and computer knowledge. A substantial positive correlation was found between self-efficacy and computer knowledge. (SK)

  6. COGMIR: A computer model for knowledge integration

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Z.X.

    1988-01-01

    This dissertation explores some aspects of knowledge integration, namely, accumulation of scientific knowledge and performing analogical reasoning on the acquired knowledge. Knowledge to be integrated is conveyed by paragraph-like pieces referred to as documents. By incorporating some results from cognitive science, the Deutsch-Kraft model of information retrieval is extended to a model for knowledge engineering, which integrates acquired knowledge and performs intelligent retrieval. The resulting computer model is termed COGMIR, which stands for a COGnitive Model for Intelligent Retrieval. A scheme, named query invoked memory reorganization, is used in COGMIR for knowledge integration. Unlike some other schemes which realize knowledge integration through subjective understanding by representing new knowledge in terms of existing knowledge, the proposed scheme suggests at storage time only recording the possible connection of knowledge acquired from different documents. The actual binding of the knowledge acquired from different documents is deferred to query time. There is only one way to store knowledge and numerous ways to utilize the knowledge. Each document can be represented as a whole as well as its meaning. In addition, since facts are constructed from the documents, document retrieval and fact retrieval are treated in a unified way. When the requested knowledge is not available, query invoked memory reorganization can generate suggestion based on available knowledge through analogical reasoning. This is done by revising the algorithms developed for document retrieval and fact retrieval, and by incorporating Gentner's structure mapping theory. Analogical reasoning is treated as a natural extension of intelligent retrieval, so that two previously separate research areas are combined. A case study is provided. All the components are implemented as list structures similar to relational data-bases.

  7. Towards Modeling False Memory With Computational Knowledge Bases.

    Science.gov (United States)

    Li, Justin; Kohanyi, Emma

    2017-01-01

    One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.

  8. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 19: Computer and information technology and aerospace knowledge diffusion

    Science.gov (United States)

    Pinelli, Thomas E.; Kennedy, John M.; Barclay, Rebecca O.; Bishop, Ann P.

    1992-01-01

    To remain a world leader in aerospace, the US must improve and maintain the professional competency of its engineers and scientists, increase the research and development (R&D) knowledge base, improve productivity, and maximize the integration of recent technological developments into the R&D process. How well these objectives are met, and at what cost, depends on a variety of factors, but largely on the ability of US aerospace engineers and scientists to acquire and process the results of federally funded R&D. The Federal Government's commitment to high speed computing and networking systems presupposes that computer and information technology will play a major role in the aerospace knowledge diffusion process. However, we know little about information technology needs, uses, and problems within the aerospace knowledge diffusion process. The use of computer and information technology by US aerospace engineers and scientists in academia, government, and industry is reported.

  9. 8th International Conference on Knowledge Management in Organizations : Social and Big Data Computing for Knowledge Management

    CERN Document Server

    Wang, Leon; Rodríguez, Juan; Yang, Hsin-Chang; Ting, I-Hsien

    2014-01-01

    The proceedings from the  eighth KMO conference represent the findings of this international meeting which brought together researchers and developers from industry and the academic world to report on the latest scientific and technical advances on knowledge management in organizations. This conference provided an international forum for authors to present and discuss research focused on the role of knowledge management for innovative services in industries, to shed light on recent advances in social and big data computing for KM as well as to identify future directions for researching the role of knowledge management in service innovation and how cloud computing can be used to address many of the issues currently facing KM in academia and industrial sectors.

  10. Tacit knowledge in action: basic notions of knowledge sharing in computer supported work environments

    OpenAIRE

    Mackenzie Owen, John

    2001-01-01

    An important characteristic of most computer supported work environments is the distribution of work over individuals or teams in different locations. This leads to what we nowadays call `virtual' environments. In these environments communication between actors is to a large degree mediated, i.e. established through communications media (telephone, fax, computer networks) rather in a face-to-face way. Unfortunately, mediated communication limits the effectiveness of knowledge exchange in virt...

  11. Minimal ancilla mediated quantum computation

    International Nuclear Information System (INIS)

    Proctor, Timothy J.; Kendon, Viv

    2014-01-01

    Schemes of universal quantum computation in which the interactions between the computational elements, in a computational register, are mediated by some ancillary system are of interest due to their relevance to the physical implementation of a quantum computer. Furthermore, reducing the level of control required over both the ancillary and register systems has the potential to simplify any experimental implementation. In this paper we consider how to minimise the control needed to implement universal quantum computation in an ancilla-mediated fashion. Considering computational schemes which require no measurements and hence evolve by unitary dynamics for the global system, we show that when employing an ancilla qubit there are certain fixed-time ancilla-register interactions which, along with ancilla initialisation in the computational basis, are universal for quantum computation with no additional control of either the ancilla or the register. We develop two distinct models based on locally inequivalent interactions and we then discuss the relationship between these unitary models and the measurement-based ancilla-mediated models known as ancilla-driven quantum computation. (orig.)

  12. PERKAM: Personalized Knowledge Awareness Map for Computer Supported Ubiquitous Learning

    Science.gov (United States)

    El-Bishouty, Moushir M.; Ogata, Hiroaki; Yano, Yoneo

    2007-01-01

    This paper introduces a ubiquitous computing environment in order to support the learners while doing tasks; this environment is called PERKAM (PERsonalized Knowledge Awareness Map). PERKAM allows the learners to share knowledge, interact, collaborate, and exchange individual experiences. It utilizes the RFID ubiquities technology to detect the…

  13. Minimizing the negative effects of device mobility in cell-based ad-hoc wireless computational grids

    CSIR Research Space (South Africa)

    Mudali, P

    2006-09-01

    Full Text Available This paper provides an outline of research being conducted to minimize the disruptive effects of device mobility in wireless computational grid networks. The proposed wireless grid framework uses the existing GSM cellular architecture, with emphasis...

  14. Population dynamics of minimally cognitive individuals. Part 2: Dynamics of time-dependent knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Schmieder, R.W.

    1995-07-01

    The dynamical principle for a population of interacting individuals with mutual pairwise knowledge, presented by the author in a previous paper for the case of constant knowledge, is extended to include the possibility that the knowledge is time-dependent. Several mechanisms are presented by which the mutual knowledge, represented by a matrix K, can be altered, leading to dynamical equations for K(t). The author presents various examples of the transient and long time asymptotic behavior of K(t) for populations of relatively isolated individuals interacting infrequently in local binary collisions. Among the effects observed in the numerical experiments are knowledge diffusion, learning transients, and fluctuating equilibria. This approach will be most appropriate to small populations of complex individuals such as simple animals, robots, computer networks, agent-mediated traffic, simple ecosystems, and games. Evidence of metastable states and intermittent switching leads them to envision a spectroscopy associated with such transitions that is independent of the specific physical individuals and the population. Such spectra may serve as good lumped descriptors of the collective emergent behavior of large classes of populations in which mutual knowledge is an important part of the dynamics.

  15. Librarians and computer knowledge

    Directory of Open Access Journals (Sweden)

    Primož Južnič

    1997-01-01

    Full Text Available Virtual library had become a well established term in librarianship and is often related to the fast Internet development and its usage. It is often discussed of virtual library as an integration of different media with Internet which may give the feeling of (virtual reality. The educational impact and new perception of knowledge and learning are also much discussed. However, Internet is a good example of service, that is directly related to number and critical mass of users.When this is reached, the growth is fast and almost limitless, We do not discuss enough about the training librarians should /will have for virtual library. This text tries also to systemize the practical experiences the author has in his professional work in two domains - librarianship and computer science.

  16. Computational intelligence approach for NOx emissions minimization in a coal-fired utility boiler

    International Nuclear Information System (INIS)

    Zhou Hao; Zheng Ligang; Cen Kefa

    2010-01-01

    The current work presented a computational intelligence approach used for minimizing NO x emissions in a 300 MW dual-furnaces coal-fired utility boiler. The fundamental idea behind this work included NO x emissions characteristics modeling and NO x emissions optimization. First, an objective function aiming at estimating NO x emissions characteristics from nineteen operating parameters of the studied boiler was represented by a support vector regression (SVR) model. Second, four levels of primary air velocities (PA) and six levels of secondary air velocities (SA) were regulated by using particle swarm optimization (PSO) so as to achieve low NO x emissions combustion. To reduce the time demanding, a more flexible stopping condition was used to improve the computational efficiency without the loss of the quality of the optimization results. The results showed that the proposed approach provided an effective way to reduce NO x emissions from 399.7 ppm to 269.3 ppm, which was much better than a genetic algorithm (GA) based method and was slightly better than an ant colony optimization (ACO) based approach reported in the earlier work. The main advantage of PSO was that the computational cost, typical of less than 25 s under a PC system, is much less than those required for ACO. This meant the proposed approach would be more applicable to online and real-time applications for NO x emissions minimization in actual power plant boilers.

  17. Aware Computing in Spatial Language Understanding Guided by Cognitively Inspired Knowledge Representation

    Directory of Open Access Journals (Sweden)

    Masao Yokota

    2012-01-01

    Full Text Available Mental image directed semantic theory (MIDST has proposed an omnisensory mental image model and its description language Lmd. This language is designed to represent and compute human intuitive knowledge of space and can provide multimedia expressions with intermediate semantic descriptions in predicate logic. It is hypothesized that such knowledge and semantic descriptions are controlled by human attention toward the world and therefore subjective to each human individual. This paper describes Lmd expression of human subjective knowledge of space and its application to aware computing in cross-media operation between linguistic and pictorial expressions as spatial language understanding.

  18. Health workers' knowledge of and attitudes towards computer applications in rural African health facilities.

    Science.gov (United States)

    Sukums, Felix; Mensah, Nathan; Mpembeni, Rose; Kaltschmidt, Jens; Haefeli, Walter E; Blank, Antje

    2014-01-01

    The QUALMAT (Quality of Maternal and Prenatal Care: Bridging the Know-do Gap) project has introduced an electronic clinical decision support system (CDSS) for pre-natal and maternal care services in rural primary health facilities in Burkina Faso, Ghana, and Tanzania. To report an assessment of health providers' computer knowledge, experience, and attitudes prior to the implementation of the QUALMAT electronic CDSS. A cross-sectional study was conducted with providers in 24 QUALMAT project sites. Information was collected using structured questionnaires. Chi-squared tests and one-way ANOVA describe the association between computer knowledge, attitudes, and other factors. Semi-structured interviews and focus groups were conducted to gain further insights. A total of 108 providers responded, 63% were from Tanzania and 37% from Ghana. The mean age was 37.6 years, and 79% were female. Only 40% had ever used computers, and 29% had prior computer training. About 80% were computer illiterate or beginners. Educational level, age, and years of work experience were significantly associated with computer knowledge (pworkplace. Given the low levels of computer knowledge among rural health workers in Africa, it is important to provide adequate training and support to ensure the successful uptake of electronic CDSSs in these settings. The positive attitudes to computers found in this study underscore that also rural care providers are ready to use such technology.

  19. Distributional and Knowledge-Based Approaches for Computing Portuguese Word Similarity

    Directory of Open Access Journals (Sweden)

    Hugo Gonçalo Oliveira

    2018-02-01

    Full Text Available Identifying similar and related words is not only key in natural language understanding but also a suitable task for assessing the quality of computational resources that organise words and meanings of a language, compiled by different means. This paper, which aims to be a reference for those interested in computing word similarity in Portuguese, presents several approaches for this task and is motivated by the recent availability of state-of-the-art distributional models of Portuguese words, which add to several lexical knowledge bases (LKBs for this language, available for a longer time. The previous resources were exploited to answer word similarity tests, which also became recently available for Portuguese. We conclude that there are several valid approaches for this task, but not one that outperforms all the others in every single test. Distributional models seem to capture relatedness better, while LKBs are better suited for computing genuine similarity, but, in general, better results are obtained when knowledge from different sources is combined.

  20. A Knowledge Engineering Approach to Developing Educational Computer Games for Improving Students' Differentiating Knowledge

    Science.gov (United States)

    Hwang, Gwo-Jen; Sung, Han-Yu; Hung, Chun-Ming; Yang, Li-Hsueh; Huang, Iwen

    2013-01-01

    Educational computer games have been recognized as being a promising approach for motivating students to learn. Nevertheless, previous studies have shown that without proper learning strategies or supportive models, the learning achievement of students might not be as good as expected. In this study, a knowledge engineering approach is proposed…

  1. A K-6 Computational Thinking Curriculum Framework : Implications for Teacher Knowledge

    NARCIS (Netherlands)

    Angeli, C.; Voogt, J.; Fluck, A.; Webb, M.; Cox, M.; Malyn-Smith, J.; Zagami, J.

    2016-01-01

    Adding computer science as a separate school subject to the core K-6 curriculum is a complex issue with educational challenges. The authors herein address two of these challenges: (1) the design of the curriculum based on a generic computational thinking framework, and (2) the knowledge teachers

  2. Ward nurses' knowledge of computed tomography scanning.

    Science.gov (United States)

    Majeed, M A; Nayeemuddin, M; Christie, M

    Patients benefit from and are reassured by advance information on procedures that they are to undergo. Ward nurses should have adequate knowledge of radiological investigations to ensure proper patient preparation and good interdepartmental communication to avoid delays and cancellations. This study was conducted to assess the ward nurses' knowledge of the process of computed tomography (CT) scanning. One hundred and twenty qualified nurses were asked to complete a questionnaire regarding CT scanning. The findings revealed a suboptimal level of awareness about the process. This is probably due to lack of formal teaching for nurses on the wards in regards the different radiological procedures and patient preparation. There is a strong case for better educational talks on rapidly changing radiological techniques for ward staff to ensure high-quality patient care.

  3. Knowledge acquisition in ecological poduct design: the effects of computer-mediated communication and elicitation method

    OpenAIRE

    Sauer, J.; Schramme, S.; Rüttinger, B.

    2000-01-01

    This article presents a study that examines multiple effects of using different means of computer-mediated communication and knowledge elicitation methods during a product design process. The experimental task involved a typical scenario in product design, in which a knowledge engineer consults two experts to generate knowledge about a design issue. Employing a 3x2 between-subjects design, three conference types (face-to-face, computer, multivedia) and two knowledge elicitation methods (struc...

  4. A new paradigm of knowledge engineering by soft computing

    CERN Document Server

    Ding, Liya

    2001-01-01

    Soft computing (SC) consists of several computing paradigms, including neural networks, fuzzy set theory, approximate reasoning, and derivative-free optimization methods such as genetic algorithms. The integration of those constituent methodologies forms the core of SC. In addition, the synergy allows SC to incorporate human knowledge effectively, deal with imprecision and uncertainty, and learn to adapt to unknown or changing environments for better performance. Together with other modern technologies, SC and its applications exert unprecedented influence on intelligent systems that mimic hum

  5. Non-binary decomposition trees - a method of reliability computation for systems with known minimal paths/cuts

    International Nuclear Information System (INIS)

    Malinowski, Jacek

    2004-01-01

    A coherent system with independent components and known minimal paths (cuts) is considered. In order to compute its reliability, a tree structure T is constructed whose nodes contain the modified minimal paths (cuts) and numerical values. The value of a non-leaf node is a function of its child nodes' values. The values of leaf nodes are calculated from a simple formula. The value of the root node is the system's failure probability (reliability). Subsequently, an algorithm computing the system's failure probability (reliability) is constructed. The algorithm scans all nodes of T using a stack structure for this purpose. The nodes of T are alternately put on and removed from the stack, their data being modified in the process. Once the algorithm has terminated, the stack contains only the final modification of the root node of T, and its value is equal to the system's failure probability (reliability)

  6. Systematic review on physician's knowledge about radiation doses and radiation risks of computed tomography

    International Nuclear Information System (INIS)

    Krille, Lucian; Hammer, Gael P.; Merzenich, Hiltrud; Zeeb, Hajo

    2010-01-01

    Background: The frequent use of computed tomography is a major cause of the increasing medical radiation exposure of the general population. Consequently, dose reduction and radiation protection is a topic of scientific and public concern. Aim: We evaluated the available literature on physicians' knowledge regarding radiation dosages and risks due to computed tomography. Methods: A systematic review in accordance with the Cochrane and PRISMA statements was performed using eight databases. 3091 references were found. Only primary studies assessing physicians' knowledge about computed tomography were included. Results: 14 relevant articles were identified, all focussing on dose estimations for CT. Overall, the surveys showed moderate to low knowledge among physicians concerning radiation doses and the involved health risks. However, the surveys varied considerably in conduct and quality. For some countries, more than one survey was available. There was no general trend in knowledge in any country except a slight improvement of knowledge on health risks and radiation doses in two consecutive local German surveys. Conclusions: Knowledge gaps concerning radiation doses and associated health risks among physicians are evident from published research. However, knowledge on radiation doses cannot be interpreted as reliable indicator for good medical practice.

  7. Generative Computer Assisted Instruction: An Application of Artificial Intelligence to CAI.

    Science.gov (United States)

    Koffman, Elliot B.

    Frame-oriented computer-assisted instruction (CAI) systems dominate the field, but these mechanized programed texts utilize the computational power of the computer to a minimal degree and are difficult to modify. Newer, generative CAI systems which are supplied with a knowledge of subject matter can generate their own problems and solutions, can…

  8. Students Enrolled in Selected Upper-Division Agriculture Courses: An Examination of Computer Experiences, Self-Efficacy and Knowledge.

    Science.gov (United States)

    Johnson, Donald M.; Ferguson, James A.; Lester, Melissa L.

    2000-01-01

    Of 169 agriculture students surveyed, 79% had computer training, 66% owned computers; they had slightly above average computer self-efficacy, especially in word processing, electronic mail, and Internet use. However, 72.7% scored 60% or less on a test of computer knowledge. There was little correlation between self-efficacy and computer knowledge.…

  9. Minimal families of curves on surfaces

    KAUST Repository

    Lubbes, Niels

    2014-01-01

    A minimal family of curves on an embedded surface is defined as a 1-dimensional family of rational curves of minimal degree, which cover the surface. We classify such minimal families using constructive methods. This allows us to compute the minimal

  10. Architecture and Initial Development of a Digital Library Platform for Computable Knowledge Objects for Health.

    Science.gov (United States)

    Flynn, Allen J; Bahulekar, Namita; Boisvert, Peter; Lagoze, Carl; Meng, George; Rampton, James; Friedman, Charles P

    2017-01-01

    Throughout the world, biomedical knowledge is routinely generated and shared through primary and secondary scientific publications. However, there is too much latency between publication of knowledge and its routine use in practice. To address this latency, what is actionable in scientific publications can be encoded to make it computable. We have created a purpose-built digital library platform to hold, manage, and share actionable, computable knowledge for health called the Knowledge Grid Library. Here we present it with its system architecture.

  11. Non-binary decomposition trees - a method of reliability computation for systems with known minimal paths/cuts

    Energy Technology Data Exchange (ETDEWEB)

    Malinowski, Jacek

    2004-05-01

    A coherent system with independent components and known minimal paths (cuts) is considered. In order to compute its reliability, a tree structure T is constructed whose nodes contain the modified minimal paths (cuts) and numerical values. The value of a non-leaf node is a function of its child nodes' values. The values of leaf nodes are calculated from a simple formula. The value of the root node is the system's failure probability (reliability). Subsequently, an algorithm computing the system's failure probability (reliability) is constructed. The algorithm scans all nodes of T using a stack structure for this purpose. The nodes of T are alternately put on and removed from the stack, their data being modified in the process. Once the algorithm has terminated, the stack contains only the final modification of the root node of T, and its value is equal to the system's failure probability (reliability)

  12. Computer simulation as representation of knowledge in education

    International Nuclear Information System (INIS)

    Krekic, Valerija Pinter; Namestovski, Zolt

    2009-01-01

    According to Aebli's operative method (1963) and Bruner's (1974) theory of representation the development of the process of thinking in teaching has the following phases - levels of abstraction: manipulation with specific things (specific phase), iconic representation (figural phase), symbolic representation (symbolic phase). Modern information technology has contributed to the enrichment of teaching and learning processes, especially in the fields of natural sciences and mathematics and those of production and technology. Simulation appears as a new possibility in the representation of knowledge. According to Guetzkow (1972) simulation is an operative representation of reality from a relevant aspect. It is about a model of an objective system, which is dynamic in itself. If that model is material it is a simple simulation, if it is abstract it is a reflective experiment, that is a computer simulation. This present work deals with the systematization and classification of simulation methods in the teaching of natural sciences and mathematics and of production and technology with special retrospective view on computer simulations and exemplar representation of the place and the role of this modern method of cognition. Key words: Representation of knowledge, modeling, simulation, education

  13. Minimal mobile human computer interaction

    NARCIS (Netherlands)

    el Ali, A.

    2013-01-01

    In the last 20 years, the widespread adoption of personal, mobile computing devices in everyday life, has allowed entry into a new technological era in Human Computer Interaction (HCI). The constant change of the physical and social context in a user's situation made possible by the portability of

  14. Computer knowledge amongst clinical year medical students in a ...

    African Journals Online (AJOL)

    Objective: To study the computer knowledge and desires of clinical year medical students at one of the oldest and largest medical schools in Nigeria. Design: A survey using validated structured questionnaires. Setting: Medical school of Ahmadu Bello University, Zaria, Nigeria. Subjects: Two hundred and thirty seven clinical ...

  15. Minimally invasive computer-navigated total hip arthroplasty, following the concept of femur first and combined anteversion: design of a blinded randomized controlled trial

    Directory of Open Access Journals (Sweden)

    Woerner Michael

    2011-08-01

    Full Text Available Abstract Background Impingement can be a serious complication after total hip arthroplasty (THA, and is one of the major causes of postoperative pain, dislocation, aseptic loosening, and implant breakage. Minimally invasive THA and computer-navigated surgery were introduced several years ago. We have developed a novel, computer-assisted operation method for THA following the concept of "femur first"/"combined anteversion", which incorporates various aspects of performing a functional optimization of the cup position, and comprehensively addresses range of motion (ROM as well as cup containment and alignment parameters. Hence, the purpose of this study is to assess whether the artificial joint's ROM can be improved by this computer-assisted operation method. Second, the clinical and radiological outcome will be evaluated. Methods/Design A registered patient- and observer-blinded randomized controlled trial will be conducted. Patients between the ages of 50 and 75 admitted for primary unilateral THA will be included. Patients will be randomly allocated to either receive minimally invasive computer-navigated "femur first" THA or the conventional minimally invasive THA procedure. Self-reported functional status and health-related quality of life (questionnaires will be assessed both preoperatively and postoperatively. Perioperative complications will be registered. Radiographic evaluation will take place up to 6 weeks postoperatively with a computed tomography (CT scan. Component position will be evaluated by an independent external institute on a 3D reconstruction of the femur/pelvis using image-processing software. Postoperative ROM will be calculated by an algorithm which automatically determines bony and prosthetic impingements. Discussion In the past, computer navigation has improved the accuracy of component positioning. So far, there are only few objective data quantifying the risks and benefits of computer navigated THA. Therefore, this

  16. Large-scale computer networks and the future of legal knowledge-based systems

    NARCIS (Netherlands)

    Leenes, R.E.; Svensson, Jorgen S.; Hage, J.C.; Bench-Capon, T.J.M.; Cohen, M.J.; van den Herik, H.J.

    1995-01-01

    In this paper we investigate the relation between legal knowledge-based systems and large-scale computer networks such as the Internet. On the one hand, researchers of legal knowledge-based systems have claimed huge possibilities, but despite the efforts over the last twenty years, the number of

  17. Adaptive-weighted total variation minimization for sparse data toward low-dose x-ray computed tomography image reconstruction.

    Science.gov (United States)

    Liu, Yan; Ma, Jianhua; Fan, Yi; Liang, Zhengrong

    2012-12-07

    Previous studies have shown that by minimizing the total variation (TV) of the to-be-estimated image with some data and other constraints, piecewise-smooth x-ray computed tomography (CT) can be reconstructed from sparse-view projection data without introducing notable artifacts. However, due to the piecewise constant assumption for the image, a conventional TV minimization algorithm often suffers from over-smoothness on the edges of the resulting image. To mitigate this drawback, we present an adaptive-weighted TV (AwTV) minimization algorithm in this paper. The presented AwTV model is derived by considering the anisotropic edge property among neighboring image voxels, where the associated weights are expressed as an exponential function and can be adaptively adjusted by the local image-intensity gradient for the purpose of preserving the edge details. Inspired by the previously reported TV-POCS (projection onto convex sets) implementation, a similar AwTV-POCS implementation was developed to minimize the AwTV subject to data and other constraints for the purpose of sparse-view low-dose CT image reconstruction. To evaluate the presented AwTV-POCS algorithm, both qualitative and quantitative studies were performed by computer simulations and phantom experiments. The results show that the presented AwTV-POCS algorithm can yield images with several notable gains, in terms of noise-resolution tradeoff plots and full-width at half-maximum values, as compared to the corresponding conventional TV-POCS algorithm.

  18. Survey of Education, Engineering, and Information Technology Students Knowledge of Green Computing in Nigerian University

    Directory of Open Access Journals (Sweden)

    Tajudeen Ahmed Shittu

    2016-02-01

    Full Text Available The use of computer system is growing rapidly and there is growing concern on the environmental hazard associated with its use. Thus, the need for every user’s to possess the knowledge of using computer in an environmental friendly manner.  This study therefore, investigated the knowledge of green computing possessed by university students in Nigeria. To achieve this, survey method was employed to carry out the study. The study involved students from three schools (Computer Science, Engineering, and Education. Purposive sampling method was used to draw three hundred (300 respondents that volunteer to answer the questionnaire administered for gathering the data of the study. The instrument used was adapted but modify and subjected to pilot testing to ascertain its validity and internal consistency. The reliability of the instrument showed a .75 Cronbach alpha level.  The first research question was answer with descriptive statistic (perecentage.  T-test and ANOVA was used to answer question two and three. The findings showed that the students do not possess adequate knowledge on conscious use of computing system. Also, the study showed that there is no significant difference in the green computing knowledge possesses among male and female as well as among student from the three schools. Based on these findings, the study suggested among other an aggressive campaign on green computing among university communities.

  19. Processing Optimization of Typed Resources with Synchronized Storage and Computation Adaptation in Fog Computing

    Directory of Open Access Journals (Sweden)

    Zhengyang Song

    2018-01-01

    Full Text Available Wide application of the Internet of Things (IoT system has been increasingly demanding more hardware facilities for processing various resources including data, information, and knowledge. With the rapid growth of generated resource quantity, it is difficult to adapt to this situation by using traditional cloud computing models. Fog computing enables storage and computing services to perform at the edge of the network to extend cloud computing. However, there are some problems such as restricted computation, limited storage, and expensive network bandwidth in Fog computing applications. It is a challenge to balance the distribution of network resources. We propose a processing optimization mechanism of typed resources with synchronized storage and computation adaptation in Fog computing. In this mechanism, we process typed resources in a wireless-network-based three-tier architecture consisting of Data Graph, Information Graph, and Knowledge Graph. The proposed mechanism aims to minimize processing cost over network, computation, and storage while maximizing the performance of processing in a business value driven manner. Simulation results show that the proposed approach improves the ratio of performance over user investment. Meanwhile, conversions between resource types deliver support for dynamically allocating network resources.

  20. Artificial intelligence and tutoring systems computational and cognitive approaches to the communication of knowledge

    CERN Document Server

    Wenger, Etienne

    2014-01-01

    Artificial Intelligence and Tutoring Systems: Computational and Cognitive Approaches to the Communication of Knowledge focuses on the cognitive approaches, methodologies, principles, and concepts involved in the communication of knowledge. The publication first elaborates on knowledge communication systems, basic issues, and tutorial dialogues. Concerns cover natural reasoning and tutorial dialogues, shift from local strategies to multiple mental models, domain knowledge, pedagogical knowledge, implicit versus explicit encoding of knowledge, knowledge communication, and practical and theoretic

  1. Increasing the speed of computational fluid dynamics procedure for minimization the nitrogen oxide polution from the premixed atmospheric gas burner

    Directory of Open Access Journals (Sweden)

    Fotev Vasko G.

    2017-01-01

    Full Text Available This article presents innovative method for increasing the speed of procedure which includes complex computational fluid dynamic calculations for finding the distance between flame openings of atmospheric gas burner that lead to minimal NO pollution. The method is based on standard features included in commercial computational fluid dynamic software and shortens computer working time roughly seven times in this particular case.

  2. Segmentation of Synchrotron Radiation micro-Computed Tomography Images using Energy Minimization via Graph Cuts

    International Nuclear Information System (INIS)

    Meneses, Anderson A.M.; Giusti, Alessandro; Almeida, André P. de; Nogueira, Liebert; Braz, Delson; Almeida, Carlos E. de; Barroso, Regina C.

    2012-01-01

    The research on applications of segmentation algorithms to Synchrotron Radiation X-Ray micro-Computed Tomography (SR-μCT) is an open problem, due to the interesting and well-known characteristics of SR images, such as the phase contrast effect. The Energy Minimization via Graph Cuts (EMvGC) algorithm represents state-of-art segmentation algorithm, presenting an enormous potential of application in SR-μCT imaging. We describe the application of the algorithm EMvGC with swap move for the segmentation of bone images acquired at the ELETTRA Laboratory (Trieste, Italy). - Highlights: ► Microstructures of Wistar rats' ribs are investigated with Synchrotron Radiation μCT imaging. ► The present work is part of a research on the effects of radiotherapy on the thoracic region. ► Application of the Energy Minimization via Graph Cuts algorithm for segmentation is described.

  3. Minimal features of a computer and its basic software to executs NEPTUNIX 2 numerical step

    International Nuclear Information System (INIS)

    Roux, Pierre.

    1982-12-01

    NEPTUNIX 2 is a package which carries out the simulation of complex processes described by numerous non linear algebro-differential equations. Main features are: non linear or time dependent parameters, implicit form, stiff systems, dynamic change of equations leading to discontinuities on some variables. Thus the mathematical model is built with an equation set F(x,x',t,l) = 0, where t is the independent variable, x' the derivative of x and l an ''algebrized'' logical variable. The NEPTUNIX 2 package is divided into two successive major steps: a non numerical step and a numerical step. The non numerical step must be executed on a series 370 IBM computer or a compatible computer. This step generates a FORTRAN language model picture fitted for the computer carrying out the numerical step. The numerical step consists in building and running a mathematical model simulator. This execution step of NEPTUNIX 2 has been designed in order to be transportable on many computers. The present manual describes minimal features of such host computer used for executing the NEPTUNIX 2 numerical step [fr

  4. Intermediate-Level Knowledge in Child-Computer Interaction

    DEFF Research Database (Denmark)

    Barendregt, Wolmet; Torgersson, Olof; Eriksson, Eva

    2017-01-01

    Based on an analysis of all papers at IDC from 2003 to 2016 this paper urges the Child-Computer Interaction (CCI) field to start formulating intermediate-level knowledge, in the form of e.g. strong concepts. Our analysis showed that 40% of all papers at the Interaction Design and Children...... conference presents the design of an artefact accompanied by an evaluation (to which we will refer as 'artefact-centered' papers). While exploring the design space in the form of artefacts is important and valuable, it can be argued that those artefact-centered papers generally make a smaller contribution...... to the field as a whole, which is also visible in the number of citations to such papers in comparison to the number of citations to other kinds of papers. As a first step towards more intermediate-level knowledge, we have thus attempted to formulate and ground three suggestions for strong concepts in CCI...

  5. Meaning Making Through Minimal Linguistic Forms in Computer-Mediated Communication

    Directory of Open Access Journals (Sweden)

    Muhammad Shaban Rafi

    2014-05-01

    Full Text Available The purpose of this study was to investigate the linguistic forms, which commonly constitute meanings in the digital environment. The data were sampled from 200 Bachelor of Science (BS students (who had Urdu as their primary language of communication and English as one of the academic languages or the most prestigious second language of five universities situated in Lahore, Pakistan. The procedure for analysis was conceived within much related theoretical work on text analysis. The study reveals that cyber-language is organized through patterns of use, which can be broadly classified into minimal linguistic forms constituting a meaning-making resource. In addition, the expression of syntactic mood, and discourse roles the participants technically assume tend to contribute to the theory of meaning in the digital environment. It is hoped that the study would make some contribution to the growing literature on multilingual computer-mediated communication (CMC.

  6. Segmentation of Synchrotron Radiation micro-Computed Tomography Images using Energy Minimization via Graph Cuts

    Energy Technology Data Exchange (ETDEWEB)

    Meneses, Anderson A.M. [Federal University of Western Para (Brazil); Physics Institute, Rio de Janeiro State University (Brazil); Giusti, Alessandro [IDSIA (Dalle Molle Institute for Artificial Intelligence), University of Lugano (Switzerland); Almeida, Andre P. de, E-mail: apalmeid@gmail.com [Physics Institute, Rio de Janeiro State University (Brazil); Nuclear Engineering Program, Federal University of Rio de Janeiro (Brazil); Nogueira, Liebert; Braz, Delson [Nuclear Engineering Program, Federal University of Rio de Janeiro (Brazil); Almeida, Carlos E. de [Radiological Sciences Laboratory, Rio de Janeiro State University (Brazil); Barroso, Regina C. [Physics Institute, Rio de Janeiro State University (Brazil)

    2012-07-15

    The research on applications of segmentation algorithms to Synchrotron Radiation X-Ray micro-Computed Tomography (SR-{mu}CT) is an open problem, due to the interesting and well-known characteristics of SR images, such as the phase contrast effect. The Energy Minimization via Graph Cuts (EMvGC) algorithm represents state-of-art segmentation algorithm, presenting an enormous potential of application in SR-{mu}CT imaging. We describe the application of the algorithm EMvGC with swap move for the segmentation of bone images acquired at the ELETTRA Laboratory (Trieste, Italy). - Highlights: Black-Right-Pointing-Pointer Microstructures of Wistar rats' ribs are investigated with Synchrotron Radiation {mu}CT imaging. Black-Right-Pointing-Pointer The present work is part of a research on the effects of radiotherapy on the thoracic region. Black-Right-Pointing-Pointer Application of the Energy Minimization via Graph Cuts algorithm for segmentation is described.

  7. A true minimally invasive approach for cochlear implantation: high accuracy in cranial base navigation through flat-panel-based volume computed tomography.

    Science.gov (United States)

    Majdani, Omid; Bartling, Soenke H; Leinung, Martin; Stöver, Timo; Lenarz, Minoo; Dullin, Christian; Lenarz, Thomas

    2008-02-01

    High-precision intraoperative navigation using high-resolution flat-panel volume computed tomography makes feasible the possibility of minimally invasive cochlear implant surgery, including cochleostomy. Conventional cochlear implant surgery is typically performed via mastoidectomy with facial recess to identify and avoid damage to vital anatomic landmarks. To accomplish this procedure via a minimally invasive approach--without performing mastoidectomy--in a precise fashion, image-guided technology is necessary. With such an approach, surgical time and expertise may be reduced, and hearing preservation may be improved. Flat-panel volume computed tomography was used to scan 4 human temporal bones. A drilling channel was planned preoperatively from the mastoid surface to the round window niche, providing a margin of safety to all functional important structures (e.g., facial nerve, chorda tympani, incus). Postoperatively, computed tomographic imaging and conventional surgical exploration of the drilled route to the cochlea were performed. All 4 specimens showed a cochleostomy located at the scala tympani anterior inferior to the round window. The chorda tympani was damaged in 1 specimen--this was preoperatively planned as a narrow facial recess was encountered. Using flat-panel volume computed tomography for image-guided surgical navigation, we were able to perform minimally invasive cochlear implant surgery defined as a narrow, single-channel mastoidotomy with cochleostomy. Although this finding is preliminary, it is technologically achievable.

  8. A knowledge-based computer system for assessing new company names

    DEFF Research Database (Denmark)

    Vidal, Rene Victor Valqui; Thorsen, M.

    1990-01-01

    This paper briefly describes a knowledge-based computer system implemented at the Registry of Companies (E and S), Ministry of Industries in Denmark. The system helps E and S, on receipt of a request for registration of a new or changed company name, to check the name for acceptability. The check...

  9. A new Mumford-Shah total variation minimization based model for sparse-view x-ray computed tomography image reconstruction.

    Science.gov (United States)

    Chen, Bo; Bian, Zhaoying; Zhou, Xiaohui; Chen, Wensheng; Ma, Jianhua; Liang, Zhengrong

    2018-04-12

    Total variation (TV) minimization for the sparse-view x-ray computer tomography (CT) reconstruction has been widely explored to reduce radiation dose. However, due to the piecewise constant assumption for the TV model, the reconstructed images often suffer from over-smoothness on the image edges. To mitigate this drawback of TV minimization, we present a Mumford-Shah total variation (MSTV) minimization algorithm in this paper. The presented MSTV model is derived by integrating TV minimization and Mumford-Shah segmentation. Subsequently, a penalized weighted least-squares (PWLS) scheme with MSTV is developed for the sparse-view CT reconstruction. For simplicity, the proposed algorithm is named as 'PWLS-MSTV.' To evaluate the performance of the present PWLS-MSTV algorithm, both qualitative and quantitative studies were conducted by using a digital XCAT phantom and a physical phantom. Experimental results show that the present PWLS-MSTV algorithm has noticeable gains over the existing algorithms in terms of noise reduction, contrast-to-ratio measure and edge-preservation.

  10. Computer-based medical education in Benha University, Egypt: knowledge, attitude, limitations, and suggestions.

    Science.gov (United States)

    Bayomy, Hanaa; El Awadi, Mona; El Araby, Eman; Abed, Hala A

    2016-12-01

    Computer-assisted medical education has been developed to enhance learning and enable high-quality medical care. This study aimed to assess computer knowledge and attitude toward the inclusion of computers in medical education among second-year medical students in Benha Faculty of Medicine, Egypt, to identify limitations, and obtain suggestions for successful computer-based learning. This was a one-group pre-post-test study, which was carried out on second-year students in Benha Faculty of Medicine. A structured self-administered questionnaire was used to compare students' knowledge, attitude, limitations, and suggestions toward computer usage in medical education before and after the computer course to evaluate the change in students' responses. The majority of students were familiar with use of the mouse and keyboard, basic word processing, internet and web searching, and e-mail both before and after the computer course. The proportion of students who were familiar with software programs other than the word processing and trouble-shoot software/hardware was significantly higher after the course (Pcomputer (P=0.008), the inclusion of computer skills course in medical education, downloading lecture handouts, and computer-based exams (Pcomputers limited the inclusion of computer in medical education (Pcomputer labs, lack of Information Technology staff mentoring, large number of students, unclear course outline, and lack of internet access were more frequently reported before the course (Pcomputer labs, inviting Information Technology staff to support computer teaching, and the availability of free Wi-Fi internet access covering several areas in the university campus; all would support computer-assisted medical education. Medical students in Benha University are computer literate, which allows for computer-based medical education. Staff training, provision of computer labs, and internet access are essential requirements for enhancing computer usage in medical

  11. The Difference Engine: Computing, Knowledge, and the Transformation of Learning

    Science.gov (United States)

    Provenzo, Eugene F.

    2011-01-01

    Since the 1960s, the rapid evolution of technology has created a new cultural geography--a virtual geography. "The Difference Engine: Computing, Knowledge and the Transformation of Learning" offers a conscious critique of this change and its effects on contemporary culture and education. This engaging text assumes that we are at a critical…

  12. Sequential computation of elementary modes and minimal cut sets in genome-scale metabolic networks using alternate integer linear programming

    Energy Technology Data Exchange (ETDEWEB)

    Song, Hyun-Seob; Goldberg, Noam; Mahajan, Ashutosh; Ramkrishna, Doraiswami

    2017-03-27

    Elementary (flux) modes (EMs) have served as a valuable tool for investigating structural and functional properties of metabolic networks. Identification of the full set of EMs in genome-scale networks remains challenging due to combinatorial explosion of EMs in complex networks. It is often, however, that only a small subset of relevant EMs needs to be known, for which optimization-based sequential computation is a useful alternative. Most of the currently available methods along this line are based on the iterative use of mixed integer linear programming (MILP), the effectiveness of which significantly deteriorates as the number of iterations builds up. To alleviate the computational burden associated with the MILP implementation, we here present a novel optimization algorithm termed alternate integer linear programming (AILP). Results: Our algorithm was designed to iteratively solve a pair of integer programming (IP) and linear programming (LP) to compute EMs in a sequential manner. In each step, the IP identifies a minimal subset of reactions, the deletion of which disables all previously identified EMs. Thus, a subsequent LP solution subject to this reaction deletion constraint becomes a distinct EM. In cases where no feasible LP solution is available, IP-derived reaction deletion sets represent minimal cut sets (MCSs). Despite the additional computation of MCSs, AILP achieved significant time reduction in computing EMs by orders of magnitude. The proposed AILP algorithm not only offers a computational advantage in the EM analysis of genome-scale networks, but also improves the understanding of the linkage between EMs and MCSs.

  13. Knowledge base about earthquakes as a tool to minimize strong events consequences

    Science.gov (United States)

    Frolova, Nina; Bonnin, Jean; Larionov, Valery; Ugarov, Alexander; Kijko, Andrzej

    2017-04-01

    The paper describes the structure and content of the knowledge base on physical and socio-economical consequences of damaging earthquakes, which may be used for calibration of near real-time loss assessment systems based on simulation models for shaking intensity, damage to buildings and casualties estimates. Such calibration allows to compensate some factors which influence on reliability of expected damage and loss assessment in "emergency" mode. The knowledge base contains the description of past earthquakes' consequences for the area under study. It also includes the current distribution of built environment and population at the time of event occurrence. Computer simulation of the recorded in knowledge base events allow to determine the sets of regional calibration coefficients, including rating of seismological surveys, peculiarities of shaking intensity attenuation and changes in building stock and population distribution, in order to provide minimum error of damaging earthquakes loss estimations in "emergency" mode. References 1. Larionov, V., Frolova, N: Peculiarities of seismic vulnerability estimations. In: Natural Hazards in Russia, volume 6: Natural Risks Assessment and Management, Publishing House "Kruk", Moscow, 120-131, 2003. 2. Frolova, N., Larionov, V., Bonnin, J.: Data Bases Used In Worlwide Systems For Earthquake Loss Estimation In Emergency Mode: Wenchuan Earthquake. In Proc. TIEMS2010 Conference, Beijing, China, 2010. 3. Frolova N. I., Larionov V. I., Bonnin J., Sushchev S. P., Ugarov A. N., Kozlov M. A. Loss Caused by Earthquakes: Rapid Estimates. Natural Hazards Journal of the International Society for the Prevention and Mitigation of Natural Hazards, vol.84, ISSN 0921-030, Nat Hazards DOI 10.1007/s11069-016-2653

  14. A Method of Extracting Ontology Module Using Concept Relations for Sharing Knowledge in Mobile Cloud Computing Environment

    Directory of Open Access Journals (Sweden)

    Keonsoo Lee

    2014-01-01

    Full Text Available In mobile cloud computing environment, the cooperation of distributed computing objects is one of the most important requirements for providing successful cloud services. To satisfy this requirement, all the members, who are employed in the cooperation group, need to share the knowledge for mutual understanding. Even if ontology can be the right tool for this goal, there are several issues to make a right ontology. As the cost and complexity of managing knowledge increase according to the scale of the knowledge, reducing the size of ontology is one of the critical issues. In this paper, we propose a method of extracting ontology module to increase the utility of knowledge. For the given signature, this method extracts the ontology module, which is semantically self-contained to fulfill the needs of the service, by considering the syntactic structure and semantic relation of concepts. By employing this module, instead of the original ontology, the cooperation of computing objects can be performed with less computing load and complexity. In particular, when multiple external ontologies need to be combined for more complex services, this method can be used to optimize the size of shared knowledge.

  15. A method of extracting ontology module using concept relations for sharing knowledge in mobile cloud computing environment.

    Science.gov (United States)

    Lee, Keonsoo; Rho, Seungmin; Lee, Seok-Won

    2014-01-01

    In mobile cloud computing environment, the cooperation of distributed computing objects is one of the most important requirements for providing successful cloud services. To satisfy this requirement, all the members, who are employed in the cooperation group, need to share the knowledge for mutual understanding. Even if ontology can be the right tool for this goal, there are several issues to make a right ontology. As the cost and complexity of managing knowledge increase according to the scale of the knowledge, reducing the size of ontology is one of the critical issues. In this paper, we propose a method of extracting ontology module to increase the utility of knowledge. For the given signature, this method extracts the ontology module, which is semantically self-contained to fulfill the needs of the service, by considering the syntactic structure and semantic relation of concepts. By employing this module, instead of the original ontology, the cooperation of computing objects can be performed with less computing load and complexity. In particular, when multiple external ontologies need to be combined for more complex services, this method can be used to optimize the size of shared knowledge.

  16. Optimized Runge-Kutta methods with minimal dispersion and dissipation for problems arising from computational acoustics

    International Nuclear Information System (INIS)

    Tselios, Kostas; Simos, T.E.

    2007-01-01

    In this Letter a new explicit fourth-order seven-stage Runge-Kutta method with a combination of minimal dispersion and dissipation error and maximal accuracy and stability limit along the imaginary axes, is developed. This method was produced by a general function that was constructed to satisfy all the above requirements and, from which, all the existing fourth-order six-stage RK methods can be produced. The new method is more efficient than the other optimized methods, for acoustic computations

  17. Computer game-based and traditional learning method: a comparison regarding students' knowledge retention.

    Science.gov (United States)

    Rondon, Silmara; Sassi, Fernanda Chiarion; Furquim de Andrade, Claudia Regina

    2013-02-25

    Educational computer games are examples of computer-assisted learning objects, representing an educational strategy of growing interest. Given the changes in the digital world over the last decades, students of the current generation expect technology to be used in advancing their learning requiring a need to change traditional passive learning methodologies to an active multisensory experimental learning methodology. The objective of this study was to compare a computer game-based learning method with a traditional learning method, regarding learning gains and knowledge retention, as means of teaching head and neck Anatomy and Physiology to Speech-Language and Hearing pathology undergraduate students. Students were randomized to participate to one of the learning methods and the data analyst was blinded to which method of learning the students had received. Students' prior knowledge (i.e. before undergoing the learning method), short-term knowledge retention and long-term knowledge retention (i.e. six months after undergoing the learning method) were assessed with a multiple choice questionnaire. Students' performance was compared considering the three moments of assessment for both for the mean total score and for separated mean scores for Anatomy questions and for Physiology questions. Students that received the game-based method performed better in the pos-test assessment only when considering the Anatomy questions section. Students that received the traditional lecture performed better in both post-test and long-term post-test when considering the Anatomy and Physiology questions. The game-based learning method is comparable to the traditional learning method in general and in short-term gains, while the traditional lecture still seems to be more effective to improve students' short and long-term knowledge retention.

  18. Pedagogical Content Knowledge and Educational Cases in Computer Science: an Exploration

    NARCIS (Netherlands)

    Koppelman, Hermannus

    2008-01-01

    The concept of pedagogical content knowledge has been explored in the context of several disciplines, such as mathematics, medicine and chemistry. In this paper the concept is explored and applied to the subject matter of computer science, in particular to the sub domain of building UML class

  19. Formal definition of coherency and computation of minimal cut sequences for binary dynamic and repairable systems

    International Nuclear Information System (INIS)

    Chaux, Pierre-Yves

    2013-01-01

    Preventive risk assessment of a complex system rely on a dynamic models which describe the link between the system failure and the scenarios of failure and repair events from its components. The qualitative analyses of a binary dynamic and repairable system is aiming at computing and analyse the scenarios that lead to the system failure. Since such systems describe a large set of those, only the most representative ones, called Minimal Cut Sequences (MCS), are of interest for the safety engineer. The lack of a formal definition for the MCS has generated multiple definitions either specific to a given model (and thus not generic) or informal. This work proposes i) a formal framework and definition for the MCS while staying independent of the reliability model used, ii) the methodology to compute them using property extracted from their formal definition, iii) an extension of the formal framework for multi-states components in order to perform the qualitative analyses of Boolean logic Driven Markov Processes (BDMP) models. Under the hypothesis that the scenarios implicitly described by any reliability model can always be represented by a finite automaton, this work is defining the coherency for dynamic and repairable systems as the way to give a minimal representation of all scenarios that are leading to the system failure. (author)

  20. Constrained Total Generalized p-Variation Minimization for Few-View X-Ray Computed Tomography Image Reconstruction.

    Science.gov (United States)

    Zhang, Hanming; Wang, Linyuan; Yan, Bin; Li, Lei; Cai, Ailong; Hu, Guoen

    2016-01-01

    Total generalized variation (TGV)-based computed tomography (CT) image reconstruction, which utilizes high-order image derivatives, is superior to total variation-based methods in terms of the preservation of edge information and the suppression of unfavorable staircase effects. However, conventional TGV regularization employs l1-based form, which is not the most direct method for maximizing sparsity prior. In this study, we propose a total generalized p-variation (TGpV) regularization model to improve the sparsity exploitation of TGV and offer efficient solutions to few-view CT image reconstruction problems. To solve the nonconvex optimization problem of the TGpV minimization model, we then present an efficient iterative algorithm based on the alternating minimization of augmented Lagrangian function. All of the resulting subproblems decoupled by variable splitting admit explicit solutions by applying alternating minimization method and generalized p-shrinkage mapping. In addition, approximate solutions that can be easily performed and quickly calculated through fast Fourier transform are derived using the proximal point method to reduce the cost of inner subproblems. The accuracy and efficiency of the simulated and real data are qualitatively and quantitatively evaluated to validate the efficiency and feasibility of the proposed method. Overall, the proposed method exhibits reasonable performance and outperforms the original TGV-based method when applied to few-view problems.

  1. A procedure to compute equilibrium concentrations in multicomponent systems by Gibbs energy minimization on spreadsheets

    International Nuclear Information System (INIS)

    Lima da Silva, Aline; Heck, Nestor Cesar

    2003-01-01

    Equilibrium concentrations are traditionally calculated with the help of equilibrium constant equations from selected reactions. This procedure, however, is only useful for simpler problems. Analysis of the equilibrium state in a multicomponent and multiphase system necessarily involves solution of several simultaneous equations, and, as the number of system components grows, the required computation becomes more complex and tedious. A more direct and general method for solving the problem is the direct minimization of the Gibbs energy function. The solution for the nonlinear problem consists in minimizing the objective function (Gibbs energy of the system) subjected to the constraints of the elemental mass-balance. To solve it, usually a computer code is developed, which requires considerable testing and debugging efforts. In this work, a simple method to predict equilibrium composition in multicomponent systems is presented, which makes use of an electronic spreadsheet. The ability to carry out these calculations within a spreadsheet environment shows several advantages. First, spreadsheets are available 'universally' on nearly all personal computers. Second, the input and output capabilities of spreadsheets can be effectively used to monitor calculated results. Third, no additional systems or programs need to be learned. In this way, spreadsheets can be as suitable in computing equilibrium concentrations as well as to be used as teaching and learning aids. This work describes, therefore, the use of the Solver tool, contained in the Microsoft Excel spreadsheet package, on computing equilibrium concentrations in a multicomponent system, by the method of direct Gibbs energy minimization. The four phases Fe-Cr-O-C-Ni system is used as an example to illustrate the method proposed. The pure stoichiometric phases considered in equilibrium calculations are: Cr 2 O 3 (s) and FeO C r 2 O 3 (s). The atmosphere consists of O 2 , CO e CO 2 constituents. The liquid iron

  2. The Use of Computer-Based Videogames in Knowledge Acquisition and Retention.

    Science.gov (United States)

    Ricci, Katrina E.

    1994-01-01

    Research conducted at the Naval Training Systems Center in Orlando, Florida, investigated the acquisition and retention of basic knowledge with subject matter presented in the forms of text, test, and game. Results are discussed in terms of the effectiveness of computer-based games for military training. (Author/AEF)

  3. Using medical knowledge sources on handheld computers--a qualitative study among junior doctors.

    Science.gov (United States)

    Axelson, Christian; Wårdh, Inger; Strender, Lars-Erik; Nilsson, Gunnar

    2007-09-01

    The emergence of mobile computing could have an impact on how junior doctors learn. To exploit this opportunity it is essential to understand their information seeking process. To explore junior doctors' experiences of using medical knowledge sources on handheld computers. Interviews with five Swedish junior doctors. A qualitative manifest content analysis of a focus group interview followed by a qualitative latent content analysis of two individual interviews. A focus group interview showed that users were satisfied with access to handheld medical knowledge sources, but there was concern about contents, reliability and device dependency. Four categories emerged from individual interviews: (1) A feeling of uncertainty about using handheld technology in medical care; (2) A sense of security that handhelds can provide; (3) A need for contents to be personalized; (4) A degree of adaptability to make the handheld a versatile information tool. A theme was established to link the four categories together, as expressed in the Conclusion section. Junior doctors' experiences of using medical knowledge sources on handheld computers shed light on the need to decrease uncertainty about clinical decisions during medical internship, and to find ways to influence the level of self-confidence in the junior doctor's process of decision-making.

  4. Learning to Teach Computer Science: Qualitative Insights into Secondary Teachers' Pedagogical Content Knowledge

    Science.gov (United States)

    Hubbard, Aleata Kimberly

    2017-01-01

    In this dissertation, I explored the pedagogical content knowledge of in-service high school educators recently assigned to teach computer science for the first time. Teachers were participating in a professional development program where they co-taught introductory computing classes with tech industry professionals. The study was motivated by…

  5. Minimal open strings

    International Nuclear Information System (INIS)

    Hosomichi, Kazuo

    2008-01-01

    We study FZZT-branes and open string amplitudes in (p, q) minimal string theory. We focus on the simplest boundary changing operators in two-matrix models, and identify the corresponding operators in worldsheet theory through the comparison of amplitudes. Along the way, we find a novel linear relation among FZZT boundary states in minimal string theory. We also show that the boundary ground ring is realized on physical open string operators in a very simple manner, and discuss its use for perturbative computation of higher open string amplitudes.

  6. Computer game-based and traditional learning method: a comparison regarding students’ knowledge retention

    Directory of Open Access Journals (Sweden)

    Rondon Silmara

    2013-02-01

    Full Text Available Abstract Background Educational computer games are examples of computer-assisted learning objects, representing an educational strategy of growing interest. Given the changes in the digital world over the last decades, students of the current generation expect technology to be used in advancing their learning requiring a need to change traditional passive learning methodologies to an active multisensory experimental learning methodology. The objective of this study was to compare a computer game-based learning method with a traditional learning method, regarding learning gains and knowledge retention, as means of teaching head and neck Anatomy and Physiology to Speech-Language and Hearing pathology undergraduate students. Methods Students were randomized to participate to one of the learning methods and the data analyst was blinded to which method of learning the students had received. Students’ prior knowledge (i.e. before undergoing the learning method, short-term knowledge retention and long-term knowledge retention (i.e. six months after undergoing the learning method were assessed with a multiple choice questionnaire. Students’ performance was compared considering the three moments of assessment for both for the mean total score and for separated mean scores for Anatomy questions and for Physiology questions. Results Students that received the game-based method performed better in the pos-test assessment only when considering the Anatomy questions section. Students that received the traditional lecture performed better in both post-test and long-term post-test when considering the Anatomy and Physiology questions. Conclusions The game-based learning method is comparable to the traditional learning method in general and in short-term gains, while the traditional lecture still seems to be more effective to improve students’ short and long-term knowledge retention.

  7. Analysis and minimization of overtraining effect in rule-based classifiers for computer-aided diagnosis

    International Nuclear Information System (INIS)

    Li Qiang; Doi Kunio

    2006-01-01

    Computer-aided diagnostic (CAD) schemes have been developed to assist radiologists detect various lesions in medical images. In CAD schemes, classifiers play a key role in achieving a high lesion detection rate and a low false-positive rate. Although many popular classifiers such as linear discriminant analysis and artificial neural networks have been employed in CAD schemes for reduction of false positives, a rule-based classifier has probably been the simplest and most frequently used one since the early days of development of various CAD schemes. However, with existing rule-based classifiers, there are major disadvantages that significantly reduce their practicality and credibility. The disadvantages include manual design, poor reproducibility, poor evaluation methods such as resubstitution, and a large overtraining effect. An automated rule-based classifier with a minimized overtraining effect can overcome or significantly reduce the extent of the above-mentioned disadvantages. In this study, we developed an 'optimal' method for the selection of cutoff thresholds and a fully automated rule-based classifier. Experimental results performed with Monte Carlo simulation and a real lung nodule CT data set demonstrated that the automated threshold selection method can completely eliminate overtraining effect in the procedure of cutoff threshold selection, and thus can minimize overall overtraining effect in the constructed rule-based classifier. We believe that this threshold selection method is very useful in the construction of automated rule-based classifiers with minimized overtraining effect

  8. Knowledge-based and model-based hybrid methodology for comprehensive waste minimization in electroplating plants

    Science.gov (United States)

    Luo, Keqin

    1999-11-01

    The electroplating industry of over 10,000 planting plants nationwide is one of the major waste generators in the industry. Large quantities of wastewater, spent solvents, spent process solutions, and sludge are the major wastes generated daily in plants, which costs the industry tremendously for waste treatment and disposal and hinders the further development of the industry. It becomes, therefore, an urgent need for the industry to identify technically most effective and economically most attractive methodologies and technologies to minimize the waste, while the production competitiveness can be still maintained. This dissertation aims at developing a novel WM methodology using artificial intelligence, fuzzy logic, and fundamental knowledge in chemical engineering, and an intelligent decision support tool. The WM methodology consists of two parts: the heuristic knowledge-based qualitative WM decision analysis and support methodology and fundamental knowledge-based quantitative process analysis methodology for waste reduction. In the former, a large number of WM strategies are represented as fuzzy rules. This becomes the main part of the knowledge base in the decision support tool, WMEP-Advisor. In the latter, various first-principles-based process dynamic models are developed. These models can characterize all three major types of operations in an electroplating plant, i.e., cleaning, rinsing, and plating. This development allows us to perform a thorough process analysis on bath efficiency, chemical consumption, wastewater generation, sludge generation, etc. Additional models are developed for quantifying drag-out and evaporation that are critical for waste reduction. The models are validated through numerous industrial experiments in a typical plating line of an industrial partner. The unique contribution of this research is that it is the first time for the electroplating industry to (i) use systematically available WM strategies, (ii) know quantitatively and

  9. Promoting Elementary Students' Epistemology of Science through Computer-Supported Knowledge-Building Discourse and Epistemic Reflection

    Science.gov (United States)

    Lin, Feng; Chan, Carol K. K.

    2018-01-01

    This study examined the role of computer-supported knowledge-building discourse and epistemic reflection in promoting elementary-school students' scientific epistemology and science learning. The participants were 39 Grade 5 students who were collectively pursuing ideas and inquiry for knowledge advance using Knowledge Forum (KF) while studying a…

  10. PERAN SIKAP DALAM MEMEDIASI PENGARUH PENGETAHUAN TERHADAP PERILAKU MINIMISASI SAMPAH PADA MASYARAKAT TERBAN, YOGYAKARTA (The Role of Attitude to Mediate The Effect of Knowledge on People’s Waste Minimization Behaviour in Terban, Yogyakarta

    Directory of Open Access Journals (Sweden)

    Hanif Akhtar

    2015-01-01

    behaviour. Attitude toward behaviour plays a significant role in behavioural change. This research will focus on one kind of pro-environmental behaviour namely waste minimisastion behaviour. The purpose of this research is to find out the relationship between waste minimization knowledge, attitude, and behaviour. This research was conducted in Kelurahan Terban, RW 02 and RW 11, Yogyakarta in January until February 2014. Total of the subjects are 105. Data were collected using three scales that is: waste minimization behaviour scale, waste minimization attitude scale, and waste minimization knowledge scale. Data were analysed using regression analysis with path analysis model. Sobel Test was used to estimate the mediation effect. Indirect effect analysis showed the indirect effect coeficient was 0,742 with z=3,42 and p <0,01. It is showed that there was an indirect effect of waste minimization knowledge to waste minimization behaviour through waste minimization attitude. Thus, we can conclude that waste minimization attitude mediates the relationship between waste minimization knowledge and waste minimization behaviour.

  11. Studying Computer Science in a Multidisciplinary Degree Programme: Freshman Students' Orientation, Knowledge, and Background

    Science.gov (United States)

    Kautz, Karlheinz; Kofoed, Uffe

    2004-01-01

    Teachers at universities are facing an increasing disparity in students' prior IT knowledge and, at the same time, experience a growing disengagement of the students with regard to involvement in study activities. As computer science teachers in a joint programme in computer science and business administration, we made a number of similar…

  12. Minimal families of curves on surfaces

    KAUST Repository

    Lubbes, Niels

    2014-11-01

    A minimal family of curves on an embedded surface is defined as a 1-dimensional family of rational curves of minimal degree, which cover the surface. We classify such minimal families using constructive methods. This allows us to compute the minimal families of a given surface.The classification of minimal families of curves can be reduced to the classification of minimal families which cover weak Del Pezzo surfaces. We classify the minimal families of weak Del Pezzo surfaces and present a table with the number of minimal families of each weak Del Pezzo surface up to Weyl equivalence.As an application of this classification we generalize some results of Schicho. We classify algebraic surfaces that carry a family of conics. We determine the minimal lexicographic degree for the parametrization of a surface that carries at least 2 minimal families. © 2014 Elsevier B.V.

  13. Analysis of the knowledge and opinions of students and qualified dentists regarding the use of computers.

    Science.gov (United States)

    Castelló Castañeda, Coral; Ríos Santos, Jose Vicente; Bullón, Pedro

    2008-01-01

    Dentists are currently required to make multiple diagnoses and treatment decisions every day and the information necessary to achieve this satisfactorily doubles in volume every five years. Knowledge therefore rapidly becomes out of date, so that it is often impossible to remember established information and assimilate new concepts. This may result in a significant lack of knowledge in the future, which would jeopardize the success of treatments. To remedy this situation and to prevent it, we nowadays have access to modern computing systems, with an extensive data base, which helps us to retain the information necessary for daily practice and access it instantaneously. The objectives of this study are therefore to determine how widespread the use of computing is in this environment and to determine the opinion of students and qualified dentists as regards its use in Dentistry. 90 people were chosen to take part in the study, divided into the following groups (students) (newly qualified dentists) (experts). It has been demonstrated that a high percentage (93.30%) use a computer, but that their level of computing knowledge is predominantly moderate. The place where a computer is used most is the home, which suggests that the majority own a computer. Analysis of the results obtained for evaluation of computers in teaching showed that the participants thought that it saved a great deal of time and had great potential for providing an image (in terms of marketing) and they considered it a very innovative and stimulating tool.

  14. Sequential computation of elementary modes and minimal cut sets in genome-scale metabolic networks using alternate integer linear programming.

    Science.gov (United States)

    Song, Hyun-Seob; Goldberg, Noam; Mahajan, Ashutosh; Ramkrishna, Doraiswami

    2017-08-01

    Elementary (flux) modes (EMs) have served as a valuable tool for investigating structural and functional properties of metabolic networks. Identification of the full set of EMs in genome-scale networks remains challenging due to combinatorial explosion of EMs in complex networks. It is often, however, that only a small subset of relevant EMs needs to be known, for which optimization-based sequential computation is a useful alternative. Most of the currently available methods along this line are based on the iterative use of mixed integer linear programming (MILP), the effectiveness of which significantly deteriorates as the number of iterations builds up. To alleviate the computational burden associated with the MILP implementation, we here present a novel optimization algorithm termed alternate integer linear programming (AILP). Our algorithm was designed to iteratively solve a pair of integer programming (IP) and linear programming (LP) to compute EMs in a sequential manner. In each step, the IP identifies a minimal subset of reactions, the deletion of which disables all previously identified EMs. Thus, a subsequent LP solution subject to this reaction deletion constraint becomes a distinct EM. In cases where no feasible LP solution is available, IP-derived reaction deletion sets represent minimal cut sets (MCSs). Despite the additional computation of MCSs, AILP achieved significant time reduction in computing EMs by orders of magnitude. The proposed AILP algorithm not only offers a computational advantage in the EM analysis of genome-scale networks, but also improves the understanding of the linkage between EMs and MCSs. The software is implemented in Matlab, and is provided as supplementary information . hyunseob.song@pnnl.gov. Supplementary data are available at Bioinformatics online. Published by Oxford University Press 2017. This work is written by US Government employees and are in the public domain in the US.

  15. Perturbative computation of string one-loop corrections to Wilson loop minimal surfaces in AdS{sub 5}×S{sup 5}

    Energy Technology Data Exchange (ETDEWEB)

    Forini, V. [Institut für Physik, Humboldt-Universität zu Berlin, IRIS Adlershof,Zum Großen Windkanal 6, 12489 Berlin (Germany); Tseytlin, A.A. [Theoretical Physics Group, Blackett Laboratory, Imperial College,London, SW7 2AZ (United Kingdom); Vescovi, E. [Institut für Physik, Humboldt-Universität zu Berlin, IRIS Adlershof,Zum Großen Windkanal 6, 12489 Berlin (Germany); Institute of Physics, University of São Paulo,Rua do Matão 1371, 05508-090 São Paulo (Brazil)

    2017-03-01

    We revisit the computation of the 1-loop string correction to the “latitude' minimal surface in AdS{sub 5}×S{sup 5} representing 1/4 BPS Wilson loop in planar N=4 SYM theory previously addressed in https://arxiv.org/abs/1512.00841 and https://arxiv.org/abs/1601.04708. We resolve the problem of matching with the subleading term in the strong coupling expansion of the exact gauge theory result (derived previously from localization) using a different method to compute determinants of 2d string fluctuation operators. We apply perturbation theory in a small parameter (angle of the latitude) corresponding to an expansion near the AdS{sub 2} minimal surface representing 1/2 BPS circular Wilson loop. This allows us to compute the corrections to the heat kernels and zeta-functions of the operators in terms of the known heat kernels on AdS{sub 2}. We apply the same method also to two other examples of Wilson loop surfaces: generalized cusp and k-wound circle.

  16. Computed Tomography Fractional Flow Reserve Can Identify Culprit Lesions in Aortoiliac Occlusive Disease Using Minimally Invasive Techniques.

    Science.gov (United States)

    Ward, Erin P; Shiavazzi, Daniele; Sood, Divya; Marsden, Allison; Lane, John; Owens, Erik; Barleben, Andrew

    2017-01-01

    Currently, the gold standard diagnostic examination for significant aortoiliac lesions is angiography. Fractional flow reserve (FFR) has a growing body of literature in coronary artery disease as a minimally invasive diagnostic procedure. Improvements in numerical hemodynamics have allowed for an accurate and minimally invasive approach to estimating FFR, utilizing cross-sectional imaging. We aim to demonstrate a similar approach to aortoiliac occlusive disease (AIOD). A retrospective review evaluated 7 patients with claudication and cross-sectional imaging showing AIOD. FFR was subsequently measured during conventional angiogram with pull-back pressures in a retrograde fashion. To estimate computed tomography (CT) FFR, CT angiography (CTA) image data were analyzed using the SimVascular software suite to create a computational fluid dynamics model of the aortoiliac system. Inlet flow conditions were derived based on cardiac output, while 3-element Windkessel outlet boundary conditions were optimized to match the expected systolic and diastolic pressures, with outlet resistance distributed based on Murray's law. The data were evaluated with a Student's t-test and receiver operating characteristic curve. All patients had evidence of AIOD on CT and FFR was successfully measured during angiography. The modeled data were found to have high sensitivity and specificity between the measured and CT FFR (P = 0.986, area under the curve = 1). The average difference between the measured and calculated FFRs was 0.136, with a range from 0.03 to 0.30. CT FFR successfully identified aortoiliac lesions with significant pressure drops that were identified with angiographically measured FFR. CT FFR has the potential to provide a minimally invasive approach to identify flow-limiting stenosis for AIOD. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Balancing related methods for minimal realization of periodic systems

    OpenAIRE

    Varga, A.

    1999-01-01

    We propose balancing related numerically reliable methods to compute minimal realizations of linear periodic systems with time-varying dimensions. The first method belongs to the family of square-root methods with guaranteed enhanced computational accuracy and can be used to compute balanced minimal order realizations. An alternative balancing-free square-root method has the advantage of a potentially better numerical accuracy in case of poorly scaled original systems. The key numerical co...

  18. Role of computer techniques for knowledge propagation about nuclear energetics safety

    International Nuclear Information System (INIS)

    Osachkin, V.S.

    1996-01-01

    The development of nuclear power engineering depends on the levels of nuclear, radiological and ecological safety. To ensure the approval of such levels by the community to spread the knowledge on Safety of Nuclear Engineering in understandable forms. New computer technologies may play an important role in the safety education of the public and upgrading of qualification of personnel. The progress in computer nets development makes it possible to use besides e-mail qualification of personnel. The progress in computer in nets development makes it possible to use besides e-mail and BBS the Internet system for remote education. As an example a computer course on Atomic Energy and its safety presented. This course now written in Russian consists of 6 parts, namely: physical basis of utilization of Nuclear energy; technical bases of uses of Nuclear energy; nuclear Reactors and their Systems; safety Principles, Goals, Nuclear Safety Regulation; the Environmental Impact of the us of Nuclear Power, severe accident consequences and scenarios

  19. Minimally invasive surgical treatment of Bertolotti's Syndrome: case report.

    Science.gov (United States)

    Ugokwe, Kene T; Chen, Tsu-Lee; Klineberg, Eric; Steinmetz, Michael P

    2008-05-01

    This article aims to provide more insight into the presentation, diagnosis, and treatment of Bertolotti's syndrome, which is a rare spinal disorder that is very difficult to recognize and diagnose correctly. The syndrome was first described by Bertolotti in 1917 and affects approximately 4 to 8% of the population. It is characterized by an enlarged transverse process at the most caudal lumbar vertebra with a pseudoarticulation of the transverse process and the sacral ala. It tends to present with low back pain and may be confused with facet and sacroiliac joint disease. In this case report, we describe a 40-year-old man who presented with low back pain and was eventually diagnosed with Bertolotti's syndrome. The correct diagnosis was made based on imaging studies which included computed tomographic scans, plain x-rays, and magnetic resonance imaging scans. The patient experienced temporary relief when the abnormal pseudoarticulation was injected with a cocktail consisting of lidocaine and steroids. In order to minimize the trauma associated with surgical treatment, a minimally invasive approach was chosen to resect the anomalous transverse process with the accompanying pseudoarticulation. The patient did well postoperatively and had 97% resolution of his pain at 6 months after surgery. As with conventional surgical approaches, a complete knowledge of anatomy is required for minimally invasive spine surgery. This case is an example of the expanding utility of minimally invasive approaches in treating spinal disorders.

  20. USE OF ONTOLOGIES FOR KNOWLEDGE BASES CREATION TUTORING COMPUTER SYSTEMS

    OpenAIRE

    Cheremisina Lyubov

    2014-01-01

    This paper deals with the use of ontology for the use and development of intelligent tutoring systems. We consider the shortcomings of educational software and distance learning systems and the advantages of using ontology’s in their design. Actuality creates educational computer systems based on systematic knowledge. We consider classification of properties, use and benefits of ontology’s. Characterized approaches to the problem of ontology mapping, the first of which – manual mapping, the s...

  1. Computer vision syndrome: a study of the knowledge, attitudes and practices in Indian ophthalmologists.

    Science.gov (United States)

    Bali, Jatinder; Navin, Neeraj; Thakur, Bali Renu

    2007-01-01

    To study the knowledge, attitude and practices (KAP) towards computer vision syndrome prevalent in Indian ophthalmologists and to assess whether 'computer use by practitioners' had any bearing on the knowledge and practices in computer vision syndrome (CVS). A random KAP survey was carried out on 300 Indian ophthalmologists using a 34-point spot-questionnaire in January 2005. All the doctors who responded were aware of CVS. The chief presenting symptoms were eyestrain (97.8%), headache (82.1%), tiredness and burning sensation (79.1%), watering (66.4%) and redness (61.2%). Ophthalmologists using computers reported that focusing from distance to near and vice versa (P =0.006, chi2 test), blurred vision at a distance (P =0.016, chi2 test) and blepharospasm (P =0.026, chi2 test) formed part of the syndrome. The main mode of treatment used was tear substitutes. Half of ophthalmologists (50.7%) were not prescribing any spectacles. They did not have any preference for any special type of glasses (68.7%) or spectral filters. Computer-users were more likely to prescribe sedatives/anxiolytics (P = 0.04, chi2 test), spectacles (P = 0.02, chi2 test) and conscious frequent blinking (P = 0.003, chi2 test) than the non-computer-users. All respondents were aware of CVS. Confusion regarding treatment guidelines was observed in both groups. Computer-using ophthalmologists were more informed of symptoms and diagnostic signs but were misinformed about treatment modalities.

  2. Minimal DBM Substraction

    DEFF Research Database (Denmark)

    David, Alexandre; Håkansson, John; G. Larsen, Kim

    In this paper we present an algorithm to compute DBM substractions with a guaranteed minimal number of splits and disjoint DBMs to avoid any redundance. The substraction is one of the few operations that result in a non-convex zone, and thus, requires splitting. It is of prime importance to reduce...

  3. Computer Simulations of Developmental Change: The Contributions of Working Memory Capacity and Long-Term Knowledge

    Science.gov (United States)

    Jones, Gary; Gobet, Fernand; Pine, Julian M.

    2008-01-01

    Increasing working memory (WM) capacity is often cited as a major influence on children's development and yet WM capacity is difficult to examine independently of long-term knowledge. A computational model of children's nonword repetition (NWR) performance is presented that independently manipulates long-term knowledge and WM capacity to determine…

  4. The Variation Theorem Applied to H-2+: A Simple Quantum Chemistry Computer Project

    Science.gov (United States)

    Robiette, Alan G.

    1975-01-01

    Describes a student project which requires limited knowledge of Fortran and only minimal computing resources. The results illustrate such important principles of quantum mechanics as the variation theorem and the virial theorem. Presents sample calculations and the subprogram for energy calculations. (GS)

  5. IMPORTANCE, Minimal Cut Sets and System Availability from Fault Tree Analysis

    International Nuclear Information System (INIS)

    Lambert, H. W.

    1987-01-01

    1 - Description of problem or function: IMPORTANCE computes various measures of probabilistic importance of basic events and minimal cut sets to a fault tree or reliability network diagram. The minimal cut sets, the failure rates and the fault duration times (i.e., the repair times) of all basic events contained in the minimal cut sets are supplied as input data. The failure and repair distributions are assumed to be exponential. IMPORTANCE, a quantitative evaluation code, then determines the probability of the top event and computes the importance of minimal cut sets and basic events by a numerical ranking. Two measures are computed. The first describes system behavior at one point in time; the second describes sequences of failures that cause the system to fail in time. All measures are computed assuming statistical independence of basic events. In addition, system unavailability and expected number of system failures are computed by the code. 2 - Method of solution: Seven measures of basic event importance and two measures of cut set importance can be computed. Birnbaum's measure of importance (i.e., the partial derivative) and the probability of the top event are computed using the min cut upper bound. If there are no replicated events in the minimal cut sets, then the min cut upper bound is exact. If basic events are replicated in the minimal cut sets, then based on experience the min cut upper bound is accurate if the probability of the top event is less than 0.1. Simpson's rule is used in computing the time-integrated measures of importance. Newton's method for approximating the roots of an equation is employed in the options where the importance measures are computed as a function of the probability of the top event, and a shell sort puts the output in descending order of importance

  6. The minimal manual: is less really more?

    NARCIS (Netherlands)

    Lazonder, Adrianus W.; van der Meij, Hans

    1993-01-01

    Carroll, Smith-Kerker, Ford and Mazur-Rimetz (The minimal manual, Human-Computer Interaction , 3, 123-153, 1987) have introduced the minimal manual as an alternative to standard self-instruction manuals. While their research indicates strong gains, only a few attempts have been made to validate

  7. E-pharmacovigilance: development and implementation of a computable knowledge base to identify adverse drug reactions.

    Science.gov (United States)

    Neubert, Antje; Dormann, Harald; Prokosch, Hans-Ulrich; Bürkle, Thomas; Rascher, Wolfgang; Sojer, Reinhold; Brune, Kay; Criegee-Rieck, Manfred

    2013-09-01

    Computer-assisted signal generation is an important issue for the prevention of adverse drug reactions (ADRs). However, due to poor standardization of patients' medical data and a lack of computable medical drug knowledge the specificity of computerized decision support systems for early ADR detection is too low and thus those systems are not yet implemented in daily clinical practice. We report on a method to formalize knowledge about ADRs based on the Summary of Product Characteristics (SmPCs) and linking them with structured patient data to generate safety signals automatically and with high sensitivity and specificity. A computable ADR knowledge base (ADR-KB) that inherently contains standardized concepts for ADRs (WHO-ART), drugs (ATC) and laboratory test results (LOINC) was built. The system was evaluated in study populations of paediatric and internal medicine inpatients. A total of 262 different ADR concepts related to laboratory findings were linked to 212 LOINC terms. The ADR knowledge base was retrospectively applied to a study population of 970 admissions (474 internal and 496 paediatric patients), who underwent intensive ADR surveillance. The specificity increased from 7% without ADR-KB up to 73% in internal patients and from 19.6% up to 91% in paediatric inpatients, respectively. This study shows that contextual linkage of patients' medication data with laboratory test results is a useful and reasonable instrument for computer-assisted ADR detection and a valuable step towards a systematic drug safety process. The system enables automated detection of ADRs during clinical practice with a quality close to intensive chart review. © 2013 The Authors. British Journal of Clinical Pharmacology © 2013 The British Pharmacological Society.

  8. Computer vision syndrome: A study of the knowledge, attitudes and practices in Indian Ophthalmologists

    Directory of Open Access Journals (Sweden)

    Bali Jatinder

    2007-01-01

    Full Text Available Purpose: To study the knowledge, attitude and practices (KAP towards computer vision syndrome prevalent in Indian ophthalmologists and to assess whether ′computer use by practitioners′ had any bearing on the knowledge and practices in computer vision syndrome (CVS. Materials and Methods: A random KAP survey was carried out on 300 Indian ophthalmologists using a 34-point spot-questionnaire in January 2005. Results: All the doctors who responded were aware of CVS. The chief presenting symptoms were eyestrain (97.8%, headache (82.1%, tiredness and burning sensation (79.1%, watering (66.4% and redness (61.2%. Ophthalmologists using computers reported that focusing from distance to near and vice versa ( P =0.006, χ2 test, blurred vision at a distance ( P =0.016, χ2 test and blepharospasm ( P =0.026, χ2 test formed part of the syndrome. The main mode of treatment used was tear substitutes. Half of ophthalmologists (50.7% were not prescribing any spectacles. They did not have any preference for any special type of glasses (68.7% or spectral filters. Computer-users were more likely to prescribe sedatives/ anxiolytics ( P = 0.04, χ2 test, spectacles ( P = 0.02, χ2 test and conscious frequent blinking ( P = 0.003, χ2 test than the non-computer-users. Conclusions: All respondents were aware of CVS. Confusion regarding treatment guidelines was observed in both groups. Computer-using ophthalmologists were more informed of symptoms and diagnostic signs but were misinformed about treatment modalities.

  9. A Computer Knowledge Database of accidents at work in the construction industry

    Science.gov (United States)

    Hoła, B.; Szóstak, M.

    2017-10-01

    At least 60,000 fatal accidents at work occur on building sites all over the world each year, which means that on average, every 10 minutes an employee dies during the execution of work. In 2015 on Polish building sites, 5,776 accidents at work happened, of which 69 resulted in the death of an employee. Accidents are an enormous social and economic burden for companies, communities and countries. The vast majority of accidents at work can be prevented by appropriate and effective preventive measures. Therefore, the Computer Knowledge Database (CKD) was formulated for this purpose and it enables data and information on accidents at work in the construction industry to be collected and processed in order to obtain necessary knowledge. This gained knowledge will be the basis to form conclusions of a preventive nature

  10. Teaching Surgical Hysteroscopy with a Computer

    Science.gov (United States)

    Lefebvre; Cote; Lefebvre

    1996-08-01

    Using a hysteroscope can be simulated on a computer. It will improve physician training by measuring basic knowledge and abilities, allow different interventions and anatomic variations, minimize the trauma of surgical intervention, and reduce operative casualties. An integrated questionnaire covers instrumentation, fluid infusion, power source, indications and preparation for endometrial ablation, surgical techniques, and complications to evaluate the user's knowledge. The operation simulation then proceeds. In the endometrial cavity, by virtual simulation, the operating field should appear in real time to allow physicians to adapt the trajectory of the instruments. The computer is an IBM PC compatible. We use a modified joystick with optical encoders to know the instrument position. The simulation can be repeated as desired. An evaluation system is integrated in the software to keep the user informed on the amount of burn area(s) that have been completed. This prototype model is available.

  11. The Effect of Computer Simulations on Acquisition of Knowledge and Cognitive Load: A Gender Perspective

    Science.gov (United States)

    Kaheru, Sam J.; Kriek, Jeanne

    2016-01-01

    A study on the effect of the use of computer simulations (CS) on the acquisition of knowledge and cognitive load was undertaken with 104 Grade 11 learners in four schools in rural South Africa on the physics topic geometrical optics. Owing to the lack of resources a teacher-centred approach was followed in the use of computer simulations. The…

  12. Integrating design and production planning with knowledge-based inspection planning system

    International Nuclear Information System (INIS)

    Abbasi, Ghaleb Y.; Ketan, Hussein S.; Adil, Mazen B.

    2005-01-01

    In this paper an intelligent environment to integrate design and inspection earlier to the design stage. A hybrid knowledge-based approach integrating computer-aided design (CAD) and computer-aided inspection planning (CAIP) was developed, thereafter called computer-aided design and inspection planning (CADIP). CADIP was adopted for automated dimensional inspection planning. Critical functional features were screened based on certain attributes for part features for inspection planning application. Testing the model resulted in minimizing the number of probing vectors associated with the most important features in the inspected prismatic part, significant reduction in inspection costs and release of human labor. In totality, this tends to increase customer satisfaction as a final goal of the developed system. (author)

  13. Harm minimization among teenage drinkers

    DEFF Research Database (Denmark)

    Jørgensen, Morten Hulvej; Curtis, Tine; Christensen, Pia Haudrup

    2007-01-01

    AIM: To examine strategies of harm minimization employed by teenage drinkers. DESIGN, SETTING AND PARTICIPANTS: Two periods of ethnographic fieldwork were conducted in a rural Danish community of approximately 2000 inhabitants. The fieldwork included 50 days of participant observation among 13....... In regulating the social context of drinking they relied on their personal experiences more than on formalized knowledge about alcohol and harm, which they had learned from prevention campaigns and educational programmes. CONCLUSIONS: In this study we found that teenagers may help each other to minimize alcohol...

  14. Minimal Marking: A Success Story

    Directory of Open Access Journals (Sweden)

    Anne McNeilly

    2014-11-01

    Full Text Available The minimal-marking project conducted in Ryerson’s School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The “minimal-marking” concept (Haswell, 1983, which requires dramatically more student engagement, resulted in more successful learning outcomes for surface-level knowledge acquisition than the more traditional approach of “teacher-corrects-all.” Results suggest it would be effective, not just for grammar, punctuation, and word usage, the objective here, but for any material that requires rote-memory learning, such as the Associated Press or Canadian Press style rules used by news publications across North America.

  15. Prospective evaluation of an internet-linked handheld computer critical care knowledge access system.

    Science.gov (United States)

    Lapinsky, Stephen E; Wax, Randy; Showalter, Randy; Martinez-Motta, J Carlos; Hallett, David; Mehta, Sangeeta; Burry, Lisa; Stewart, Thomas E

    2004-12-01

    Critical care physicians may benefit from immediate access to medical reference material. We evaluated the feasibility and potential benefits of a handheld computer based knowledge access system linking a central academic intensive care unit (ICU) to multiple community-based ICUs. Four community hospital ICUs with 17 physicians participated in this prospective interventional study. Following training in the use of an internet-linked, updateable handheld computer knowledge access system, the physicians used the handheld devices in their clinical environment for a 12-month intervention period. Feasibility of the system was evaluated by tracking use of the handheld computer and by conducting surveys and focus group discussions. Before and after the intervention period, participants underwent simulated patient care scenarios designed to evaluate the information sources they accessed, as well as the speed and quality of their decision making. Participants generated admission orders during each scenario, which were scored by blinded evaluators. Ten physicians (59%) used the system regularly, predominantly for nonmedical applications (median 32.8/month, interquartile range [IQR] 28.3-126.8), with medical software accessed less often (median 9/month, IQR 3.7-13.7). Eight out of 13 physicians (62%) who completed the final scenarios chose to use the handheld computer for information access. The median time to access information on the handheld handheld computer was 19 s (IQR 15-40 s). This group exhibited a significant improvement in admission order score as compared with those who used other resources (P = 0.018). Benefits and barriers to use of this technology were identified. An updateable handheld computer system is feasible as a means of point-of-care access to medical reference material and may improve clinical decision making. However, during the study, acceptance of the system was variable. Improved training and new technology may overcome some of the barriers we

  16. On the Necessary and Sufficient Assumptions for UC Computation

    DEFF Research Database (Denmark)

    Damgård, Ivan Bjerre; Nielsen, Jesper Buus; Orlandi, Claudio

    2010-01-01

    for all of them. Perhaps most interestingly we show that: •  For even the minimal meaningful KRA, where we only assume that the secret key is a value which is hard to compute from the public key, one can UC securely compute any poly-time functionality if there exists a passive secure oblivious...... that in the KRA model one-way functions are sufficient for UC commitment and UC zero-knowledge. These are the first examples of UC secure protocols for non-trivial tasks which do not assume the existence of public-key primitives. In particular, the protocols show that non-trivial UC computation is possible...

  17. The Impact of Learner's Prior Knowledge on Their Use of Chemistry Computer Simulations: A Case Study

    Science.gov (United States)

    Liu, Han-Chin; Andre, Thomas; Greenbowe, Thomas

    2008-01-01

    It is complicated to design a computer simulation that adapts to students with different characteristics. This study documented cases that show how college students' prior chemistry knowledge level affected their interaction with peers and their approach to solving problems with the use of computer simulations that were designed to learn…

  18. Use of declarative statements in creating and maintaining computer-interpretable knowledge bases for guideline-based care.

    Science.gov (United States)

    Tu, Samson W; Hrabak, Karen M; Campbell, James R; Glasgow, Julie; Nyman, Mark A; McClure, Robert; McClay, James; Abarbanel, Robert; Mansfield, James G; Martins, Susana M; Goldstein, Mary K; Musen, Mark A

    2006-01-01

    Developing computer-interpretable clinical practice guidelines (CPGs) to provide decision support for guideline-based care is an extremely labor-intensive task. In the EON/ATHENA and SAGE projects, we formulated substantial portions of CPGs as computable statements that express declarative relationships between patient conditions and possible interventions. We developed query and expression languages that allow a decision-support system (DSS) to evaluate these statements in specific patient situations. A DSS can use these guideline statements in multiple ways, including: (1) as inputs for determining preferred alternatives in decision-making, and (2) as a way to provide targeted commentaries in the clinical information system. The use of these declarative statements significantly reduces the modeling expertise and effort required to create and maintain computer-interpretable knowledge bases for decision-support purpose. We discuss possible implications for sharing of such knowledge bases.

  19. Distributed Submodular Minimization And Motion Coordination Over Discrete State Space

    KAUST Repository

    Jaleel, Hassan

    2017-09-21

    Submodular set-functions are extensively used in large-scale combinatorial optimization problems arising in complex networks and machine learning. While there has been significant interest in distributed formulations of convex optimization, distributed minimization of submodular functions has not received significant attention. Thus, our main contribution is a framework for minimizing submodular functions in a distributed manner. The proposed framework is based on the ideas of Lovasz extension of submodular functions and distributed optimization of convex functions. The framework exploits a fundamental property of submodularity that the Lovasz extension of a submodular function is a convex function and can be computed efficiently. Moreover, a minimizer of a submodular function can be computed by computing the minimizer of its Lovasz extension. In the proposed framework, we employ a consensus based distributed optimization algorithm to minimize set-valued submodular functions as well as general submodular functions defined over set products. We also identify distributed motion coordination in multiagent systems as a new application domain for submodular function minimization. For demonstrating key ideas of the proposed framework, we select a complex setup of the capture the flag game, which offers a variety of challenges relevant to multiagent system. We formulate the problem as a submodular minimization problem and verify through extensive simulations that the proposed framework results in feasible policies for the agents.

  20. ℓ0 Gradient Minimization Based Image Reconstruction for Limited-Angle Computed Tomography.

    Directory of Open Access Journals (Sweden)

    Wei Yu

    Full Text Available In medical and industrial applications of computed tomography (CT imaging, limited by the scanning environment and the risk of excessive X-ray radiation exposure imposed to the patients, reconstructing high quality CT images from limited projection data has become a hot topic. X-ray imaging in limited scanning angular range is an effective imaging modality to reduce the radiation dose to the patients. As the projection data available in this modality are incomplete, limited-angle CT image reconstruction is actually an ill-posed inverse problem. To solve the problem, image reconstructed by conventional filtered back projection (FBP algorithm frequently results in conspicuous streak artifacts and gradual changed artifacts nearby edges. Image reconstruction based on total variation minimization (TVM can significantly reduce streak artifacts in few-view CT, but it suffers from the gradual changed artifacts nearby edges in limited-angle CT. To suppress this kind of artifacts, we develop an image reconstruction algorithm based on ℓ0 gradient minimization for limited-angle CT in this paper. The ℓ0-norm of the image gradient is taken as the regularization function in the framework of developed reconstruction model. We transformed the optimization problem into a few optimization sub-problems and then, solved these sub-problems in the manner of alternating iteration. Numerical experiments are performed to validate the efficiency and the feasibility of the developed algorithm. From the statistical analysis results of the performance evaluations peak signal-to-noise ratio (PSNR and normalized root mean square distance (NRMSD, it shows that there are significant statistical differences between different algorithms from different scanning angular ranges (p<0.0001. From the experimental results, it also indicates that the developed algorithm outperforms classical reconstruction algorithms in suppressing the streak artifacts and the gradual changed

  1. Single photon emission computed tomography and statistical parametric mapping analysis in cirrhotic patients with and without minimal hepatic encephalopathy

    International Nuclear Information System (INIS)

    Nakagawa, Yuri; Matsumura, Kaname; Iwasa, Motoh; Kaito, Masahiko; Adachi, Yukihiko; Takeda, Kan

    2004-01-01

    The early diagnosis and treatment of cognitive impairment in cirrhotic patients is needed to improve the patients' daily living. In this study, alterations of regional cerebral blood flow (rCBF) were evaluated in cirrhotic patients using statistical parametric mapping (SPM). The relationships between rCBF and neuropsychological test, severity of disease and biochemical data were also assessed. 99m Tc-ethyl cysteinate dimer single photon emission computed tomography was performed in 20 patients with non-alcoholic liver cirrhosis without overt hepatic encephalopathy (HE) and in 20 age-matched healthy subjects. Neuropsychological tests were performed in 16 patients; of these 7 had minimal HE. Regional CBF images were also analyzed in these groups using SPM. On SPM analysis, cirrhotic patients showed regions of significant hypoperfusion in the superior and middle frontal gyri, and inferior parietal lobules compared with the control group. These areas included parts of the premotor and parietal associated areas of the cortex. Among the cirrhotic patients, those with minimal HE had regions of significant hypoperfusion in the cingulate gyri bilaterally as compared with those without minimal HE. Abnormal function in the above regions may account for the relatively selective neuropsychological deficits in the cognitive status of patients with cirrhosis. These findings may be important in the identification and management of cirrhotic patients with minimal HE. (author)

  2. Knowledge-driven computational modeling in Alzheimer's disease research: Current state and future trends.

    Science.gov (United States)

    Geerts, Hugo; Hofmann-Apitius, Martin; Anastasio, Thomas J

    2017-11-01

    Neurodegenerative diseases such as Alzheimer's disease (AD) follow a slowly progressing dysfunctional trajectory, with a large presymptomatic component and many comorbidities. Using preclinical models and large-scale omics studies ranging from genetics to imaging, a large number of processes that might be involved in AD pathology at different stages and levels have been identified. The sheer number of putative hypotheses makes it almost impossible to estimate their contribution to the clinical outcome and to develop a comprehensive view on the pathological processes driving the clinical phenotype. Traditionally, bioinformatics approaches have provided correlations and associations between processes and phenotypes. Focusing on causality, a new breed of advanced and more quantitative modeling approaches that use formalized domain expertise offer new opportunities to integrate these different modalities and outline possible paths toward new therapeutic interventions. This article reviews three different computational approaches and their possible complementarities. Process algebras, implemented using declarative programming languages such as Maude, facilitate simulation and analysis of complicated biological processes on a comprehensive but coarse-grained level. A model-driven Integration of Data and Knowledge, based on the OpenBEL platform and using reverse causative reasoning and network jump analysis, can generate mechanistic knowledge and a new, mechanism-based taxonomy of disease. Finally, Quantitative Systems Pharmacology is based on formalized implementation of domain expertise in a more fine-grained, mechanism-driven, quantitative, and predictive humanized computer model. We propose a strategy to combine the strengths of these individual approaches for developing powerful modeling methodologies that can provide actionable knowledge for rational development of preventive and therapeutic interventions. Development of these computational approaches is likely to

  3. Knowledge Representation and Ontologies

    Science.gov (United States)

    Grimm, Stephan

    Knowledge representation and reasoning aims at designing computer systems that reason about a machine-interpretable representation of the world. Knowledge-based systems have a computational model of some domain of interest in which symbols serve as surrogates for real world domain artefacts, such as physical objects, events, relationships, etc. [1]. The domain of interest can cover any part of the real world or any hypothetical system about which one desires to represent knowledge for com-putational purposes. A knowledge-based system maintains a knowledge base, which stores the symbols of the computational model in the form of statements about the domain, and it performs reasoning by manipulating these symbols. Applications can base their decisions on answers to domain-relevant questions posed to a knowledge base.

  4. Solving black box computation problems using expert knowledge theory and methods

    International Nuclear Information System (INIS)

    Booker, Jane M.; McNamara, Laura A.

    2004-01-01

    The challenge problems for the Epistemic Uncertainty Workshop at Sandia National Laboratories provide common ground for comparing different mathematical theories of uncertainty, referred to as General Information Theories (GITs). These problems also present the opportunity to discuss the use of expert knowledge as an important constituent of uncertainty quantification. More specifically, how do the principles and methods of eliciting and analyzing expert knowledge apply to these problems and similar ones encountered in complex technical problem solving and decision making? We will address this question, demonstrating how the elicitation issues and the knowledge that experts provide can be used to assess the uncertainty in outputs that emerge from a black box model or computational code represented by the challenge problems. In our experience, the rich collection of GITs provides an opportunity to capture the experts' knowledge and associated uncertainties consistent with their thinking, problem solving, and problem representation. The elicitation process is rightly treated as part of an overall analytical approach, and the information elicited is not simply a source of data. In this paper, we detail how the elicitation process itself impacts the analyst's ability to represent, aggregate, and propagate uncertainty, as well as how to interpret uncertainties in outputs. While this approach does not advocate a specific GIT, answers under uncertainty do result from the elicitation

  5. The benefits of computer-generated feedback for mathematics problem solving.

    Science.gov (United States)

    Fyfe, Emily R; Rittle-Johnson, Bethany

    2016-07-01

    The goal of the current research was to better understand when and why feedback has positive effects on learning and to identify features of feedback that may improve its efficacy. In a randomized experiment, second-grade children received instruction on a correct problem-solving strategy and then solved a set of relevant problems. Children were assigned to receive no feedback, immediate feedback, or summative feedback from the computer. On a posttest the following day, feedback resulted in higher scores relative to no feedback for children who started with low prior knowledge. Immediate feedback was particularly effective, facilitating mastery of the material for children with both low and high prior knowledge. Results suggest that minimal computer-generated feedback can be a powerful form of guidance during problem solving. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Wilson loops in minimal surfaces

    International Nuclear Information System (INIS)

    Drukker, Nadav; Gross, David J.; Ooguri, Hirosi

    1999-01-01

    The AdS/CFT correspondence suggests that the Wilson loop of the large N gauge theory with N = 4 supersymmetry in 4 dimensions is described by a minimal surface in AdS 5 x S 5 . The authors examine various aspects of this proposal, comparing gauge theory expectations with computations of minimal surfaces. There is a distinguished class of loops, which the authors call BPS loops, whose expectation values are free from ultra-violet divergence. They formulate the loop equation for such loops. To the extent that they have checked, the minimal surface in AdS 5 x S 5 gives a solution of the equation. The authors also discuss the zig-zag symmetry of the loop operator. In the N = 4 gauge theory, they expect the zig-zag symmetry to hold when the loop does not couple the scalar fields in the supermultiplet. They will show how this is realized for the minimal surface

  7. Wilson loops and minimal surfaces

    International Nuclear Information System (INIS)

    Drukker, Nadav; Gross, David J.; Ooguri, Hirosi

    1999-01-01

    The AdS-CFT correspondence suggests that the Wilson loop of the large N gauge theory with N=4 supersymmetry in four dimensions is described by a minimal surface in AdS 5 xS 5 . We examine various aspects of this proposal, comparing gauge theory expectations with computations of minimal surfaces. There is a distinguished class of loops, which we call BPS loops, whose expectation values are free from ultraviolet divergence. We formulate the loop equation for such loops. To the extent that we have checked, the minimal surface in AdS 5 xS 5 gives a solution of the equation. We also discuss the zigzag symmetry of the loop operator. In the N=4 gauge theory, we expect the zigzag symmetry to hold when the loop does not couple the scalar fields in the supermultiplet. We will show how this is realized for the minimal surface. (c) 1999 The American Physical Society

  8. Computer vision syndrome: a study of knowledge and practices in university students.

    Science.gov (United States)

    Reddy, S C; Low, C K; Lim, Y P; Low, L L; Mardina, F; Nursaleha, M P

    2013-01-01

    Computer vision syndrome (CVS) is a condition in which a person experiences one or more of eye symptoms as a result of prolonged working on a computer. To determine the prevalence of CVS symptoms, knowledge and practices of computer use in students studying in different universities in Malaysia, and to evaluate the association of various factors in computer use with the occurrence of symptoms. In a cross sectional, questionnaire survey study, data was collected in college students regarding the demography, use of spectacles, duration of daily continuous use of computer, symptoms of CVS, preventive measures taken to reduce the symptoms, use of radiation filter on the computer screen, and lighting in the room. A total of 795 students, aged between 18 and 25 years, from five universities in Malaysia were surveyed. The prevalence of symptoms of CVS (one or more) was found to be 89.9%; the most disturbing symptom was headache (19.7%) followed by eye strain (16.4%). Students who used computer for more than 2 hours per day experienced significantly more symptoms of CVS (p=0.0001). Looking at far objects in-between the work was significantly (p=0.0008) associated with less frequency of CVS symptoms. The use of radiation filter on the screen (p=0.6777) did not help in reducing the CVS symptoms. Ninety percent of university students in Malaysia experienced symptoms related to CVS, which was seen more often in those who used computer for more than 2 hours continuously per day. © NEPjOPH.

  9. Discrete Curvatures and Discrete Minimal Surfaces

    KAUST Repository

    Sun, Xiang

    2012-01-01

    This thesis presents an overview of some approaches to compute Gaussian and mean curvature on discrete surfaces and discusses discrete minimal surfaces. The variety of applications of differential geometry in visualization and shape design leads

  10. Knowledge based system for control rod programming of BWRs

    International Nuclear Information System (INIS)

    Fukuzaki, Takaharu; Yoshida, Ken-ichi; Kobayashi, Yasuhiro

    1988-01-01

    A knowledge based system has been developed to support designers in control rod programming of BWRs. The programming searches through optimal control rod patterns to realize safe and effective burning of nuclear fuel. Knowledge of experienced designers plays the main role in minimizing the number of calculations by the core performance evaluation code. This code predicts power distibution and thermal margins of the nuclear fuel. This knowledge is transformed into 'if-then' type rules and subroutines, and is stored in a knowledge base of the knowledge based system. The system consists of working area, an inference engine and the knowledge base. The inference engine can detect those data which have to be regenerated, call those subroutine which control the user's interface and numerical computations, and store competitive sets of data in different parts of the working area. Using this system, control rod programming of a BWR plant was traced with about 500 rules and 150 subroutines. Both the generation of control rod patterns for the first calculation of the code and the modification of a control rod pattern to reflect the calculation were completed more effectively than in a conventional method. (author)

  11. Linear Text vs. Non-Linear Hypertext in Handheld Computers: Effects on Declarative and Structural Knowledge, and Learner Motivation

    Science.gov (United States)

    Son, Chanhee; Park, Sanghoon; Kim, Minjeong

    2011-01-01

    This study compared linear text-based and non-linear hypertext-based instruction in a handheld computer regarding effects on two different levels of knowledge (declarative and structural knowledge) and learner motivation. Forty four participants were randomly assigned to one of three experimental conditions: linear text, hierarchical hypertext,…

  12. Factors affecting learning of vector math from computer-based practice: Feedback complexity and prior knowledge

    Directory of Open Access Journals (Sweden)

    Andrew F. Heckler

    2016-06-01

    Full Text Available In experiments including over 450 university-level students, we studied the effectiveness and time efficiency of several levels of feedback complexity in simple, computer-based training utilizing static question sequences. The learning domain was simple vector math, an essential skill in introductory physics. In a unique full factorial design, we studied the relative effects of “knowledge of correct response” feedback and “elaborated feedback” (i.e., a general explanation both separately and together. A number of other factors were analyzed, including training time, physics course grade, prior knowledge of vector math, and student beliefs about both their proficiency in and the importance of vector math. We hypothesize a simple model predicting how the effectiveness of feedback depends on prior knowledge, and the results confirm this knowledge-by-treatment interaction. Most notably, elaborated feedback is the most effective feedback, especially for students with low prior knowledge and low course grade. In contrast, knowledge of correct response feedback was less effective for low-performing students, and including both kinds of feedback did not significantly improve performance compared to elaborated feedback alone. Further, while elaborated feedback resulted in higher scores, the learning rate was at best only marginally higher because the training time was slightly longer. Training time data revealed that students spent significantly more time on the elaborated feedback after answering a training question incorrectly. Finally, we found that training improved student self-reported proficiency and that belief in the importance of the learned domain improved the effectiveness of training. Overall, we found that computer based training with static question sequences and immediate elaborated feedback in the form of simple and general explanations can be an effective way to improve student performance on a physics essential skill

  13. Matrix factorizations, minimal models and Massey products

    International Nuclear Information System (INIS)

    Knapp, Johanna; Omer, Harun

    2006-01-01

    We present a method to compute the full non-linear deformations of matrix factorizations for ADE minimal models. This method is based on the calculation of higher products in the cohomology, called Massey products. The algorithm yields a polynomial ring whose vanishing relations encode the obstructions of the deformations of the D-branes characterized by these matrix factorizations. This coincides with the critical locus of the effective superpotential which can be computed by integrating these relations. Our results for the effective superpotential are in agreement with those obtained from solving the A-infinity relations. We point out a relation to the superpotentials of Kazama-Suzuki models. We will illustrate our findings by various examples, putting emphasis on the E 6 minimal model

  14. Knowledge-Based Systems in Biomedicine and Computational Life Science

    CERN Document Server

    Jain, Lakhmi

    2013-01-01

    This book presents a sample of research on knowledge-based systems in biomedicine and computational life science. The contributions include: ·         personalized stress diagnosis system ·         image analysis system for breast cancer diagnosis ·         analysis of neuronal cell images ·         structure prediction of protein ·         relationship between two mental disorders ·         detection of cardiac abnormalities ·         holistic medicine based treatment ·         analysis of life-science data  

  15. A Comparative Study of University of Wisconsin-Stout Freshmen and Senior Education Majors Computing and Internet Technology Skills / Knowledge and Associated Learning Experiences

    OpenAIRE

    Sveum, Evan Charles

    2010-01-01

    A study comparing University of Wisconsin-Stout freshmen and senior education majors’ computing and Internet technology skills/knowledge and associated learning experiences was conducted. Instruments used in this study included the IC³® Exam by Certiport, Inc. and the investigator’s Computing and Internet Skills Learning Experiences survey. UW-Stout freshmen education majors participating in the study demonstrated poor computing and Internet technology skills/knowledge. UW-Stout senior educat...

  16. Support Minimized Inversion of Acoustic and Elastic Wave Scattering

    Science.gov (United States)

    Safaeinili, Ali

    Inversion of limited data is common in many areas of NDE such as X-ray Computed Tomography (CT), Ultrasonic and eddy current flaw characterization and imaging. In many applications, it is common to have a bias toward a solution with minimum (L^2)^2 norm without any physical justification. When it is a priori known that objects are compact as, say, with cracks and voids, by choosing "Minimum Support" functional instead of the minimum (L^2)^2 norm, an image can be obtained that is equally in agreement with the available data, while it is more consistent with what is most probably seen in the real world. We have utilized a minimum support functional to find a solution with the smallest volume. This inversion algorithm is most successful in reconstructing objects that are compact like voids and cracks. To verify this idea, we first performed a variational nonlinear inversion of acoustic backscatter data using minimum support objective function. A full nonlinear forward model was used to accurately study the effectiveness of the minimized support inversion without error due to the linear (Born) approximation. After successful inversions using a full nonlinear forward model, a linearized acoustic inversion was developed to increase speed and efficiency in imaging process. The results indicate that by using minimum support functional, we can accurately size and characterize voids and/or cracks which otherwise might be uncharacterizable. An extremely important feature of support minimized inversion is its ability to compensate for unknown absolute phase (zero-of-time). Zero-of-time ambiguity is a serious problem in the inversion of the pulse-echo data. The minimum support inversion was successfully used for the inversion of acoustic backscatter data due to compact scatterers without the knowledge of the zero-of-time. The main drawback to this type of inversion is its computer intensiveness. In order to make this type of constrained inversion available for common use, work

  17. Discrete computational structures

    CERN Document Server

    Korfhage, Robert R

    1974-01-01

    Discrete Computational Structures describes discrete mathematical concepts that are important to computing, covering necessary mathematical fundamentals, computer representation of sets, graph theory, storage minimization, and bandwidth. The book also explains conceptual framework (Gorn trees, searching, subroutines) and directed graphs (flowcharts, critical paths, information network). The text discusses algebra particularly as it applies to concentrates on semigroups, groups, lattices, propositional calculus, including a new tabular method of Boolean function minimization. The text emphasize

  18. Security and health protection while working with a computer. Survey into the knowledge of users about legal and other requirements.

    OpenAIRE

    Šmejkalová, Petra

    2005-01-01

    This bachelor thesis is aimed at the knowledge of general computer users with regards to work security and health protection. It summarizes the relevant legislation and recommendations of ergonomic specialists. The practical part analyses results of a survey, which examined the computer workplaces and user habits when working with a computer.

  19. Accurately computing the optical pathlength difference for a michelson interferometer with minimal knowledge of the source spectrum.

    Science.gov (United States)

    Milman, Mark H

    2005-12-01

    Astrometric measurements using stellar interferometry rely on precise measurement of the central white light fringe to accurately obtain the optical pathlength difference of incoming starlight to the two arms of the interferometer. One standard approach to stellar interferometry uses a channeled spectrum to determine phases at a number of different wavelengths that are then converted to the pathlength delay. When throughput is low these channels are broadened to improve the signal-to-noise ratio. Ultimately the ability to use monochromatic models and algorithms in each of the channels to extract phase becomes problematic and knowledge of the spectrum must be incorporated to achieve the accuracies required of the astrometric measurements. To accomplish this an optimization problem is posed to estimate simultaneously the pathlength delay and spectrum of the source. Moreover, the nature of the parameterization of the spectrum that is introduced circumvents the need to solve directly for these parameters so that the optimization problem reduces to a scalar problem in just the pathlength delay variable. A number of examples are given to show the robustness of the approach.

  20. Minimal canonical comprehensive Gröbner systems

    OpenAIRE

    Manubens, Montserrat; Montes, Antonio

    2009-01-01

    This is the continuation of Montes' paper "On the canonical discussion of polynomial systems with parameters''. In this paper, we define the Minimal Canonical Comprehensive Gröbner System of a parametric ideal and fix under which hypothesis it exists and is computable. An algorithm to obtain a canonical description of the segments of the Minimal Canonical CGS is given, thus completing the whole MCCGS algorithm (implemented in Maple and Singular). We show its high utility for applications, suc...

  1. SAR image regularization with fast approximate discrete minimization.

    Science.gov (United States)

    Denis, Loïc; Tupin, Florence; Darbon, Jérôme; Sigelle, Marc

    2009-07-01

    Synthetic aperture radar (SAR) images, like other coherent imaging modalities, suffer from speckle noise. The presence of this noise makes the automatic interpretation of images a challenging task and noise reduction is often a prerequisite for successful use of classical image processing algorithms. Numerous approaches have been proposed to filter speckle noise. Markov random field (MRF) modelization provides a convenient way to express both data fidelity constraints and desirable properties of the filtered image. In this context, total variation minimization has been extensively used to constrain the oscillations in the regularized image while preserving its edges. Speckle noise follows heavy-tailed distributions, and the MRF formulation leads to a minimization problem involving nonconvex log-likelihood terms. Such a minimization can be performed efficiently by computing minimum cuts on weighted graphs. Due to memory constraints, exact minimization, although theoretically possible, is not achievable on large images required by remote sensing applications. The computational burden of the state-of-the-art algorithm for approximate minimization (namely the alpha -expansion) is too heavy specially when considering joint regularization of several images. We show that a satisfying solution can be reached, in few iterations, by performing a graph-cut-based combinatorial exploration of large trial moves. This algorithm is applied to joint regularization of the amplitude and interferometric phase in urban area SAR images.

  2. Computer Support for Knowledge Management within R&D and the Teaching of Bachelor Students

    Directory of Open Access Journals (Sweden)

    Stefan Svetsky

    2013-01-01

    Full Text Available Abstract—Knowledge plays a key role within research, development and education. One of the major challenges for knowledge management is to select the right knowledge from numerous sources, including know - how of individuals, and to transform it into useful, practicable knowledge. The focus should always be on supporting strategic organisational goals. In this context, from the organisation’s strategic point of view, it is very important to link an institutional knowledge management system with the knowledge management systems of individuals. This paper presents personalised IT - support for knowledge management within industrial R&D and especially for teaching and learning. The support is based on the use of a long term developed in - house software that enables individuals (managers and teachers to process and manage knowledge on their desktop computers in a user friendly way. Within the implementation of “Technology - enhanced learning” at the Faculty of Materials Science and Technology, a pre - programmed work environment called BIKE (Batch Information and Knowledge Editor was developed. However, this desktop environment works also as a teacher’s personalized knowledge management system. It is programmed by the lead author of this paper who is a teacher; therefore the outcomes into teaching bachelor students are implemented directly into the classroom. The paper also presents how such IT - support complements, at a personalized level, the existing organizational knowledge management tool known as the university’s Academic Information System. Some examples from teaching are presented, communication channels (teacher - student forums were also mentioned as a part of the teacher’s knowledge management personalised system. In this case, the BIKE environment is demonstrated as an alternative to learning management systems based on the so called WEB 2.0 technologies.

  3. Algorithm for finding minimal cut sets in a fault tree

    International Nuclear Information System (INIS)

    Rosenberg, Ladislav

    1996-01-01

    This paper presents several algorithms that have been used in a computer code for fault-tree analysing by the minimal cut sets method. The main algorithm is the more efficient version of the new CARA algorithm, which finds minimal cut sets with an auxiliary dynamical structure. The presented algorithm for finding the minimal cut sets enables one to do so by defined requirements - according to the order of minimal cut sets, or to the number of minimal cut sets, or both. This algorithm is from three to six times faster when compared with the primary version of the CARA algorithm

  4. An evaluation of The Great Escape: can an interactive computer game improve young children's fire safety knowledge and behaviors?

    Science.gov (United States)

    Morrongiello, Barbara A; Schwebel, David C; Bell, Melissa; Stewart, Julia; Davis, Aaron L

    2012-07-01

    Fire is a leading cause of unintentional injury and, although young children are at particularly increased risk, there are very few evidence-based resources available to teach them fire safety knowledge and behaviors. Using a pre-post randomized design, the current study evaluated the effectiveness of a computer game (The Great Escape) for teaching fire safety information to young children (3.5-6 years). Using behavioral enactment procedures, children's knowledge and behaviors related to fire safety were compared to a control group of children before and after receiving the intervention. The results indicated significant improvements in knowledge and fire safety behaviors in the intervention group but not the control. Using computer games can be an effective way to promote young children's understanding of safety and how to react in different hazardous situations.

  5. Minimally inconsistent reasoning in Semantic Web.

    Science.gov (United States)

    Zhang, Xiaowang

    2017-01-01

    Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical) description logic reasoning.

  6. Knowledge and inference

    CERN Document Server

    Nagao, Makoto

    1990-01-01

    Knowledge and Inference discusses an important problem for software systems: How do we treat knowledge and ideas on a computer and how do we use inference to solve problems on a computer? The book talks about the problems of knowledge and inference for the purpose of merging artificial intelligence and library science. The book begins by clarifying the concept of """"knowledge"""" from many points of view, followed by a chapter on the current state of library science and the place of artificial intelligence in library science. Subsequent chapters cover central topics in the artificial intellig

  7. Impact of Knowledge Economy on the Participation of Women in Labor Market

    Directory of Open Access Journals (Sweden)

    Abeer Mohamed Ali Abd Elkhalek

    2017-07-01

    Full Text Available Purpose: To examine the influence and participation of women in the labor market by the know-ledge economy; in negative or positive manner. Methodology: Quantitative research technique has been implied to evaluate women’s participa-tion in the labor market to minimize negative impacts of knowledge economy. Findings: Within the service and agricultural sectors, the outcomes demonstrated that knowledge economy is found to have a significant impact on the participation of women’s labor force. The only drawback that discourages the employment of women is the concept of culture and social norms. Practical Implications: A higher participation of females in computer science, engineering and technology-oriented jobs would spur innovation and economic advances in all countries. Origi-nality Statement: The research also depicted procedures to accomplish women’s participation as a fundamental requirement for the achievement of developmental goals.

  8. Using SETS to find minimal cut sets in large fault trees

    International Nuclear Information System (INIS)

    Worrell, R.B.; Stack, D.W.

    1978-01-01

    An efficient algebraic algorithm for finding the minimal cut sets for a large fault tree was defined and a new procedure which implements the algorithm was added to the Set Equation Transformation System (SETS). The algorithm includes the identification and separate processing of independent subtrees, the coalescing of consecutive gates of the same kind, the creation of additional independent subtrees, and the derivation of the fault tree stem equation in stages. The computer time required to determine the minimal cut sets using these techniques is shown to be substantially less than the computer time required to determine the minimal cut sets when these techniques are not employed. It is shown for a given example that the execution time required to determine the minimal cut sets can be reduced from 7,686 seconds to 7 seconds when all of these techniques are employed

  9. A prototype system for perinatal knowledge engineering using an artificial intelligence tool.

    Science.gov (United States)

    Sokol, R J; Chik, L

    1988-01-01

    Though several perinatal expert systems are extant, the use of artificial intelligence has, as yet, had minimal impact in medical computing. In this evaluation of the potential of AI techniques in the development of a computer based "Perinatal Consultant," a "top down" approach to the development of a perinatal knowledge base was taken, using as a source for such a knowledge base a 30-page manuscript of a chapter concerning high risk pregnancy. The UNIX utility "style" was used to parse sentences and obtain key words and phrases, both as part of a natural language interface and to identify key perinatal concepts. Compared with the "gold standard" of sentences containing key facts as chosen by the experts, a semiautomated method using a nonmedical speller to identify key words and phrases in context functioned with a sensitivity of 79%, i.e., approximately 8 in 10 key sentences were detected as the basis for PROLOG, rules and facts for the knowledge base. These encouraging results suggest that functional perinatal expert systems may well be expedited by using programming utilities in conjunction with AI tools and published literature.

  10. Genome-wide Studies of Mycolic Acid Bacteria: Computational Identification and Analysis of a Minimal Genome

    KAUST Repository

    Kamanu, Frederick Kinyua

    2012-12-01

    The mycolic acid bacteria are a distinct suprageneric group of asporogenous Grampositive, high GC-content bacteria, distinguished by the presence of mycolic acids in their cell envelope. They exhibit great diversity in their cell and morphology; although primarily non-pathogens, this group contains three major pathogens Mycobacterium leprae, Mycobacterium tuberculosis complex, and Corynebacterium diphtheria. Although the mycolic acid bacteria are a clearly defined group of bacteria, the taxonomic relationships between its constituent genera and species are less well defined. Two approaches were tested for their suitability in describing the taxonomy of the group. First, a Multilocus Sequence Typing (MLST) experiment was assessed and found to be superior to monophyletic (16S small ribosomal subunit) in delineating a total of 52 mycolic acid bacterial species. Phylogenetic inference was performed using the neighbor-joining method. To further refine phylogenetic analysis and to take advantage of the widespread availability of bacterial genome data, a computational framework that simulates DNA-DNA hybridisation was developed and validated using multiscale bootstrap resampling. The tool classifies microbial genomes based on whole genome DNA, and was deployed as a web-application using PHP and Javascript. It is accessible online at http://cbrc.kaust.edu.sa/dna_hybridization/ A third study was a computational and statistical methods in the identification and analysis of a putative minimal mycolic acid bacterial genome so as to better understand (1) the genomic requirements to encode a mycolic acid bacterial cell and (2) the role and type of genes and genetic elements that lead to the massive increase in genome size in environmental mycolic acid bacteria. Using a reciprocal comparison approach, a total of 690 orthologous gene clusters forming a putative minimal genome were identified across 24 mycolic acid bacterial species. In order to identify new potential drug

  11. Normalizing biomedical terms by minimizing ambiguity and variability

    Directory of Open Access Journals (Sweden)

    McNaught John

    2008-04-01

    Full Text Available Abstract Background One of the difficulties in mapping biomedical named entities, e.g. genes, proteins, chemicals and diseases, to their concept identifiers stems from the potential variability of the terms. Soft string matching is a possible solution to the problem, but its inherent heavy computational cost discourages its use when the dictionaries are large or when real time processing is required. A less computationally demanding approach is to normalize the terms by using heuristic rules, which enables us to look up a dictionary in a constant time regardless of its size. The development of good heuristic rules, however, requires extensive knowledge of the terminology in question and thus is the bottleneck of the normalization approach. Results We present a novel framework for discovering a list of normalization rules from a dictionary in a fully automated manner. The rules are discovered in such a way that they minimize the ambiguity and variability of the terms in the dictionary. We evaluated our algorithm using two large dictionaries: a human gene/protein name dictionary built from BioThesaurus and a disease name dictionary built from UMLS. Conclusions The experimental results showed that automatically discovered rules can perform comparably to carefully crafted heuristic rules in term mapping tasks, and the computational overhead of rule application is small enough that a very fast implementation is possible. This work will help improve the performance of term-concept mapping tasks in biomedical information extraction especially when good normalization heuristics for the target terminology are not fully known.

  12. A comprehensive program to minimize platelet outdating.

    Science.gov (United States)

    Fuller, Alice K; Uglik, Kristin M; Braine, Hayden G; King, Karen E

    2011-07-01

    Platelet (PLT) transfusions are essential for patients who are bleeding or have an increased risk of bleeding due to a decreased number or abnormal function of circulating PLTs. A shelf life of 5 days for PLT products presents an inventory management challenge. In 2006, greater than 10% of apheresis PLTs made in the United States outdated. It is imperative to have a sufficient number of products for patients requiring transfusion, but outdating PLTs is a financial burden and a waste of a resource. We present the approach used in our institution to anticipate inventory needs based on current patient census and usage. Strategies to predict usage and to identify changes in anticipated usage are examined. Annual outdating is reviewed for a 10-year period from 2000 through 2009. From January 1, 2000, through December 2009, there were 128,207 PLT transfusions given to 15,265 patients. The methods used to anticipate usage and adjust inventory resulted in an annual outdate rate of approximately 1% for the 10-year period reviewed. In addition we have not faced situations where inventory was inadequate to meet the needs of the patients requiring transfusions. We have identified three elements of our transfusion service that can minimize outdate: a knowledgeable proactive staff dedicated to PLT management, a comprehensive computer-based transfusion history for each patient, and a strong two-way relationship with the primary product supplier. Through our comprehensive program, based on the principles of providing optimal patient care, we have minimized PLT outdating for more than 10 years. © 2011 American Association of Blood Banks.

  13. Membangun Sistem Linux Mandrake Minimal Menggunakan Inisial Disk Ram

    OpenAIRE

    Wagito, Wagito

    2006-01-01

    Minimal Linux system is commonly used for special systems like router, gateway, Linux installer and diskless Linux system. Minimal Linux system is a Linux system that use a few facilities of all Linux capabilities. Mandrake Linux, as one of Linux distribution is able to perform minimal Linux system. RAM is a computer resource that especially used as main memory. A part of RAM's function can be changed into disk called RAM disk. This RAM disk can be used to run the Linux system. This ...

  14. Knowledge base mechanisms

    Energy Technology Data Exchange (ETDEWEB)

    Suwa, M; Furukawa, K; Makinouchi, A; Mizoguchi, T; Mizoguchi, F; Yamasaki, H

    1982-01-01

    One of the principal goals of the Fifth Generation Computer System Project for the coming decade is to develop a methodology for building knowledge information processing systems which will provide people with intelligent agents. The key notion of the fifth generation computer system is knowledge used for problem solving. In this paper the authors describe the plan of Randd on knowledge base mechanisms. A knowledge representation system is to be designed to support knowledge acquisition for the knowledge information processing systems. The system will include a knowledge representation language, a knowledge base editor and a debugger. It is also expected to perform as a kind of meta-inference system. In order to develop the large scale knowledge base systems, a knowledge base mechanism based on the relational model is to be studied in the earlier stage of the project. Distributed problem solving is also one of the main issues of the project. 19 references.

  15. Knowledge management: Role of the the Radiation Safety Information Computational Center (RSICC)

    Science.gov (United States)

    Valentine, Timothy

    2017-09-01

    The Radiation Safety Information Computational Center (RSICC) at Oak Ridge National Laboratory (ORNL) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 software packages that have been provided by code developers from various federal and international agencies. RSICC's customers (scientists, engineers, and students from around the world) obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programs both domestically and internationally, as the majority of RSICC's customers are students attending U.S. universities. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC's activities, services, and systems that support knowledge management and education and training in the nuclear field.

  16. Learn with SAT to Minimize Büchi Automata

    Directory of Open Access Journals (Sweden)

    Stephan Barth

    2012-10-01

    Full Text Available We describe a minimization procedure for nondeterministic Büchi automata (NBA. For an automaton A another automaton A_min with the minimal number of states is learned with the help of a SAT-solver. This is done by successively computing automata A' that approximate A in the sense that they accept a given finite set of positive examples and reject a given finite set of negative examples. In the course of the procedure these example sets are successively increased. Thus, our method can be seen as an instance of a generic learning algorithm based on a "minimally adequate teacher'' in the sense of Angluin. We use a SAT solver to find an NBA for given sets of positive and negative examples. We use complementation via construction of deterministic parity automata to check candidates computed in this manner for equivalence with A. Failure of equivalence yields new positive or negative examples. Our method proved successful on complete samplings of small automata and of quite some examples of bigger automata. We successfully ran the minimization on over ten thousand automata with mostly up to ten states, including the complements of all possible automata with two states and alphabet size three and discuss results and runtimes; single examples had over 100 states.

  17. Fast nonconvex nonsmooth minimization methods for image restoration and reconstruction.

    Science.gov (United States)

    Nikolova, Mila; Ng, Michael K; Tam, Chi-Pan

    2010-12-01

    Nonconvex nonsmooth regularization has advantages over convex regularization for restoring images with neat edges. However, its practical interest used to be limited by the difficulty of the computational stage which requires a nonconvex nonsmooth minimization. In this paper, we deal with nonconvex nonsmooth minimization methods for image restoration and reconstruction. Our theoretical results show that the solution of the nonconvex nonsmooth minimization problem is composed of constant regions surrounded by closed contours and neat edges. The main goal of this paper is to develop fast minimization algorithms to solve the nonconvex nonsmooth minimization problem. Our experimental results show that the effectiveness and efficiency of the proposed algorithms.

  18. Pattern recognition algorithms for data mining scalability, knowledge discovery and soft granular computing

    CERN Document Server

    Pal, Sankar K

    2004-01-01

    Pattern Recognition Algorithms for Data Mining addresses different pattern recognition (PR) tasks in a unified framework with both theoretical and experimental results. Tasks covered include data condensation, feature selection, case generation, clustering/classification, and rule generation and evaluation. This volume presents various theories, methodologies, and algorithms, using both classical approaches and hybrid paradigms. The authors emphasize large datasets with overlapping, intractable, or nonlinear boundary classes, and datasets that demonstrate granular computing in soft frameworks.Organized into eight chapters, the book begins with an introduction to PR, data mining, and knowledge discovery concepts. The authors analyze the tasks of multi-scale data condensation and dimensionality reduction, then explore the problem of learning with support vector machine (SVM). They conclude by highlighting the significance of granular computing for different mining tasks in a soft paradigm.

  19. Approximate error conjugation gradient minimization methods

    Science.gov (United States)

    Kallman, Jeffrey S

    2013-05-21

    In one embodiment, a method includes selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, calculating an approximate error using the subset of rays, and calculating a minimum in a conjugate gradient direction based on the approximate error. In another embodiment, a system includes a processor for executing logic, logic for selecting a subset of rays from a set of all rays to use in an error calculation for a constrained conjugate gradient minimization problem, logic for calculating an approximate error using the subset of rays, and logic for calculating a minimum in a conjugate gradient direction based on the approximate error. In other embodiments, computer program products, methods, and systems are described capable of using approximate error in constrained conjugate gradient minimization problems.

  20. Internal and External Regulation to Support Knowledge Construction and Convergence in Computer Supported Collaborative Learning (CSCL)

    Science.gov (United States)

    Romero, Margarida; Lambropoulos, Niki

    2011-01-01

    Computer Supported Collaborative Learning (CSCL) activities aim to promote collaborative knowledge construction and convergence. During the CSCL activity, the students should regulate their learning activity, at the individual and collective level. This implies an organisation cost related to the coordination of the activity with the team-mates…

  1. Minimally inconsistent reasoning in Semantic Web.

    Directory of Open Access Journals (Sweden)

    Xiaowang Zhang

    Full Text Available Reasoning with inconsistencies is an important issue for Semantic Web as imperfect information is unavoidable in real applications. For this, different paraconsistent approaches, due to their capacity to draw as nontrivial conclusions by tolerating inconsistencies, have been proposed to reason with inconsistent description logic knowledge bases. However, existing paraconsistent approaches are often criticized for being too skeptical. To this end, this paper presents a non-monotonic paraconsistent version of description logic reasoning, called minimally inconsistent reasoning, where inconsistencies tolerated in the reasoning are minimized so that more reasonable conclusions can be inferred. Some desirable properties are studied, which shows that the new semantics inherits advantages of both non-monotonic reasoning and paraconsistent reasoning. A complete and sound tableau-based algorithm, called multi-valued tableaux, is developed to capture the minimally inconsistent reasoning. In fact, the tableaux algorithm is designed, as a framework for multi-valued DL, to allow for different underlying paraconsistent semantics, with the mere difference in the clash conditions. Finally, the complexity of minimally inconsistent description logic reasoning is shown on the same level as the (classical description logic reasoning.

  2. Computer-Aided Experiment Planning toward Causal Discovery in Neuroscience.

    Science.gov (United States)

    Matiasz, Nicholas J; Wood, Justin; Wang, Wei; Silva, Alcino J; Hsu, William

    2017-01-01

    Computers help neuroscientists to analyze experimental results by automating the application of statistics; however, computer-aided experiment planning is far less common, due to a lack of similar quantitative formalisms for systematically assessing evidence and uncertainty. While ontologies and other Semantic Web resources help neuroscientists to assimilate required domain knowledge, experiment planning requires not only ontological but also epistemological (e.g., methodological) information regarding how knowledge was obtained. Here, we outline how epistemological principles and graphical representations of causality can be used to formalize experiment planning toward causal discovery. We outline two complementary approaches to experiment planning: one that quantifies evidence per the principles of convergence and consistency, and another that quantifies uncertainty using logical representations of constraints on causal structure. These approaches operationalize experiment planning as the search for an experiment that either maximizes evidence or minimizes uncertainty. Despite work in laboratory automation, humans must still plan experiments and will likely continue to do so for some time. There is thus a great need for experiment-planning frameworks that are not only amenable to machine computation but also useful as aids in human reasoning.

  3. The Role of the Radiation Safety Information Computational Center (RSICC) in Knowledge Management

    International Nuclear Information System (INIS)

    Valentine, T.

    2016-01-01

    Full text: The Radiation Safety Information Computational Center (RSICC) is an information analysis center that collects, archives, evaluates, synthesizes and distributes information, data and codes that are used in various nuclear technology applications. RSICC retains more than 2,000 packages that have been provided by contributors from various agencies. RSICC’s customers obtain access to such computing codes (source and/or executable versions) and processed nuclear data files to promote on-going research, to help ensure nuclear and radiological safety, and to advance nuclear technology. The role of such information analysis centers is critical for supporting and sustaining nuclear education and training programmes both domestically and internationally, as the majority of RSICC’s customers are students attending U.S. universities. RSICC also supports and promotes workshops and seminars in nuclear science and technology to further the use and/or development of computational tools and data. Additionally, RSICC operates a secure CLOUD computing system to provide access to sensitive export-controlled modeling and simulation (M&S) tools that support both domestic and international activities. This presentation will provide a general review of RSICC’s activities, services, and systems that support knowledge management and education and training in the nuclear field. (author

  4. Knowledge, attitude and skills of dental practitioners of Puducherry on minimally invasive dentistry concepts: A questionnaire survey

    Science.gov (United States)

    Rayapudi, Jasmine; Usha, Carounanidy

    2018-01-01

    Background: Minimally invasive dentistry (MID) encompasses early caries diagnosis through caries risk assessment (CRA), early detection of incipient carious lesion including primary and secondary prevention based on scientific evidence that remineralization of demineralized enamel and dentin is possible if detected early. Although the dental curriculum focuses on the advantages of MID in tooth preservation, this science is not usually translated into practice. Aim: This study aimed to evaluate the knowledge, attitude, and skills of dental practitioners of Puducherry regarding the concepts of MID. Subjects and Methods: Data were collected through an online survey questionnaire based on awareness and practice of MID. Statistical evaluation was done on SPSS by Chi-square test. Results: A total of 126 dentists responded of which only 55% were trained in MID during their undergraduate and internship period, mainly through lectures (49.6%). Nearly 81% agreed that CRA should be conducted for all patients. Almost 42.7% had heard about International Caries Detection and Assessment System, but only 25.9% used a blunt explorer for caries detection. About 13.7% use magnification (loupes/microscope), but majority (84.7%) use radiographs. More than 70% were unaware of newer methods of caries detection. Statistically significant differences were found (P < 0.05) regarding qualification and experience about the effectiveness of Atraumatic Restorative Treatment and sandwich technique for treatment of caries in permanent teeth and high caries-risk children. Conclusion: Although there is knowledge about advantages of MID among dentists of Puducherry, it does not benefit patients, as many practitioners still follow the traditional principles of total caries removal. PMID:29899626

  5. Exploring gender and gender pairing in the knowledge elaboration processes of students using computer-supported collaborative learning

    NARCIS (Netherlands)

    Ding, N.; Bosker, R. J.; Harskamp, E. G.

    The aim of the study is to investigate the influence of gender and gender pairing on students' learning performances and knowledge elaboration processes in Computer-Supported Collaborative Learning (CSCL). A sample of ninety-six secondary school students, participated in a two-week experiment.

  6. MEMBANGUN SISTEM LINUX MANDRAKE MINIMAL MENGGUNAKAN INISIAL DISK RAM

    OpenAIRE

    Wagito, Wagito

    2009-01-01

            Minimal Linux system is commonly used for special systems like router, gateway, Linux installer and diskless Linux system. Minimal Linux system is a Linux system that use a few facilities of all Linux capabilities. Mandrake Linux, as one of Linux distribution is able to perform minimal Linux system.         RAM is a computer resource that especially used as main memory. A  part of RAM’s function can be changed into disk called RAM disk. This RAM disk can be used to run the Linux syste...

  7. Minimal computational-space implementation of multiround quantum protocols

    International Nuclear Information System (INIS)

    Bisio, Alessandro; D'Ariano, Giacomo Mauro; Perinotti, Paolo; Chiribella, Giulio

    2011-01-01

    A single-party strategy in a multiround quantum protocol can be implemented by sequential networks of quantum operations connected by internal memories. Here, we provide an efficient realization in terms of computational-space resources.

  8. A Lightweight Distributed Framework for Computational Offloading in Mobile Cloud Computing

    Science.gov (United States)

    Shiraz, Muhammad; Gani, Abdullah; Ahmad, Raja Wasim; Adeel Ali Shah, Syed; Karim, Ahmad; Rahman, Zulkanain Abdul

    2014-01-01

    The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs). Therefore, Mobile Cloud Computing (MCC) leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC. PMID:25127245

  9. A lightweight distributed framework for computational offloading in mobile cloud computing.

    Directory of Open Access Journals (Sweden)

    Muhammad Shiraz

    Full Text Available The latest developments in mobile computing technology have enabled intensive applications on the modern Smartphones. However, such applications are still constrained by limitations in processing potentials, storage capacity and battery lifetime of the Smart Mobile Devices (SMDs. Therefore, Mobile Cloud Computing (MCC leverages the application processing services of computational clouds for mitigating resources limitations in SMDs. Currently, a number of computational offloading frameworks are proposed for MCC wherein the intensive components of the application are outsourced to computational clouds. Nevertheless, such frameworks focus on runtime partitioning of the application for computational offloading, which is time consuming and resources intensive. The resource constraint nature of SMDs require lightweight procedures for leveraging computational clouds. Therefore, this paper presents a lightweight framework which focuses on minimizing additional resources utilization in computational offloading for MCC. The framework employs features of centralized monitoring, high availability and on demand access services of computational clouds for computational offloading. As a result, the turnaround time and execution cost of the application are reduced. The framework is evaluated by testing prototype application in the real MCC environment. The lightweight nature of the proposed framework is validated by employing computational offloading for the proposed framework and the latest existing frameworks. Analysis shows that by employing the proposed framework for computational offloading, the size of data transmission is reduced by 91%, energy consumption cost is minimized by 81% and turnaround time of the application is decreased by 83.5% as compared to the existing offloading frameworks. Hence, the proposed framework minimizes additional resources utilization and therefore offers lightweight solution for computational offloading in MCC.

  10. National Institutes of Health: Mixed waste minimization and treatment

    International Nuclear Information System (INIS)

    1995-08-01

    The Appalachian States Low-Level Radioactive Waste Commission requested the US Department of Energy's National Low-Level Waste Management Program (NLLWMP) to assist the biomedical community in becoming more knowledgeable about its mixed waste streams, to help minimize the mixed waste stream generated by the biomedical community, and to identify applicable treatment technologies for these mixed waste streams. As the first step in the waste minimization process, liquid low-level radioactive mixed waste (LLMW) streams generated at the National Institutes of Health (NIH) were characterized and combined into similar process categories. This report identifies possible waste minimization and treatment approaches for the LLMW generated by the biomedical community identified in DOE/LLW-208. In development of the report, on site meetings were conducted with NIH personnel responsible for generating each category of waste identified as lacking disposal options. Based on the meetings and general waste minimization guidelines, potential waste minimization options were identified

  11. National Institutes of Health: Mixed waste minimization and treatment

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-08-01

    The Appalachian States Low-Level Radioactive Waste Commission requested the US Department of Energy`s National Low-Level Waste Management Program (NLLWMP) to assist the biomedical community in becoming more knowledgeable about its mixed waste streams, to help minimize the mixed waste stream generated by the biomedical community, and to identify applicable treatment technologies for these mixed waste streams. As the first step in the waste minimization process, liquid low-level radioactive mixed waste (LLMW) streams generated at the National Institutes of Health (NIH) were characterized and combined into similar process categories. This report identifies possible waste minimization and treatment approaches for the LLMW generated by the biomedical community identified in DOE/LLW-208. In development of the report, on site meetings were conducted with NIH personnel responsible for generating each category of waste identified as lacking disposal options. Based on the meetings and general waste minimization guidelines, potential waste minimization options were identified.

  12. Promoting elementary students' epistemology of science through computer-supported knowledge-building discourse and epistemic reflection

    Science.gov (United States)

    Lin, Feng; Chan, Carol K. K.

    2018-04-01

    This study examined the role of computer-supported knowledge-building discourse and epistemic reflection in promoting elementary-school students' scientific epistemology and science learning. The participants were 39 Grade 5 students who were collectively pursuing ideas and inquiry for knowledge advance using Knowledge Forum (KF) while studying a unit on electricity; they also reflected on the epistemic nature of their discourse. A comparison class of 22 students, taught by the same teacher, studied the same unit using the school's established scientific investigation method. We hypothesised that engaging students in idea-driven and theory-building discourse, as well as scaffolding them to reflect on the epistemic nature of their discourse, would help them understand their own scientific collaborative discourse as a theory-building process, and therefore understand scientific inquiry as an idea-driven and theory-building process. As hypothesised, we found that students engaged in knowledge-building discourse and reflection outperformed comparison students in scientific epistemology and science learning, and that students' understanding of collaborative discourse predicted their post-test scientific epistemology and science learning. To further understand the epistemic change process among knowledge-building students, we analysed their KF discourse to understand whether and how their epistemic practice had changed after epistemic reflection. The implications on ways of promoting epistemic change are discussed.

  13. The Influence of Perceived Information Overload on Student Participation and Knowledge Construction in Computer-Mediated Communication

    Science.gov (United States)

    Chen, Chun-Ying; Pedersen, Susan; Murphy, Karen L.

    2012-01-01

    Computer-mediated communication (CMC) has been used widely to engage learners in academic discourse for knowledge construction. Due to the features of the task environment, one of the main problems caused by the medium is information overload (IO). Yet the literature is unclear about the impact of IO on student learning. This study therefore…

  14. Comparison of Knowledge and Attitudes Using Computer-Based and Face-to-Face Personal Hygiene Training Methods in Food Processing Facilities

    Science.gov (United States)

    Fenton, Ginger D.; LaBorde, Luke F.; Radhakrishna, Rama B.; Brown, J. Lynne; Cutter, Catherine N.

    2006-01-01

    Computer-based training is increasingly favored by food companies for training workers due to convenience, self-pacing ability, and ease of use. The objectives of this study were to determine if personal hygiene training, offered through a computer-based method, is as effective as a face-to-face method in knowledge acquisition and improved…

  15. Technological Pedagogical Content Knowledge of Prospective Mathematics Teacher in Three Dimensional Material Based on Sex Differences

    Science.gov (United States)

    Aqib, M. A.; Budiarto, M. T.; Wijayanti, P.

    2018-01-01

    The effectiveness of learning in this era can be seen from 3 factors such as: technology, content, and pedagogy that covered in Technological Pedagogical Content Knowledge (TPCK). This research was a qualitative research which aimed to describe each domain from TPCK include Content Knowledge, Pedagogical Knowledge, Pedagogical Content Knowledge, Technological Knowledge, Technological Content Knowledge, Technological Pedagogical Knowledge and Technological, Pedagogical, and Content Knowledge. The subjects of this research were male and female mathematics college students at least 5th semester who has almost the same ability for some course like innovative learning, innovative learning II, school mathematics I, school mathematics II, computer applications and instructional media. Research began by spreading the questionnaire of subject then continued with the assignment and interview. The obtained data was validated by time triangulation.This research has result that male and female prospective teacher was relatively same for Content Knowledge and Pedagogical Knowledge domain. While it was difference in the Technological Knowledge domain. The difference in this domain certainly has an impact on other domains that has technology components on it. Although it can be minimized by familiarizing the technology.

  16. Facilitating Preschoolers' Scientific Knowledge Construction via Computer Games Regarding Light and Shadow: The Effect of the Prediction-Observation-Explanation (POE) Strategy

    Science.gov (United States)

    Hsu, Chung-Yuan; Tsai, Chin-Chung; Liang, Jyh-Chong

    2011-01-01

    Educational researchers have suggested that computer games have a profound influence on students' motivation, knowledge construction, and learning performance, but little empirical research has targeted preschoolers. Thus, the purpose of the present study was to investigate the effects of implementing a computer game that integrates the…

  17. Accelerating an Ordered-Subset Low-Dose X-Ray Cone Beam Computed Tomography Image Reconstruction with a Power Factor and Total Variation Minimization.

    Science.gov (United States)

    Huang, Hsuan-Ming; Hsiao, Ing-Tsung

    2016-01-01

    In recent years, there has been increased interest in low-dose X-ray cone beam computed tomography (CBCT) in many fields, including dentistry, guided radiotherapy and small animal imaging. Despite reducing the radiation dose, low-dose CBCT has not gained widespread acceptance in routine clinical practice. In addition to performing more evaluation studies, developing a fast and high-quality reconstruction algorithm is required. In this work, we propose an iterative reconstruction method that accelerates ordered-subsets (OS) reconstruction using a power factor. Furthermore, we combine it with the total-variation (TV) minimization method. Both simulation and phantom studies were conducted to evaluate the performance of the proposed method. Results show that the proposed method can accelerate conventional OS methods, greatly increase the convergence speed in early iterations. Moreover, applying the TV minimization to the power acceleration scheme can further improve the image quality while preserving the fast convergence rate.

  18. An algorithm for reduct cardinality minimization

    KAUST Repository

    AbouEisha, Hassan M.

    2013-12-01

    This is devoted to the consideration of a new algorithm for reduct cardinality minimization. This algorithm transforms the initial table to a decision table of a special kind, simplify this table, and use a dynamic programming algorithm to finish the construction of an optimal reduct. Results of computer experiments with decision tables from UCI ML Repository are discussed. © 2013 IEEE.

  19. An algorithm for reduct cardinality minimization

    KAUST Repository

    AbouEisha, Hassan M.; Al Farhan, Mohammed; Chikalov, Igor; Moshkov, Mikhail

    2013-01-01

    This is devoted to the consideration of a new algorithm for reduct cardinality minimization. This algorithm transforms the initial table to a decision table of a special kind, simplify this table, and use a dynamic programming algorithm to finish the construction of an optimal reduct. Results of computer experiments with decision tables from UCI ML Repository are discussed. © 2013 IEEE.

  20. Minimal knotted polygons in cubic lattices

    International Nuclear Information System (INIS)

    Van Rensburg, E J Janse; Rechnitzer, A

    2011-01-01

    In this paper we examine numerically the properties of minimal length knotted lattice polygons in the simple cubic, face-centered cubic, and body-centered cubic lattices by sieving minimal length polygons from a data stream of a Monte Carlo algorithm, implemented as described in Aragão de Carvalho and Caracciolo (1983 Phys. Rev. B 27 1635), Aragão de Carvalho et al (1983 Nucl. Phys. B 215 209) and Berg and Foester (1981 Phys. Lett. B 106 323). The entropy, mean writhe, and mean curvature of minimal length polygons are computed (in some cases exactly). While the minimal length and mean curvature are found to be lattice dependent, the mean writhe is found to be only weakly dependent on the lattice type. Comparison of our results to numerical results for the writhe obtained elsewhere (see Janse van Rensburg et al 1999 Contributed to Ideal Knots (Series on Knots and Everything vol 19) ed Stasiak, Katritch and Kauffman (Singapore: World Scientific), Portillo et al 2011 J. Phys. A: Math. Theor. 44 275004) shows that the mean writhe is also insensitive to the length of a knotted polygon. Thus, while these results for the mean writhe and mean absolute writhe at minimal length are not universal, our results demonstrate that these values are quite close the those of long polygons regardless of the underlying lattice and length

  1. Energy-efficient ECG compression on wireless biosensors via minimal coherence sensing and weighted ℓ₁ minimization reconstruction.

    Science.gov (United States)

    Zhang, Jun; Gu, Zhenghui; Yu, Zhu Liang; Li, Yuanqing

    2015-03-01

    Low energy consumption is crucial for body area networks (BANs). In BAN-enabled ECG monitoring, the continuous monitoring entails the need of the sensor nodes to transmit a huge data to the sink node, which leads to excessive energy consumption. To reduce airtime over energy-hungry wireless links, this paper presents an energy-efficient compressed sensing (CS)-based approach for on-node ECG compression. At first, an algorithm called minimal mutual coherence pursuit is proposed to construct sparse binary measurement matrices, which can be used to encode the ECG signals with superior performance and extremely low complexity. Second, in order to minimize the data rate required for faithful reconstruction, a weighted ℓ1 minimization model is derived by exploring the multisource prior knowledge in wavelet domain. Experimental results on MIT-BIH arrhythmia database reveals that the proposed approach can obtain higher compression ratio than the state-of-the-art CS-based methods. Together with its low encoding complexity, our approach can achieve significant energy saving in both encoding process and wireless transmission.

  2. The minimal work cost of information processing

    Science.gov (United States)

    Faist, Philippe; Dupuis, Frédéric; Oppenheim, Jonathan; Renner, Renato

    2015-07-01

    Irreversible information processing cannot be carried out without some inevitable thermodynamical work cost. This fundamental restriction, known as Landauer's principle, is increasingly relevant today, as the energy dissipation of computing devices impedes the development of their performance. Here we determine the minimal work required to carry out any logical process, for instance a computation. It is given by the entropy of the discarded information conditional to the output of the computation. Our formula takes precisely into account the statistically fluctuating work requirement of the logical process. It enables the explicit calculation of practical scenarios, such as computational circuits or quantum measurements. On the conceptual level, our result gives a precise and operational connection between thermodynamic and information entropy, and explains the emergence of the entropy state function in macroscopic thermodynamics.

  3. Designing Knowledge Map for Knowledge Management projects Using Network Analysis

    Directory of Open Access Journals (Sweden)

    heidar najafi

    2017-09-01

    Full Text Available In this research knowledge management has been studied as an interdisciplinary area. We aim to find an answer for this question that "what are the scientific structure and knowledge map of knowledge management projects regarding these two aspect of subject areas and keywords. For this purpose, nearly 40000 scientific documents including knowledge management as one of their keywords were selected from Scopus database and were studied in various subject areas. In this research,bar charts have been drawn for each index of subject areas and keywords. Besides, using Co-occurrence matrix, adjacency graphs were drawn and then clustered using Average-Link algorithm. Bar charts and graphs were drawn using R and Excel software. The results of this research showed that among the researches on knowledge management in the world, the most relevant scientific fields to knowledge management are Computer Sciences with 32.5%, Business, Management and Accounting with 14.5%, Engineering with 13.7%, Decisive Sciences with 12.6%, Mathematics with 7.07%, and Social Sciences with 6.63%, respectively. The most keywords collocate with knowledge management in the world are Human-Computer Interaction, Information Management, Systems Management, Information Technology, Manufacturing, Acquisition of Knowledge, Semantics, Knowledge Transfer, Ontology and Information Retrieval.

  4. Computed Tomography Helps to Plan Minimally Invasive Aortic Valve Replacement Operations.

    Science.gov (United States)

    Stoliński, Jarosław; Plicner, Dariusz; Grudzień, Grzegorz; Kruszec, Paweł; Fijorek, Kamil; Musiał, Robert; Andres, Janusz

    2016-05-01

    This study evaluated the role of multidetector computed tomography (MDCT) in preparation for minimally invasive aortic valve replacement (MIAVR). An analysis of 187 patients scheduled for MIAVR between June 2009 and December 2014 was conducted. In the study group (n = 86), MDCT of the thorax, aorta, and femoral arteries was performed before the operation. In the control group (n = 101), patients qualified for MIAVR without receiving preoperative MDCT. The surgical strategy was changed preoperatively in 12.8% of patients from the study group and in 2.0% of patients from the control group (p = 0.010) and intraoperatively in 9.9% of patients from the control group and in none from the study group (p = 0.002). No conversion to median sternotomy was necessary in the study group; among the controls, there were 4.0% conversions. On the basis of the MDCT measurements, optimal access to the aortic valve was achieved when the angle between the aortic valve plane and the line to the second intercostal space was 91.9 ± 10.0 degrees and to the third intercostal space was 94.0 ± 1.4 degrees, with the distance to the valve being 94.8 ± 13.8 mm and 84.5 ± 9.9 mm for the second and third intercostal spaces, respectively. The right atrium covering the site of the aortotomy was present in 42.9% of cases when MIAVR had been performed through the third intercostal space and in 1.3% when through the second intercostal space (p = 0.001). Preoperative MDCT of the thorax, aorta, and femoral arteries makes it possible to plan MIAVR operations. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  5. Families of bitangent planes of space curves and minimal non-fibration families

    KAUST Repository

    Lubbes, Niels

    2014-01-01

    A cone curve is a reduced sextic space curve which lies on a quadric cone and does not pass through the vertex. We classify families of bitangent planes of cone curves. The methods we apply can be used for any space curve with ADE singularities, though in this paper we concentrate on cone curves. An embedded complex projective surface which is adjoint to a degree one weak Del Pezzo surface contains families of minimal degree rational curves, which cannot be defined by the fibers of a map. Such families are called minimal non-fibration families. Families of bitangent planes of cone curves correspond to minimal non-fibration families. The main motivation of this paper is to classify minimal non-fibration families. We present algorithms which compute all bitangent families of a given cone curve and their geometric genus. We consider cone curves to be equivalent if they have the same singularity configuration. For each equivalence class of cone curves we determine the possible number of bitangent families and the number of rational bitangent families. Finally we compute an example of a minimal non-fibration family on an embedded weak degree one Del Pezzo surface.

  6. The minimally tuned minimal supersymmetric standard model

    International Nuclear Information System (INIS)

    Essig, Rouven; Fortin, Jean-Francois

    2008-01-01

    The regions in the Minimal Supersymmetric Standard Model with the minimal amount of fine-tuning of electroweak symmetry breaking are presented for general messenger scale. No a priori relations among the soft supersymmetry breaking parameters are assumed and fine-tuning is minimized with respect to all the important parameters which affect electroweak symmetry breaking. The superpartner spectra in the minimally tuned region of parameter space are quite distinctive with large stop mixing at the low scale and negative squark soft masses at the high scale. The minimal amount of tuning increases enormously for a Higgs mass beyond roughly 120 GeV

  7. Context-dependent memory decay is evidence of effort minimization in motor learning: a computational study

    OpenAIRE

    Takiyama, Ken

    2015-01-01

    Recent theoretical models suggest that motor learning includes at least two processes: error minimization and memory decay. While learning a novel movement, a motor memory of the movement is gradually formed to minimize the movement error between the desired and actual movements in each training trial, but the memory is slightly forgotten in each trial. The learning effects of error minimization trained with a certain movement are partially available in other non-trained movements, and this t...

  8. Subspace Correction Methods for Total Variation and $\\ell_1$-Minimization

    KAUST Repository

    Fornasier, Massimo

    2009-01-01

    This paper is concerned with the numerical minimization of energy functionals in Hilbert spaces involving convex constraints coinciding with a seminorm for a subspace. The optimization is realized by alternating minimizations of the functional on a sequence of orthogonal subspaces. On each subspace an iterative proximity-map algorithm is implemented via oblique thresholding, which is the main new tool introduced in this work. We provide convergence conditions for the algorithm in order to compute minimizers of the target energy. Analogous results are derived for a parallel variant of the algorithm. Applications are presented in domain decomposition methods for degenerate elliptic PDEs arising in total variation minimization and in accelerated sparse recovery algorithms based on 1-minimization. We include numerical examples which show e.cient solutions to classical problems in signal and image processing. © 2009 Society for Industrial and Applied Physics.

  9. A knowledge creation info-structure to acquire and crystallize the tacit knowledge of health-care experts.

    Science.gov (United States)

    Abidi, Syed Sibte Raza; Cheah, Yu-N; Curran, Janet

    2005-06-01

    Tacit knowledge of health-care experts is an important source of experiential know-how, yet due to various operational and technical reasons, such health-care knowledge is not entirely harnessed and put into professional practice. Emerging knowledge-management (KM) solutions suggest strategies to acquire the seemingly intractable and nonarticulated tacit knowledge of health-care experts. This paper presents a KM methodology, together with its computational implementation, to 1) acquire the tacit knowledge possessed by health-care experts; 2) represent the acquired tacit health-care knowledge in a computational formalism--i.e., clinical scenarios--that allows the reuse of stored knowledge to acquire tacit knowledge; and 3) crystallize the acquired tacit knowledge so that it is validated for health-care decision-support and medical education systems.

  10. Probabilistic Properties of Rectilinear Steiner Minimal Trees

    Directory of Open Access Journals (Sweden)

    V. N. Salnikov

    2015-01-01

    Full Text Available This work concerns the properties of Steiner minimal trees for the manhattan plane in the context of introducing a probability measure. This problem is important because exact algorithms to solve the Steiner problem are computationally expensive (NP-hard and the solution (especially in the case of big number of points to be connected has a diversity of practical applications. That is why the work considers a possibility to rank the possible topologies of the minimal trees with respect to a probability of their usage. For this, the known facts about the structural properties of minimal trees for selected metrics have been analyzed to see their usefulness for the problem in question. For the small amount of boundary (fixed vertices, the paper offers a way to introduce a probability measure as a corollary of proved theorem about some structural properties of the minimal trees.This work is considered to further the previous similar activity concerning a problem of searching for minimal fillings, and it is a door opener to the more general (complicated task. The stated method demonstrates the possibility to reach the final result analytically, which gives a chance of its applicability to the case of the bigger number of boundary vertices (probably, with the use of computer engineering.The introducing definition of an essential Steiner point allowed a considerable restriction of the ambiguity of initial problem solution and, at the same time, comparison of such an approach with more classical works in the field concerned. The paper also lists main barriers of classical approaches, preventing their use for the task of introducing a probability measure.In prospect, application areas of the described method are expected to be wider both in terms of system enlargement (the number of boundary vertices and in terms of other metric spaces (the Euclidean case is of especial interest. The main interest is to find the classes of topologies with significantly

  11. Zero-knowledge using garbled circuits

    DEFF Research Database (Denmark)

    Jawurek, Marek; Kerschbaum, Florian; Orlandi, Claudio

    2013-01-01

    Zero-knowledge protocols are one of the fundamental concepts in modern cryptography and have countless applications. However, after more than 30 years from their introduction, there are only very few languages (essentially those with a group structure) for which we can construct zero......-party computation (i.e., any protocol for generic secure computation can be used to do zero-knowledge). The main contribution of this paper is to construct an efficient protocol for the special case of secure two-party computation where only one party has input (like in the zero-knowledge case). The protocol......-knowledge protocols that are efficient enough to be used in practice. In this paper we address the problem of how to construct efficient zero-knowledge protocols for generic languages and we propose a protocol based on Yao's garbled circuit technique. The motivation for our work is that in many cryptographic...

  12. The minimal non-minimal standard model

    International Nuclear Information System (INIS)

    Bij, J.J. van der

    2006-01-01

    In this Letter I discuss a class of extensions of the standard model that have a minimal number of possible parameters, but can in principle explain dark matter and inflation. It is pointed out that the so-called new minimal standard model contains a large number of parameters that can be put to zero, without affecting the renormalizability of the model. With the extra restrictions one might call it the minimal (new) non-minimal standard model (MNMSM). A few hidden discrete variables are present. It is argued that the inflaton should be higher-dimensional. Experimental consequences for the LHC and the ILC are discussed

  13. Claus sulphur recovery potential approaches 99% while minimizing cost

    Energy Technology Data Exchange (ETDEWEB)

    Berlie, E M

    1974-01-21

    In a summary of a paper presented to the fourth joint engineering conference of the American Institute of Chemical Engineers and the Canadian Society for Chemical Engineering, the Claus process is discussed in a modern setting. Some problems faced in the operation of sulfur recovery plants include (1) strict pollution control regulations; (2) design and operation of existing plants; (3) knowledge of process fundamentals; (4) performance testing; (5) specification of feed gas; (6) catalyst life; (7) instrumentation and process control; and (8) quality of feed gas. Some of the factors which must be considered in order to achieve the ultimate capability of the Claus process are listed. There is strong evidence to support the contention that plant operators are reluctant to accept new fundamental knowledge of the Claus sulfur recovery process and are not taking advantage of its inherent potential to achieve the emission standards required, to minimize cost of tail gas cleanup systems and to minimize operating costs.

  14. Hazardous waste minimization tracking system

    International Nuclear Information System (INIS)

    Railan, R.

    1994-01-01

    Under RCRA section 3002 9(b) and 3005f(h), hazardous waste generators and owners/operators of treatment, storage, and disposal facilities (TSDFs) are required to certify that they have a program in place to reduce the volume or quantity and toxicity of hazardous waste to the degree determined to be economically practicable. In many cases, there are environmental, as well as, economic benefits, for agencies that pursue pollution prevention options. Several state governments have already enacted waste minimization legislation (e.g., Massachusetts Toxic Use Reduction Act of 1989, and Oregon Toxic Use Reduction Act and Hazardous Waste Reduction Act, July 2, 1989). About twenty six other states have established legislation that will mandate some type of waste minimization program and/or facility planning. The need to address the HAZMIN (Hazardous Waste Minimization) Program at government agencies and private industries has prompted us to identify the importance of managing The HAZMIN Program, and tracking various aspects of the program, as well as the progress made in this area. The open-quotes WASTEclose quotes is a tracking system, which can be used and modified in maintaining the information related to Hazardous Waste Minimization Program, in a manageable fashion. This program maintains, modifies, and retrieves information related to hazardous waste minimization and recycling, and provides automated report generating capabilities. It has a built-in menu, which can be printed either in part or in full. There are instructions on preparing The Annual Waste Report, and The Annual Recycling Report. The program is very user friendly. This program is available in 3.5 inch or 5 1/4 inch floppy disks. A computer with 640K memory is required

  15. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    Science.gov (United States)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  16. Minimally Invasive Transforaminal Lumbar Interbody Fusion: A Perspective on Current Evidence and Clinical Knowledge

    Directory of Open Access Journals (Sweden)

    Ali Habib

    2012-01-01

    Full Text Available This paper reviews the current published data regarding open transforaminal lumbar interbody fusion (TLIF in relation to minimally invasive transforaminal lumbar interbody fusion (MI-TLIF. Introduction. MI-TLIF, a modern method for lumbar interbody arthrodesis, has allowed for a minimally invasive method to treat degenerative spinal pathologies. Currently, there is limited literature that compares TLIF directly to MI-TLIF. Thus, we seek to discuss the current literature on these techniques. Methods. Using a PubMed search, we reviewed recent publications of open and MI-TLIF, dating from 2002 to 2012. We discussed these studies and their findings in this paper, focusing on patient-reported outcomes as well as complications. Results. Data found in 14 articles of the literature was analyzed. Using these reports, we found mean follow-up was 20 months. The mean patient study size was 52. Seven of the articles directly compared outcomes of open TLIF with MI-TLIF, such as mean duration of surgery, length of post-operative stay, blood loss, and complications. Conclusion. Although high-class data comparing these two techniques is lacking, the current evidence supports MI-TLIF with outcomes comparable to that of the traditional, open technique. Further prospective, randomized studies will help to further our understanding of this minimally invasive technique.

  17. Procedures minimally invasive image-guided

    International Nuclear Information System (INIS)

    Mora Guevara, Alejandro

    2011-01-01

    A literature review focused on minimally invasive procedures, has been performed at the Department of Radiology at the Hospital Calderon Guardia. A multidisciplinary team has been raised for decision making. The materials, possible complications and the available imaging technique such as ultrasound, computed tomography, magnetic resonance imaging, have been determined according to the procedure to be performed. The revision has supported medical interventions didactically enjoying the best materials, resources and conditions for a successful implementation of procedures and results [es

  18. Minimal cut-set methodology for artificial intelligence applications

    International Nuclear Information System (INIS)

    Weisbin, C.R.; de Saussure, G.; Barhen, J.; Oblow, E.M.; White, J.C.

    1984-01-01

    This paper reviews minimal cut-set theory and illustrates its application with an example. The minimal cut-set approach uses disjunctive normal form in Boolean algebra and various Boolean operators to simplify very complicated tree structures composed of AND/OR gates. The simplification process is automated and performed off-line using existing computer codes to implement the Boolean reduction on the finite, but large tree structure. With this approach, on-line expert diagnostic systems whose response time is critical, could determine directly whether a goal is achievable by comparing the actual system state to a concisely stored set of preprocessed critical state elements

  19. Minimalism

    CERN Document Server

    Obendorf, Hartmut

    2009-01-01

    The notion of Minimalism is proposed as a theoretical tool supporting a more differentiated understanding of reduction and thus forms a standpoint that allows definition of aspects of simplicity. This book traces the development of minimalism, defines the four types of minimalism in interaction design, and looks at how to apply it.

  20. Theory of planned behavior and knowledge sharing among nurses in patient computer management system: The role of distributive justice

    Directory of Open Access Journals (Sweden)

    Sarminah Samad

    2018-05-01

    Full Text Available This study examined the relationship between Theory of Planned Behavior and knowledge sharing among nurses in Patient Computer Management System. Consequently, it determined the moderating effect of distributive justice on the relationship between Theory Planned Behavior and knowledge sharing. A quantitative approach was employed in this study. The research was based on a correlational and cross-sectional study which involved a total of 336 nurses. Data was collected based on random sampling via self-administered questionnaires. Partial Least Squares (PLS (Version 3.0 analysis was used to analyze the data. The study revealed that Theory of Plan Behavior components were significantly related to knowledge sharing. These components were also found to have a significant and positive influence on knowledge sharing. The study revealed that distributive justice had significantly moderated the relationship between two components of Theory Planned Behavior (attitude and subjective norm and knowledge sharing.

  1. On Time with Minimal Expected Cost!

    DEFF Research Database (Denmark)

    David, Alexandre; Jensen, Peter Gjøl; Larsen, Kim Guldstrand

    2014-01-01

    (Priced) timed games are two-player quantitative games involving an environment assumed to be completely antogonistic. Classical analysis consists in the synthesis of strategies ensuring safety, time-bounded or cost-bounded reachability objectives. Assuming a randomized environment, the (priced......) timed game essentially defines an infinite-state Markov (reward) decision proces. In this setting the objective is classically to find a strategy that will minimize the expected reachability cost, but with no guarantees on worst-case behaviour. In this paper, we provide efficient methods for computing...... reachability strategies that will both ensure worst case time-bounds as well as provide (near-) minimal expected cost. Our method extends the synthesis algorithms of the synthesis tool Uppaal-Tiga with suitable adapted reinforcement learning techniques, that exhibits several orders of magnitude improvements w...

  2. The COPD Knowledge Base: enabling data analysis and computational simulation in translational COPD research.

    Science.gov (United States)

    Cano, Isaac; Tényi, Ákos; Schueller, Christine; Wolff, Martin; Huertas Migueláñez, M Mercedes; Gomez-Cabrero, David; Antczak, Philipp; Roca, Josep; Cascante, Marta; Falciani, Francesco; Maier, Dieter

    2014-11-28

    Previously we generated a chronic obstructive pulmonary disease (COPD) specific knowledge base (http://www.copdknowledgebase.eu) from clinical and experimental data, text-mining results and public databases. This knowledge base allowed the retrieval of specific molecular networks together with integrated clinical and experimental data. The COPDKB has now been extended to integrate over 40 public data sources on functional interaction (e.g. signal transduction, transcriptional regulation, protein-protein interaction, gene-disease association). In addition we integrated COPD-specific expression and co-morbidity networks connecting over 6 000 genes/proteins with physiological parameters and disease states. Three mathematical models describing different aspects of systemic effects of COPD were connected to clinical and experimental data. We have completely redesigned the technical architecture of the user interface and now provide html and web browser-based access and form-based searches. A network search enables the use of interconnecting information and the generation of disease-specific sub-networks from general knowledge. Integration with the Synergy-COPD Simulation Environment enables multi-scale integrated simulation of individual computational models while integration with a Clinical Decision Support System allows delivery into clinical practice. The COPD Knowledge Base is the only publicly available knowledge resource dedicated to COPD and combining genetic information with molecular, physiological and clinical data as well as mathematical modelling. Its integrated analysis functions provide overviews about clinical trends and connections while its semantically mapped content enables complex analysis approaches. We plan to further extend the COPDKB by offering it as a repository to publish and semantically integrate data from relevant clinical trials. The COPDKB is freely available after registration at http://www.copdknowledgebase.eu.

  3. Assessment of knowledge and awareness among radiology personnel regarding current computed tomography technology and radiation dose

    Science.gov (United States)

    Karim, M. K. A.; Hashim, S.; Bradley, D. A.; Bahruddin, N. A.; Ang, W. C.; Salehhon, N.

    2016-03-01

    In this paper, we evaluate the level of knowledge and awareness among 120 radiology personnel working in 7 public hospitals in Johor, Malaysia, concerning Computed Tomography (CT) technology and radiation doses based on a set of questionnaires. Subjects were divided into two groups (Medical profession (Med, n=32) and Allied health profession (AH, n=88). The questionnaires are addressed: (1) demographic data (2) relative radiation dose and (3) knowledge of current CT technology. One-third of respondents from both groups were able to estimate relative radiation dose for routine CT examinations. 68% of the allied health profession personnel knew of the Malaysia regulations entitled ‘Basic Safety Standard (BSS) 2010’, although notably 80% of them had previously attended a radiation protection course. No significant difference (p < 0.05) in mean scores of CT technology knowledge detected between the two groups, with the medical professions producing a mean score of (26.7 ± 2.7) and the allied health professions a mean score of (25.2 ± 4.3). This study points to considerable variation among the respondents concerning their understanding of knowledge and awareness of risks of radiation and CT optimization techniques.

  4. Assessment of knowledge and awareness among radiology personnel regarding current computed tomography technology and radiation dose

    International Nuclear Information System (INIS)

    Karim, M K A; Hashim, S; Bahruddin, N A; Ang, W C; Salehhon, N; Bradley, D A

    2016-01-01

    In this paper, we evaluate the level of knowledge and awareness among 120 radiology personnel working in 7 public hospitals in Johor, Malaysia, concerning Computed Tomography (CT) technology and radiation doses based on a set of questionnaires. Subjects were divided into two groups (Medical profession (Med, n=32) and Allied health profession (AH, n=88). The questionnaires are addressed: (1) demographic data (2) relative radiation dose and (3) knowledge of current CT technology. One-third of respondents from both groups were able to estimate relative radiation dose for routine CT examinations. 68% of the allied health profession personnel knew of the Malaysia regulations entitled ‘Basic Safety Standard (BSS) 2010’, although notably 80% of them had previously attended a radiation protection course. No significant difference (p < 0.05) in mean scores of CT technology knowledge detected between the two groups, with the medical professions producing a mean score of (26.7 ± 2.7) and the allied health professions a mean score of (25.2 ± 4.3). This study points to considerable variation among the respondents concerning their understanding of knowledge and awareness of risks of radiation and CT optimization techniques. (paper)

  5. Knowledge spaces

    CERN Document Server

    Doignon, Jean-Paul

    1999-01-01

    Knowledge spaces offer a rigorous mathematical foundation for various practical systems of knowledge assessment. An example is offered by the ALEKS system (Assessment and LEarning in Knowledge Spaces), a software for the assessment of mathematical knowledge. From a mathematical standpoint, knowledge spaces generalize partially ordered sets. They are investigated both from a combinatorial and a stochastic viewpoint. The results are applied to real and simulated data. The book gives a systematic presentation of research and extends the results to new situations. It is of interest to mathematically oriented readers in education, computer science and combinatorics at research and graduate levels. The text contains numerous examples and exercises and an extensive bibliography.

  6. [epiDRB--a new minimally invasive concept for referencing in the field of computer-assisted orthopaedic surgery].

    Science.gov (United States)

    Ohnsorge, J A K; Weisskopf, M; Siebert, C H

    2005-01-01

    Optoelectronic navigation for computer-assisted orthopaedic surgery (CAOS) is based on a firm connection of bone with passive reflectors or active light-emitting diodes in a specific three-dimensional pattern. Even a so-called "minimally-invasive" dynamic reference base (DRB) requires fixation with screws or clamps via incision of the skin. Consequently an originally percutaneous intervention would unnecessarily be extended to an open procedure. Thus, computer-assisted navigation is rarely applied. Due to their tree-like design most DRB's interfere with the surgeon's actions and therefore are at permanent risk to be accidentally dislocated. Accordingly, the optic communication between the camera and the operative site may repeatedly be interrupted. The aim of the research was the development of a less bulky, more comfortable, stable and safely trackable device that can be fixed truly percutaneously. With engineering support of the industrial partner the radiolucent epiDRB was developed. It can be fixed with two or more pins and gains additional stability from its epicutaneous position. The intraoperative applicability and reliability was experimentally tested. Its low centre of gravity and its flat design allow the device to be located directly in the area of interest. Thanks to its epicutaneous position and its particular shape the epiDRB may perpetually be tracked by the navigation system without hindering the surgeon's actions. Hence, the risk of being displaced by accident is minimised and the line of sight remains unaffected. With the newly developed epiDRB computer-assisted navigation becomes easier and safer to handle even in punctures and other percutaneous procedures at the spine as much as at the extremities without an unproportionate amount of additional trauma. Due to the special design referencing of more than one vertebral body is possible at one time, thus decreasing radiation exposure and increasing efficiency.

  7. Modular invariance of N=2 minimal models

    International Nuclear Information System (INIS)

    Sidenius, J.

    1991-01-01

    We prove modular covariance of one-point functions at one loop in the diagonal N=2 minimal superconformal models. We use the recently derived general formalism for computing arbitrary conformal blocks in these models. Our result should be sufficient to guarantee modular covariance at arbitrary genus. It is thus an important check on the general formalism which is not manifestly modular covariant. (orig.)

  8. Image denoising by a direct variational minimization

    Directory of Open Access Journals (Sweden)

    Pilipović Stevan

    2011-01-01

    Full Text Available Abstract In this article we introduce a novel method for the image de-noising which combines a mathematically well-posdenes of the variational modeling with the efficiency of a patch-based approach in the field of image processing. It based on a direct minimization of an energy functional containing a minimal surface regularizer that uses fractional gradient. The minimization is obtained on every predefined patch of the image, independently. By doing so, we avoid the use of an artificial time PDE model with its inherent problems of finding optimal stopping time, as well as the optimal time step. Moreover, we control the level of image smoothing on each patch (and thus on the whole image by adapting the Lagrange multiplier using the information on the level of discontinuities on a particular patch, which we obtain by pre-processing. In order to reduce the average number of vectors in the approximation generator and still to obtain the minimal degradation, we combine a Ritz variational method for the actual minimization on a patch, and a complementary fractional variational principle. Thus, the proposed method becomes computationally feasible and applicable for practical purposes. We confirm our claims with experimental results, by comparing the proposed method with a couple of PDE-based methods, where we get significantly better denoising results specially on the oscillatory regions.

  9. Systems biology perspectives on minimal and simpler cells.

    Science.gov (United States)

    Xavier, Joana C; Patil, Kiran Raosaheb; Rocha, Isabel

    2014-09-01

    The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  10. Systems Biology Perspectives on Minimal and Simpler Cells

    Science.gov (United States)

    Xavier, Joana C.; Patil, Kiran Raosaheb

    2014-01-01

    SUMMARY The concept of the minimal cell has fascinated scientists for a long time, from both fundamental and applied points of view. This broad concept encompasses extreme reductions of genomes, the last universal common ancestor (LUCA), the creation of semiartificial cells, and the design of protocells and chassis cells. Here we review these different areas of research and identify common and complementary aspects of each one. We focus on systems biology, a discipline that is greatly facilitating the classical top-down and bottom-up approaches toward minimal cells. In addition, we also review the so-called middle-out approach and its contributions to the field with mathematical and computational models. Owing to the advances in genomics technologies, much of the work in this area has been centered on minimal genomes, or rather minimal gene sets, required to sustain life. Nevertheless, a fundamental expansion has been taking place in the last few years wherein the minimal gene set is viewed as a backbone of a more complex system. Complementing genomics, progress is being made in understanding the system-wide properties at the levels of the transcriptome, proteome, and metabolome. Network modeling approaches are enabling the integration of these different omics data sets toward an understanding of the complex molecular pathways connecting genotype to phenotype. We review key concepts central to the mapping and modeling of this complexity, which is at the heart of research on minimal cells. Finally, we discuss the distinction between minimizing the number of cellular components and minimizing cellular complexity, toward an improved understanding and utilization of minimal and simpler cells. PMID:25184563

  11. The integration of automated knowledge acquisition with computer-aided software engineering for space shuttle expert systems

    Science.gov (United States)

    Modesitt, Kenneth L.

    1990-01-01

    A prediction was made that the terms expert systems and knowledge acquisition would begin to disappear over the next several years. This is not because they are falling into disuse; it is rather that practitioners are realizing that they are valuable adjuncts to software engineering, in terms of problem domains addressed, user acceptance, and in development methodologies. A specific problem was discussed, that of constructing an automated test analysis system for the Space Shuttle Main Engine. In this domain, knowledge acquisition was part of requirements systems analysis, and was performed with the aid of a powerful inductive ESBT in conjunction with a computer aided software engineering (CASE) tool. The original prediction is not a very risky one -- it has already been accomplished.

  12. Computer programming and computer systems

    CERN Document Server

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  13. Development of an Instrument to Measure Health Center (HC) Personnel's Computer Use, Knowledge and Functionality Demand for HC Computerized Information System in Thailand

    OpenAIRE

    Kijsanayotin, Boonchai; Pannarunothai, Supasit; Speedie, Stuart

    2005-01-01

    Knowledge about socio-technical aspects of information technology (IT) is vital for the success of health IT projects. The Thailand health administration anticipates using health IT to support the recently implemented national universal health care system. However, the national knowledge associate with the socio-technical aspects of health IT has not been studied in Thailand. A survey instrument measuring Thai health center (HC) personnel’s computer use, basic IT knowledge a...

  14. A Comparative Study of University of Wisconsin-Stout Freshmen and Senior Education Major's Computing and Internet Technology Skills/Knowledge and Associated Learning Experiences

    Science.gov (United States)

    Sveum, Evan Charles

    2010-01-01

    A study comparing University of Wisconsin-Stout freshmen and senior education majors' computing and Internet technology skills/knowledge and associated learning experiences was conducted. Instruments used in this study included the IC[superscript 3][R] Exam by Certiport, Inc. and the investigator's Computing and Internet Skills Learning…

  15. Selecting Suitable Drainage Pattern to Minimize Flooding in ...

    African Journals Online (AJOL)

    Water shed analysis is a geographic information system (GIS) based technique designed to model the way surface water flows on the earth surface. This was the method adopted to select suitable drainage pattern to minimized flood in some parts of sangere. The process of watershed computes the local direction of flow ...

  16. An Ontology for Knowledge Representation and Applications

    OpenAIRE

    Nhon Do

    2008-01-01

    Ontology is a terminology which is used in artificial intelligence with different meanings. Ontology researching has an important role in computer science and practical applications, especially distributed knowledge systems. In this paper we present an ontology which is called Computational Object Knowledge Base Ontology. It has been used in designing some knowledge base systems for solving problems such as the system that supports studying knowledge and solving analytic ...

  17. Minimal changes in health status questionnaires: distinction between minimally detectable change and minimally important change

    Directory of Open Access Journals (Sweden)

    Knol Dirk L

    2006-08-01

    Full Text Available Abstract Changes in scores on health status questionnaires are difficult to interpret. Several methods to determine minimally important changes (MICs have been proposed which can broadly be divided in distribution-based and anchor-based methods. Comparisons of these methods have led to insight into essential differences between these approaches. Some authors have tried to come to a uniform measure for the MIC, such as 0.5 standard deviation and the value of one standard error of measurement (SEM. Others have emphasized the diversity of MIC values, depending on the type of anchor, the definition of minimal importance on the anchor, and characteristics of the disease under study. A closer look makes clear that some distribution-based methods have been merely focused on minimally detectable changes. For assessing minimally important changes, anchor-based methods are preferred, as they include a definition of what is minimally important. Acknowledging the distinction between minimally detectable and minimally important changes is useful, not only to avoid confusion among MIC methods, but also to gain information on two important benchmarks on the scale of a health status measurement instrument. Appreciating the distinction, it becomes possible to judge whether the minimally detectable change of a measurement instrument is sufficiently small to detect minimally important changes.

  18. AN ENHANCED METHOD FOREXTENDING COMPUTATION AND RESOURCES BY MINIMIZING SERVICE DELAY IN EDGE CLOUD COMPUTING

    OpenAIRE

    B.Bavishna*1, Mrs.M.Agalya2 & Dr.G.Kavitha3

    2018-01-01

    A lot of research has been done in the field of cloud computing in computing domain. For its effective performance, variety of algorithms has been proposed. The role of virtualization is significant and its performance is dependent on VM Migration and allocation. More of the energy is absorbed in cloud; therefore, the utilization of numerous algorithms is required for saving energy and efficiency enhancement in the proposed work. In the proposed work, green algorithm has been considered with ...

  19. An intelligent multi-media human-computer dialogue system

    Science.gov (United States)

    Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.

    1988-01-01

    Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.

  20. A Matrix Splitting Method for Composite Function Minimization

    KAUST Repository

    Yuan, Ganzhao

    2016-12-07

    Composite function minimization captures a wide spectrum of applications in both computer vision and machine learning. It includes bound constrained optimization and cardinality regularized optimization as special cases. This paper proposes and analyzes a new Matrix Splitting Method (MSM) for minimizing composite functions. It can be viewed as a generalization of the classical Gauss-Seidel method and the Successive Over-Relaxation method for solving linear systems in the literature. Incorporating a new Gaussian elimination procedure, the matrix splitting method achieves state-of-the-art performance. For convex problems, we establish the global convergence, convergence rate, and iteration complexity of MSM, while for non-convex problems, we prove its global convergence. Finally, we validate the performance of our matrix splitting method on two particular applications: nonnegative matrix factorization and cardinality regularized sparse coding. Extensive experiments show that our method outperforms existing composite function minimization techniques in term of both efficiency and efficacy.

  1. A Matrix Splitting Method for Composite Function Minimization

    KAUST Repository

    Yuan, Ganzhao; Zheng, Wei-Shi; Ghanem, Bernard

    2016-01-01

    Composite function minimization captures a wide spectrum of applications in both computer vision and machine learning. It includes bound constrained optimization and cardinality regularized optimization as special cases. This paper proposes and analyzes a new Matrix Splitting Method (MSM) for minimizing composite functions. It can be viewed as a generalization of the classical Gauss-Seidel method and the Successive Over-Relaxation method for solving linear systems in the literature. Incorporating a new Gaussian elimination procedure, the matrix splitting method achieves state-of-the-art performance. For convex problems, we establish the global convergence, convergence rate, and iteration complexity of MSM, while for non-convex problems, we prove its global convergence. Finally, we validate the performance of our matrix splitting method on two particular applications: nonnegative matrix factorization and cardinality regularized sparse coding. Extensive experiments show that our method outperforms existing composite function minimization techniques in term of both efficiency and efficacy.

  2. Minimal surfaces

    CERN Document Server

    Dierkes, Ulrich; Sauvigny, Friedrich; Jakob, Ruben; Kuster, Albrecht

    2010-01-01

    Minimal Surfaces is the first volume of a three volume treatise on minimal surfaces (Grundlehren Nr. 339-341). Each volume can be read and studied independently of the others. The central theme is boundary value problems for minimal surfaces. The treatise is a substantially revised and extended version of the monograph Minimal Surfaces I, II (Grundlehren Nr. 295 & 296). The first volume begins with an exposition of basic ideas of the theory of surfaces in three-dimensional Euclidean space, followed by an introduction of minimal surfaces as stationary points of area, or equivalently

  3. Qualification guideline of the German X-ray association (DRG) und the German association for interventional radiology and minimal invasive therapy (DeGIR) for the performance of interventional-radiological minimal invasive procedures on arteries and veins

    International Nuclear Information System (INIS)

    Buecker, A.; Gross-Fengels, W.; Haage, P.; Huppert, P.; Landwehr, P.; Loose, R.; Reimer, P.; Tacke, J.; Vorwerk, D.; Fischer, J.

    2012-01-01

    The topics covered in the qualification guideline of the German X-ray association (DRG) und the German association for interventional radiology and minimal invasive therapy (DeGIR) for the performance of interventional-radiological minimal invasive procedures on arteries and veins are the following: Practical qualification: aorta iliac vessels and vessels in the upper and lower extremities, kidney and visceral arteries, head and neck arteries, dialysis shunts, veins and pulmonary arteries, aorta aneurysms and peripheral artery aneurysms. Knowledge acquisition concerning radiation protection: legal fundamentals, education and training, knowledge actualization and quality control, definition of the user and the procedure, competence preservation.

  4. The Use of Trust Regions in Kohn-Sham Total Energy Minimization

    International Nuclear Information System (INIS)

    Yang, Chao; Meza, Juan C.; Wang, Lin-wang

    2006-01-01

    The Self Consistent Field (SCF) iteration, widely used for computing the ground state energy and the corresponding single particle wave functions associated with a many-electron atomistic system, is viewed in this paper as an optimization procedure that minimizes the Kohn-Sham total energy indirectly by minimizing a sequence of quadratic surrogate functions. We point out the similarity and difference between the total energy and the surrogate, and show how the SCF iteration can fail when the minimizer of the surrogate produces an increase in the KS total energy. A trust region technique is introduced as a way to restrict the update of the wave functions within a small neighborhood of an approximate solution at which the gradient of the total energy agrees with that of the surrogate. The use of trust region in SCF is not new. However, it has been observed that directly applying a trust region based SCF(TRSCF) to the Kohn-Sham total energy often leads to slow convergence. We propose to use TRSCF within a direct constrained minimization(DCM) algorithm we developed in dcm. The key ingredients of the DCM algorithm involve projecting the total energy function into a sequence of subspaces of small dimensions and seeking the minimizer of the total energy function within each subspace. The minimizer of a subspace energy function, which is computed by TRSCF, not only provides a search direction along which the KS total energy function decreases but also gives an optimal 'step-length' that yields a sufficient decrease in total energy. A numerical example is provided to demonstrate that the combination of TRSCF and DCM is more efficient than SCF

  5. Profiles of Motivated Self-Regulation in College Computer Science Courses: Differences in Major versus Required Non-Major Courses

    Science.gov (United States)

    Shell, Duane F.; Soh, Leen-Kiat

    2013-12-01

    The goal of the present study was to utilize a profiling approach to understand differences in motivation and strategic self-regulation among post-secondary STEM students in major versus required non-major computer science courses. Participants were 233 students from required introductory computer science courses (194 men; 35 women; 4 unknown) at a large Midwestern state university. Cluster analysis identified five profiles: (1) a strategic profile of a highly motivated by-any-means good strategy user; (2) a knowledge-building profile of an intrinsically motivated autonomous, mastery-oriented student; (3) a surface learning profile of a utility motivated minimally engaged student; (4) an apathetic profile of an amotivational disengaged student; and (5) a learned helpless profile of a motivated but unable to effectively self-regulate student. Among CS majors and students in courses in their major field, the strategic and knowledge-building profiles were the most prevalent. Among non-CS majors and students in required non-major courses, the learned helpless, surface learning, and apathetic profiles were the most prevalent. Students in the strategic and knowledge-building profiles had significantly higher retention of computational thinking knowledge than students in other profiles. Students in the apathetic and surface learning profiles saw little instrumentality of the course for their future academic and career objectives. Findings show that students in STEM fields taking required computer science courses exhibit the same constellation of motivated strategic self-regulation profiles found in other post-secondary and K-12 settings.

  6. Atomic-level computer simulation

    International Nuclear Information System (INIS)

    Adams, J.B.; Rockett, Angus; Kieffer, John; Xu Wei; Nomura, Miki; Kilian, K.A.; Richards, D.F.; Ramprasad, R.

    1994-01-01

    This paper provides a broad overview of the methods of atomic-level computer simulation. It discusses methods of modelling atomic bonding, and computer simulation methods such as energy minimization, molecular dynamics, Monte Carlo, and lattice Monte Carlo. ((orig.))

  7. Minimally Invasive Spinal Surgery with Intraoperative Image-Guided Navigation

    Directory of Open Access Journals (Sweden)

    Terrence T. Kim

    2016-01-01

    Full Text Available We present our perioperative minimally invasive spine surgery technique using intraoperative computed tomography image-guided navigation for the treatment of various lumbar spine pathologies. We present an illustrative case of a patient undergoing minimally invasive percutaneous posterior spinal fusion assisted by the O-arm system with navigation. We discuss the literature and the advantages of the technique over fluoroscopic imaging methods: lower occupational radiation exposure for operative room personnel, reduced need for postoperative imaging, and decreased revision rates. Most importantly, we demonstrate that use of intraoperative cone beam CT image-guided navigation has been reported to increase accuracy.

  8. Physically Embedded Minimal Self-Replicating Systems

    DEFF Research Database (Denmark)

    Fellermann, Harold

    Self-replication is a fundamental property of all living organisms, yet has only been accomplished to limited extend in manmade systems. This thesis is part of the ongoing research endeavor to bridge the two sides of this gap. In particular, we present simulation results of a minimal life...... for any model above the atomistic scale. This is achieved by deriving an alternative scaling procedure for interaction parameters in the model. We perform system-level simulations of the design which attempt to account for theoretical, and experimental knowledge, as well as results from other...

  9. Controllers with Minimal Observation Power (Application to Timed Systems)

    DEFF Research Database (Denmark)

    Bulychev, Petr; Cassez, Franck; David, Alexandre

    2012-01-01

    We consider the problem of controller synthesis under imper- fect information in a setting where there is a set of available observable predicates equipped with a cost function. The problem that we address is the computation of a subset of predicates sufficient for control and whose cost is minimal...

  10. Discrete Curvatures and Discrete Minimal Surfaces

    KAUST Repository

    Sun, Xiang

    2012-06-01

    This thesis presents an overview of some approaches to compute Gaussian and mean curvature on discrete surfaces and discusses discrete minimal surfaces. The variety of applications of differential geometry in visualization and shape design leads to great interest in studying discrete surfaces. With the rich smooth surface theory in hand, one would hope that this elegant theory can still be applied to the discrete counter part. Such a generalization, however, is not always successful. While discrete surfaces have the advantage of being finite dimensional, thus easier to treat, their geometric properties such as curvatures are not well defined in the classical sense. Furthermore, the powerful calculus tool can hardly be applied. The methods in this thesis, including angular defect formula, cotangent formula, parallel meshes, relative geometry etc. are approaches based on offset meshes or generalized offset meshes. As an important application, we discuss discrete minimal surfaces and discrete Koenigs meshes.

  11. Minimal Reducts with Grasp

    Directory of Open Access Journals (Sweden)

    Iris Iddaly Mendez Gurrola

    2011-03-01

    Full Text Available The proper detection of patient level of dementia is important to offer the suitable treatment. The diagnosis is based on certain criteria, reflected in the clinical examinations. From these examinations emerge the limitations and the degree in which each patient is in. In order to reduce the total of limitations to be evaluated, we used the rough set theory, this theory has been applied in areas of the artificial intelligence such as decision analysis, expert systems, knowledge discovery, classification with multiple attributes. In our case this theory is applied to find the minimal limitations set or reduct that generate the same classification that considering all the limitations, to fulfill this purpose we development an algorithm GRASP (Greedy Randomized Adaptive Search Procedure.

  12. Minimizing energy consumption for wireless computers in Moby Dick

    NARCIS (Netherlands)

    Havinga, Paul J.M.; Smit, Gerardus Johannes Maria

    1997-01-01

    The Moby Dick project is a joint European project to develop and define the architecture of a new generation of mobile hand-held computers, called Pocket Companions. The Pocket Companion is a hand-held device that is resource-poor, i.e. small amount of memory, limited battery life, low processing

  13. Features of construction of the individual trajectory education to computer science on the basis dynamic integrated estimation of level of knowledge

    Directory of Open Access Journals (Sweden)

    Ольга Юрьевна Заславская

    2010-12-01

    Full Text Available In article features of realisation of the mechanism of construction of an optimum trajectory of education to computer science on the basis of a dynamic integrated estimation of level of knowledge are considered.

  14. Retrospective Evaluation of Detection of Minimal Preparedness Multidetector Computed Tomography with Colorectal Neoplasms

    Directory of Open Access Journals (Sweden)

    Naci Üngür

    2015-06-01

    Full Text Available  Purpose: The aim of this study is to retrospectively assess the contrubition of the minimal preparation CT to the diagnosis of colorectal cancer in the patients who were refered to department of gatroenterology with colorectal cancer prediagnosis and have consequent colonoscopically visible mass and histopathological proof.Materials and methods: 100 consecutive cases referred from department of gastroenterology between september 2008 and december 2012 with confirmed colonoscopical mass diagnosis were included to our study (Age range: 18–90 Sex: females 41 and 59 males. Radiological findings were statistically compared with pathological findings as a gold standard.Results: Of these patients with coloscopically visible mass, minimal preparation CT revealed asymmetric wall thickening(n:89, extracolonic mass (n:3, and symmetric wall thickening(n:2 and normal wall thickness (n:6. 79 cases had enlarged lymph nodes in pericolonic mesenteric fat tissue while remaning have no lymph nodes(21. 54 cases had stranding in pericolonic mesenteric fat tissue and remanining individuals showed normal fat density.  The masses were located in rectum (n:54, sigmoid colon (n:17, descending colon (n:10, transverse colon (n:2, ascending colon (n:14, and cecum (n:3.Conclusion: In colorectal and extracolonic mass investigation we recommend minimal preparation CT, which is highly sensitive and more acceptible by patients. 

  15. The evolution and future of minimalism in neurological surgery.

    Science.gov (United States)

    Liu, Charles Y; Wang, Michael Y; Apuzzo, Michael L J

    2004-11-01

    The evolution of the field of neurological surgery has been marked by a progressive minimalism. This has been evident in the development of an entire arsenal of modern neurosurgical enterprises, including microneurosurgery, neuroendoscopy, stereotactic neurosurgery, endovascular techniques, radiosurgical systems, intraoperative and navigational devices, and in the last decade, cellular and molecular adjuvants. In addition to reviewing the major developments and paradigm shifts in the cyclic reinvention of the field as it currently stands, this paper attempts to identify forces and developments that are likely to fuel the irresistible escalation of minimalism into the future. These forces include discoveries in computational science, imaging, molecular science, biomedical engineering, and information processing as they relate to the theme of minimalism. These areas are explained in the light of future possibilities offered by the emerging field of nanotechnology with molecular engineering.

  16. Minimal Liouville gravity correlation numbers from Douglas string equation

    International Nuclear Information System (INIS)

    Belavin, Alexander; Dubrovin, Boris; Mukhametzhanov, Baur

    2014-01-01

    We continue the study of (q,p) Minimal Liouville Gravity with the help of Douglas string equation. We generalize the results of http://dx.doi.org/10.1016/0550-3213(91)90548-Chttp://dx.doi.org/10.1088/1751-8113/42/30/304004, where Lee-Yang series (2,2s+1) was studied, to (3,3s+p 0 ) Minimal Liouville Gravity, where p 0 =1,2. We demonstrate that there exist such coordinates τ m,n on the space of the perturbed Minimal Liouville Gravity theories, in which the partition function of the theory is determined by the Douglas string equation. The coordinates τ m,n are related in a non-linear fashion to the natural coupling constants λ m,n of the perturbations of Minimal Lioville Gravity by the physical operators O m,n . We find this relation from the requirement that the correlation numbers in Minimal Liouville Gravity must satisfy the conformal and fusion selection rules. After fixing this relation we compute three- and four-point correlation numbers when they are not zero. The results are in agreement with the direct calculations in Minimal Liouville Gravity available in the literature http://dx.doi.org/10.1103/PhysRevLett.66.2051http://dx.doi.org/10.1007/s11232-005-0003-3http://dx.doi.org/10.1007/s11232-006-0075-8

  17. A minimal unified model of disease trajectories captures hallmarks of multiple sclerosis

    KAUST Repository

    Kannan, Venkateshan; Kiani, Narsis A.; Piehl, Fredrik; Tegner, Jesper

    2017-01-01

    Multiple Sclerosis (MS) is an autoimmune disease targeting the central nervous system (CNS) causing demyelination and neurodegeneration leading to accumulation of neurological disability. Here we present a minimal, computational model involving

  18. USE OF ONTOLOGIES FOR KNOWLEDGE BASES CREATION TUTORING COMPUTER SYSTEMS

    Directory of Open Access Journals (Sweden)

    Cheremisina Lyubov

    2014-11-01

    Full Text Available This paper deals with the use of ontology for the use and development of intelligent tutoring systems. We consider the shortcomings of educational software and distance learning systems and the advantages of using ontology’s in their design. Actuality creates educational computer systems based on systematic knowledge. We consider classification of properties, use and benefits of ontology’s. Characterized approaches to the problem of ontology mapping, the first of which – manual mapping, the second – a comparison of the names of concepts based on their lexical similarity and using special dictionaries. The analysis of languages available for the formal description of ontology. Considered a formal mathematical model of ontology’s and ontology consistency problem, which is that different developers for the same domain ontology can be created, syntactically or semantically heterogeneous, and their use requires a compatible broadcast or display. An algorithm combining ontology’s. The characteristic of the practical value of developing an ontology for electronic educational resources and recommendations for further research and development, such as implementation of other components of the system integration, formalization of the processes of integration and development of a universal expansion algorithms ontology’s software

  19. Use of Tablet Computers to Promote Physical Therapy Students' Engagement in Knowledge Translation During Clinical Experiences

    Science.gov (United States)

    Loeb, Kathryn; Barbosa, Sabrina; Jiang, Fei; Lee, Karin T.

    2016-01-01

    Background and Purpose: Physical therapists strive to integrate research into daily practice. The tablet computer is a potentially transformational tool for accessing information within the clinical practice environment. The purpose of this study was to measure and describe patterns of tablet computer use among physical therapy students during clinical rotation experiences. Methods: Doctor of physical therapy students (n = 13 users) tracked their use of tablet computers (iPad), loaded with commercially available apps, during 16 clinical experiences (6-16 weeks in duration). Results: The tablets were used on 70% of 691 clinic days, averaging 1.3 uses per day. Information seeking represented 48% of uses; 33% of those were foreground searches for research articles and syntheses and 66% were for background medical information. Other common uses included patient education (19%), medical record documentation (13%), and professional communication (9%). The most frequently used app was Safari, the preloaded web browser (representing 281 [36.5%] incidents of use). Users accessed 56 total apps to support clinical practice. Discussion and Conclusions: Physical therapy students successfully integrated use of a tablet computer into their clinical experiences including regular activities of information seeking. Our findings suggest that the tablet computer represents a potentially transformational tool for promoting knowledge translation in the clinical practice environment. Video Abstract available for more insights from the authors (see Supplemental Digital Content 1, http://links.lww.com/JNPT/A127). PMID:26945431

  20. A brief measure of Smokers' knowledge of lung cancer screening with low-dose computed tomography

    Directory of Open Access Journals (Sweden)

    Lisa M. Lowenstein

    2016-12-01

    Full Text Available We describe the development and psychometric properties of a new, brief measure of smokers' knowledge of lung cancer screening with low-dose computed tomography (LDCT. Content experts identified key facts smokers should know in making an informed decision about lung cancer screening. Sample questions were drafted and iteratively refined based on feedback from content experts and cognitive testing with ten smokers. The resulting 16-item knowledge measure was completed by 108 heavy smokers in Houston, Texas, recruited from 12/2014 to 09/2015. Item difficulty, item discrimination, internal consistency and test-retest reliability were assessed. Group differences based upon education levels and smoking history were explored. Several items were dropped due to ceiling effects or overlapping constructs, resulting in a 12-item knowledge measure. Additional items with high item uncertainty were retained because of their importance in informed decision making about lung cancer screening. Internal consistency reliability of the final scale was acceptable (KR-20 = 0.66 and test-retest reliability of the overall scale was 0.84 (intraclass correlation. Knowledge scores differed across education levels (F = 3.36, p = 0.04, while no differences were observed between current and former smokers (F = 1.43, p = 0.24 or among participants who met or did not meet the 30-pack-year screening eligibility criterion (F = 0.57, p = 0.45. The new measure provides a brief, valid and reliable indicator of smokers' knowledge of key concepts central to making an informed decision about lung cancer screening with LDCT, and can be part of a broader assessment of the quality of smokers' decision making about lung cancer screening.

  1. 7th International Conference on Intelligent Systems and Knowledge Engineering

    CERN Document Server

    Li, Tianrui; Li, Hongbo

    2014-01-01

    These proceedings present technical papers selected from the 2012 International Conference on Intelligent Systems and Knowledge Engineering (ISKE 2012), held on December 15-17 in Beijing. The aim of this conference is to bring together experts from different fields of expertise to discuss the state-of-the-art in Intelligent Systems and Knowledge Engineering, and to present new findings and perspectives on future developments. The proceedings introduce current scientific and technical advances in the fields of artificial intelligence, machine learning, pattern recognition, data mining, knowledge engineering, information retrieval, information theory, knowledge-based systems, knowledge representation and reasoning, multi-agent systems, and natural-language processing, etc. Furthermore they include papers on new intelligent computing paradigms, which combine new computing methodologies, e.g., cloud computing, service computing and pervasive computing with traditional intelligent methods. By presenting new method...

  2. Generating inferences from knowledge structures based on general automata

    Energy Technology Data Exchange (ETDEWEB)

    Koenig, E C

    1983-01-01

    The author shows that the model for knowledge structures for computers based on general automata accommodates procedures for establishing inferences. Algorithms are presented which generate inferences as output of a computer when its sentence input names appropriate knowledge elements contained in an associated knowledge structure already stored in the memory of the computer. The inferences are found to have either a single graph tuple or more than one graph tuple of associated knowledge. Six algorithms pertain to a single graph tuple and a seventh pertains to more than one graph tuple of associated knowledge. A named term is either the automaton, environment, auxiliary receptor, principal receptor, auxiliary effector, or principal effector. The algorithm pertaining to more than one graph tuple requires that the input sentence names the automaton, transformation response, and environment of one of the tuples of associated knowledge in a sequence of tuples. Interaction with the computer may be either in a conversation or examination mode. The algorithms are illustrated by an example. 13 references.

  3. Contribution to computer aided design of digital circuits - Minimization of alphanumeric expressions - Program CHOPIN

    International Nuclear Information System (INIS)

    Blanca, Ernest

    1974-10-01

    Alpha-numeric boolean expressions, written in the form of sums of products and/or products of sums with many brackets, may be minimized in two steps: syntaxic recognition analysis using precedence operator grammar, syntaxic reduction analysis. These two phases of execution and the different programs of the corresponding machine algorithm are described. Examples of minimization of alpha-numeric boolean expressions written with the help of brackets, utilisation note of the program CHOPIN and theoretical considerations related to language, grammar, precedence operator grammar, sequential systems, boolean sets, boolean representations and treatments of boolean expressions, boolean matrices and their use in grammar theory, are discussed and described. (author) [fr

  4. Participatory knowledge-management design: A semiotic approach

    DEFF Research Database (Denmark)

    Valtolina, Stefano; Barricelli, Barbara Rita; Dittrich, Yvonne

    2012-01-01

    vocabularies, notations, and suitable visual structures for navigating among interface elements. To this end, the paper describes how our semiotic approach supports processes for representing, storing, accessing, and transferring knowledge through which the information architecture of an interactive system can......The aim of this paper is to present a design strategy for collaborative knowledge-management systems based on a semiotic approach. The contents and structure of experts' knowledge is highly dependent on professional or individual practice. Knowledge-management systems that support cooperation...... a semiotic perspective to computer application and human–computer interaction. From a semiotic perspective, the computer application is both a message from the designer to the user about the structure of the problem domain, as well as about interaction with it, and a structured channel for the user...

  5. Ubiquitous mobile knowledge construction in collaborative learning environments.

    Science.gov (United States)

    Baloian, Nelson; Zurita, Gustavo

    2012-01-01

    Knowledge management is a critical activity for any organization. It has been said to be a differentiating factor and an important source of competitiveness if this knowledge is constructed and shared among its members, thus creating a learning organization. Knowledge construction is critical for any collaborative organizational learning environment. Nowadays workers must perform knowledge creation tasks while in motion, not just in static physical locations; therefore it is also required that knowledge construction activities be performed in ubiquitous scenarios, and supported by mobile and pervasive computational systems. These knowledge creation systems should help people in or outside organizations convert their tacit knowledge into explicit knowledge, thus supporting the knowledge construction process. Therefore in our understanding, we consider highly relevant that undergraduate university students learn about the knowledge construction process supported by mobile and ubiquitous computing. This has been a little explored issue in this field. This paper presents the design, implementation, and an evaluation of a system called MCKC for Mobile Collaborative Knowledge Construction, supporting collaborative face-to-face tacit knowledge construction and sharing in ubiquitous scenarios. The MCKC system can be used by undergraduate students to learn how to construct knowledge, allowing them anytime and anywhere to create, make explicit and share their knowledge with their co-learners, using visual metaphors, gestures and sketches to implement the human-computer interface of mobile devices (PDAs).

  6. Y-formalism and b ghost in the non-minimal pure spinor formalism of superstrings

    International Nuclear Information System (INIS)

    Oda, Ichiro; Tonin, Mario

    2007-01-01

    We present the Y-formalism for the non-minimal pure spinor quantization of superstrings. In the framework of this formalism we compute, at the quantum level, the explicit form of the compound operators involved in the construction of the b ghost, their normal-ordering contributions and the relevant relations among them. We use these results to construct the quantum-mechanical b ghost in the non-minimal pure spinor formalism. Moreover we show that this non-minimal b ghost is cohomologically equivalent to the non-covariant b ghost

  7. Distributed multiscale computing

    NARCIS (Netherlands)

    Borgdorff, J.

    2014-01-01

    Multiscale models combine knowledge, data, and hypotheses from different scales. Simulating a multiscale model often requires extensive computation. This thesis evaluates distributing these computations, an approach termed distributed multiscale computing (DMC). First, the process of multiscale

  8. Data structures and apparatuses for representing knowledge

    Science.gov (United States)

    Hohimer, Ryan E; Thomson, Judi R; Harvey, William J; Paulson, Patrick R; Whiting, Mark A; Tratz, Stephen C; Chappell, Alan R; Butner, Robert S

    2014-02-18

    Data structures and apparatuses to represent knowledge are disclosed. The processes can comprise labeling elements in a knowledge signature according to concepts in an ontology and populating the elements with confidence values. The data structures can comprise knowledge signatures stored on computer-readable media. The knowledge signatures comprise a matrix structure having elements labeled according to concepts in an ontology, wherein the value of the element represents a confidence that the concept is present in an information space. The apparatus can comprise a knowledge representation unit having at least one ontology stored on a computer-readable medium, at least one data-receiving device, and a processor configured to generate knowledge signatures by comparing datasets obtained by the data-receiving devices to the ontologies.

  9. Knowledge Uncertainty and Composed Classifier

    Czech Academy of Sciences Publication Activity Database

    Klimešová, Dana; Ocelíková, E.

    2007-01-01

    Roč. 1, č. 2 (2007), s. 101-105 ISSN 1998-0140 Institutional research plan: CEZ:AV0Z10750506 Keywords : Boosting architecture * contextual modelling * composed classifier * knowledge management, * knowledge * uncertainty Subject RIV: IN - Informatics, Computer Science

  10. Experience with the EPA manual for waste minimization opportunity assessments

    International Nuclear Information System (INIS)

    Bridges, J.S.

    1990-01-01

    The EPA Waste Minimization Opportunity Assessment Manual (EPA/625/788/003) was published to assist those responsible for managing waste minimization activities at the waste generating facility and at corporate levels. The Manual sets forth a procedure that incorporates technical and managerial principles and motivates people to develop and implement pollution prevention concepts and ideas. Environmental management has increasingly become one of cooperative endeavor whereby whether in government, industry, or other forms of enterprise, the effectiveness with whirl, people work together toward the attainment of a clean environment is largely determined by the ability of those who hold managerial position. This paper offers a description of the EPA Waste Minimization Opportunity Assessment Manual procedure which supports the waste minimization assessment as a systematic planned procedure with the objective of identifying ways to reduce or eliminate waste generation. The Manual is a management tool that blends science and management principles. The practice of managing waste minimization/pollution prevention makes use of the underlying organized science and engineering knowledge and applies it in the light of realities to gain a desired, practical result. The early stages of EPA's Pollution Prevention Research Program centered on the development of the Manual and its use at a number of facilities within the private and public sectors. This paper identifies a number of case studies and waste minimization opportunity assessment reports that demonstrate the value of using the Manual's approach. Several industry-specific waste minimization assessment manuals have resulted from the Manual's generic approach to waste minimization. There were some modifications to the Manual's generic approach when the waste stream has been other than industrial hazardous waste

  11. Gener: a minimal programming module for chemical controllers based on DNA strand displacement.

    Science.gov (United States)

    Kahramanoğulları, Ozan; Cardelli, Luca

    2015-09-01

    : Gener is a development module for programming chemical controllers based on DNA strand displacement. Gener is developed with the aim of providing a simple interface that minimizes the opportunities for programming errors: Gener allows the user to test the computations of the DNA programs based on a simple two-domain strand displacement algebra, the minimal available so far. The tool allows the user to perform stepwise computations with respect to the rules of the algebra as well as exhaustive search of the computation space with different options for exploration and visualization. Gener can be used in combination with existing tools, and in particular, its programs can be exported to Microsoft Research's DSD tool as well as to LaTeX. Gener is available for download at the Cosbi website at http://www.cosbi.eu/research/prototypes/gener as a windows executable that can be run on Mac OS X and Linux by using Mono. ozan@cosbi.eu. © The Author 2015. Published by Oxford University Press.

  12. Minimally Invasive Technique for PMMA Augmentation of Fenestrated Screws

    Directory of Open Access Journals (Sweden)

    Jan-Helge Klingler

    2015-01-01

    Full Text Available Purpose. To describe the minimally invasive technique for cement augmentation of cannulated and fenestrated screws using an injection cannula as well as to report its safety and efficacy. Methods. A total of 157 cannulated and fenestrated pedicle screws had been cement-augmented during minimally invasive posterior screw-rod spondylodesis in 35 patients from January to December 2012. Retrospective evaluation of cement extravasation and screw loosening was carried out in postoperative plain radiographs and thin-sliced triplanar computed tomography scans. Results. Twenty-seven, largely prevertebral cement extravasations were detected in 157 screws (17.2%. None of the cement extravasations was causing a clinical sequela like a new neurological deficit. One screw loosening was noted (0.6% after a mean follow-up of 12.8 months. We observed no cementation-associated complication like pulmonary embolism or hemodynamic insufficiency. Conclusions. The presented minimally invasive cement augmentation technique using an injection cannula facilitates convenient and safe cement delivery through polyaxial cannulated and fenestrated screws during minimally invasive screw-rod spondylodesis. Nevertheless, the optimal injection technique and design of fenestrated screws have yet to be identified. This trial is registered with German Clinical Trials DRKS00006726.

  13. A convergent overlapping domain decomposition method for total variation minimization

    KAUST Repository

    Fornasier, Massimo

    2010-06-22

    In this paper we are concerned with the analysis of convergent sequential and parallel overlapping domain decomposition methods for the minimization of functionals formed by a discrepancy term with respect to the data and a total variation constraint. To our knowledge, this is the first successful attempt of addressing such a strategy for the nonlinear, nonadditive, and nonsmooth problem of total variation minimization. We provide several numerical experiments, showing the successful application of the algorithm for the restoration of 1D signals and 2D images in interpolation/inpainting problems, respectively, and in a compressed sensing problem, for recovering piecewise constant medical-type images from partial Fourier ensembles. © 2010 Springer-Verlag.

  14. An investigation of the artifacts, outcomes, and processes of constructing computer games about environmental science in a fifth grade science classroom

    Science.gov (United States)

    Baytak, Ahmet

    Among educational researchers and practitioners, there is a growing interest in employing computer games for pedagogical purposes. The present research integrated a technology education class and a science class where 5 th graders learned about environmental issues by designing games that involved environmental concepts. The purposes of this study were to investigate how designing computer games affected the development of students' environmental knowledge, programming knowledge, environmental awareness and interest in computers. It also explored the nature of the artifacts developed and the types of knowledge represented therein. A case study (Yin, 2003) was employed within the context of a 5 th grade elementary science classroom. Fifth graders designed computer games about environmental issues to present to 2nd graders by using Scratch software. The analysis of this study was based on multiple data sources: students' pre- and post-test scores on environmental awareness, their environmental knowledge, their interest in computer science, and their game design. Included in the analyses were also data from students' computer games, participant observations, and structured interviews. The results of the study showed that students were able to successfully design functional games that represented their understanding of environment, even though the gain between pre- and post-environmental knowledge test and environmental awareness survey were minimal. The findings indicate that all students were able to use various game characteristics and programming concepts, but their prior experience with the design software affected their representations. The analyses of the interview transcriptions and games show that students improved their programming skills and that they wanted to do similar projects for other subject areas in the future. Observations showed that game design appeared to lead to knowledge-building, interaction and collaboration among students. This, in turn

  15. A computer-aided collection and construction system of terminology based on a statistically built knowledge base

    International Nuclear Information System (INIS)

    Konishi, O.; Miyahara, A.

    1988-04-01

    Since a terminology is a system of concepts to which terms should be assigned, automatic processings of such concepts seem to be difficult to be realized. In this paper, we propose a computer-aided collection and construction system of terminology which creates automatically a knowledge base using the technical terms and their frequency of appearance information extracted from a bibliographic database and selects the terminologies based on the knowledge base. The experts in specialized fields can be offered a set of candidates for the technical terms with the system and determined needed terminologies through the following important procedures: 1) the relations among important concepts in a specified field can be extracted; 2) the concepts in the lower classes can be searched by specifying those in the upper classes; 3) the important terms can be distinguished from the basic terms and then the most important terms can be selected. An application of the present system to nuclear fusion research is described in detail. (author)

  16. Metacognitive Knowledge in Relation to Inquiry Skills and Knowledge Acquisition Within a Computer-Supported Inquiry Learning Environment

    Directory of Open Access Journals (Sweden)

    Zrinka Ristić Dedić

    2014-04-01

    Full Text Available The study examines two components of metacognitive knowledge in the context of inquiry learning: metatask and metastrategic. Existing work on the topic has shown that adolescents often lacked metacognitive understanding necessary for optimal inquiry learning (Keselman & Kuhn, 2002; Kuhn, 2002a; Kuhn, Black, Keselman, & Kaplan, 2000, but demonstrated that engagement with inquiry tasks may improve it (Keselman, 2003; Kuhn & Pearsall, 1998.The aim of the study is to investigate the gains in metacognitive knowledge that occur as a result of repeated engagement with an inquiry learning task, and to examine the relationship between metacognitive knowledge and performance on the task.The participants were 34 eighth grade pupils, who participated in a self-directed experimentation task using the FILE programme (Hulshof, Wilhelm, Beishuizen, & van Rijn, 2005. The task required pupils to design and conduct experiments and to make inferences regarding the causal structure of a multivariable system. Pupils participated in four learning sessions over the course of one month. Metacognitive knowledge was assessed by the questionnaire before and after working in FILE.The results indicate that pupils improved in metacognitive knowledge following engagement with the task. However, many pupils showed insufficient metacognitive knowledge in the post-test and failed to apply newly achieved knowledge to the transfer task. Pupils who attained a higher level of metacognitive knowledge were more successful on the task than pupils who did not improve on metacognitive knowledge. A particular level of metacognitive understanding is a necessary, but not sufficient condition for successful performance on the task.

  17. Development of an instrument to measure health center (HC) personnel's computer use, knowledge and functionality demand for HC computerized information system in Thailand.

    Science.gov (United States)

    Kijsanayotin, Boonchai; Pannarunothai, Supasit; Speedie, Stuart

    2005-01-01

    Knowledge about socio-technical aspects of information technology (IT) is vital for the success of health IT projects. The Thailand health administration anticipates using health IT to support the recently implemented national universal health care system. However, the national knowledge associate with the socio-technical aspects of health IT has not been studied in Thailand. A survey instrument measuring Thai health center (HC) personnel's computer use, basic IT knowledge and HC computerized information system functionality needs was developed. The instrument reveals acceptable test-retest reliability and reasonable internal consistency of the measures. The future nation-wide demonstration study will benefit from this study.

  18. Smooth surfaces from bilinear patches: Discrete affine minimal surfaces

    KAUST Repository

    Käferböck, Florian

    2013-06-01

    Motivated by applications in freeform architecture, we study surfaces which are composed of smoothly joined bilinear patches. These surfaces turn out to be discrete versions of negatively curved affine minimal surfaces and share many properties with their classical smooth counterparts. We present computational design approaches and study special cases which should be interesting for the architectural application. 2013 Elsevier B.V.

  19. Why do people show minimal knowledge updating with task experience: inferential deficit or experimental artifact?

    Science.gov (United States)

    Hertzog, Christopher; Price, Jodi; Burpee, Ailis; Frentzel, William J; Feldstein, Simeon; Dunlosky, John

    2009-01-01

    Students generally do not have highly accurate knowledge about strategy effectiveness for learning, such as that imagery is superior to rote repetition. During multiple study-test trials using both strategies, participants' predictions about performance on List 2 do not markedly differ for the two strategies, even though List 1 recall is substantially greater for imagery. Two experiments evaluated whether such deficits in knowledge updating about the strategy effects were due to an experimental artifact or to inaccurate inferences about the effects the strategies had on recall. Participants studied paired associates on two study-test trials--they were instructed to study half using imagery and half using rote repetition. Metacognitive judgements tapped the quality of inferential processes about the strategy effects during the List 1 test and tapped gains in knowledge about the strategies across lists. One artifactual explanation--noncompliance with strategy instructions--was ruled out, whereas manipulations aimed at supporting the data available to inferential processes improved but did not fully repair knowledge updating.

  20. Applying Minimal Manual Principles for Documentation of Graphical User Interfaces.

    Science.gov (United States)

    Nowaczyk, Ronald H.; James, E. Christopher

    1993-01-01

    Investigates the need to include computer screens in documentation for software using a graphical user interface. Describes the uses and purposes of "minimal manuals" and their principles. Studies student reaction to their use of one of three on-screen manuals: screens, icon, and button. Finds some benefit for including icon and button…

  1. Logically automorphically equivalent knowledge bases

    OpenAIRE

    Aladova, Elena; Plotkin, Tatjana

    2017-01-01

    Knowledge bases theory provide an important example of the field where applications of universal algebra and algebraic logic look very natural, and their interaction with practical problems arising in computer science might be very productive. In this paper we study the equivalence problem for knowledge bases. Our interest is to find out how the informational equivalence is related to the logical description of knowledge. Studying various equivalences of knowledge bases allows us to compare d...

  2. Cognitive radio adaptation for power consumption minimization using biogeography-based optimization

    International Nuclear Information System (INIS)

    Qi Pei-Han; Zheng Shi-Lian; Yang Xiao-Niu; Zhao Zhi-Jin

    2016-01-01

    Adaptation is one of the key capabilities of cognitive radio, which focuses on how to adjust the radio parameters to optimize the system performance based on the knowledge of the radio environment and its capability and characteristics. In this paper, we consider the cognitive radio adaptation problem for power consumption minimization. The problem is formulated as a constrained power consumption minimization problem, and the biogeography-based optimization (BBO) is introduced to solve this optimization problem. A novel habitat suitability index (HSI) evaluation mechanism is proposed, in which both the power consumption minimization objective and the quality of services (QoS) constraints are taken into account. The results show that under different QoS requirement settings corresponding to different types of services, the algorithm can minimize power consumption while still maintaining the QoS requirements. Comparison with particle swarm optimization (PSO) and cat swarm optimization (CSO) reveals that BBO works better, especially at the early stage of the search, which means that the BBO is a better choice for real-time applications. (paper)

  3. Local empathy provides global minimization of congestion in communication networks

    Science.gov (United States)

    Meloni, Sandro; Gómez-Gardeñes, Jesús

    2010-11-01

    We present a mechanism to avoid congestion in complex networks based on a local knowledge of traffic conditions and the ability of routers to self-coordinate their dynamical behavior. In particular, routers make use of local information about traffic conditions to either reject or accept information packets from their neighbors. We show that when nodes are only aware of their own congestion state they self-organize into a hierarchical configuration that delays remarkably the onset of congestion although leading to a sharp first-order-like congestion transition. We also consider the case when nodes are aware of the congestion state of their neighbors. In this case, we show that empathy between nodes is strongly beneficial to the overall performance of the system and it is possible to achieve larger values for the critical load together with a smooth, second-order-like, transition. Finally, we show how local empathy minimize the impact of congestion as much as global minimization. Therefore, here we present an outstanding example of how local dynamical rules can optimize the system’s functioning up to the levels reached using global knowledge.

  4. Computer-aided Fault Tree Analysis

    International Nuclear Information System (INIS)

    Willie, R.R.

    1978-08-01

    A computer-oriented methodology for deriving minimal cut and path set families associated with arbitrary fault trees is discussed first. Then the use of the Fault Tree Analysis Program (FTAP), an extensive FORTRAN computer package that implements the methodology is described. An input fault tree to FTAP may specify the system state as any logical function of subsystem or component state variables or complements of these variables. When fault tree logical relations involve complements of state variables, the analyst may instruct FTAP to produce a family of prime implicants, a generalization of the minimal cut set concept. FTAP can also identify certain subsystems associated with the tree as system modules and provide a collection of minimal cut set families that essentially expresses the state of the system as a function of these module state variables. Another FTAP feature allows a subfamily to be obtained when the family of minimal cut sets or prime implicants is too large to be found in its entirety; this subfamily consists only of sets that are interesting to the analyst in a special sense

  5. Cyclone Simulation via Action Minimization

    Science.gov (United States)

    Plotkin, D. A.; Weare, J.; Abbot, D. S.

    2016-12-01

    A postulated impact of climate change is an increase in intensity of tropical cyclones (TCs). This hypothesized effect results from the fact that TCs are powered subsaturated boundary layer air picking up water vapor from the surface ocean as it flows inwards towards the eye. This water vapor serves as the energy input for TCs, which can be idealized as heat engines. The inflowing air has a nearly identical temperature as the surface ocean; therefore, warming of the surface leads to a warmer atmospheric boundary layer. By the Clausius-Clapeyron relationship, warmer boundary layer air can hold more water vapor and thus results in more energetic storms. Changes in TC intensity are difficult to predict due to the presence of fine structures (e.g. convective structures and rainbands) with length scales of less than 1 km, while general circulation models (GCMs) generally have horizontal resolutions of tens of kilometers. The models are therefore unable to capture these features, which are critical to accurately simulating cyclone structure and intensity. Further, strong TCs are rare events, meaning that long multi-decadal simulations are necessary to generate meaningful statistics about intense TC activity. This adds to the computational expense, making it yet more difficult to generate accurate statistics about long-term changes in TC intensity due to global warming via direct simulation. We take an alternative approach, applying action minimization techniques developed in molecular dynamics to the WRF weather/climate model. We construct artificial model trajectories that lead from quiescent (TC-free) states to TC states, then minimize the deviation of these trajectories from true model dynamics. We can thus create Monte Carlo model ensembles that are biased towards cyclogenesis, which reduces computational expense by limiting time spent in non-TC states. This allows for: 1) selective interrogation of model states with TCs; 2) finding the likeliest paths for

  6. A Novel Approach for Risk Minimization in Life-Cycle Oil Production Optimization

    DEFF Research Database (Denmark)

    Capolei, Andrea; Christiansen, Lasse Hjuler; Jørgensen, John Bagterp

    2017-01-01

    The oil research community has invested much effort into computer aided optimization to enhance oil recovery. While simulation studies have demonstrated the potential of model-based technology to improve industrial standards, the largely unknown geology of subsurface reservoirs limits applications...... to commercial oil fields. In particular, uncertain model descriptions lead to risks of profit loss. To address the challenges of geological uncertainty, this paper proposes offset risk minimization. As opposed to existing methodologies of the oil literature, the offset approach minimizes risk of profit loss...

  7. Analyzing Subject Disciplines of Knowledge Originality and Knowledge Generality for Library & Information Science

    Directory of Open Access Journals (Sweden)

    Mu-Hsuan Huang

    2007-12-01

    Full Text Available This study used bibliometric methods to analyze subject disciplines of knowledge originality and knowledge generality for Library and Information Science (LIS by using citing and cited documents from 1997 to 2006. We found that the major subject disciplines of knowledge originality and generality are still LIS, and computer science and LIS interact and influence each other closely. It is evident that number of subject disciplines of knowledge originality is higher than that of knowledge generality. The interdisciplinary characteristics of LIS are illustrated by variety areas of knowledge originality and knowledge generality. Because the number of received subject disciplines is higher than that of given subject disciplines, it suggests that LIS is an application-oriented research area. [Article content in Chinese

  8. Bridging the Science-Management Divide: Moving from Unidirectional Knowledge Transfer to Knowledge Interfacing and Sharing

    Directory of Open Access Journals (Sweden)

    Dirk J. Roux

    2006-06-01

    Full Text Available Sustainable ecosystem management relies on a diverse and multi-faceted knowledge system in which techniques are continuously updated to reflect current understanding and needs. The challenge is to minimize delay as ideas flow from intent through scientific capability, and finally to implementation to achieve desired outcomes. The best way to do this is by setting the stage for the flow of knowledge between researchers, policy makers, and resource managers. The cultural differences between these groups magnify the challenge. This paper highlights the importance of the tacit dimension of knowledge, and how this renders the concept of knowledge transfer much less useful than the concepts of information transfer and technology transfer. Instead of knowledge transfer, we propose that "co-production" of knowledge through collaborative learning between "experts" and "users" is a more suitable approach to building a knowledge system for the sustainable management of ecosystems. This can be achieved through knowledge interfacing and sharing, but requires a shift from a view of knowledge as a "thing" that can be transferred to viewing knowledge as a "process of relating" that involves negotiation of meaning among partners. Lessons from informal communities of practice provide guidance on how to nurture and promote knowledge interfacing between science and management in R&D programs.

  9. Minimal representations of supersymmetry and 1D N-extended σ-models

    International Nuclear Information System (INIS)

    Toppan, Francesco

    2008-01-01

    We discuss the minimal representations of the 1D N-Extended Supersymmetry algebra (the Z 2 -graded symmetry algebra of the Supersymmetric Quantum Mechanics) linearly realized on a finite number of fields depending on a real parameter t, the time. Their knowledge allows to construct one-dimensional sigma-models with extended off-shell supersymmetries without using superfields (author)

  10. Estimating biological elementary flux modes that decompose a flux distribution by the minimal branching property

    DEFF Research Database (Denmark)

    Chan, Siu Hung Joshua; Solem, Christian; Jensen, Peter Ruhdal

    2014-01-01

    biologically feasible EFMs by considering their graphical properties. A previous study on the transcriptional regulation of metabolic genes found that distinct branches at a branch point metabolite usually belong to distinct metabolic pathways. This suggests an intuitive property of biologically feasible EFMs......, i.e. minimal branching. RESULTS: We developed the concept of minimal branching EFM and derived the minimal branching decomposition (MBD) to decompose flux distributions. Testing in the core Escherichia coli metabolic network indicated that MBD can distinguish branches at branch points and greatly...... knowledge, which facilitates interpretation. Comparison of the methods applied to a complex flux distribution in Lactococcus lactis similarly showed the advantages of MBD. The minimal branching EFM concept underlying MBD should be useful in other applications....

  11. Image reconstruction in circular cone-beam computed tomography by constrained, total-variation minimization

    International Nuclear Information System (INIS)

    Sidky, Emil Y; Pan Xiaochuan

    2008-01-01

    An iterative algorithm, based on recent work in compressive sensing, is developed for volume image reconstruction from a circular cone-beam scan. The algorithm minimizes the total variation (TV) of the image subject to the constraint that the estimated projection data is within a specified tolerance of the available data and that the values of the volume image are non-negative. The constraints are enforced by the use of projection onto convex sets (POCS) and the TV objective is minimized by steepest descent with an adaptive step-size. The algorithm is referred to as adaptive-steepest-descent-POCS (ASD-POCS). It appears to be robust against cone-beam artifacts, and may be particularly useful when the angular range is limited or when the angular sampling rate is low. The ASD-POCS algorithm is tested with the Defrise disk and jaw computerized phantoms. Some comparisons are performed with the POCS and expectation-maximization (EM) algorithms. Although the algorithm is presented in the context of circular cone-beam image reconstruction, it can also be applied to scanning geometries involving other x-ray source trajectories

  12. Computing the eigenvalues and eigenvectors of a fuzzy matrix

    Directory of Open Access Journals (Sweden)

    A. Kumar

    2012-08-01

    Full Text Available Computation of fuzzy eigenvalues and fuzzy eigenvectors of a fuzzy matrix is a challenging problem. Determining the maximal and minimal symmetric solution can help to find the eigenvalues. So, we try to compute these eigenvalues by determining the maximal and minimal symmetric solution of the fully fuzzy linear system $widetilde{A}widetilde{X}= widetilde{lambda} widetilde{X}.$

  13. Complex functionality with minimal computation: Promise and pitfalls of reduced-tracer ocean biogeochemistry models

    Science.gov (United States)

    Galbraith, Eric D.; Dunne, John P.; Gnanadesikan, Anand; Slater, Richard D.; Sarmiento, Jorge L.; Dufour, Carolina O.; de Souza, Gregory F.; Bianchi, Daniele; Claret, Mariona; Rodgers, Keith B.; Marvasti, Seyedehsafoura Sedigh

    2015-12-01

    Earth System Models increasingly include ocean biogeochemistry models in order to predict changes in ocean carbon storage, hypoxia, and biological productivity under climate change. However, state-of-the-art ocean biogeochemical models include many advected tracers, that significantly increase the computational resources required, forcing a trade-off with spatial resolution. Here, we compare a state-of-the art model with 30 prognostic tracers (TOPAZ) with two reduced-tracer models, one with 6 tracers (BLING), and the other with 3 tracers (miniBLING). The reduced-tracer models employ parameterized, implicit biological functions, which nonetheless capture many of the most important processes resolved by TOPAZ. All three are embedded in the same coupled climate model. Despite the large difference in tracer number, the absence of tracers for living organic matter is shown to have a minimal impact on the transport of nutrient elements, and the three models produce similar mean annual preindustrial distributions of macronutrients, oxygen, and carbon. Significant differences do exist among the models, in particular the seasonal cycle of biomass and export production, but it does not appear that these are necessary consequences of the reduced tracer number. With increasing CO2, changes in dissolved oxygen and anthropogenic carbon uptake are very similar across the different models. Thus, while the reduced-tracer models do not explicitly resolve the diversity and internal dynamics of marine ecosystems, we demonstrate that such models are applicable to a broad suite of major biogeochemical concerns, including anthropogenic change. These results are very promising for the further development and application of reduced-tracer biogeochemical models that incorporate "sub-ecosystem-scale" parameterizations.

  14. Processes, data structures, and apparatuses for representing knowledge

    Science.gov (United States)

    Hohimer, Ryan E [West Richland, WA; Thomson, Judi R [Guelph, CA; Harvey, William J [Richland, WA; Paulson, Patrick R [Pasco, WA; Whiting, Mark A [Richland, WA; Tratz, Stephen C [Richland, WA; Chappell, Alan R [Seattle, WA; Butner, R Scott [Richland, WA

    2011-09-20

    Processes, data structures, and apparatuses to represent knowledge are disclosed. The processes can comprise labeling elements in a knowledge signature according to concepts in an ontology and populating the elements with confidence values. The data structures can comprise knowledge signatures stored on computer-readable media. The knowledge signatures comprise a matrix structure having elements labeled according to concepts in an ontology, wherein the value of the element represents a confidence that the concept is present in an information space. The apparatus can comprise a knowledge representation unit having at least one ontology stored on a computer-readable medium, at least one data-receiving device, and a processor configured to generate knowledge signatures by comparing datasets obtained by the data-receiving devices to the ontologies.

  15. Knowledge Representation: A Brief Review.

    Science.gov (United States)

    Vickery, B. C.

    1986-01-01

    Reviews different structures and techniques of knowledge representation: structure of database records and files, data structures in computer programming, syntatic and semantic structure of natural language, knowledge representation in artificial intelligence, and models of human memory. A prototype expert system that makes use of some of these…

  16. A new methodology for minimizing investment in the development of offshore fields

    International Nuclear Information System (INIS)

    Garcia-Diaz, J.C.; Startzman, R.; Hogg, G.L.

    1996-01-01

    The development of an offshore field is often a long, complex, and extremely expensive undertaking. The enormous amount of capital required for making investments of this type motivates one to try to optimize the development of a field. This paper provides an efficient computational method to minimize the initial investment in the development of a field. The problem of minimizing the investment in an offshore field is defined here as the problem of locating a number of offshore facilities and wells and allocating these wells to the facilities at minimum cost. Side constraints include restrictions on the total number of facilities of every type and design and various technology constraints. This problem is modeled as a 0/1 integer program. The solution method is based on an implicit enumeration scheme using efficient mathematical tools, such as Lagrangian relaxation and heuristics, to calculate good bounds and, consequently, to reduce the computation time. The solution method was implemented and tested on some typical field-development cases. Execution times were remarkably small for the size and complexity of the examples. Computational results indicate that the new methodology outperforms existing methods both in execution time and in memory required

  17. Comparing levels of school performance to science teachers' reports on knowledge/skills, instructional use and student use of computers

    Science.gov (United States)

    Kerr, Rebecca

    The purpose of this descriptive quantitative and basic qualitative study was to examine fifth and eighth grade science teachers' responses, perceptions of the role of technology in the classroom, and how they felt that computer applications, tools, and the Internet influence student understanding. The purposeful sample included survey and interview responses from fifth grade and eighth grade general and physical science teachers. Even though they may not be generalizable to other teachers or classrooms due to a low response rate, findings from this study indicated teachers with fewer years of teaching science had a higher level of computer use but less computer access, especially for students, in the classroom. Furthermore, teachers' choice of professional development moderated the relationship between the level of school performance and teachers' knowledge/skills, with the most positive relationship being with workshops that occurred outside of the school. Eighteen interviews revealed that teachers perceived the role of technology in classroom instruction mainly as teacher-centered and supplemental, rather than student-centered activities.

  18. A computational model for knowledge-driven monitoring of nuclear power plant operators based on information theory

    International Nuclear Information System (INIS)

    Kim, Man Cheol; Seong, Poong Hyun

    2006-01-01

    To develop operator behavior models such as IDAC, quantitative models for the cognitive activities of nuclear power plant (NPP) operators in abnormal situations are essential. Among them, only few quantitative models for the monitoring and detection have been developed. In this paper, we propose a computational model for the knowledge-driven monitoring, which is also known as model-driven monitoring, of NPP operators in abnormal situations, based on the information theory. The basic assumption of the proposed model is that the probability that an operator shifts his or her attention to an information source is proportional to the expected information from the information source. A small experiment performed to evaluate the feasibility of the proposed model shows that the predictions made by the proposed model have high correlations with the experimental results. Even though it has been argued that heuristics might play an important role on human reasoning, we believe that the proposed model can provide part of the mathematical basis for developing quantitative models for knowledge-driven monitoring of NPP operators when NPP operators are assumed to behave very logically

  19. Context-dependent memory decay is evidence of effort minimization in motor learning: a computational study.

    Science.gov (United States)

    Takiyama, Ken

    2015-01-01

    Recent theoretical models suggest that motor learning includes at least two processes: error minimization and memory decay. While learning a novel movement, a motor memory of the movement is gradually formed to minimize the movement error between the desired and actual movements in each training trial, but the memory is slightly forgotten in each trial. The learning effects of error minimization trained with a certain movement are partially available in other non-trained movements, and this transfer of the learning effect can be reproduced by certain theoretical frameworks. Although most theoretical frameworks have assumed that a motor memory trained with a certain movement decays at the same speed during performing the trained movement as non-trained movements, a recent study reported that the motor memory decays faster during performing the trained movement than non-trained movements, i.e., the decay rate of motor memory is movement or context dependent. Although motor learning has been successfully modeled based on an optimization framework, e.g., movement error minimization, the type of optimization that can lead to context-dependent memory decay is unclear. Thus, context-dependent memory decay raises the question of what is optimized in motor learning. To reproduce context-dependent memory decay, I extend a motor primitive framework. Specifically, I introduce motor effort optimization into the framework because some previous studies have reported the existence of effort optimization in motor learning processes and no conventional motor primitive model has yet considered the optimization. Here, I analytically and numerically revealed that context-dependent decay is a result of motor effort optimization. My analyses suggest that context-dependent decay is not merely memory decay but is evidence of motor effort optimization in motor learning.

  20. Context-dependent memory decay is evidence of effort minimization in motor learning: A computational study

    Directory of Open Access Journals (Sweden)

    Ken eTakiyama

    2015-02-01

    Full Text Available Recent theoretical models suggest that motor learning includes at least two processes: error minimization and memory decay. While learning a novel movement, a motor memory of the movement is gradually formed to minimize the movement error between the desired and actual movements in each training trial, but the memory is slightly forgotten in each trial. The learning effects of error minimization trained with a certain movement are partially available in other non-trained movements, and this transfer of the learning effect can be reproduced by certain theoretical frameworks. Although most theoretical frameworks have assumed that a motor memory trained with a certain movement decays at the same speed during performing the trained movement as non-trained movements, a recent study reported that the motor memory decays faster during performing the trained movement than non-trained movements, i.e., the decay rate of motor memory is movement or context dependent. Although motor learning has been successfully modeled based on an optimization framework, e.g., movement error minimization, the type of optimization that can lead to context-dependent memory decay is unclear. Thus, context-dependent memory decay raises the question of what is optimized in motor learning. To reproduce context-dependent memory decay, I extend a motor primitive framework. Specifically, I introduce motor effort optimization into the framework because some previous studies have reported the existence of effort optimization in motor learning processes and no conventional motor primitive model has yet considered the optimization. Here, I analytically and numerically revealed that context-dependent decay is a result of motor effort optimization. My analyses suggest that context-dependent decay is not merely memory decay but is evidence of motor effort optimization in motor learning.

  1. Correction of harmonic motion and Kepler orbit based on the minimal momentum uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Won Sang, E-mail: mimip4444@hanmail.net [Department of Physics and Research Institute of Natural Science, College of Natural Science, Gyeongsang National University, Jinju 660-701 (Korea, Republic of); Hassanabadi, Hassan, E-mail: h.hasanabadi@shahroodut.ac.ir [Physics Department, Shahrood University of Technology, Shahrood (Iran, Islamic Republic of)

    2017-03-18

    In this paper we consider the deformed Heisenberg uncertainty principle with the minimal uncertainty in momentum which is called a minimal momentum uncertainty principle (MMUP). We consider MMUP in D-dimension and its classical analogue. Using these we investigate the MMUP effect for the harmonic motion and Kepler orbit. - Highlights: • We discussed minimal momentum uncertainty relation. • We considered MMUR in D-dimension and used the deformed Poisson bracket to find the classical mechanics based on the MMUR. • Using these we investigate the MMUR effect for the harmonic motion and Kepler orbit. • Especially, we computed the corrected precession angle for each case. • We found that the corrected precession angle is always positive.

  2. New trends in computational collective intelligence

    CERN Document Server

    Kim, Sang-Wook; Trawiński, Bogdan

    2015-01-01

    This book consists of 20 chapters in which the authors deal with different theoretical and practical aspects of new trends in Collective Computational Intelligence techniques. Computational Collective Intelligence methods and algorithms are one the current trending research topics from areas related to Artificial Intelligence, Soft Computing or Data Mining among others. Computational Collective Intelligence is a rapidly growing field that is most often understood as an AI sub-field dealing with soft computing methods which enable making group decisions and processing knowledge among autonomous units acting in distributed environments. Web-based Systems, Social Networks, and Multi-Agent Systems very often need these tools for working out consistent knowledge states, resolving conflicts and making decisions. The chapters included in this volume cover a selection of topics and new trends in several domains related to Collective Computational Intelligence: Language and Knowledge Processing, Data Mining Methods an...

  3. Minimally invasive lateral trans-psoas approach for tuberculosis of lumbar spine

    Directory of Open Access Journals (Sweden)

    Nitin Garg

    2014-01-01

    Full Text Available Anterior, posterolateral and posterior approaches are used for managing lumbar tuberculosis. Minimally invasive methods are being used increasingly for various disorders of the spine. This report presents the utility of lateral trans-psoas approach to the lumbar spine (LS using minimal access techniques, also known as direct lateral lumbar interbody fusion in 2 cases with tuberculosis of LS. Two patients with tuberculosis at L2-3 and L4-5 presented with back pain. Both had destruction and deformity of the vertebral body. The whole procedure comprising debridement and placement of iliac crest graft was performed using tubular retractors and was augmented by posterior fixation using percutaneous transpedicular screws. Both patients recovered well with no significant procedure related morbidity. Post-operative computed tomography scans showed appropriate position of the graft and instrumentation. At follow-up, both patients are ambulant with no progression of the deformity. Minimal access direct lateral transpsoas approach can be used for debridement and reconstruction of ventral column in tuberculous of Lumbar spine. This paper highlights the growing applications of minimal access surgery for spine.

  4. Estimating absolute configurational entropies of macromolecules: the minimally coupled subspace approach.

    Directory of Open Access Journals (Sweden)

    Ulf Hensen

    Full Text Available We develop a general minimally coupled subspace approach (MCSA to compute absolute entropies of macromolecules, such as proteins, from computer generated canonical ensembles. Our approach overcomes limitations of current estimates such as the quasi-harmonic approximation which neglects non-linear and higher-order correlations as well as multi-minima characteristics of protein energy landscapes. Here, Full Correlation Analysis, adaptive kernel density estimation, and mutual information expansions are combined and high accuracy is demonstrated for a number of test systems ranging from alkanes to a 14 residue peptide. We further computed the configurational entropy for the full 67-residue cofactor of the TATA box binding protein illustrating that MCSA yields improved results also for large macromolecular systems.

  5. Standard model of knowledge representation

    Science.gov (United States)

    Yin, Wensheng

    2016-09-01

    Knowledge representation is the core of artificial intelligence research. Knowledge representation methods include predicate logic, semantic network, computer programming language, database, mathematical model, graphics language, natural language, etc. To establish the intrinsic link between various knowledge representation methods, a unified knowledge representation model is necessary. According to ontology, system theory, and control theory, a standard model of knowledge representation that reflects the change of the objective world is proposed. The model is composed of input, processing, and output. This knowledge representation method is not a contradiction to the traditional knowledge representation method. It can express knowledge in terms of multivariate and multidimensional. It can also express process knowledge, and at the same time, it has a strong ability to solve problems. In addition, the standard model of knowledge representation provides a way to solve problems of non-precision and inconsistent knowledge.

  6. Knowledge-based utility

    International Nuclear Information System (INIS)

    Chwalowski, M.

    1997-01-01

    This presentation provides industry examples of successful marketing practices by companies facing deregulation and competition. The common thread through the examples is that long term survival of today's utility structure is dependent on the strategic role of knowledge. As opposed to regulated monopolies which usually own huge physical assets and have very little intelligence about their customers, unregulated enterprises tend to be knowledge-based, characterized by higher market value than book value. A knowledge-based enterprise gathers data, creates information and develops knowledge by leveraging it as a competitive weapon. It institutionalizes human knowledge as a corporate asset for use over and over again by the use of databases, computer networks, patents, billing, collection and customer services (BCCS), branded interfaces and management capabilities. Activities to become knowledge-based such as replacing inventory/fixed assets with information about material usage to reduce expenditure and achieve more efficient operations, and by focusing on integration and value-adding delivery capabilities, were reviewed

  7. Control by personal computer and Interface 1

    International Nuclear Information System (INIS)

    Kim, Eung Mug; Park, Sun Ho

    1989-03-01

    This book consists of three chapters. The first chapter deals with basic knowledge of micro computer control which are computer system, micro computer system, control of the micro computer and control system for calculator. The second chapter describes Interface about basic knowledge such as 8255 parallel interface, 6821 parallel interface, parallel interface of personal computer, reading BCD code in parallel interface, IEEE-488 interface, RS-232C interface and transmit data in personal computer and a measuring instrument. The third chapter includes control experiment by micro computer, experiment by eight bit computer and control experiment by machine code and BASIC.

  8. Minimizing size of decision trees for multi-label decision tables

    KAUST Repository

    Azad, Mohammad

    2014-09-29

    We used decision tree as a model to discover the knowledge from multi-label decision tables where each row has a set of decisions attached to it and our goal is to find out one arbitrary decision from the set of decisions attached to a row. The size of the decision tree can be small as well as very large. We study here different greedy as well as dynamic programming algorithms to minimize the size of the decision trees. When we compare the optimal result from dynamic programming algorithm, we found some greedy algorithms produce results which are close to the optimal result for the minimization of number of nodes (at most 18.92% difference), number of nonterminal nodes (at most 20.76% difference), and number of terminal nodes (at most 18.71% difference).

  9. Minimizing size of decision trees for multi-label decision tables

    KAUST Repository

    Azad, Mohammad; Moshkov, Mikhail

    2014-01-01

    We used decision tree as a model to discover the knowledge from multi-label decision tables where each row has a set of decisions attached to it and our goal is to find out one arbitrary decision from the set of decisions attached to a row. The size of the decision tree can be small as well as very large. We study here different greedy as well as dynamic programming algorithms to minimize the size of the decision trees. When we compare the optimal result from dynamic programming algorithm, we found some greedy algorithms produce results which are close to the optimal result for the minimization of number of nodes (at most 18.92% difference), number of nonterminal nodes (at most 20.76% difference), and number of terminal nodes (at most 18.71% difference).

  10. [Factors associated with condom use and knowledge about STD/AIDS among teenagers in public and private schools in São Paulo, Brazil].

    Science.gov (United States)

    Martins, Laura B Motta; da Costa-Paiva, Lúcia Helena S; Osis, Maria José D; de Sousa, Maria Helena; Pinto-Neto, Aarão M; Tadini, Valdir

    2006-02-01

    This study aimed to compare knowledge about STD/AIDS and identify the factors associated with adequate knowledge and consistent use of male condoms in teenagers from public and private schools in the city of São Paulo, Brazil. We selected 1,594 adolescents ranging 12 to 19 years of age in 13 public schools and 5 private schools to complete a questionnaire on knowledge of STD/AIDS and use of male condoms. Prevalence ratios were computed with a 95% confidence interval. The score on STD knowledge used a cutoff point corresponding to 50% of correct answers. Statistical tests were chi-square and Poisson multiple regression. Consistent use of male condoms was 60% in private and 57.1% in public schools (p > 0.05) and was associated with male gender and lower socioeconomic status. Female gender, higher schooling, enrollment in private school, Caucasian race, and being single were associated with higher knowledge of STDs. Teenagers from public and private schools have adequate knowledge of STD prevention, however this does not include the adoption of effective prevention. Educational programs and STD/AIDS awareness-raising should be expanded in order to minimize vulnerability.

  11. Option Pricing under Risk-Minimization Criterion in an Incomplete Market with the Finite Difference Method

    Directory of Open Access Journals (Sweden)

    Xinfeng Ruan

    2013-01-01

    Full Text Available We study option pricing with risk-minimization criterion in an incomplete market where the dynamics of the risky underlying asset is governed by a jump diffusion equation with stochastic volatility. We obtain the Radon-Nikodym derivative for the minimal martingale measure and a partial integro-differential equation (PIDE of European option. The finite difference method is employed to compute the European option valuation of PIDE.

  12. The Tarsoft software: A computer program to learn the elemental theory for the urban wastewater treatment; Tarsoft: programa de ordenador para el aprendizaje de la teoria basica sobre la depuracion de las aguas residuales urbanas

    Energy Technology Data Exchange (ETDEWEB)

    Parra Narvaez, R.; Baldasano Recio, J.M. [Universitat Politecnica de Catalunya. Barcelona (Spain)

    1998-12-31

    It` ve been developed a first version of the computer program Tarsoft, with the objective to be a Informatic contribution for the education and training of the knowledge about the urban wastewater treatment. This article describe in general form, the main characteristic of this computer program, the minimal hardware and software requirements, the theoretical topics than it include, the presentation form of the variety alfanumerical and graphical interactive information and design modules of the different treatment process. (Author) 20 refs.

  13. Radwaste minimization successes at Duke Power Company

    International Nuclear Information System (INIS)

    Lan, C.D.; Johnson, G.T.; Groves, D.C.; Smith, T.A.

    1996-01-01

    At Duke Power Company, open-quotes Culture Changeclose quotes is a common term that we have used to describe the incredible transformation. We are becoming a cost conscious, customer driven, highly competitive business. Nowhere has this change been more evident then in the way we process and dispose of our solid radioactive waste. With top-down management support, we have used team-based, formalized problem solving methods and have implemented many successful waste minimization programs. Through these programs, we have dramatically increased employees' awareness of the importance of waste minimization. As a result, we have been able to reduce both our burial volumes and our waste processing and disposal costs. In June, 1994, we invited EPRI to conduct assessments of our waste minimization programs at Oconee and Catawba nuclear stations. Included in the assessments were in-depth looks at contamination control, an inventory of items in the plant, the volume of waste generated in the plant and how it was processed, laundry reject data, site waste-handling operations, and plant open-quotes housekeepingclose quotes routines and process. One of the most important aspects of the assessment is the open-quotes dumpster dive,close quotes which is an evaluation of site dry active waste composition by sorting through approximately fifteen bags of radioactive waste. Finally, there was an evaluation of consumable used at each site in order to gain knowledge of items that could be standardized at all stations. With EPRI recommendations, we made several changes and standardized the items used. We have made significant progress in waste reduction. We realize, however, that we are aiming at a moving target and we still have room for improvement. As the price of processing and disposal (or storage) increases, we will continue to evaluate our waste minimization programs

  14. Minimizing Leg Length Discrepancy After Intramedullary Nailing of Comminuted Femoral Shaft Fractures: A Quality Improvement Initiative Using the Scout Computed Tomography Scanogram.

    Science.gov (United States)

    Gheraibeh, Petra; Vaidya, Rahul; Hudson, Ian; Meehan, Robert; Tonnos, Frederick; Sethi, Anil

    2018-05-01

    To prevent leg length discrepancy (LLD) after locked femoral nailing in patients with comminuted femoral shaft fractures. Prospective consecutive case series aimed at quality improvement. Level 1 Trauma Center PATIENTS:: Ninety-eight consecutive patients with a comminuted femoral shaft fracture underwent statically locked intramedullary nailing, with a focused attempt at minimizing LLD during surgery. A computed tomography scanogram of both legs was performed on postoperative day 1 to assess for residual LLD. Patients were offered the option to have LLD >1.5 cm corrected before discharge. LLD >1.5 cm. Twenty-one patients (21.4%) were found to have an LLD >1.5 cm. An LLD >1.5 cm occurred in 10/55 (18%) antegrade nail patients and 11/43 (26%) retrograde nail patients (P = 0.27). No difference was noted based on the mechanism of injury, surgeon training and OTA/AO type B versus C injury. Ninety of 98 patients left with 1.5 cm after locked intramedullary nailing for a comminuted femoral shaft fracture without being informed and the option of early correction. We recommend using a full-length computed tomography scanogram after IM nailing of comminuted femur fractures to prevent iatrogenic LLD. Therapeutic Level IV. See Instructions for Authors for a complete description of levels of evidence.

  15. Iterative CT reconstruction via minimizing adaptively reweighted total variation.

    Science.gov (United States)

    Zhu, Lei; Niu, Tianye; Petrongolo, Michael

    2014-01-01

    Iterative reconstruction via total variation (TV) minimization has demonstrated great successes in accurate CT imaging from under-sampled projections. When projections are further reduced, over-smoothing artifacts appear in the current reconstruction especially around the structure boundaries. We propose a practical algorithm to improve TV-minimization based CT reconstruction on very few projection data. Based on the theory of compressed sensing, the L-0 norm approach is more desirable to further reduce the projection views. To overcome the computational difficulty of the non-convex optimization of the L-0 norm, we implement an adaptive weighting scheme to approximate the solution via a series of TV minimizations for practical use in CT reconstruction. The weight on TV is initialized as uniform ones, and is automatically changed based on the gradient of the reconstructed image from the previous iteration. The iteration stops when a small difference between the weighted TV values is observed on two consecutive reconstructed images. We evaluate the proposed algorithm on both a digital phantom and a physical phantom. Using 20 equiangular projections, our method reduces reconstruction errors in the conventional TV minimization by a factor of more than 5, with improved spatial resolution. By adaptively reweighting TV in iterative CT reconstruction, we successfully further reduce the projection number for the same or better image quality.

  16. OxMaR: open source free software for online minimization and randomization for clinical trials.

    Science.gov (United States)

    O'Callaghan, Christopher A

    2014-01-01

    Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.

  17. OxMaR: open source free software for online minimization and randomization for clinical trials.

    Directory of Open Access Journals (Sweden)

    Christopher A O'Callaghan

    Full Text Available Minimization is a valuable method for allocating participants between the control and experimental arms of clinical studies. The use of minimization reduces differences that might arise by chance between the study arms in the distribution of patient characteristics such as gender, ethnicity and age. However, unlike randomization, minimization requires real time assessment of each new participant with respect to the preceding distribution of relevant participant characteristics within the different arms of the study. For multi-site studies, this necessitates centralized computational analysis that is shared between all study locations. Unfortunately, there is no suitable freely available open source or free software that can be used for this purpose. OxMaR was developed to enable researchers in any location to use minimization for patient allocation and to access the minimization algorithm using any device that can connect to the internet such as a desktop computer, tablet or mobile phone. The software is complete in itself and requires no special packages or libraries to be installed. It is simple to set up and run over the internet using online facilities which are very low cost or even free to the user. Importantly, it provides real time information on allocation to the study lead or administrator and generates real time distributed backups with each allocation. OxMaR can readily be modified and customised and can also be used for standard randomization. It has been extensively tested and has been used successfully in a low budget multi-centre study. Hitherto, the logistical difficulties involved in minimization have precluded its use in many small studies and this software should allow more widespread use of minimization which should lead to studies with better matched control and experimental arms. OxMaR should be particularly valuable in low resource settings.

  18. Computed tomography for preoperative planning in minimal-invasive total hip arthroplasty: Radiation exposure and cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huppertz, Alexander, E-mail: Alexander.Huppertz@charite.de [Imaging Science Institute Charite Berlin, Robert-Koch-Platz 7, D-10115 Berlin (Germany); Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Radmer, Sebastian, E-mail: s.radmer@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany); Asbach, Patrick, E-mail: Patrick.Asbach@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Juran, Ralf, E-mail: ralf.juran@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Schwenke, Carsten, E-mail: carsten.schwenke@scossis.de [Biostatistician, Scossis Statistical Consulting, Zeltinger Str. 58G, D-13465 Berlin (Germany); Diederichs, Gerd, E-mail: gerd.diederichs@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Hamm, Bernd, E-mail: Bernd.Hamm@charite.de [Department of Radiology, Medical Physics, Charite-University Hospitals of Berlin, Chariteplatz 1, D-10117 Berlin (Germany); Sparmann, Martin, E-mail: m.sparmann@immanuel.de [Department of Orthopedic Surgery and Rheumatology, Immanuel-Krankenhaus, Koenigstr. 63, D-14109, Berlin (Germany)

    2011-06-15

    Computed tomography (CT) was used for preoperative planning of minimal-invasive total hip arthroplasty (THA). 92 patients (50 males, 42 females, mean age 59.5 years) with a mean body-mass-index (BMI) of 26.5 kg/m{sup 2} underwent 64-slice CT to depict the pelvis, the knee and the ankle in three independent acquisitions using combined x-, y-, and z-axis tube current modulation. Arthroplasty planning was performed using 3D-Hip Plan (Symbios, Switzerland) and patient radiation dose exposure was determined. The effects of BMI, gender, and contralateral THA on the effective dose were evaluated by an analysis-of-variance. A process-cost-analysis from the hospital perspective was done. All CT examinations were of sufficient image quality for 3D-THA planning. A mean effective dose of 4.0 mSv (SD 0.9 mSv) modeled by the BMI (p < 0.0001) was calculated. The presence of a contralateral THA (9/92 patients; p = 0.15) and the difference between males and females were not significant (p = 0.08). Personnel involved were the radiologist (4 min), the surgeon (16 min), the radiographer (12 min), and administrative personnel (4 min). A CT operation time of 11 min and direct per-patient costs of 52.80 Euro were recorded. Preoperative CT for THA was associated with a slight and justifiable increase of radiation exposure in comparison to conventional radiographs and low per-patient costs.

  19. Minimal-Learning-Parameter Technique Based Adaptive Neural Sliding Mode Control of MEMS Gyroscope

    Directory of Open Access Journals (Sweden)

    Bin Xu

    2017-01-01

    Full Text Available This paper investigates an adaptive neural sliding mode controller for MEMS gyroscopes with minimal-learning-parameter technique. Considering the system uncertainty in dynamics, neural network is employed for approximation. Minimal-learning-parameter technique is constructed to decrease the number of update parameters, and in this way the computation burden is greatly reduced. Sliding mode control is designed to cancel the effect of time-varying disturbance. The closed-loop stability analysis is established via Lyapunov approach. Simulation results are presented to demonstrate the effectiveness of the method.

  20. Enumeration of minimal stoichiometric precursor sets in metabolic networks.

    Science.gov (United States)

    Andrade, Ricardo; Wannagat, Martin; Klein, Cecilia C; Acuña, Vicente; Marchetti-Spaccamela, Alberto; Milreu, Paulo V; Stougie, Leen; Sagot, Marie-France

    2016-01-01

    What an organism needs at least from its environment to produce a set of metabolites, e.g. target(s) of interest and/or biomass, has been called a minimal precursor set. Early approaches to enumerate all minimal precursor sets took into account only the topology of the metabolic network (topological precursor sets). Due to cycles and the stoichiometric values of the reactions, it is often not possible to produce the target(s) from a topological precursor set in the sense that there is no feasible flux. Although considering the stoichiometry makes the problem harder, it enables to obtain biologically reasonable precursor sets that we call stoichiometric. Recently a method to enumerate all minimal stoichiometric precursor sets was proposed in the literature. The relationship between topological and stoichiometric precursor sets had however not yet been studied. Such relationship between topological and stoichiometric precursor sets is highlighted. We also present two algorithms that enumerate all minimal stoichiometric precursor sets. The first one is of theoretical interest only and is based on the above mentioned relationship. The second approach solves a series of mixed integer linear programming problems. We compared the computed minimal precursor sets to experimentally obtained growth media of several Escherichia coli strains using genome-scale metabolic networks. The results show that the second approach efficiently enumerates minimal precursor sets taking stoichiometry into account, and allows for broad in silico studies of strains or species interactions that may help to understand e.g. pathotype and niche-specific metabolic capabilities. sasita is written in Java, uses cplex as LP solver and can be downloaded together with all networks and input files used in this paper at http://www.sasita.gforge.inria.fr.

  1. Accounting Knowledge Representation in PROLOG Language

    Directory of Open Access Journals (Sweden)

    Bogdan Patrut

    2010-03-01

    Full Text Available This paper presents some original techniques for implementing accounting knowledge in PROLOG language. We will represent rules of operation of accounts, the texts of accounting operations, and how to compute the depreciation.Keywords: accounting, knowledge representation, PROLOG, depreciation, natural language processing

  2. Discontinuity minimization for omnidirectional video projections

    Science.gov (United States)

    Alshina, Elena; Zakharchenko, Vladyslav

    2017-09-01

    Advances in display technologies both for head mounted devices and television panels demand resolution increase beyond 4K for source signal in virtual reality video streaming applications. This poses a problem of content delivery trough a bandwidth limited distribution networks. Considering a fact that source signal covers entire surrounding space investigation reviled that compression efficiency may fluctuate 40% in average depending on origin selection at the conversion stage from 3D space to 2D projection. Based on these knowledge the origin selection algorithm for video compression applications has been proposed. Using discontinuity entropy minimization function projection origin rotation may be defined to provide optimal compression results. Outcome of this research may be applied across various video compression solutions for omnidirectional content.

  3. Physiology in Medicine: Understanding dynamic alveolar physiology to minimize ventilator-induced lung injury.

    Science.gov (United States)

    Nieman, Gary F; Satalin, Josh; Kollisch-Singule, Michaela; Andrews, Penny; Aiash, Hani; Habashi, Nader M; Gatto, Louis A

    2017-06-01

    Acute respiratory distress syndrome (ARDS) remains a serious clinical problem with the main treatment being supportive in the form of mechanical ventilation. However, mechanical ventilation can be a double-edged sword: if set improperly, it can exacerbate the tissue damage caused by ARDS; this is known as ventilator-induced lung injury (VILI). To minimize VILI, we must understand the pathophysiologic mechanisms of tissue damage at the alveolar level. In this Physiology in Medicine paper, the dynamic physiology of alveolar inflation and deflation during mechanical ventilation will be reviewed. In addition, the pathophysiologic mechanisms of VILI will be reviewed, and this knowledge will be used to suggest an optimal mechanical breath profile (MB P : all airway pressures, volumes, flows, rates, and the duration that they are applied at both inspiration and expiration) necessary to minimize VILI. Our review suggests that the current protective ventilation strategy, known as the "open lung strategy," would be the optimal lung-protective approach. However, the viscoelastic behavior of dynamic alveolar inflation and deflation has not yet been incorporated into protective mechanical ventilation strategies. Using our knowledge of dynamic alveolar mechanics (i.e., the dynamic change in alveolar and alveolar duct size and shape during tidal ventilation) to modify the MB P so as to minimize VILI will reduce the morbidity and mortality associated with ARDS. Copyright © 2017 the American Physiological Society.

  4. Computer vision syndrome prevalence, knowledge and associated factors among Saudi Arabia University Students: Is it a serious problem?

    Science.gov (United States)

    Al Rashidi, Sultan H; Alhumaidan, H

    2017-01-01

    Computers and other visual display devices are now an essential part of our daily life. With the increased use, a very large population is experiencing sundry ocular symptoms globally such as dry eyes, eye strain, irritation, and redness of the eyes to name a few. Collectively, all such computer related symptoms are usually referred to as computer vision syndrome (CVS). The current study aims to define the prevalence, knowledge in community, pathophysiology, factors associated, and prevention of CVS. This is a cross-sectional study conducted in Qassim University College of Medicine during a period of 1 year from January 2015 to January 2016 using a questionnaire to collect relevant data including demographics and various variables to be studied. 634 students were inducted from a public sector University of Qassim, Saudi Arabia, regardless of their age and gender. The data were then statistically analyzed on SPSS version 22, and the descriptive data were expressed as percentages, mode, and median using graphs where needed. A total of 634 students with a mean age of 21. 40, Std 1.997 and Range 7 (18-25) were included as study subjects with a male predominance (77.28%). Of the total patients, majority (459, 72%) presented with acute symptoms while remaining had chronic problems. A clear-cut majority was carrying the symptoms for 1 month. The statistical analysis revealed serious symptoms in the majority of study subjects especially those who are permanent users of a computer for long hours. Continuous use of computers for long hours is found to have severe problems of vision especially in those who are using computers and similar devices for a long duration.

  5. Cloud Computing Organizational Benefits : A Managerial concern

    OpenAIRE

    Mandala, Venkata Bhaskar Reddy; Chandra, Marepalli Sharat

    2012-01-01

    Context: Software industry is looking for new methods and opportunities to reduce the project management problems and operational costs. Cloud Computing concept is providing answers to these problems. Cloud Computing is made possible with the availability of high internet bandwidth. Cloud Computing is providing wide range of various services to varied customer base. Cloud Computing has some key elements such as on-demand services, large pool of configurable computing resources and minimal man...

  6. Minimally invasive orthognathic surgery.

    Science.gov (United States)

    Resnick, Cory M; Kaban, Leonard B; Troulis, Maria J

    2009-02-01

    Minimally invasive surgery is defined as the discipline in which operative procedures are performed in novel ways to diminish the sequelae of standard surgical dissections. The goals of minimally invasive surgery are to reduce tissue trauma and to minimize bleeding, edema, and injury, thereby improving the rate and quality of healing. In orthognathic surgery, there are two minimally invasive techniques that can be used separately or in combination: (1) endoscopic exposure and (2) distraction osteogenesis. This article describes the historical developments of the fields of orthognathic surgery and minimally invasive surgery, as well as the integration of the two disciplines. Indications, techniques, and the most current outcome data for specific minimally invasive orthognathic surgical procedures are presented.

  7. Regularity of Minimal Surfaces

    CERN Document Server

    Dierkes, Ulrich; Tromba, Anthony J; Kuster, Albrecht

    2010-01-01

    "Regularity of Minimal Surfaces" begins with a survey of minimal surfaces with free boundaries. Following this, the basic results concerning the boundary behaviour of minimal surfaces and H-surfaces with fixed or free boundaries are studied. In particular, the asymptotic expansions at interior and boundary branch points are derived, leading to general Gauss-Bonnet formulas. Furthermore, gradient estimates and asymptotic expansions for minimal surfaces with only piecewise smooth boundaries are obtained. One of the main features of free boundary value problems for minimal surfaces is t

  8. Minimizing Overhead for Secure Computation and Fully Homomorphic Encryption: Overhead

    Science.gov (United States)

    2015-11-01

    application for this technology is mobile devices: the preparation work can be performed while the phone is plugged into a power source, then it can later...handle large realistic security parameters. Therefore, we looked into the possibility of augmenting the SAGE system with a backend that could handle...limited mobile devices and yet have ready access to cloud-based computing resources. The techniques we propose form part of a growing line of work aimed

  9. Super-acceleration from massless, minimally coupled phi sup 4

    CERN Document Server

    Onemli, V K

    2002-01-01

    We derive a simple form for the propagator of a massless, minimally coupled scalar in a locally de Sitter geometry of arbitrary spacetime dimension. We then employ it to compute the fully renormalized stress tensor at one- and two-loop orders for a massless, minimally coupled phi sup 4 theory which is released in Bunch-Davies vacuum at t=0 in co-moving coordinates. In this system, the uncertainty principle elevates the scalar above the minimum of its potential, resulting in a phase of super-acceleration. With the non-derivative self-interaction the scalar's breaking of de Sitter invariance becomes observable. It is also worth noting that the weak-energy condition is violated on cosmological scales. An interesting subsidiary result is that cancelling overlapping divergences in the stress tensor requires a conformal counterterm which has no effect on purely scalar diagrams.

  10. Nuclear Knowledge Loss Risk Management (Lessons Learned, Implementation Experiences)

    International Nuclear Information System (INIS)

    Květoňová, Romana

    2014-01-01

    In the years 2007/2008 the Knowledge Management has emerged as one of the prime concerns in our HRM system. Based on the KM best practice data gathering, surveys and analyses, the detailed concept has been proposed and implemented primarily in our nuclear production units. Main objectives: • To identify, maintain and develop the unique knowledge; • To share the critical knowledge and the best practices; • To save the organization from critical capabilities and minimize the duplication effort; • To set up the succession planning system for the knowledge holders with potential knowledge loss; • To create effective system for the knowledge record sharing and its updating; • Further implementation of KM within production division as well as extension into another divisions

  11. Simulated lumbar minimally invasive surgery educational model with didactic and technical components.

    Science.gov (United States)

    Chitale, Rohan; Ghobrial, George M; Lobel, Darlene; Harrop, James

    2013-10-01

    The learning and development of technical skills are paramount for neurosurgical trainees. External influences and a need for maximizing efficiency and proficiency have encouraged advancements in simulator-based learning models. To confirm the importance of establishing an educational curriculum for teaching minimally invasive techniques of pedicle screw placement using a computer-enhanced physical model of percutaneous pedicle screw placement with simultaneous didactic and technical components. A 2-hour educational curriculum was created to educate neurosurgical residents on anatomy, pathophysiology, and technical aspects associated with image-guided pedicle screw placement. Predidactic and postdidactic practical and written scores were analyzed and compared. Scores were calculated for each participant on the basis of the optimal pedicle screw starting point and trajectory for both fluoroscopy and computed tomographic navigation. Eight trainees participated in this module. Average mean scores on the written didactic test improved from 78% to 100%. The technical component scores for fluoroscopic guidance improved from 58.8 to 52.9. Technical score for computed tomography-navigated guidance also improved from 28.3 to 26.6. Didactic and technical quantitative scores with a simulator-based educational curriculum improved objectively measured resident performance. A minimally invasive spine simulation model and curriculum may serve a valuable function in the education of neurosurgical residents and outcomes for patients.

  12. Assessment of a computer-based Taenia solium health education tool 'The Vicious Worm' on knowledge uptake among professionals and their attitudes towards the program.

    Science.gov (United States)

    Ertel, Rebekka Lund; Braae, Uffe Christian; Ngowi, Helena Aminiel; Johansen, Maria Vang

    2017-01-01

    Health education has been recognised as a specific intervention tool for control of Taenia solium taeniosis/cysticercosis but evaluation of the efficacy of the tool remains. The aim of our study was to assess the effect of a computer-based T. solium health education tool 'The Vicious Worm' on knowledge uptake among professionals and investigate attitudes towards the program. The study was carried out between March and May 2014 in Mbeya Region, Tanzania, where T. solium is endemic. The study was a pre and post assessment of a health education tool based on questionnaire surveys and focus group discussions to investigate knowledge and attitudes. A total of 79 study subjects participated in the study including study subjects from both health- and agriculture sector. The health education consisted of 1½h individual practice with the computer program. The baseline questionnaire showed an overall knowledge on aspects of acquisition and transmission of T. solium infections (78%), porcine cysticercosis treatment (77%), human tapeworm in general (72%), neurocysticercosis in general (49%), and porcine cysticercosis diagnosis (48%). However, there was a lack of knowledge on acquisition of neurocysticercosis (15%), prevention of T. solium taeniosis/cysticercosis (28%), and relation between porcine cysticercosis, human cysticercosis, and taeniosis (32%). Overall, the study subject's knowledge was significantly improved both immediately after (p=0.001) and two weeks after (pthe health education and knowledge regarding specific aspects was significantly improved in most aspects immediately after and two weeks after the health education. The focus group discussions showed positive attitudes towards the program and the study subjects found 'The Vicious Worm' efficient, simple, and appealing. The study revealed a good effect of 'The Vicious Worm' suggesting that it could be a useful health education tool, which should be further assessed and thereafter integrated in T. solium

  13. Parent-administered computer-assisted tutoring targeting letter-sound knowledge: Evaluation via multiple-baseline across three preschool students.

    Science.gov (United States)

    DuBois, Matthew R; Volpe, Robert J; Burns, Matthew K; Hoffman, Jessica A

    2016-12-01

    Knowledge of letters sounds has been identified as a primary objective of preschool instruction and intervention. Despite this designation, large disparities exist in the number of letter sounds children know at school entry. Enhancing caregivers' ability to teach their preschool-aged children letter sounds may represent an effective practice for reducing this variability and ensuring that more children are prepared to experience early school success. This study used a non-concurrent multiple-baseline-across-participants design to evaluate the effectiveness of caregivers (N=3) delivering a computer-assisted tutoring program (Tutoring Buddy) targeting letter sound knowledge to their preschool-aged children. Visual analyses and effect size estimates derived from Percentage of All Non-Overlapping Data (PAND) statistics indicated consistent results for letter sound acquisition, as 6weeks of intervention yielded large effects for letter sound knowledge (LSK) across all three children. Large effect sizes were also found for letter sound fluency (LSF) and nonsense word fluency (NWF) for two children. All three caregivers rated the intervention as highly usable and were able to administer it with high levels of fidelity. Taken together, the results of the present study found Tutoring Buddy to be an effective, simple, and usable way for the caregivers to support their children's literacy development. Copyright © 2016 Society for the Study of School Psychology. Published by Elsevier Ltd. All rights reserved.

  14. Product Knowledge Modelling and Management

    DEFF Research Database (Denmark)

    Zhang, Y.; MacCallum, K. J.; Duffy, Alex

    1996-01-01

    function-oriented design. Both Specific Product Knowledge and Product Domain Knowledge are modelled at two levels, a meta-model and an information-level.Following that, a computer-based scheme to manage the proposed product lknowledge models within a dynamically changing environment is presented.......The term, Product Knowledge is used to refer to two related but distinct concepts; the knowledge of a specific product (Specific Product Knowledge) and the knowledge of a product domain (Product Domain Knowledge). Modelling and managing Product Knowlege is an essential part of carrying out design.......A scheme is presented in this paper to model, i.e. classify, structure and formalise the product knowledge for the purpose of supporting function-oriented design. The product design specification and four types of required attributes of a specific product have been identified to form the Specific Product...

  15. Minimal Poems Written in 1979 Minimal Poems Written in 1979

    Directory of Open Access Journals (Sweden)

    Sandra Sirangelo Maggio

    2008-04-01

    Full Text Available The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism. The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism.

  16. Description of the TREBIL, CRESSEX and STREUSL computer programs, that belongs to RALLY computer code pack for the analysis of reliability systems

    International Nuclear Information System (INIS)

    Fernandes Filho, T.L.

    1982-11-01

    The RALLY computer code pack (RALLY pack) is a set of computer codes destinate to the reliability of complex systems, aiming to a risk analysis. Three of the six codes, are commented, presenting their purpose, input description, calculation methods and results obtained with each one of those computer codes. The computer codes are: TREBIL, to obtain the fault tree logical equivalent; CRESSEX, to obtain the minimal cut and the punctual values of the non-reliability and non-availability of the system; and STREUSL, for the dispersion calculation of those values around the media. In spite of the CRESSEX, in its version available at CNEN, uses a little long method to obtain the minimal cut in an HB-CNEN system, the three computer programs show good results, mainly the STREUSL, which permits the simulation of various components. (E.G.) [pt

  17. Transformation of general binary MRF minimization to the first-order case.

    Science.gov (United States)

    Ishikawa, Hiroshi

    2011-06-01

    We introduce a transformation of general higher-order Markov random field with binary labels into a first-order one that has the same minima as the original. Moreover, we formalize a framework for approximately minimizing higher-order multi-label MRF energies that combines the new reduction with the fusion-move and QPBO algorithms. While many computer vision problems today are formulated as energy minimization problems, they have mostly been limited to using first-order energies, which consist of unary and pairwise clique potentials, with a few exceptions that consider triples. This is because of the lack of efficient algorithms to optimize energies with higher-order interactions. Our algorithm challenges this restriction that limits the representational power of the models so that higher-order energies can be used to capture the rich statistics of natural scenes. We also show that some minimization methods can be considered special cases of the present framework, as well as comparing the new method experimentally with other such techniques.

  18. A Variance Minimization Criterion to Feature Selection Using Laplacian Regularization.

    Science.gov (United States)

    He, Xiaofei; Ji, Ming; Zhang, Chiyuan; Bao, Hujun

    2011-10-01

    In many information processing tasks, one is often confronted with very high-dimensional data. Feature selection techniques are designed to find the meaningful feature subset of the original features which can facilitate clustering, classification, and retrieval. In this paper, we consider the feature selection problem in unsupervised learning scenarios, which is particularly difficult due to the absence of class labels that would guide the search for relevant information. Based on Laplacian regularized least squares, which finds a smooth function on the data manifold and minimizes the empirical loss, we propose two novel feature selection algorithms which aim to minimize the expected prediction error of the regularized regression model. Specifically, we select those features such that the size of the parameter covariance matrix of the regularized regression model is minimized. Motivated from experimental design, we use trace and determinant operators to measure the size of the covariance matrix. Efficient computational schemes are also introduced to solve the corresponding optimization problems. Extensive experimental results over various real-life data sets have demonstrated the superiority of the proposed algorithms.

  19. Correlates of minimal dating.

    Science.gov (United States)

    Leck, Kira

    2006-10-01

    Researchers have associated minimal dating with numerous factors. The present author tested shyness, introversion, physical attractiveness, performance evaluation, anxiety, social skill, social self-esteem, and loneliness to determine the nature of their relationships with 2 measures of self-reported minimal dating in a sample of 175 college students. For women, shyness, introversion, physical attractiveness, self-rated anxiety, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. For men, physical attractiveness, observer-rated social skill, social self-esteem, and loneliness correlated with 1 or both measures of minimal dating. The patterns of relationships were not identical for the 2 indicators of minimal dating, indicating the possibility that minimal dating is not a single construct as researchers previously believed. The present author discussed implications and suggestions for future researchers.

  20. Marrying Content and Process in Computer Science Education

    Science.gov (United States)

    Zendler, A.; Spannagel, C.; Klaudt, D.

    2011-01-01

    Constructivist approaches to computer science education emphasize that as well as knowledge, thinking skills and processes are involved in active knowledge construction. K-12 computer science curricula must not be based on fashions and trends, but on contents and processes that are observable in various domains of computer science, that can be…

  1. Finding minimal action sequences with a simple evaluation of actions

    Science.gov (United States)

    Shah, Ashvin; Gurney, Kevin N.

    2014-01-01

    Animals are able to discover the minimal number of actions that achieves an outcome (the minimal action sequence). In most accounts of this, actions are associated with a measure of behavior that is higher for actions that lead to the outcome with a shorter action sequence, and learning mechanisms find the actions associated with the highest measure. In this sense, previous accounts focus on more than the simple binary signal of “was the outcome achieved?”; they focus on “how well was the outcome achieved?” However, such mechanisms may not govern all types of behavioral development. In particular, in the process of action discovery (Redgrave and Gurney, 2006), actions are reinforced if they simply lead to a salient outcome because biological reinforcement signals occur too quickly to evaluate the consequences of an action beyond an indication of the outcome's occurrence. Thus, action discovery mechanisms focus on the simple evaluation of “was the outcome achieved?” and not “how well was the outcome achieved?” Notwithstanding this impoverishment of information, can the process of action discovery find the minimal action sequence? We address this question by implementing computational mechanisms, referred to in this paper as no-cost learning rules, in which each action that leads to the outcome is associated with the same measure of behavior. No-cost rules focus on “was the outcome achieved?” and are consistent with action discovery. No-cost rules discover the minimal action sequence in simulated tasks and execute it for a substantial amount of time. Extensive training, however, results in extraneous actions, suggesting that a separate process (which has been proposed in action discovery) must attenuate learning if no-cost rules participate in behavioral development. We describe how no-cost rules develop behavior, what happens when attenuation is disrupted, and relate the new mechanisms to wider computational and biological context. PMID:25506326

  2. Computer sciences

    Science.gov (United States)

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  3. Facilitating Preschoolers' Scientific Knowledge Construction via Computer Games Regarding Light and Shadow: The Effect of the Prediction-Observation-Explanation (POE) Strategy

    Science.gov (United States)

    Hsu, Chung-Yuan; Tsai, Chin-Chung; Liang, Jyh-Chong

    2011-10-01

    Educational researchers have suggested that computer games have a profound influence on students' motivation, knowledge construction, and learning performance, but little empirical research has targeted preschoolers. Thus, the purpose of the present study was to investigate the effects of implementing a computer game that integrates the prediction-observation-explanation (POE) strategy (White and Gunstone in Probing understanding. Routledge, New York, 1992) on facilitating preschoolers' acquisition of scientific concepts regarding light and shadow. The children's alternative conceptions were explored as well. Fifty participants were randomly assigned into either an experimental group that played a computer game integrating the POE model or a control group that played a non-POE computer game. By assessing the students' conceptual understanding through interviews, this study revealed that the students in the experimental group significantly outperformed their counterparts in the concepts regarding "shadow formation in daylight" and "shadow orientation." However, children in both groups, after playing the games, still expressed some alternative conceptions such as "Shadows always appear behind a person" and "Shadows should be on the same side as the sun."

  4. Minimal Super Technicolor

    DEFF Research Database (Denmark)

    Antola, M.; Di Chiara, S.; Sannino, F.

    2011-01-01

    We introduce novel extensions of the Standard Model featuring a supersymmetric technicolor sector (supertechnicolor). As the first minimal conformal supertechnicolor model we consider N=4 Super Yang-Mills which breaks to N=1 via the electroweak interactions. This is a well defined, economical......, between unparticle physics and Minimal Walking Technicolor. We consider also other N =1 extensions of the Minimal Walking Technicolor model. The new models allow all the standard model matter fields to acquire a mass....

  5. Aiming for knowledge information processing systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuchi, K

    1982-01-01

    The Fifth Generation Computer Project in Japan intends to develop a new generation of computers by extensive research in many areas. This paper discusses many research topics which the Japanese are hoping will lead to a radical new knowledge information processing system. Topics discussed include new computer architecture, programming styles, semantics of programming languages, relational databases, linguistics theory, artificial intelligence, functional images and interference systems.

  6. Image registration using stationary velocity fields parameterized by norm-minimizing Wendland kernel

    DEFF Research Database (Denmark)

    Pai, Akshay Sadananda Uppinakudru; Sommer, Stefan Horst; Sørensen, Lauge

    by the regularization term. In a variational formulation, this term is traditionally expressed as a squared norm which is a scalar inner product of the interpolating kernels parameterizing the velocity fields. The minimization of this term using the standard spline interpolation kernels (linear or cubic) is only...... approximative because of the lack of a compatible norm. In this paper, we propose to replace such interpolants with a norm-minimizing interpolant - the Wendland kernel which has the same computational simplicity like B-Splines. An application on the Alzheimer's disease neuroimaging initiative showed...... that Wendland SVF based measures separate (Alzheimer's disease v/s normal controls) better than both B-Spline SVFs (p

  7. Introducing Seismic Tomography with Computational Modeling

    Science.gov (United States)

    Neves, R.; Neves, M. L.; Teodoro, V.

    2011-12-01

    Learning seismic tomography principles and techniques involves advanced physical and computational knowledge. In depth learning of such computational skills is a difficult cognitive process that requires a strong background in physics, mathematics and computer programming. The corresponding learning environments and pedagogic methodologies should then involve sets of computational modelling activities with computer software systems which allow students the possibility to improve their mathematical or programming knowledge and simultaneously focus on the learning of seismic wave propagation and inverse theory. To reduce the level of cognitive opacity associated with mathematical or programming knowledge, several computer modelling systems have already been developed (Neves & Teodoro, 2010). Among such systems, Modellus is particularly well suited to achieve this goal because it is a domain general environment for explorative and expressive modelling with the following main advantages: 1) an easy and intuitive creation of mathematical models using just standard mathematical notation; 2) the simultaneous exploration of images, tables, graphs and object animations; 3) the attribution of mathematical properties expressed in the models to animated objects; and finally 4) the computation and display of mathematical quantities obtained from the analysis of images and graphs. Here we describe virtual simulations and educational exercises which enable students an easy grasp of the fundamental of seismic tomography. The simulations make the lecture more interactive and allow students the possibility to overcome their lack of advanced mathematical or programming knowledge and focus on the learning of seismological concepts and processes taking advantage of basic scientific computation methods and tools.

  8. Long-term changes of information environments and computer anxiety of nurse administrators in Japan.

    Science.gov (United States)

    Majima, Yukie; Izumi, Takako

    2013-01-01

    In Japan, medical information systems, including electronic medical records, are being introduced increasingly at medical and nursing fields. Nurse administrators, who are involved in the introduction of medical information systems and who must make proper judgment, are particularly required to have at least minimal knowledge of computers and networks and the ability to think about easy-to-use medical information systems. However, few of the current generation of nurse administrators studied information science subjects in their basic education curriculum. It can be said that information education for nurse administrators has become a pressing issue. Consequently, in this study, we conducted a survey of participants taking the first level program of the education course for Japanese certified nurse administrators to ascertain the actual conditions, such as the information environments that nurse administrators are in, their anxiety attitude to computers. Comparisons over the seven years since 2004 revealed that although introduction of electronic medical records in hospitals was progressing, little change in attributes of participants taking the course was observed, such as computer anxiety.

  9. Minimizing the Effect of Substantial Perturbations in Military Water Systems for Increased Resilience and Efficiency

    Directory of Open Access Journals (Sweden)

    Corey M. James

    2017-10-01

    Full Text Available A model predictive control (MPC framework, exploiting both feedforward and feedback control loops, is employed to minimize large disturbances that occur in military water networks. Military installations’ need for resilient and efficient water supplies is often challenged by large disturbances like fires, terrorist activity, troop training rotations, and large scale leaks. This work applies the effectiveness of MPC to provide predictive capability and compensate for vast geographical differences and varying phenomena time scales using computational software and actual system dimensions and parameters. The results show that large disturbances are rapidly minimized while maintaining chlorine concentration within legal limits at the point of demand and overall water usage is minimized. The control framework also ensures pumping is minimized during peak electricity hours, so costs are kept lower than simple proportional control. Thecontrol structure implemented in this work is able to support resiliency and increased efficiency on military bases by minimizing tank holdup, effectively countering large disturbances, and efficiently managing pumping.

  10. Integrating pedagogical content knowledge and pedagogical/psychological knowledge in mathematics

    Science.gov (United States)

    Harr, Nora; Eichler, Andreas; Renkl, Alexander

    2014-01-01

    In teacher education at universities, general pedagogical and psychological principles are often treated separately from subject matter knowledge and therefore run the risk of not being applied in the teaching subject. In an experimental study (N = 60 mathematics student teachers) we investigated the effects of providing aspects of general pedagogical/psychological knowledge (PPK) and pedagogical content knowledge (PCK) in an integrated or separated way. In both conditions (“integrated” vs. “separated”), participants individually worked on computer-based learning environments addressing the same topic: use and handling of multiple external representations, a central issue in mathematics. We experimentally varied whether PPK aspects and PCK aspects were treated integrated or apart from one another. As expected, the integrated condition led to greater application of pedagogical/psychological aspects and an increase in applying both knowledge types simultaneously compared to the separated condition. Overall, our findings indicate beneficial effects of an integrated design in teacher education. PMID:25191300

  11. Omnigradient Based Total Variation Minimization for Enhanced Defocus Deblurring of Omnidirectional Images

    Directory of Open Access Journals (Sweden)

    Yongle Li

    2014-01-01

    Full Text Available We propose a new method of image restoration for catadioptric defocus blur using omnitotal variation (Omni-TV minimization based on omnigradient. Catadioptric omnidirectional imaging systems usually consist of conventional cameras and curved mirrors for capturing 360° field of view. The problem of catadioptric omnidirectional imaging defocus blur, which is caused by lens aperture and mirror curvature, becomes more severe when high resolution sensors and large apertures are used. In an omnidirectional image, two points near each other may not be close to one another in the 3D scene. Traditional gradient computation cannot be directly applied to omnidirectional image processing. Thus, omnigradient computing method combined with the characteristics of catadioptric omnidirectional imaging is proposed. Following this Omni-TV minimization is used as the constraint for deconvolution regularization, leading to the restoration of defocus blur in an omnidirectional image to obtain all sharp omnidirectional images. The proposed method is important for improving catadioptric omnidirectional imaging quality and promoting applications in related fields like omnidirectional video and image processing.

  12. Research on Computer-Based Education for Reading Teachers: A 1989 Update. Results of the First National Assessment of Computer Competence.

    Science.gov (United States)

    Balajthy, Ernest

    Results of the 1985-86 National Assessment of Educational Progress (NAEP) survey of American students' knowledge of computers suggest that American schools have a long way to go before computers can be said to have made a significant impact. The survey covered the 3rd, 7th, and 11th grade levels and assessed competence in knowledge of computers,…

  13. Minimizing inner product data dependencies in conjugate gradient iteration

    Science.gov (United States)

    Vanrosendale, J.

    1983-01-01

    The amount of concurrency available in conjugate gradient iteration is limited by the summations required in the inner product computations. The inner product of two vectors of length N requires time c log(N), if N or more processors are available. This paper describes an algebraic restructuring of the conjugate gradient algorithm which minimizes data dependencies due to inner product calculations. After an initial start up, the new algorithm can perform a conjugate gradient iteration in time c*log(log(N)).

  14. COMPUTATIONAL MODELS USED FOR MINIMIZING THE NEGATIVE IMPACT OF ENERGY ON THE ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Oprea D.

    2012-04-01

    Full Text Available Optimizing energy system is a problem that is extensively studied for many years by scientists. This problem can be studied from different views and using different computer programs. The work is characterized by one of the following calculation methods used in Europe for modelling, power system optimization. This method shall be based on reduce action of energy system on environment. Computer program used and characterized in this article is GEMIS.

  15. Sculpting proteins interactively: continual energy minimization embedded in a graphical modeling system.

    Science.gov (United States)

    Surles, M C; Richardson, J S; Richardson, D C; Brooks, F P

    1994-02-01

    We describe a new paradigm for modeling proteins in interactive computer graphics systems--continual maintenance of a physically valid representation, combined with direct user control and visualization. This is achieved by a fast algorithm for energy minimization, capable of real-time performance on all atoms of a small protein, plus graphically specified user tugs. The modeling system, called Sculpt, rigidly constrains bond lengths, bond angles, and planar groups (similar to existing interactive modeling programs), while it applies elastic restraints to minimize the potential energy due to torsions, hydrogen bonds, and van der Waals and electrostatic interactions (similar to existing batch minimization programs), and user-specified springs. The graphical interface can show bad and/or favorable contacts, and individual energy terms can be turned on or off to determine their effects and interactions. Sculpt finds a local minimum of the total energy that satisfies all the constraints using an augmented Lagrange-multiplier method; calculation time increases only linearly with the number of atoms because the matrix of constraint gradients is sparse and banded. On a 100-MHz MIPS R4000 processor (Silicon Graphics Indigo), Sculpt achieves 11 updates per second on a 20-residue fragment and 2 updates per second on an 80-residue protein, using all atoms except non-H-bonding hydrogens, and without electrostatic interactions. Applications of Sculpt are described: to reverse the direction of bundle packing in a designed 4-helix bundle protein, to fold up a 2-stranded beta-ribbon into an approximate beta-barrel, and to design the sequence and conformation of a 30-residue peptide that mimics one partner of a protein subunit interaction. Computer models that are both interactive and physically realistic (within the limitations of a given force field) have 2 significant advantages: (1) they make feasible the modeling of very large changes (such as needed for de novo design), and

  16. Cloud Computing Task Scheduling Based on Cultural Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Li Jian-Wen

    2016-01-01

    Full Text Available The task scheduling strategy based on cultural genetic algorithm(CGA is proposed in order to improve the efficiency of task scheduling in the cloud computing platform, which targets at minimizing the total time and cost of task scheduling. The improved genetic algorithm is used to construct the main population space and knowledge space under cultural framework which get independent parallel evolution, forming a mechanism of mutual promotion to dispatch the cloud task. Simultaneously, in order to prevent the defects of the genetic algorithm which is easy to fall into local optimum, the non-uniform mutation operator is introduced to improve the search performance of the algorithm. The experimental results show that CGA reduces the total time and lowers the cost of the scheduling, which is an effective algorithm for the cloud task scheduling.

  17. Personal optical disk library (PODL) for knowledge engineering

    Science.gov (United States)

    Wang, Hong; Jia, Huibo; Xu, Duanyi

    2001-02-01

    This paper describes the structure of Personal Optical Disk Library (PODL), a kind of large capacity (40 GB) optical storage equipment for personal usage. With the knowledge engineering technology integrated in the PODL, it can be used on knowledge query, knowledge discovery, Computer-Aided Instruction (CAI) and Online Analysis Process (OLAP).

  18. Minimal supersymmetric grand unified theory: Symmetry breaking and the particle spectrum

    International Nuclear Information System (INIS)

    Bajc, Borut; Melfo, Alejandra; Senjanovic, Goran; Vissani, Francesco

    2004-01-01

    We discuss in detail the symmetry breaking and related issues in the minimal renormalizable supersymmetric grand unified theory. We find all the possible patterns of symmetry breaking, compute the associated particle spectrum and study its impact on the physical scales of the theory. In particular, the complete mass matrices of the SU(2) doublets and the color triplets are computed in connection with the doublet-triplet splitting and the d=5 proton decay. We explicitly construct the two light Higgs doublets as a function of the Higgs superpotential parameters. This provides a framework for the analysis of phenomenological implications of the theory, to be carried out in a second paper

  19. Strong coupling in electromechanical computation

    CERN Document Server

    Fuezi, J

    2000-01-01

    A method is presented to carry out simultaneously electromagnetic field and force computation, electrical circuit analysis and mechanical computation to simulate the dynamic operation of electromagnetic actuators. The equation system is solved by a predictor-corrector scheme containing a Powell error minimization algorithm which ensures that every differential equation (coil current, field strength rate, flux rate, speed of the keeper) is fulfilled within the same time step.

  20. Strong coupling in electromechanical computation

    Energy Technology Data Exchange (ETDEWEB)

    Fuezi, J. E-mail: fuzi@leda.unitbv.rofuzi@evtsz.bme.hu

    2000-06-02

    A method is presented to carry out simultaneously electromagnetic field and force computation, electrical circuit analysis and mechanical computation to simulate the dynamic operation of electromagnetic actuators. The equation system is solved by a predictor-corrector scheme containing a Powell error minimization algorithm which ensures that every differential equation (coil current, field strength rate, flux rate, speed of the keeper) is fulfilled within the same time step.

  1. Knowledge Management for Shared Awareness

    Science.gov (United States)

    2013-05-01

    knowledge derived from experiences that can be communicated through mechanisms such as storytelling , debriefing etc., or summarized in databases...patterned on the neural architecture of the brain . Neural nets often consist of a large number of nodes connected by links that transmit signals...that allow speech generation by a computer. Storytelling : The use of stories in organizations as a way of sharing knowledge and helping learning

  2. Spontaneous and traumatic hepatic rupture: imaging findings and minimally invasive treatment

    International Nuclear Information System (INIS)

    Palacio, Glaucia Andrade e Silva; D'Ippolito, Giuseppe

    2003-01-01

    Spontaneous hepatic bleeding is a rare condition. Our aim was to describe the imaging findings and minimally invasive treatment using transcatheter arterial embolization in patients with spontaneous and traumatic hepatic rupture. Three patients presented acute hemoperitoneum dur to hepatic rupture caused by spontaneous rupture of hepatocellular carcinoma, HELLP syndrome and a blunt hepatic trauma. The patients were submitted to ultrasound and computed tomography of the abdomen and subsequently treated by transcatheter arterial embolization. All patients underwent helical computed tomography before and after treatments. Computed tomography played an important role in the evaluation and follow-up in the therapeutic intervention. Different types of liver injuries were identified. Transcatheter arterial embolization blocked arterial hemorrhage in the patients who were hemodynamically unstable. The conclusion was: transcatheter arterial embolization is an effective and well-tolerated treatment method for the management of hepatic rupture and computed tomography is an excellent method for the diagnosis and follow-up of these patients. (author)

  3. Computational model for dosimetric purposes in dental procedures

    International Nuclear Information System (INIS)

    Kawamoto, Renato H.; Campos, Tarcisio R.

    2013-01-01

    This study aims to develop a computational model for dosimetric purposes the oral region, based on computational tools SISCODES and MCNP-5, to predict deterministic effects and minimize stochastic effects caused by ionizing radiation by radiodiagnosis. Based on a set of digital information provided by computed tomography, three-dimensional voxel model was created, and its tissues represented. The model was exported to the MCNP code. In association with SICODES, we used the Monte Carlo N-Particle Transport Code (MCNP-5) method to play the corresponding interaction of nuclear particles with human tissues statistical process. The study will serve as a source of data for dosimetric studies in the oral region, providing deterministic effect and minimize the stochastic effect of ionizing radiation

  4. Fifth generation computer systems. Proceedings of the International conference

    Energy Technology Data Exchange (ETDEWEB)

    Moto-oka, T

    1982-01-01

    The following topics were dealt with: Fifth Generation Computer Project-social needs and impact; knowledge information processing research plan; architecture research plan; knowledge information processing topics; fifth generation computer architecture considerations.

  5. Children Develop Initial Orthographic Knowledge during Storybook Reading

    Science.gov (United States)

    Apel, Kenn; Brimo, Danielle; Wilson-Fowler, Elizabeth B.; Vorstius, Christian; Radach, Ralph

    2013-01-01

    We examined whether young children acquire orthographic knowledge during structured adult-led storybook reading even though minimal viewing time is devoted to print. Sixty-two kindergarten children were read 12 storybook "chapters" while their eye movements were tracked. Results indicated that the children quickly acquired initial mental…

  6. Semantic Health Knowledge Graph: Semantic Integration of Heterogeneous Medical Knowledge and Services

    Directory of Open Access Journals (Sweden)

    Longxiang Shi

    2017-01-01

    Full Text Available With the explosion of healthcare information, there has been a tremendous amount of heterogeneous textual medical knowledge (TMK, which plays an essential role in healthcare information systems. Existing works for integrating and utilizing the TMK mainly focus on straightforward connections establishment and pay less attention to make computers interpret and retrieve knowledge correctly and quickly. In this paper, we explore a novel model to organize and integrate the TMK into conceptual graphs. We then employ a framework to automatically retrieve knowledge in knowledge graphs with a high precision. In order to perform reasonable inference on knowledge graphs, we propose a contextual inference pruning algorithm to achieve efficient chain inference. Our algorithm achieves a better inference result with precision and recall of 92% and 96%, respectively, which can avoid most of the meaningless inferences. In addition, we implement two prototypes and provide services, and the results show our approach is practical and effective.

  7. Minimizing Mutual Couping

    DEFF Research Database (Denmark)

    2010-01-01

    Disclosed herein are techniques, systems, and methods relating to minimizing mutual coupling between a first antenna and a second antenna.......Disclosed herein are techniques, systems, and methods relating to minimizing mutual coupling between a first antenna and a second antenna....

  8. Information technology to support informal knowledge sharing

    NARCIS (Netherlands)

    Davison, R.M.; Ou, C.X.J.; Martinsons, M.G.

    2013-01-01

    The knowledge management (KM) literature largely focuses on the explicit and formal representation of knowledge in computer-based KM systems. Informal KM practices are widespread, but less is known about them. This paper aims to redress this imbalance by exploring the use of interactive information

  9. Media Richness, Knowledge sharing and computer progamming by virtual Software teams

    DEFF Research Database (Denmark)

    Williams, Idongesit; Gyamfi, Albert

    2018-01-01

    Software programming is a task with high analysability. However, knowledge sharing is an intricate part of the software programming process. Today, new media platforms have been adopted to enable knowledge sharing between virtual teams. Taking into consideration the high task analyzability and th...

  10. Designing Educational Games for Computer Programming: A Holistic Framework

    Science.gov (United States)

    Malliarakis, Christos; Satratzemi, Maya; Xinogalos, Stelios

    2014-01-01

    Computer science is continuously evolving during the past decades. This has also brought forth new knowledge that should be incorporated and new learning strategies must be adopted for the successful teaching of all sub-domains. For example, computer programming is a vital knowledge area within computer science with constantly changing curriculum…

  11. The Effects of Minimal Group Membership on Young Preschoolers’ Social Preferences, Estimates of Similarity, and Behavioral Attribution

    Directory of Open Access Journals (Sweden)

    Nadja Richter

    2016-07-01

    Full Text Available We investigate young children’s sensitivity to minimal group membership. Previous research has suggested that children do not show sensitivity to minimal cues to group membership until the age of five to six, contributing to claims that this is an important transition in the development of intergroup cognition and behavior. In this study, we investigated whether even younger children are sensitive to minimal cues to group membership. Random assignment to one of either of two color groups created a temporary, visually salient minimal group membership in 3 and 4-year-old study participants. Using explicit measures, we tested whether children preferred minimal group members when making social judgments. We find that, in the absence of any knowledge regarding the two groups, children expressed greater liking for ingroup than outgroup targets. Moreover, children estimated that ingroup members would share their preferences. Our findings demonstrate that from early in development, humans assess unknown others on the basis of minimal cues to social similarity and that the perception of group boundaries potentially underlies social assortment in strangers.

  12. Seventh International Conference on Intelligent Systems and Knowledge Engineering - Foundations and Applications of Intelligent Systems

    CERN Document Server

    Li, Tianrui; Li, Hongbo

    2014-01-01

    These proceedings present technical papers selected from the 2012 International Conference on Intelligent Systems and Knowledge Engineering (ISKE 2012), held on December 15-17 in Beijing. The aim of this conference is to bring together experts from different fields of expertise to discuss the state-of-the-art in Intelligent Systems and Knowledge Engineering, and to present new findings and perspectives on future developments. The proceedings introduce current scientific and technical advances in the fields of artificial intelligence, machine learning, pattern recognition, data mining, knowledge engineering, information retrieval, information theory, knowledge-based systems, knowledge representation and reasoning, multi-agent systems, and natural-language processing, etc. Furthermore they include papers on new intelligent computing paradigms, which combine new computing methodologies, e.g., cloud computing, service computing and pervasive computing with traditional intelligent methods. By presenting new method...

  13. Minimal-effort planning of active alignment processes for beam-shaping optics

    Science.gov (United States)

    Haag, Sebastian; Schranner, Matthias; Müller, Tobias; Zontar, Daniel; Schlette, Christian; Losch, Daniel; Brecher, Christian; Roßmann, Jürgen

    2015-03-01

    In science and industry, the alignment of beam-shaping optics is usually a manual procedure. Many industrial applications utilizing beam-shaping optical systems require more scalable production solutions and therefore effort has been invested in research regarding the automation of optics assembly. In previous works, the authors and other researchers have proven the feasibility of automated alignment of beam-shaping optics such as collimation lenses or homogenization optics. Nevertheless, the planning efforts as well as additional knowledge from the fields of automation and control required for such alignment processes are immense. This paper presents a novel approach of planning active alignment processes of beam-shaping optics with the focus of minimizing the planning efforts for active alignment. The approach utilizes optical simulation and the genetic programming paradigm from computer science for automatically extracting features from a simulated data basis with a high correlation coefficient regarding the individual degrees of freedom of alignment. The strategy is capable of finding active alignment strategies that can be executed by an automated assembly system. The paper presents a tool making the algorithm available to end-users and it discusses the results of planning the active alignment of the well-known assembly of a fast-axis collimator. The paper concludes with an outlook on the transferability to other use cases such as application specific intensity distributions which will benefit from reduced planning efforts.

  14. A strategy for minimizing common mode human error in executing critical functions and tasks

    International Nuclear Information System (INIS)

    Beltracchi, L.; Lindsay, R.W.

    1992-01-01

    Human error in execution of critical functions and tasks can be costly. The Three Mile Island and the Chernobyl Accidents are examples of results from human error in the nuclear industry. There are similar errors that could no doubt be cited from other industries. This paper discusses a strategy to minimize common mode human error in the execution of critical functions and tasks. The strategy consists of the use of human redundancy, and also diversity in human cognitive behavior: skill-, rule-, and knowledge-based behavior. The authors contend that the use of diversity in human cognitive behavior is possible, and it minimizes common mode error

  15. Minimally invasive single-site surgery for the digestive system: A technological review

    Directory of Open Access Journals (Sweden)

    Dhumane Parag

    2011-01-01

    Full Text Available Minimally Invasive Single Site (MISS surgery is a better terminology to explain the novel concept of scarless surgery, which is increasingly making its way into clinical practice. But, there are some difficulties. We review the existing technologies for MISS surgery with regards to single-port devices, endoscope and camera, instruments, retractors and also the future perspectives for the evolution of MISS surgery. While we need to move ahead cautiously and wait for the development of appropriate technology, we believe that the "Ultimate form of Minimally Invasive Surgery" will be a hybrid form of MISS surgery and Natural Orifice Transluminal Endoscopic Surgery, complimented by technological innovations from the fields of robotics and computer-assisted surgery.

  16. MAINS: MULTI-AGENT INTELLIGENT SERVICE ARCHITECTURE FOR CLOUD COMPUTING

    Directory of Open Access Journals (Sweden)

    T. Joshva Devadas

    2014-04-01

    Full Text Available Computing has been transformed to a model having commoditized services. These services are modeled similar to the utility services water and electricity. The Internet has been stunningly successful over the course of past three decades in supporting multitude of distributed applications and a wide variety of network technologies. However, its popularity has become the biggest impediment to its further growth with the handheld devices mobile and laptops. Agents are intelligent software system that works on behalf of others. Agents are incorporated in many innovative applications in order to improve the performance of the system. Agent uses its possessed knowledge to react with the system and helps to improve the performance. Agents are introduced in the cloud computing is to minimize the response time when similar request is raised from an end user in the globe. In this paper, we have introduced a Multi Agent Intelligent system (MAINS prior to cloud service models and it was tested using sample dataset. Performance of the MAINS layer was analyzed in three aspects and the outcome of the analysis proves that MAINS Layer provides a flexible model to create cloud applications and deploying them in variety of applications.

  17. Automated knowledge-base refinement

    Science.gov (United States)

    Mooney, Raymond J.

    1994-01-01

    Over the last several years, we have developed several systems for automatically refining incomplete and incorrect knowledge bases. These systems are given an imperfect rule base and a set of training examples and minimally modify the knowledge base to make it consistent with the examples. One of our most recent systems, FORTE, revises first-order Horn-clause knowledge bases. This system can be viewed as automatically debugging Prolog programs based on examples of correct and incorrect I/O pairs. In fact, we have already used the system to debug simple Prolog programs written by students in a programming language course. FORTE has also been used to automatically induce and revise qualitative models of several continuous dynamic devices from qualitative behavior traces. For example, it has been used to induce and revise a qualitative model of a portion of the Reaction Control System (RCS) of the NASA Space Shuttle. By fitting a correct model of this portion of the RCS to simulated qualitative data from a faulty system, FORTE was also able to correctly diagnose simple faults in this system.

  18. Studi Perbandingan Layanan Cloud Computing

    Directory of Open Access Journals (Sweden)

    Afdhal Afdhal

    2014-03-01

    Full Text Available In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud delivery service, correlation and inter-dependency. This article compares and contrasts the different levels of delivery services and the development models, identify issues, and future directions on cloud computing. The end-users comprehension of cloud computing delivery service classification will equip them with knowledge to determine and decide which business model that will be chosen and adopted securely and comfortably. The last part of this article provides several recommendations for cloud computing service providers and end-users.

  19. Learning Management Systems: Are They Knowledge Management Tools?

    Directory of Open Access Journals (Sweden)

    Bayan Aref Abu Shawar

    2010-03-01

    Full Text Available The new adventure of online world has helped to improve many domains and sectors. Knowledge management era which originally related to business sector is now required in industry, health, or any institute that needs to manage its knowledge. Education is no exception! The advancement in computers speed and memory, and the growth of Internet usage are behind the inspiration of e-learning approach. In which the computer is used as a medium to deliver and share educational materials and knowledge instead of face-to-face tutoring. This makes education available to any one, any place, and any time as learner need. This paper presents the relationship between knowledge management and learning management system (LMS that is used in e-learning paradigms. A detailed description of the LMS used at Arab Open University (AOU is included in this paper. We claim that the LMS used at AOU can be considered as a knowledge management tool.

  20. Impact of Knowledge Economy on the Participation of Women in Labor Market

    OpenAIRE

    Abeer Mohamed Ali Abd Elkhalek

    2017-01-01

    Purpose: To examine the influence and participation of women in the labor market by the know-ledge economy; in negative or positive manner. Methodology: Quantitative research technique has been implied to evaluate women’s participa-tion in the labor market to minimize negative impacts of knowledge economy. Findings: Within the service and agricultural sectors, the outcomes demonstrated that knowledge economy is found to have a significant impact on the participation of women’s labor for...

  1. Computer science handbook. Vol. 13.3. Environmental computer science. Computer science methods for environmental protection and environmental research

    International Nuclear Information System (INIS)

    Page, B.; Hilty, L.M.

    1994-01-01

    Environmental computer science is a new partial discipline of applied computer science, which makes use of methods and techniques of information processing in environmental protection. Thanks to the inter-disciplinary nature of environmental problems, computer science acts as a mediator between numerous disciplines and institutions in this sector. The handbook reflects the broad spectrum of state-of-the art environmental computer science. The following important subjects are dealt with: Environmental databases and information systems, environmental monitoring, modelling and simulation, visualization of environmental data and knowledge-based systems in the environmental sector. (orig.) [de

  2. BCS Glossary of Computing and ICT

    CERN Document Server

    Panel, BCS Education and Training Expert; Burkhardt, Diana; Cumming, Aline; Hunter, Alan; Hurvid, Frank; Jaworski, John; Ng, Thomas; Scheer, Marianne; Southall, John; Vella, Alfred

    2008-01-01

    A glossary of computing designed to support those taking computer courses or courses where computers are used, including GCSE, A-Level, ECDL and 14-19 Diplomas in Functional Skills, in schools and Further Education colleges. It helps the reader build up knowledge and understanding of computing.

  3. Shredder: GPU-Accelerated Incremental Storage and Computation

    OpenAIRE

    Bhatotia, Pramod; Rodrigues, Rodrigo; Verma, Akshat

    2012-01-01

    Redundancy elimination using data deduplication and incremental data processing has emerged as an important technique to minimize storage and computation requirements in data center computing. In this paper, we present the design, implementation and evaluation of Shredder, a high performance content-based chunking framework for supporting incremental storage and computation systems. Shredder exploits the massively parallel processing power of GPUs to overcome the CPU bottlenecks of content-ba...

  4. Innovation system and knowledge-intensive entrepreneurship

    DEFF Research Database (Denmark)

    Timmermans, Bram

    2011-01-01

    The goal of this deliverable is to investigate the properties and the nature of knowledge-intensive entrepreneurship as a largely distributed phenomenon at firm, sector and national levels in Denmark. Following the guidelines previously developed in the Deliverable 2.2.1 “Innovation systems...... and knowledge-intensive entrepreneurship: Analytical framework and guidelines for case study research” I will investigate the interplay between national innovation systems and knowledge- intensive entrepreneurship by focusing on two main sectors: machine tools, and computer and related activities....

  5. Towards a Knowledge Communication Perspective on Designing Artefacts Supporting Knowledge Work

    Directory of Open Access Journals (Sweden)

    Niclas Eberhagen

    2015-02-01

    Full Text Available The designing of computer-based artefacts to support knowledge work is far from a straightforward rational process. Characteristics of knowledge work have a bearing upon how developers (or designers, together with users, come to approach and capture the rich and tacit knowing of the practice. As all knowledge work is about the production of knowledge, transforming it, so is the design practice for developing artefacts to occupy space within that same practice. There is a need for providing a conceptual language to better reflect the nature of this design work that goes beyond those dressed in the managerial (or rational language of planned activities and deliverables. Towards this end, a conceptual frame is presented that makes several important aspects of the design practice visible. The frame brings together both nature of design work and characteristics of knowledge work to extend the frame of knowledge in user-developer communication of Kensing and Munk-Madsen. Thereby, providing a means to focus attention and dress debate on what situated designing is. By using explicit concepts, such as types knowledge domains embedded in the design situation, the transitional paths between them, and design engagements, it arms practitioners with specific linguistic constructs to direct attention and efforts in planning and organizing development undertakings.Purpose – the purpose of this work is to present and argue for a perspective on designing of computer-based artefacts supporting knowledge work. This is done to inform practitioners, directing their attention and dressing debate, and providing a conceptual language to better capture design activities in planning and organizing development undertakings.Design/Methodology/Approach – The approach presented in this article is conceptual in so far that a model or frame providing linguistic constructs is constructed and argued, building upon scholarly work of knowledge communication and drawing upon

  6. NASA/DOD Aerospace Knowledge Diffusion Research Project. Report 35: The use of computer networks in aerospace engineering

    Science.gov (United States)

    Bishop, Ann P.; Pinelli, Thomas E.

    1995-01-01

    This research used survey research to explore and describe the use of computer networks by aerospace engineers. The study population included 2000 randomly selected U.S. aerospace engineers and scientists who subscribed to Aerospace Engineering. A total of 950 usable questionnaires were received by the cutoff date of July 1994. Study results contribute to existing knowledge about both computer network use and the nature of engineering work and communication. We found that 74 percent of mail survey respondents personally used computer networks. Electronic mail, file transfer, and remote login were the most widely used applications. Networks were used less often than face-to-face interactions in performing work tasks, but about equally with reading and telephone conversations, and more often than mail or fax. Network use was associated with a range of technical, organizational, and personal factors: lack of compatibility across systems, cost, inadequate access and training, and unwillingness to embrace new technologies and modes of work appear to discourage network use. The greatest positive impacts from networking appear to be increases in the amount of accurate and timely information available, better exchange of ideas across organizational boundaries, and enhanced work flexibility, efficiency, and quality. Involvement with classified or proprietary data and type of organizational structure did not distinguish network users from nonusers. The findings can be used by people involved in the design and implementation of networks in engineering communities to inform the development of more effective networking systems, services, and policies.

  7. Older Adults' Knowledge of Internet Hazards

    Science.gov (United States)

    Grimes, Galen A.; Hough, Michelle G.; Mazur, Elizabeth; Signorella, Margaret L.

    2010-01-01

    Older adults are less likely to be using computers and less knowledgeable about Internet security than are younger users. The two groups do not differ on trust of Internet information. The younger group shows no age or gender differences. Within the older group, computer users are more trusting of Internet information, and along with those with…

  8. Minimization of Linear Functionals Defined on| Solutions of Large-Scale Discrete Ill-Posed Problems

    DEFF Research Database (Denmark)

    Elden, Lars; Hansen, Per Christian; Rojas, Marielba

    2003-01-01

    The minimization of linear functionals de ned on the solutions of discrete ill-posed problems arises, e.g., in the computation of con dence intervals for these solutions. In 1990, Elden proposed an algorithm for this minimization problem based on a parametric-programming reformulation involving...... the solution of a sequence of trust-region problems, and using matrix factorizations. In this paper, we describe MLFIP, a large-scale version of this algorithm where a limited-memory trust-region solver is used on the subproblems. We illustrate the use of our algorithm in connection with an inverse heat...

  9. Optimal design method to minimize users' thinking mapping load in human-machine interactions.

    Science.gov (United States)

    Huang, Yanqun; Li, Xu; Zhang, Jie

    2015-01-01

    The discrepancy between human cognition and machine requirements/behaviors usually results in serious mental thinking mapping loads or even disasters in product operating. It is important to help people avoid human-machine interaction confusions and difficulties in today's mental work mastered society. Improving the usability of a product and minimizing user's thinking mapping and interpreting load in human-machine interactions. An optimal human-machine interface design method is introduced, which is based on the purpose of minimizing the mental load in thinking mapping process between users' intentions and affordance of product interface states. By analyzing the users' thinking mapping problem, an operating action model is constructed. According to human natural instincts and acquired knowledge, an expected ideal design with minimized thinking loads is uniquely determined at first. Then, creative alternatives, in terms of the way human obtains operational information, are provided as digital interface states datasets. In the last, using the cluster analysis method, an optimum solution is picked out from alternatives, by calculating the distances between two datasets. Considering multiple factors to minimize users' thinking mapping loads, a solution nearest to the ideal value is found in the human-car interaction design case. The clustering results show its effectiveness in finding an optimum solution to the mental load minimizing problems in human-machine interaction design.

  10. Legal incentives for minimizing waste

    International Nuclear Information System (INIS)

    Clearwater, S.W.; Scanlon, J.M.

    1991-01-01

    Waste minimization, or pollution prevention, has become an integral component of federal and state environmental regulation. Minimizing waste offers many economic and public relations benefits. In addition, waste minimization efforts can also dramatically reduce potential criminal requirements. This paper addresses the legal incentives for minimizing waste under current and proposed environmental laws and regulations

  11. Application of computers in a Radiological Survey Program

    International Nuclear Information System (INIS)

    Berven, B.A.; Blair, M.S.; Doane, R.W.; Little, C.A.; Perdue, P.T.

    1984-01-01

    A brief description of some of the applications of computers in a radiological survey program is presented. It has been our experience that computers and computer software have allowed our staff personnel to more productively use their time by using computers to perform the mechanical acquisition, analyses, and storage of data. It is hoped that other organizations may similarly profit from this experience. This effort will ultimately minimize errors and reduce program costs

  12. Computational Fluid Dynamics (CFD) Modeling for High Rate Pulverized Coal Injection (PCI) to Blast Furnaces

    International Nuclear Information System (INIS)

    Zhou, Chenn

    2008-01-01

    Pulverized coal injection (PCI) into the blast furnace (BF) has been recognized as an effective way to decrease the coke and total energy consumption along with minimization of environmental impacts. However, increasing the amount of coal injected into the BF is currently limited by the lack of knowledge of some issues related to the process. It is therefore important to understand the complex physical and chemical phenomena in the PCI process. Due to the difficulty in attaining trus BF measurements, Computational fluid dynamics (CFD) modeling has been identified as a useful technology to provide such knowledge. CFD simulation is powerful for providing detailed information on flow properties and performing parametric studies for process design and optimization. In this project, comprehensive 3-D CFD models have been developed to simulate the PCI process under actual furnace conditions. These models provide raceway size and flow property distributions. The results have provided guidance for optimizing the PCI process

  13. MOCUS, Minimal Cut Sets and Minimal Path Sets from Fault Tree Analysis

    International Nuclear Information System (INIS)

    Fussell, J.B.; Henry, E.B.; Marshall, N.H.

    1976-01-01

    1 - Description of problem or function: From a description of the Boolean failure logic of a system, called a fault tree, and control parameters specifying the minimal cut set length to be obtained MOCUS determines the system failure modes, or minimal cut sets, and the system success modes, or minimal path sets. 2 - Method of solution: MOCUS uses direct resolution of the fault tree into the cut and path sets. The algorithm used starts with the main failure of interest, the top event, and proceeds to basic independent component failures, called primary events, to resolve the fault tree to obtain the minimal sets. A key point of the algorithm is that an and gate alone always increases the number of path sets; an or gate alone always increases the number of cut sets and increases the size of path sets. Other types of logic gates must be described in terms of and and or logic gates. 3 - Restrictions on the complexity of the problem: Output from MOCUS can include minimal cut and path sets for up to 20 gates

  14. Voice-enabled Knowledge Engine using Flood Ontology and Natural Language Processing

    Science.gov (United States)

    Sermet, M. Y.; Demir, I.; Krajewski, W. F.

    2015-12-01

    The Iowa Flood Information System (IFIS) is a web-based platform developed by the Iowa Flood Center (IFC) to provide access to flood inundation maps, real-time flood conditions, flood forecasts, flood-related data, information and interactive visualizations for communities in Iowa. The IFIS is designed for use by general public, often people with no domain knowledge and limited general science background. To improve effective communication with such audience, we have introduced a voice-enabled knowledge engine on flood related issues in IFIS. Instead of navigating within many features and interfaces of the information system and web-based sources, the system provides dynamic computations based on a collection of built-in data, analysis, and methods. The IFIS Knowledge Engine connects to real-time stream gauges, in-house data sources, analysis and visualization tools to answer natural language questions. Our goal is the systematization of data and modeling results on flood related issues in Iowa, and to provide an interface for definitive answers to factual queries. The goal of the knowledge engine is to make all flood related knowledge in Iowa easily accessible to everyone, and support voice-enabled natural language input. We aim to integrate and curate all flood related data, implement analytical and visualization tools, and make it possible to compute answers from questions. The IFIS explicitly implements analytical methods and models, as algorithms, and curates all flood related data and resources so that all these resources are computable. The IFIS Knowledge Engine computes the answer by deriving it from its computational knowledge base. The knowledge engine processes the statement, access data warehouse, run complex database queries on the server-side and return outputs in various formats. This presentation provides an overview of IFIS Knowledge Engine, its unique information interface and functionality as an educational tool, and discusses the future plans

  15. Eighth International Conference on Intelligent Systems and Knowledge Engineering

    CERN Document Server

    Li, Tianrui; ISKE 2013; Foundations of Intelligent Systems; Knowledge Engineering and Management; Practical Applications of Intelligent Systems

    2014-01-01

    "Foundations of Intelligent Systems" presents selected papers from the 2013 International Conference on Intelligent Systems and Knowledge Engineering (ISKE2013). The aim of this conference is to bring together experts from different expertise areas to discuss the state-of-the-art in Intelligent Systems and Knowledge Engineering, and to present new research results and perspectives on future development. The topics in this volume include, but not limited to: Artificial Intelligence Theories, Pattern Recognition, Intelligent System Models, Speech Recognition, Computer Vision, Multi-Agent Systems, Machine Learning, Soft Computing and Fuzzy Systems, Biological Inspired Computation, Game Theory, Cognitive Systems and Information Processing, Computational Intelligence, etc. The proceedings are benefit for both researchers and practitioners who want to utilize intelligent methods in their specific research fields. Dr. Zhenkun Wen is a Professor at the College of Computer and Software Engineering, Shenzhen University...

  16. Free Energy Minimization Calculation of Complex Chemical Equilibria. Reduction of Silicon Dioxide with Carbon at High Temperature.

    Science.gov (United States)

    Wai, C. M.; Hutchinson, S. G.

    1989-01-01

    Discusses the calculation of free energy in reactions between silicon dioxide and carbon. Describes several computer programs for calculating the free energy minimization and their uses in chemistry classrooms. Lists 16 references. (YP)

  17. Is non-minimal inflation eternal?

    International Nuclear Information System (INIS)

    Feng, Chao-Jun; Li, Xin-Zhou

    2010-01-01

    The possibility that the non-minimal coupling inflation could be eternal is investigated. We calculate the quantum fluctuation of the inflaton in a Hubble time and find that it has the same value as that in the minimal case in the slow-roll limit. Armed with this result, we have studied some concrete non-minimal inflationary models including the chaotic inflation and the natural inflation, in which the inflaton is non-minimally coupled to the gravity. We find that the non-minimal coupling inflation could be eternal in some parameter spaces.

  18. Effects of Playing a Serious Computer Game on Body Mass Index and Nutrition Knowledge in Women.

    Science.gov (United States)

    Shiyko, Mariya; Hallinan, Sean; Seif El-Nasr, Magy; Subramanian, Shree; Castaneda-Sceppa, Carmen

    2016-06-02

    Obesity and weight gain is a critical public health concern. Serious digital games are gaining popularity in the context of health interventions. They use persuasive and fun design features to engage users in health-related behaviors in a non-game context. As a young field, research about effectiveness and acceptability of such games for weight loss is sparse. The goal of this study was to evaluate real-world play patterns of SpaPlay and its impact on body mass index (BMI) and nutritional knowledge. SpaPlay is a computer game designed to help women adopt healthier dietary and exercise behaviors, developed based on Self-Determination theory and the Player Experience of Need Satisfaction (PENS) model. Progress in the game is tied to real-life activities (e.g., eating a healthy snack, taking a flight of stairs). We recruited 47 women to partake in a within-subject 90-day longitudinal study, with assessments taken at baseline, 1-, 2-, and 3- months. Women were on average, 29.8 years old (±7.3), highly educated (80.9% had BA or higher), 39% non-White, baseline BMI 26.98 (±5.6), who reported at least contemplating making changes in their diet and exercise routine based on the Stages of Change Model. We computed 9 indices from game utilization data to evaluate game play. We used general linear models to examine inter-individual differences between levels of play, and multilevel models to assess temporal changes in BMI and nutritional knowledge. Patterns of game play were mixed. Participants who reported being in the preparation or action stages of behavior change exhibited more days of play and more play regularity compared to those who were in the contemplation stage. Additionally, women who reported playing video games 1-2 hours per session demonstrated more sparse game play. Brief activities, such as one-time actions related to physical activity or healthy food, were preferred over activities that require a longer commitment (e.g., taking stairs every day for a week

  19. A computational model for determining the minimal cost expansion alternatives in transmission systems planning

    International Nuclear Information System (INIS)

    Pinto, L.M.V.G.; Pereira, M.V.F.; Nunes, A.

    1989-01-01

    A computational model for determining an economical transmission expansion plan, based in the decomposition techniques is presented. The algorithm was used in the Brazilian South System and was able to find an optimal solution, with a low computational resource. Some expansions of this methodology are been investigated: the probabilistic one and the expansion with financier restriction. (C.G.C.). 4 refs, 7 figs

  20. DSS and GIS in Knowledge Transformation Process

    Directory of Open Access Journals (Sweden)

    Klimešová Dana

    2009-06-01

    Full Text Available Knowledge is an important resource for successful decision-making process in the whole society today. The special procedures of control and management of knowledge therefore have to be used. In the area of knowledge management and knowledge engineering basic terms of these disciplines are data, information, knowledge and knowledge transformation. The knowledge can be defined as a dynamic human process of justifying personal beliefs. Knowledge is a product of successful decision-making process. Knowledge transformation is a spiralling process of interactions between explicit and tacit knowledge that leads to the new knowledge. Nonaka and all show, that the combination of these two categories makes possible to conceptualise four conversion steps: Socialisation, Externalisation, Combination and Internalisation (SECI model. Another model of knowledge creation is the Knowledge Transformation Continuum (BCI Knowledge Group that begins with the articulation of a specific instruction representing the best way that a specific task, or series of tasks, should be performed. Knowledge modelling and knowledge representation is an important field of research also in Computer Science and Artificial Intelligence. The definition of knowledge in Artificial Intelligence is a noticeable different, because Artificial Intelligence is typically dealing with formalized knowledge (e.g. ontology. The development of knowledge-based systems was seen as a process of transferring human knowledge to an implemented knowledge base. Decision Support Systems (DSS, Geographical Information Systems (GIS and Operations Research/Management Science (OR/MS modelling process support decision-making process, therefore they also produce a new knowledge. A Decision Support Systems are an interactive computer-based systems helping decision makers complete decision process. Geographic Information Systems provide essential marketing and customer intelligence solutions that lead to better

  1. Home, Hearth and Computing.

    Science.gov (United States)

    Seelig, Anita

    1982-01-01

    Advantages of having children use microcomputers at school and home include learning about sophisticated concepts early in life without a great deal of prodding, playing games that expand knowledge, and becoming literate in computer knowledge needed later in life. Includes comments from parents on their experiences with microcomputers and…

  2. Knowledge Translation Capacity of Arts-informed Dissemination: A Narrative Study

    OpenAIRE

    Jennifer L Lapum; Linda Liu; Kathryn Church; Sarah Hume; Bailey Harding; Siyuan Wang; Megan Nguyen; Gideon Cohen; Terrence M Yau

    2016-01-01

    Background: Arts-informed dissemination is an expanding approach to enhancing knowledge translation in the health sciences. Problematic is the minimal evaluation studies and the rare reporting of the influencing factors of knowledge translation. “The 7,024th Patient” is a research-derived art installation created to disseminate findings about patients’ experiences of heart surgery and the importance of humanistic patient-centred care approaches. The current study’s purpose was to explor...

  3. Thinking about computational thinking

    NARCIS (Netherlands)

    Lu, J.J.; Fletcher, G.H.L.; Fitzgerald, S.; Guzdial, M.; Lewandowski, G.; Wolfman, S.A.

    2009-01-01

    Jeannette Wing's call for teaching Computational Thinking (CT) as a formative skill on par with reading, writing, and arithmetic places computer science in the category of basic knowledge. Just as proficiency in basic language arts helps us to effectively communicate and in basic math helps us to

  4. Motivating Contributions for Home Computer Security

    Science.gov (United States)

    Wash, Richard L.

    2009-01-01

    Recently, malicious computer users have been compromising computers en masse and combining them to form coordinated botnets. The rise of botnets has brought the problem of home computers to the forefront of security. Home computer users commonly have insecure systems; these users do not have the knowledge, experience, and skills necessary to…

  5. Hexavalent Chromium Minimization Strategy

    Science.gov (United States)

    2011-05-01

    Logistics 4 Initiative - DoD Hexavalent Chromium Minimization Non- Chrome Primer IIEXAVAJ ENT CHRO:M I~UMI CHROMIUM (VII Oil CrfVli.J CANCEfl HAnRD CD...Management Office of the Secretary of Defense Hexavalent Chromium Minimization Strategy Report Documentation Page Form ApprovedOMB No. 0704-0188...00-2011 4. TITLE AND SUBTITLE Hexavalent Chromium Minimization Strategy 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6

  6. The limits of quantum computers

    International Nuclear Information System (INIS)

    Aaronson, S.

    2008-01-01

    Future computers, which work with quantum bits, would indeed solve some special problems extremely fastly, but for the most problems the would hardly be superior to contemporary computers. This knowledge could manifest a new fundamental physical principle

  7. A novel particle swarm optimization algorithm for permutation flow-shop scheduling to minimize makespan

    International Nuclear Information System (INIS)

    Lian Zhigang; Gu Xingsheng; Jiao Bin

    2008-01-01

    It is well known that the flow-shop scheduling problem (FSSP) is a branch of production scheduling and is NP-hard. Now, many different approaches have been applied for permutation flow-shop scheduling to minimize makespan, but current algorithms even for moderate size problems cannot be solved to guarantee optimality. Some literatures searching PSO for continuous optimization problems are reported, but papers searching PSO for discrete scheduling problems are few. In this paper, according to the discrete characteristic of FSSP, a novel particle swarm optimization (NPSO) algorithm is presented and successfully applied to permutation flow-shop scheduling to minimize makespan. Computation experiments of seven representative instances (Taillard) based on practical data were made, and comparing the NPSO with standard GA, we obtain that the NPSO is clearly more efficacious than standard GA for FSSP to minimize makespan

  8. A Hybrid ACO Approach to the Matrix Bandwidth Minimization Problem

    Science.gov (United States)

    Pintea, Camelia-M.; Crişan, Gloria-Cerasela; Chira, Camelia

    The evolution of the human society raises more and more difficult endeavors. For some of the real-life problems, the computing time-restriction enhances their complexity. The Matrix Bandwidth Minimization Problem (MBMP) seeks for a simultaneous permutation of the rows and the columns of a square matrix in order to keep its nonzero entries close to the main diagonal. The MBMP is a highly investigated {NP}-complete problem, as it has broad applications in industry, logistics, artificial intelligence or information recovery. This paper describes a new attempt to use the Ant Colony Optimization framework in tackling MBMP. The introduced model is based on the hybridization of the Ant Colony System technique with new local search mechanisms. Computational experiments confirm a good performance of the proposed algorithm for the considered set of MBMP instances.

  9. Computational Complexity of Bosons in Linear Networks

    Science.gov (United States)

    2017-03-01

    is between one and two orders-of-magnitude more efficient than current heralded multiphoton sources based on spontaneous parametric downconversion...expected to perform tasks intractable for a classical computer, yet requiring minimal non-classical resources as compared to full- scale quantum computers...implementations to date employed sources based on inefficient processes—spontaneous parametric downconversion—that only simulate heralded single

  10. DC Control Effort Minimized for Magnetic-Bearing-Supported Shaft

    Science.gov (United States)

    Brown, Gerald V.

    2001-01-01

    A magnetic-bearing-supported shaft may have a number of concentricity and alignment problems. One of these involves the relationship of the position sensors, the centerline of the backup bearings, and the magnetic center of the magnetic bearings. For magnetic bearings with permanent magnet biasing, the average control current for a given control axis that is not bearing the shaft weight will be minimized if the shaft is centered, on average over a revolution, at the magnetic center of the bearings. That position may not yield zero sensor output or center the shaft in the backup bearing clearance. The desired shaft position that gives zero average current can be achieved if a simple additional term is added to the control law. Suppose that the instantaneous control currents from each bearing are available from measurements and can be input into the control computer. If each control current is integrated with a very small rate of accumulation and the result is added to the control output, the shaft will gradually move to a position where the control current averages to zero over many revolutions. This will occur regardless of any offsets of the position sensor inputs. At that position, the average control effort is minimized in comparison to other possible locations of the shaft. Nonlinearities of the magnetic bearing are minimized at that location as well.

  11. Computational neurogenetic modeling

    CERN Document Server

    Benuskova, Lubica

    2010-01-01

    Computational Neurogenetic Modeling is a student text, introducing the scope and problems of a new scientific discipline - Computational Neurogenetic Modeling (CNGM). CNGM is concerned with the study and development of dynamic neuronal models for modeling brain functions with respect to genes and dynamic interactions between genes. These include neural network models and their integration with gene network models. This new area brings together knowledge from various scientific disciplines, such as computer and information science, neuroscience and cognitive science, genetics and molecular biol

  12. Mobile, Collaborative Situated Knowledge Creation for Urban Planning

    Directory of Open Access Journals (Sweden)

    Nelson Baloian

    2012-05-01

    Full Text Available Geo-collaboration is an emerging research area in computer sciences studying the way spatial, geographically referenced information and communication technologies can support collaborative activities. Scenarios in which information associated to its physical location are of paramount importance are often referred as Situated Knowledge Creation scenarios. To date there are few computer systems supporting knowledge creation that explicitly incorporate physical context as part of the knowledge being managed in mobile face-to-face scenarios. This work presents a collaborative software application supporting visually-geo-referenced knowledge creation in mobile working scenarios while the users are interacting face-to-face. The system allows to manage data information associated to specific physical locations for knowledge creation processes in the field, such as urban planning, identifying specific physical locations, territorial management, etc.; using Tablet-PCs and GPS in order to geo-reference data and information. It presents a model for developing mobile applications supporting situated knowledge creation in the field, introducing the requirements for such an application and the functionalities it should have in order to fulfill them. The paper also presents the results of utility and usability evaluations.

  13. Complications of minimally invasive cosmetic procedures: Prevention and management

    Directory of Open Access Journals (Sweden)

    Lauren L Levy

    2012-01-01

    Full Text Available Over the past decade, facial rejuvenation procedures to circumvent traditional surgery have become increasingly popular. Office-based, minimally invasive procedures can promote a youthful appearance with minimal downtime and low risk of complications. Injectable botulinum toxin (BoNT, soft-tissue fillers, and chemical peels are among the most popular non-invasive rejuvenation procedures, and each has unique applications for improving facial aesthetics. Despite the simplicity and reliability of office-based procedures, complications can occur even with an astute and experienced injector. The goal of any procedure is to perform it properly and safely; thus, early recognition of complications when they do occur is paramount in dictating prevention of long-term sequelae. The most common complications from BoNT and soft-tissue filler injection are bruising, erythema and pain. With chemical peels, it is not uncommon to have erythema, irritation and burning. Fortunately, these side effects are normally transient and have simple remedies. More serious complications include muscle paralysis from BoNT, granuloma formation from soft-tissue filler placement and scarring from chemical peels. Thankfully, these complications are rare and can be avoided with excellent procedure technique, knowledge of facial anatomy, proper patient selection, and appropriate pre- and post-skin care. This article reviews complications of office-based, minimally invasive procedures, with emphasis on prevention and management. Practitioners providing these treatments should be well versed in this subject matter in order to deliver the highest quality care.

  14. TELMA: Technology-enhanced learning environment for minimally invasive surgery.

    Science.gov (United States)

    Sánchez-González, Patricia; Burgos, Daniel; Oropesa, Ignacio; Romero, Vicente; Albacete, Antonio; Sánchez-Peralta, Luisa F; Noguera, José F; Sánchez-Margallo, Francisco M; Gómez, Enrique J

    2013-06-01

    Cognitive skills training for minimally invasive surgery has traditionally relied upon diverse tools, such as seminars or lectures. Web technologies for e-learning have been adopted to provide ubiquitous training and serve as structured repositories for the vast amount of laparoscopic video sources available. However, these technologies fail to offer such features as formative and summative evaluation, guided learning, or collaborative interaction between users. The "TELMA" environment is presented as a new technology-enhanced learning platform that increases the user's experience using a four-pillared architecture: (1) an authoring tool for the creation of didactic contents; (2) a learning content and knowledge management system that incorporates a modular and scalable system to capture, catalogue, search, and retrieve multimedia content; (3) an evaluation module that provides learning feedback to users; and (4) a professional network for collaborative learning between users. Face validation of the environment and the authoring tool are presented. Face validation of TELMA reveals the positive perception of surgeons regarding the implementation of TELMA and their willingness to use it as a cognitive skills training tool. Preliminary validation data also reflect the importance of providing an easy-to-use, functional authoring tool to create didactic content. The TELMA environment is currently installed and used at the Jesús Usón Minimally Invasive Surgery Centre and several other Spanish hospitals. Face validation results ascertain the acceptance and usefulness of this new minimally invasive surgery training environment. Copyright © 2013 Elsevier Inc. All rights reserved.

  15. Data identification for improving gene network inference using computational algebra.

    Science.gov (United States)

    Dimitrova, Elena; Stigler, Brandilyn

    2014-11-01

    Identification of models of gene regulatory networks is sensitive to the amount of data used as input. Considering the substantial costs in conducting experiments, it is of value to have an estimate of the amount of data required to infer the network structure. To minimize wasted resources, it is also beneficial to know which data are necessary to identify the network. Knowledge of the data and knowledge of the terms in polynomial models are often required a priori in model identification. In applications, it is unlikely that the structure of a polynomial model will be known, which may force data sets to be unnecessarily large in order to identify a model. Furthermore, none of the known results provides any strategy for constructing data sets to uniquely identify a model. We provide a specialization of an existing criterion for deciding when a set of data points identifies a minimal polynomial model when its monomial terms have been specified. Then, we relax the requirement of the knowledge of the monomials and present results for model identification given only the data. Finally, we present a method for constructing data sets that identify minimal polynomial models.

  16. Factors influencing exemplary science teachers' levels of computer use

    Science.gov (United States)

    Hakverdi, Meral

    This study examines exemplary science teachers' use of technology in science instruction, factors influencing their level of computer use, their level of knowledge/skills in using specific computer applications for science instruction, their use of computer-related applications/tools during their instruction, and their students' use of computer applications/tools in or for their science class. After a relevant review of the literature certain variables were selected for analysis. These variables included personal self-efficacy in teaching with computers, outcome expectancy, pupil-control ideology, level of computer use, age, gender, teaching experience, personal computer use, professional computer use and science teachers' level of knowledge/skills in using specific computer applications for science instruction. The sample for this study includes middle and high school science teachers who received the Presidential Award for Excellence in Science Teaching Award (sponsored by the White House and the National Science Foundation) between the years 1997 and 2003 from all 50 states and U.S. territories. Award-winning science teachers were contacted about the survey via e-mail or letter with an enclosed return envelope. Of the 334 award-winning science teachers, usable responses were received from 92 science teachers, which made a response rate of 27.5%. Analysis of the survey responses indicated that exemplary science teachers have a variety of knowledge/skills in using computer related applications/tools. The most commonly used computer applications/tools are information retrieval via the Internet, presentation tools, online communication, digital cameras, and data collection probes. Results of the study revealed that students' use of technology in their science classroom is highly correlated with the frequency of their science teachers' use of computer applications/tools. The results of the multiple regression analysis revealed that personal self-efficacy related to

  17. Minimal and non-minimal standard models: Universality of radiative corrections

    International Nuclear Information System (INIS)

    Passarino, G.

    1991-01-01

    The possibility of describing electroweak processes by means of models with a non-minimal Higgs sector is analyzed. The renormalization procedure which leads to a set of fitting equations for the bare parameters of the lagrangian is first reviewed for the minimal standard model. A solution of the fitting equations is obtained, which correctly includes large higher-order corrections. Predictions for physical observables, notably the W boson mass and the Z O partial widths, are discussed in detail. Finally the extension to non-minimal models is described under the assumption that new physics will appear only inside the vector boson self-energies and the concept of universality of radiative corrections is introduced, showing that to a large extent they are insensitive to the details of the enlarged Higgs sector. Consequences for the bounds on the top quark mass are also discussed. (orig.)

  18. From computer to brain foundations of computational neuroscience

    CERN Document Server

    Lytton, William W

    2002-01-01

    Biology undergraduates, medical students and life-science graduate students often have limited mathematical skills. Similarly, physics, math and engineering students have little patience for the detailed facts that make up much of biological knowledge. Teaching computational neuroscience as an integrated discipline requires that both groups be brought forward onto common ground. This book does this by making ancillary material available in an appendix and providing basic explanations without becoming bogged down in unnecessary details. The book will be suitable for undergraduates and beginning graduate students taking a computational neuroscience course and also to anyone with an interest in the uses of the computer in modeling the nervous system.

  19. Computational neuroanatomy: ontology-based representation of neural components and connectivity.

    Science.gov (United States)

    Rubin, Daniel L; Talos, Ion-Florin; Halle, Michael; Musen, Mark A; Kikinis, Ron

    2009-02-05

    A critical challenge in neuroscience is organizing, managing, and accessing the explosion in neuroscientific knowledge, particularly anatomic knowledge. We believe that explicit knowledge-based approaches to make neuroscientific knowledge computationally accessible will be helpful in tackling this challenge and will enable a variety of applications exploiting this knowledge, such as surgical planning. We developed ontology-based models of neuroanatomy to enable symbolic lookup, logical inference and mathematical modeling of neural systems. We built a prototype model of the motor system that integrates descriptive anatomic and qualitative functional neuroanatomical knowledge. In addition to modeling normal neuroanatomy, our approach provides an explicit representation of abnormal neural connectivity in disease states, such as common movement disorders. The ontology-based representation encodes both structural and functional aspects of neuroanatomy. The ontology-based models can be evaluated computationally, enabling development of automated computer reasoning applications. Neuroanatomical knowledge can be represented in machine-accessible format using ontologies. Computational neuroanatomical approaches such as described in this work could become a key tool in translational informatics, leading to decision support applications that inform and guide surgical planning and personalized care for neurological disease in the future.

  20. Current trends on knowledge-based systems

    CERN Document Server

    Valencia-García, Rafael

    2017-01-01

    This book presents innovative and high-quality research on the implementation of conceptual frameworks, strategies, techniques, methodologies, informatics platforms and models for developing advanced knowledge-based systems and their application in different fields, including Agriculture, Education, Automotive, Electrical Industry, Business Services, Food Manufacturing, Energy Services, Medicine and others. Knowledge-based technologies employ artificial intelligence methods to heuristically address problems that cannot be solved by means of formal techniques. These technologies draw on standard and novel approaches from various disciplines within Computer Science, including Knowledge Engineering, Natural Language Processing, Decision Support Systems, Artificial Intelligence, Databases, Software Engineering, etc. As a combination of different fields of Artificial Intelligence, the area of Knowledge-Based Systems applies knowledge representation, case-based reasoning, neural networks, Semantic Web and TICs used...

  1. The representation of knowledge within model-based control systems

    International Nuclear Information System (INIS)

    Weygand, D.P.; Koul, R.

    1987-01-01

    The ability to represent knowledge is often considered essential to build systems with reasoning capabilities. In computer science, a good solution often depends on a good representation. The first step in development of most computer applications is selection of a representation for the input, output, and intermediate results that the program will operate upon. For applications in artificial intelligence, this initial choice of representation is especially important. This is because the possible representational paradigms are diverse and the forcing criteria for the choice are usually not clear in the beginning. Yet, the consequences of an inadequate choice can be devastating in the later state of a project if it is discovered that critical information cannot be encoded within the chosen representational paradigm. Problems arise when designing representational systems to support any kind of Knowledge-Base System, that is a computer system that uses knowledge to perform some task. The general case of knowledge-based systems can be thought of as reasoning agents applying knowledge to achieve goals. Artificial Intelligence (AI) research involves building computer systems to perform tasks of perception and reasoning, as well as storage and retrieval of data. The problem of automatically perceiving large patterns in data is a perceptual task that begins to be important for many expert systems applications. Most of AI research assumes that what needs to be represented is known a priori; an AI researcher's job is just figuring out how to encode the information in the system's data structure and procedures. 10 refs

  2. NP-hardness of the cluster minimization problem revisited

    Science.gov (United States)

    Adib, Artur B.

    2005-10-01

    The computational complexity of the 'cluster minimization problem' is revisited (Wille and Vennik 1985 J. Phys. A: Math. Gen. 18 L419). It is argued that the original NP-hardness proof does not apply to pairwise potentials of physical interest, such as those that depend on the geometric distance between the particles. A geometric analogue of the original problem is formulated, and a new proof for such potentials is provided by polynomial time transformation from the independent set problem for unit disk graphs. Limitations of this formulation are pointed out, and new subproblems that bear more direct consequences to the numerical study of clusters are suggested.

  3. NP-hardness of the cluster minimization problem revisited

    International Nuclear Information System (INIS)

    Adib, Artur B

    2005-01-01

    The computational complexity of the 'cluster minimization problem' is revisited (Wille and Vennik 1985 J. Phys. A: Math. Gen. 18 L419). It is argued that the original NP-hardness proof does not apply to pairwise potentials of physical interest, such as those that depend on the geometric distance between the particles. A geometric analogue of the original problem is formulated, and a new proof for such potentials is provided by polynomial time transformation from the independent set problem for unit disk graphs. Limitations of this formulation are pointed out, and new subproblems that bear more direct consequences to the numerical study of clusters are suggested

  4. NP-hardness of the cluster minimization problem revisited

    Energy Technology Data Exchange (ETDEWEB)

    Adib, Artur B [Physics Department, Brown University, Providence, RI 02912 (United States)

    2005-10-07

    The computational complexity of the 'cluster minimization problem' is revisited (Wille and Vennik 1985 J. Phys. A: Math. Gen. 18 L419). It is argued that the original NP-hardness proof does not apply to pairwise potentials of physical interest, such as those that depend on the geometric distance between the particles. A geometric analogue of the original problem is formulated, and a new proof for such potentials is provided by polynomial time transformation from the independent set problem for unit disk graphs. Limitations of this formulation are pointed out, and new subproblems that bear more direct consequences to the numerical study of clusters are suggested.

  5. Memetic Algorithms, Domain Knowledge, and Financial Investing

    Science.gov (United States)

    Du, Jie

    2012-01-01

    While the question of how to use human knowledge to guide evolutionary search is long-recognized, much remains to be done to answer this question adequately. This dissertation aims to further answer this question by exploring the role of domain knowledge in evolutionary computation as applied to real-world, complex problems, such as financial…

  6. Political significance of knowledge in Southeast Europe.

    Science.gov (United States)

    Slaus, Ivo

    2003-02-01

    The processes of globalization and transition are inevitable, full of dangers and threats, but offer enormous opportunities. Surveys of public opinion show that citizens are not aware of the fact that their countries are governed by the will of the people and a large majority considers that their country and the world are not going in the right direction. Presently, knowledge is becoming a dominant political power. This article outlines a strategy for building a knowledge-based society to minimize dangers, avoid threats, and take advantage of most of the opportunities, bringing a concrete action plan for Croatia, applicable to countries with similar history and socioeconomic structure.

  7. Randomization in clinical trials: stratification or minimization? The HERMES free simulation software.

    Science.gov (United States)

    Fron Chabouis, Hélène; Chabouis, Francis; Gillaizeau, Florence; Durieux, Pierre; Chatellier, Gilles; Ruse, N Dorin; Attal, Jean-Pierre

    2014-01-01

    Operative clinical trials are often small and open-label. Randomization is therefore very important. Stratification and minimization are two randomization options in such trials. The first aim of this study was to compare stratification and minimization in terms of predictability and balance in order to help investigators choose the most appropriate allocation method. Our second aim was to evaluate the influence of various parameters on the performance of these techniques. The created software generated patients according to chosen trial parameters (e.g., number of important prognostic factors, number of operators or centers, etc.) and computed predictability and balance indicators for several stratification and minimization methods over a given number of simulations. Block size and proportion of random allocations could be chosen. A reference trial was chosen (50 patients, 1 prognostic factor, and 2 operators) and eight other trials derived from this reference trial were modeled. Predictability and balance indicators were calculated from 10,000 simulations per trial. Minimization performed better with complex trials (e.g., smaller sample size, increasing number of prognostic factors, and operators); stratification imbalance increased when the number of strata increased. An inverse correlation between imbalance and predictability was observed. A compromise between predictability and imbalance still has to be found by the investigator but our software (HERMES) gives concrete reasons for choosing between stratification and minimization; it can be downloaded free of charge. This software will help investigators choose the appropriate randomization method in future two-arm trials.

  8. Cloud computing can simplify HIT infrastructure management.

    Science.gov (United States)

    Glaser, John

    2011-08-01

    Software as a Service (SaaS), built on cloud computing technology, is emerging as the forerunner in IT infrastructure because it helps healthcare providers reduce capital investments. Cloud computing leads to predictable, monthly, fixed operating expenses for hospital IT staff. Outsourced cloud computing facilities are state-of-the-art data centers boasting some of the most sophisticated networking equipment on the market. The SaaS model helps hospitals safeguard against technology obsolescence, minimizes maintenance requirements, and simplifies management.

  9. Gravity-assisted exact unification in minimal supersymmetric SU(5) and its gaugino mass spectrum

    International Nuclear Information System (INIS)

    Tobe, Kazuhiro; Wells, James D.

    2004-01-01

    Minimal supersymmetric SU(5) with exact unification is naively inconsistent with proton decay constraints. However, it can be made viable by a gravity-induced non-renormalizable operator connecting the adjoint Higgs boson and adjoint vector boson representations. We compute the allowed coupling space for this theory and find natural compatibility with proton decay constraints even for relatively light superpartner masses. The modifications away from the naive SU(5) theory have an impact on the gaugino mass spectrum, which we calculate. A combination of precision linear collider and large hadron collider measurements of superpartner masses would enable interesting tests of the high-scale form of minimal supersymmetric SU(5)

  10. A software system for oilfield facility investment minimization

    International Nuclear Information System (INIS)

    Ding, Z.X.; Startzman, R.A.

    1996-01-01

    Minimizing investment in oilfield development is an important subject that has attracted a considerable amount of industry attention. One method to reduce investment involves the optimal placement and selection of production facilities. Because of the large amount of capital used in this process, saving a small percent of the total investment may represent a large monetary value. The literature reports algorithms using mathematical programming techniques that were designed to solve the proposed problem in a global optimal manner. Owing to the high-computational complexity and the lack of user-friendly interfaces for data entry and results display, mathematical programming techniques have not been given enough attention in practice. This paper describes an interactive, graphical software system that provides a global optimal solution to the problem of placement and selection of production facilities in oil-field development processes. This software system can be used as an investment minimization tool and a scenario-study simulator. The developed software system consists of five basic modules: (1) an interactive data-input unit, (2) a cost function generator, (3) an optimization unit, (4) a graphic-output display, and (5) a sensitivity-analysis unit

  11. Computational problems in engineering

    CERN Document Server

    Mladenov, Valeri

    2014-01-01

    This book provides readers with modern computational techniques for solving variety of problems from electrical, mechanical, civil and chemical engineering. Mathematical methods are presented in a unified manner, so they can be applied consistently to problems in applied electromagnetics, strength of materials, fluid mechanics, heat and mass transfer, environmental engineering, biomedical engineering, signal processing, automatic control and more.   • Features contributions from distinguished researchers on significant aspects of current numerical methods and computational mathematics; • Presents actual results and innovative methods that provide numerical solutions, while minimizing computing times; • Includes new and advanced methods and modern variations of known techniques that can solve difficult scientific problems efficiently.  

  12. Minimal Gromov-Witten rings

    International Nuclear Information System (INIS)

    Przyjalkowski, V V

    2008-01-01

    We construct an abstract theory of Gromov-Witten invariants of genus 0 for quantum minimal Fano varieties (a minimal class of varieties which is natural from the quantum cohomological viewpoint). Namely, we consider the minimal Gromov-Witten ring: a commutative algebra whose generators and relations are of the form used in the Gromov-Witten theory of Fano varieties (of unspecified dimension). The Gromov-Witten theory of any quantum minimal variety is a homomorphism from this ring to C. We prove an abstract reconstruction theorem which says that this ring is isomorphic to the free commutative ring generated by 'prime two-pointed invariants'. We also find solutions of the differential equation of type DN for a Fano variety of dimension N in terms of the generating series of one-pointed Gromov-Witten invariants

  13. Minimal Liouville gravity on the torus via the Douglas string equation

    International Nuclear Information System (INIS)

    Spodyneiko, Lev

    2015-01-01

    In this paper we assume that the partition function in minimal Liouville gravity (MLG) obeys the Douglas string equation. This conjecture makes it possible to compute the torus correlation numbers in (3,p) MLG. We perform this calculation using also the resonance relations between the coupling constants in the KdV frame and in the Liouville frame. We obtain explicit expressions for the torus partition function and for the one- and two-point correlation numbers. (paper)

  14. Minimal time spiking in various ChR2-controlled neuron models.

    Science.gov (United States)

    Renault, Vincent; Thieullen, Michèle; Trélat, Emmanuel

    2018-02-01

    We use conductance based neuron models, and the mathematical modeling of optogenetics to define controlled neuron models and we address the minimal time control of these affine systems for the first spike from equilibrium. We apply tools of geometric optimal control theory to study singular extremals, and we implement a direct method to compute optimal controls. When the system is too large to theoretically investigate the existence of singular optimal controls, we observe numerically the optimal bang-bang controls.

  15. Foundations of Intelligent Systems : Proceedings of the Sixth International Conference on Intelligent Systems and Knowledge Engineering

    CERN Document Server

    Li, Tianrui

    2012-01-01

    Proceedings of The Sixth International Conference on Intelligent System and Knowledge Engineering presents selected papers from the conference ISKE 2011, held December 15-17 in Shanghai, China. This proceedings doesn’t only examine original research and approaches in the broad areas of intelligent systems and knowledge engineering, but also present new methodologies and practices in intelligent computing paradigms. The book introduces the current scientific and technical advances in the fields of artificial intelligence, machine learning, pattern recognition, data mining, information retrieval, knowledge-based systems, knowledge representation and reasoning, multi-agent systems, natural-language processing, etc. Furthermore, new computing methodologies are presented, including cloud computing, service computing and pervasive computing with traditional intelligent methods. The proceedings will be beneficial for both researchers and practitioners who want to utilize intelligent methods in their specific resea...

  16. Optimizing Ship Speed to Minimize Total Fuel Consumption with Multiple Time Windows

    Directory of Open Access Journals (Sweden)

    Jae-Gon Kim

    2016-01-01

    Full Text Available We study the ship speed optimization problem with the objective of minimizing the total fuel consumption. We consider multiple time windows for each port call as constraints and formulate the problem as a nonlinear mixed integer program. We derive intrinsic properties of the problem and develop an exact algorithm based on the properties. Computational experiments show that the suggested algorithm is very efficient in finding an optimal solution.

  17. Innovations in knowledge management the impact of social media, semantic web and cloud computing

    CERN Document Server

    Phillips-Wren, Gloria; Jain, Lakhmi

    2016-01-01

    This book discusses emerging trends in the field of managing knowledge work due to technological innovations. The book is organized in 3 sections. The first section, entitled "Managing Knowledge, Projects and Networks", discusses knowledge processes and their use, reuse or generation in the context of an organization. The second section, entitled "Managing Knowledge using Social Media", focuses on factors influencing adoption and usage, the role of social media in managing knowledge, and factors that influence employees' acceptance and participation. The third section brings into discussion new approaches and technologies for acquiring knowledge. The book will be useful to both academics engaged in research in knowledge management and practitioners who are considering or implementing strategies for managing one of their most important resources.

  18. Improving Computational Efficiency of Prediction in Model-Based Prognostics Using the Unscented Transform

    Science.gov (United States)

    Daigle, Matthew John; Goebel, Kai Frank

    2010-01-01

    Model-based prognostics captures system knowledge in the form of physics-based models of components, and how they fail, in order to obtain accurate predictions of end of life (EOL). EOL is predicted based on the estimated current state distribution of a component and expected profiles of future usage. In general, this requires simulations of the component using the underlying models. In this paper, we develop a simulation-based prediction methodology that achieves computational efficiency by performing only the minimal number of simulations needed in order to accurately approximate the mean and variance of the complete EOL distribution. This is performed through the use of the unscented transform, which predicts the means and covariances of a distribution passed through a nonlinear transformation. In this case, the EOL simulation acts as that nonlinear transformation. In this paper, we review the unscented transform, and describe how this concept is applied to efficient EOL prediction. As a case study, we develop a physics-based model of a solenoid valve, and perform simulation experiments to demonstrate improved computational efficiency without sacrificing prediction accuracy.

  19. Efficient universal computing architectures for decoding neural activity.

    Directory of Open Access Journals (Sweden)

    Benjamin I Rapoport

    Full Text Available The ability to decode neural activity into meaningful control signals for prosthetic devices is critical to the development of clinically useful brain- machine interfaces (BMIs. Such systems require input from tens to hundreds of brain-implanted recording electrodes in order to deliver robust and accurate performance; in serving that primary function they should also minimize power dissipation in order to avoid damaging neural tissue; and they should transmit data wirelessly in order to minimize the risk of infection associated with chronic, transcutaneous implants. Electronic architectures for brain- machine interfaces must therefore minimize size and power consumption, while maximizing the ability to compress data to be transmitted over limited-bandwidth wireless channels. Here we present a system of extremely low computational complexity, designed for real-time decoding of neural signals, and suited for highly scalable implantable systems. Our programmable architecture is an explicit implementation of a universal computing machine emulating the dynamics of a network of integrate-and-fire neurons; it requires no arithmetic operations except for counting, and decodes neural signals using only computationally inexpensive logic operations. The simplicity of this architecture does not compromise its ability to compress raw neural data by factors greater than [Formula: see text]. We describe a set of decoding algorithms based on this computational architecture, one designed to operate within an implanted system, minimizing its power consumption and data transmission bandwidth; and a complementary set of algorithms for learning, programming the decoder, and postprocessing the decoded output, designed to operate in an external, nonimplanted unit. The implementation of the implantable portion is estimated to require fewer than 5000 operations per second. A proof-of-concept, 32-channel field-programmable gate array (FPGA implementation of this portion

  20. Charge transfer interaction using quasiatomic minimal-basis orbitals in the effective fragment potential method

    International Nuclear Information System (INIS)

    Xu, Peng; Gordon, Mark S.

    2013-01-01

    The charge transfer (CT) interaction, the most time-consuming term in the general effective fragment potential method, is made much more computationally efficient. This is accomplished by the projection of the quasiatomic minimal-basis-set orbitals (QUAMBOs) as the atomic basis onto the self-consistent field virtual molecular orbital (MO) space to select a subspace of the full virtual space called the valence virtual space. The diagonalization of the Fock matrix in terms of QUAMBOs recovers the canonical occupied orbitals and, more importantly, gives rise to the valence virtual orbitals (VVOs). The CT energies obtained using VVOs are generally as accurate as those obtained with the full virtual space canonical MOs because the QUAMBOs span the valence part of the virtual space, which can generally be regarded as “chemically important.” The number of QUAMBOs is the same as the number of minimal-basis MOs of a molecule. Therefore, the number of VVOs is significantly smaller than the number of canonical virtual MOs, especially for large atomic basis sets. This leads to a dramatic decrease in the computational cost

  1. CLAss-Specific Subspace Kernel Representations and Adaptive Margin Slack Minimization for Large Scale Classification.

    Science.gov (United States)

    Yu, Yinan; Diamantaras, Konstantinos I; McKelvey, Tomas; Kung, Sun-Yuan

    2018-02-01

    In kernel-based classification models, given limited computational power and storage capacity, operations over the full kernel matrix becomes prohibitive. In this paper, we propose a new supervised learning framework using kernel models for sequential data processing. The framework is based on two components that both aim at enhancing the classification capability with a subset selection scheme. The first part is a subspace projection technique in the reproducing kernel Hilbert space using a CLAss-specific Subspace Kernel representation for kernel approximation. In the second part, we propose a novel structural risk minimization algorithm called the adaptive margin slack minimization to iteratively improve the classification accuracy by an adaptive data selection. We motivate each part separately, and then integrate them into learning frameworks for large scale data. We propose two such frameworks: the memory efficient sequential processing for sequential data processing and the parallelized sequential processing for distributed computing with sequential data acquisition. We test our methods on several benchmark data sets and compared with the state-of-the-art techniques to verify the validity of the proposed techniques.

  2. Teacher-Education Students' Views about Knowledge Building Theory and Practice

    Science.gov (United States)

    Hong, Huang-Yao; Chen, Fei-Ching; Chai, Ching Sing; Chan, Wen-Ching

    2011-01-01

    This study investigated the effects of engaging students to collectively learn and work with knowledge in a computer-supported collaborative learning environment called Knowledge Forum on their views about knowledge building theory and practice. Participants were 24 teacher-education students who took a required course titled "Integrating Theory…

  3. Minimal Marking: A Success Story

    Science.gov (United States)

    McNeilly, Anne

    2014-01-01

    The minimal-marking project conducted in Ryerson's School of Journalism throughout 2012 and early 2013 resulted in significantly higher grammar scores in two first-year classes of minimally marked university students when compared to two traditionally marked classes. The "minimal-marking" concept (Haswell, 1983), which requires…

  4. Computers as components principles of embedded computing system design

    CERN Document Server

    Wolf, Marilyn

    2012-01-01

    Computers as Components: Principles of Embedded Computing System Design, 3e, presents essential knowledge on embedded systems technology and techniques. Updated for today's embedded systems design methods, this edition features new examples including digital signal processing, multimedia, and cyber-physical systems. Author Marilyn Wolf covers the latest processors from Texas Instruments, ARM, and Microchip Technology plus software, operating systems, networks, consumer devices, and more. Like the previous editions, this textbook: Uses real processors to demonstrate both technology and tec

  5. An improved algorithm for finding all minimal paths in a network

    International Nuclear Information System (INIS)

    Bai, Guanghan; Tian, Zhigang; Zuo, Ming J.

    2016-01-01

    Minimal paths (MPs) play an important role in network reliability evaluation. In this paper, we report an efficient recursive algorithm for finding all MPs in two-terminal networks, which consist of a source node and a sink node. A linked path structure indexed by nodes is introduced, which accepts both directed and undirected form of networks. The distance between each node and the sink node is defined, and a simple recursive algorithm is presented for labeling the distance for each node. Based on the distance between each node and the sink node, additional conditions for backtracking are incorporated to reduce the number of search branches. With the newly introduced linked node structure, the distances between each node and the sink node, and the additional backtracking conditions, an improved backtracking algorithm for searching for all MPs is developed. In addition, the proposed algorithm can be adapted to search for all minimal paths for each source–sink pair in networks consisting of multiple source nodes and/or multiple sink nodes. Through computational experiments, it is demonstrated that the proposed algorithm is more efficient than existing algorithms when the network size is not too small. The proposed algorithm becomes more advantageous as the size of the network grows. - Highlights: • A linked path structure indexed by nodes is introduced to represent networks. • Additional conditions for backtracking are proposed based on the distance of each node. • An efficient algorithm is developed to find all MPs for two-terminal networks. • The computational efficiency of the algorithm for two-terminal networks is investigated. • The computational efficiency of the algorithm for multi-terminal networks is investigated.

  6. Lightgrid-an agile distributed computing architecture for Geant4

    International Nuclear Information System (INIS)

    Young, Jason; Perry, John O.; Jevremovic, Tatjana

    2010-01-01

    A light weight grid based computing architecture has been developed to accelerate Geant4 computations on a variety of network architectures. This new software is called LightGrid. LightGrid has a variety of features designed to overcome current limitations on other grid based computing platforms, more specifically, smaller network architectures. By focusing on smaller, local grids, LightGrid is able to simplify the grid computing process with minimal changes to existing Geant4 code. LightGrid allows for integration between Geant4 and MySQL, which both increases flexibility in the grid as well as provides a faster, reliable, and more portable method for accessing results than traditional data storage systems. This unique method of data acquisition allows for more fault tolerant runs as well as instant results from simulations as they occur. The performance increases brought along by using LightGrid allow simulation times to be decreased linearly. LightGrid also allows for pseudo-parallelization with minimal Geant4 code changes.

  7. Source Authenticity in the UMLS – A Case Study of the Minimal Standard Terminology

    Science.gov (United States)

    Elhanan, Gai; Huang, Kuo-Chuan; Perl, Yehoshua

    2010-01-01

    As the UMLS integrates multiple source vocabularies, the integration process requires that certain adaptation be applied to the source. Our interest is in examining the relationship between the UMLS representation of a source vocabulary and the source vocabulary itself. We investigated the integration of the Minimal Standard Terminology (MST) into the UMLS in order to examine how close its UMLS representation is to the source MST. The MST was conceived as a “minimal” list of terms and structure intended for use within computer systems to facilitate standardized reporting of gastrointestinal endoscopic examinations. Although the MST has an overall schema and implied relationship structure, many of the UMLS integrated MST terms were found to be hierarchically orphaned, and with lateral relationships that do not closely adhere to the source MST. Thus, the MST representation within the UMLS significantly differs from that of the source MST. These representation discrepancies may affect the usability of the MST representation in the UMLS for knowledge acquisition. Furthermore, they pose a problem from the perspective of application developers. While these findings may not necessarily apply to other source terminologies, they highlight the conflict between preservation of authentic concept orientation and the UMLS overall desire to provide fully specified names for all source terms. PMID:20692366

  8. Design-for-Six-Sigma To Develop a Bioprocess Knowledge Management Framework.

    Science.gov (United States)

    Junker, Beth; Maheshwari, Gargi; Ranheim, Todd; Altaras, Nedim; Stankevicz, Michael; Harmon, Lori; Rios, Sandra; D'anjou, Marc

    2011-01-01

    Owing to the high costs associated with biopharmaceutical development, considerable pressure has developed for the biopharmaceutical industry to increase productivity by becoming more lean and flexible. The ability to reuse knowledge was identified as one key advantage to streamline productivity, efficiently use resources, and ultimately perform better than the competition. A knowledge management (KM) strategy was assembled for bioprocess-related information using the technique of Design-for-Six-Sigma (DFSS). This strategy supported quality-by-design and process validation efforts for pipeline as well as licensed products. The DFSS technique was selected because it was both streamlined and efficient. These characteristics permitted development of a KM strategy with minimized team leader and team member resources. DFSS also placed a high emphasis on the voice of the customer, information considered crucial to the selection of solutions most appropriate for the current knowledge-based challenges of the organization. The KM strategy developed was comprised of nine workstreams, constructed from related solution buckets which in turn were assembled from the individual solution tasks that were identified. Each workstream's detailed design was evaluated against published and established best practices, as well as the KM strategy project charter and design inputs. Gaps and risks were identified and mitigated as necessary to improve the robustness of the proposed strategy. Aggregated resources (specifically expense/capital funds and staff) and timing were estimated to obtain vital management sponsorship for implementation. Where possible, existing governance and divisional/corporate information technology efforts were leveraged to minimize the additional bioprocess resources required for implementation. Finally, leading and lagging indicator metrics were selected to track the success of pilots and eventual implementation. A knowledge management framework was assembled for

  9. Waste minimization assessment procedure

    International Nuclear Information System (INIS)

    Kellythorne, L.L.

    1993-01-01

    Perry Nuclear Power Plant began developing a waste minimization plan early in 1991. In March of 1991 the plan was documented following a similar format to that described in the EPA Waste Minimization Opportunity Assessment Manual. Initial implementation involved obtaining management's commitment to support a waste minimization effort. The primary assessment goal was to identify all hazardous waste streams and to evaluate those streams for minimization opportunities. As implementation of the plan proceeded, non-hazardous waste streams routinely generated in large volumes were also evaluated for minimization opportunities. The next step included collection of process and facility data which would be useful in helping the facility accomplish its assessment goals. This paper describes the resources that were used and which were most valuable in identifying both the hazardous and non-hazardous waste streams that existed on site. For each material identified as a waste stream, additional information regarding the materials use, manufacturer, EPA hazardous waste number and DOT hazard class was also gathered. Once waste streams were evaluated for potential source reduction, recycling, re-use, re-sale, or burning for heat recovery, with disposal as the last viable alternative

  10. Westinghouse Hanford Company waste minimization actions

    International Nuclear Information System (INIS)

    Greenhalgh, W.O.

    1988-09-01

    Companies that generate hazardous waste materials are now required by national regulations to establish a waste minimization program. Accordingly, in FY88 the Westinghouse Hanford Company formed a waste minimization team organization. The purpose of the team is to assist the company in its efforts to minimize the generation of waste, train personnel on waste minimization techniques, document successful waste minimization effects, track dollar savings realized, and to publicize and administer an employee incentive program. A number of significant actions have been successful, resulting in the savings of materials and dollars. The team itself has been successful in establishing some worthwhile minimization projects. This document briefly describes the waste minimization actions that have been successful to date. 2 refs., 26 figs., 3 tabs

  11. Assessment of a computer-based Taenia solium health education tool ‘The Vicious Worm’ on knowledge uptake among professionals and their attitudes towards the program

    DEFF Research Database (Denmark)

    Ertel, Rebekka Lund; Braae, Uffe Christian; Ngowi, Helena Aminiel

    2017-01-01

    ’ on knowledge uptake among professionals and investigate attitudes towards the program. The study was carried out between March and May 2014 in Mbeya Region, Tanzania, where T. solium is endemic. The study was a pre and post assessment of a health education tool based on questionnaire surveys and focus group...... and knowledge regarding specific aspects was significantly improved in most aspects immediately after and two weeks after the health education. The focus group discussions showed positive attitudes towards the program and the study subjects found ‘The Vicious Worm’ efficient, simple, and appealing. The study...... discussions to investigate knowledge and attitudes. A total of 79 study subjects participated in the study including study subjects from both health – and agriculture sector. The health education consisted of 1½ hours individual practice with the computer program. The baseline questionnaire showed an overall...

  12. Computer assisted radiology

    International Nuclear Information System (INIS)

    Lemke, H.U.; Jaffe, C.C.; Felix, R.

    1993-01-01

    The proceedings of the CAR'93 symposium present the 126 oral papers and the 58 posters contributed to the four Technical Sessions entitled: (1) Image Management, (2) Medical Workstations, (3) Digital Image Generation - DIG, and (4) Application Systems - AS. Topics discussed in Session (1) are: picture archiving and communication systems, teleradiology, hospital information systems and radiological information systems, technology assessment and implications, standards, and data bases. Session (2) deals with computer vision, computer graphics, design and application, man computer interaction. Session (3) goes into the details of the diagnostic examination methods such as digital radiography, MRI, CT, nuclear medicine, ultrasound, digital angiography, and multimodality imaging. Session (4) is devoted to computer-assisted techniques, as there are: computer assisted radiological diagnosis, knowledge based systems, computer assisted radiation therapy and computer assisted surgical planning. (UWA). 266 figs [de

  13. Minimal but non-minimal inflation and electroweak symmetry breaking

    Energy Technology Data Exchange (ETDEWEB)

    Marzola, Luca [National Institute of Chemical Physics and Biophysics,Rävala 10, 10143 Tallinn (Estonia); Institute of Physics, University of Tartu,Ravila 14c, 50411 Tartu (Estonia); Racioppi, Antonio [National Institute of Chemical Physics and Biophysics,Rävala 10, 10143 Tallinn (Estonia)

    2016-10-07

    We consider the most minimal scale invariant extension of the standard model that allows for successful radiative electroweak symmetry breaking and inflation. The framework involves an extra scalar singlet, that plays the rôle of the inflaton, and is compatibile with current experimental bounds owing to the non-minimal coupling of the latter to gravity. This inflationary scenario predicts a very low tensor-to-scalar ratio r≈10{sup −3}, typical of Higgs-inflation models, but in contrast yields a scalar spectral index n{sub s}≃0.97 which departs from the Starobinsky limit. We briefly discuss the collider phenomenology of the framework.

  14. 7th International Conference on Knowledge Management in Organizations (KMO)

    CERN Document Server

    Herrera, Francisco; Pérez, Javier; Rodríguez, Juan; 7th International Conference on Knowledge Management in Organizations: Service and Cloud Computing

    2013-01-01

    The seventh International Conference on Knowledge Management in Organizations (KMO) brings together researchers and developers from industry and the academic world to report on the latest scientific and technical advances on knowledge management in organisations.   KMO 2012 provides an international forum for authors to present and discuss research focused on the role of knowledge management for innovative services in industries, to shed light on recent advances in cloud computing for KM as well as to identify future directions for researching the role of knowledge management in service innovation and how cloud computing can be used to address many of the issues currently facing KM in academia and industrial sectors. The conference took place at Salamanca in Spain on the 11th-13th July in 2012.

  15. Verification of product design using regulation knowledge base and Web services

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ik June [KAERI, Daejeon (Korea, Republic of); Lee, Jae Chul; Mun Du Hwan [Kyungpook National University, Daegu (Korea, Republic of); Kim, Byung Chul [Dong-A University, Busan (Korea, Republic of); Hwang, Jin Sang [PartDB Co., Ltd., Daejeom (Korea, Republic of); Lim, Chae Ho [Korea Institute of Industrial Technology, Incheon (Korea, Republic of)

    2015-11-15

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  16. Verification of product design using regulation knowledge base and Web services

    International Nuclear Information System (INIS)

    Kim, Ik June; Lee, Jae Chul; Mun Du Hwan; Kim, Byung Chul; Hwang, Jin Sang; Lim, Chae Ho

    2015-01-01

    Since product regulations contain important rules or codes that manufacturers must follow, automatic verification of product design with the regulations related to a product is necessary. For this, this study presents a new method for the verification of product design using regulation knowledge base and Web services. Regulation knowledge base consisting of product ontology and rules was built with a hybrid technique combining ontology and programming languages. Web service for design verification was developed ensuring the flexible extension of knowledge base. By virtue of two technical features, design verification is served to various products while the change of system architecture is minimized.

  17. Introduction to reversible computing

    CERN Document Server

    Perumalla, Kalyan S

    2013-01-01

    Few books comprehensively cover the software and programming aspects of reversible computing. Filling this gap, Introduction to Reversible Computing offers an expanded view of the field that includes the traditional energy-motivated hardware viewpoint as well as the emerging application-motivated software approach. Collecting scattered knowledge into one coherent account, the book provides a compendium of both classical and recently developed results on reversible computing. It explores up-and-coming theories, techniques, and tools for the application of rever

  18. Ruled Laguerre minimal surfaces

    KAUST Repository

    Skopenkov, Mikhail

    2011-10-30

    A Laguerre minimal surface is an immersed surface in ℝ 3 being an extremal of the functional ∫ (H 2/K-1)dA. In the present paper, we prove that the only ruled Laguerre minimal surfaces are up to isometry the surfaces ℝ (φλ) = (Aφ, Bφ, Cφ + D cos 2φ) + λ(sin φ, cos φ, 0), where A,B,C,D ε ℝ are fixed. To achieve invariance under Laguerre transformations, we also derive all Laguerre minimal surfaces that are enveloped by a family of cones. The methodology is based on the isotropic model of Laguerre geometry. In this model a Laguerre minimal surface enveloped by a family of cones corresponds to a graph of a biharmonic function carrying a family of isotropic circles. We classify such functions by showing that the top view of the family of circles is a pencil. © 2011 Springer-Verlag.

  19. Clashing and Emerging Genres: The interplay of knowledge forms in educational gaming

    Directory of Open Access Journals (Sweden)

    Thorkhild Hanghøj

    2011-09-01

    Full Text Available Based upon a series of design interventions with the educational computer game series Global Conflicts at various secondary schools, this article explores how educational gaming can be understood as a complex interplay between four knowledge forms – i.e. students’ everyday knowledge (non-specialised knowledge, the institutionalised knowledge forms of schooling, teachers’ subject-specific knowledge (specialised knowledge forms, and game-specific knowledge forms such as professional journalism, which is one of the inspirations for the game scenario. Depending on how the GC series was enacted by different teachers and students, these knowledge forms were brought into play rather differently. More specifically, several students experienced genre clashes in relation to their expectations of what it means to play a computer game, whereas other students experienced emerging genres – e.g. when one student was able to transform the game experience into a journalistic article that challenged her classmates’ understanding of journalistic writing.

  20. A Distributed Computational Infrastructure for Science and Education

    Directory of Open Access Journals (Sweden)

    Rustam K. Bazarov

    2014-06-01

    Full Text Available Researchers have lately been paying increasingly more attention to parallel and distributed algorithms for solving high-dimensionality problems. In this regard, the issue of acquiring or renting computational resources becomes a topical one for employees of scientific and educational institutions. This article examines technology and methods for organizing a distributed computational infrastructure. The author addresses the experience of creating a high-performance system powered by existing clusterization and grid computing technology. The approach examined in the article helps minimize financial costs, aggregate territorially distributed computational resources and ensures a more rational use of available computer equipment, eliminating its downtimes.

  1. Exploiting Genomic Knowledge in Optimising Molecular Breeding Programmes: Algorithms from Evolutionary Computing

    Science.gov (United States)

    O'Hagan, Steve; Knowles, Joshua; Kell, Douglas B.

    2012-01-01

    Comparatively few studies have addressed directly the question of quantifying the benefits to be had from using molecular genetic markers in experimental breeding programmes (e.g. for improved crops and livestock), nor the question of which organisms should be mated with each other to best effect. We argue that this requires in silico modelling, an approach for which there is a large literature in the field of evolutionary computation (EC), but which has not really been applied in this way to experimental breeding programmes. EC seeks to optimise measurable outcomes (phenotypic fitnesses) by optimising in silico the mutation, recombination and selection regimes that are used. We review some of the approaches from EC, and compare experimentally, using a biologically relevant in silico landscape, some algorithms that have knowledge of where they are in the (genotypic) search space (G-algorithms) with some (albeit well-tuned ones) that do not (F-algorithms). For the present kinds of landscapes, F- and G-algorithms were broadly comparable in quality and effectiveness, although we recognise that the G-algorithms were not equipped with any ‘prior knowledge’ of epistatic pathway interactions. This use of algorithms based on machine learning has important implications for the optimisation of experimental breeding programmes in the post-genomic era when we shall potentially have access to the full genome sequence of every organism in a breeding population. The non-proprietary code that we have used is made freely available (via Supplementary information). PMID:23185279

  2. Minimal variance hedging of natural gas derivatives in exponential Lévy models: Theory and empirical performance

    International Nuclear Information System (INIS)

    Ewald, Christian-Oliver; Nawar, Roy; Siu, Tak Kuen

    2013-01-01

    We consider the problem of hedging European options written on natural gas futures, in a market where prices of traded assets exhibit jumps, by trading in the underlying asset. We provide a general expression for the hedging strategy which minimizes the variance of the terminal hedging error, in terms of stochastic integral representations of the payoffs of the options involved. This formula is then applied to compute hedge ratios for common options in various models with jumps, leading to easily computable expressions. As a benchmark we take the standard Black–Scholes and Merton delta hedges. We show that in natural gas option markets minimal variance hedging with underlying consistently outperform the benchmarks by quite a margin. - Highlights: ► We derive hedging strategies for European type options written on natural gas futures. ► These are tested empirically using Henry Hub natural gas futures and options data. ► We find that our hedges systematically outperform classical benchmarks

  3. Using minimal spanning trees to compare the reliability of network topologies

    Science.gov (United States)

    Leister, Karen J.; White, Allan L.; Hayhurst, Kelly J.

    1990-01-01

    Graph theoretic methods are applied to compute the reliability for several types of networks of moderate size. The graph theory methods used are minimal spanning trees for networks with bi-directional links and the related concept of strongly connected directed graphs for networks with uni-directional links. A comparison is conducted of ring networks and braided networks. The case is covered where just the links fail and the case where both links and nodes fail. Two different failure modes for the links are considered. For one failure mode, the link no longer carries messages. For the other failure mode, the link delivers incorrect messages. There is a description and comparison of link-redundancy versus path-redundancy as methods to achieve reliability. All the computations are carried out by means of a fault tree program.

  4. Advances in photonic reservoir computing

    Directory of Open Access Journals (Sweden)

    Van der Sande Guy

    2017-05-01

    Full Text Available We review a novel paradigm that has emerged in analogue neuromorphic optical computing. The goal is to implement a reservoir computer in optics, where information is encoded in the intensity and phase of the optical field. Reservoir computing is a bio-inspired approach especially suited for processing time-dependent information. The reservoir’s complex and high-dimensional transient response to the input signal is capable of universal computation. The reservoir does not need to be trained, which makes it very well suited for optics. As such, much of the promise of photonic reservoirs lies in their minimal hardware requirements, a tremendous advantage over other hardware-intensive neural network models. We review the two main approaches to optical reservoir computing: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.

  5. Advances in photonic reservoir computing

    Science.gov (United States)

    Van der Sande, Guy; Brunner, Daniel; Soriano, Miguel C.

    2017-05-01

    We review a novel paradigm that has emerged in analogue neuromorphic optical computing. The goal is to implement a reservoir computer in optics, where information is encoded in the intensity and phase of the optical field. Reservoir computing is a bio-inspired approach especially suited for processing time-dependent information. The reservoir's complex and high-dimensional transient response to the input signal is capable of universal computation. The reservoir does not need to be trained, which makes it very well suited for optics. As such, much of the promise of photonic reservoirs lies in their minimal hardware requirements, a tremendous advantage over other hardware-intensive neural network models. We review the two main approaches to optical reservoir computing: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.

  6. Workshop on Computational Optimization

    CERN Document Server

    2015-01-01

    Our everyday life is unthinkable without optimization. We try to minimize our effort and to maximize the achieved profit. Many real world and industrial problems arising in engineering, economics, medicine and other domains can be formulated as optimization tasks. This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2013. It presents recent advances in computational optimization. The volume includes important real life problems like parameter settings for controlling processes in bioreactor, resource constrained project scheduling, problems arising in transport services, error correcting codes, optimal system performance and energy consumption and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others.

  7. Global Analysis of Minimal Surfaces

    CERN Document Server

    Dierkes, Ulrich; Tromba, Anthony J

    2010-01-01

    Many properties of minimal surfaces are of a global nature, and this is already true for the results treated in the first two volumes of the treatise. Part I of the present book can be viewed as an extension of these results. For instance, the first two chapters deal with existence, regularity and uniqueness theorems for minimal surfaces with partially free boundaries. Here one of the main features is the possibility of 'edge-crawling' along free parts of the boundary. The third chapter deals with a priori estimates for minimal surfaces in higher dimensions and for minimizers of singular integ

  8. Minimal Surfaces for Hitchin Representations

    DEFF Research Database (Denmark)

    Li, Qiongling; Dai, Song

    2018-01-01

    . In this paper, we investigate the properties of immersed minimal surfaces inside symmetric space associated to a subloci of Hitchin component: $q_n$ and $q_{n-1}$ case. First, we show that the pullback metric of the minimal surface dominates a constant multiple of the hyperbolic metric in the same conformal...... class and has a strong rigidity property. Secondly, we show that the immersed minimal surface is never tangential to any flat inside the symmetric space. As a direct corollary, the pullback metric of the minimal surface is always strictly negatively curved. In the end, we find a fully decoupled system...

  9. Practical Applications of Intelligent Systems : Proceedings of the Sixth International Conference on Intelligent Systems and Knowledge Engineering

    CERN Document Server

    Li, Tianrui

    2012-01-01

    Proceedings of The Sixth International Conference on Intelligent System and Knowledge Engineering presents selected papers from the conference ISKE 2011, held December 15-17 in Shanghai, China. This proceedings doesn’t only examine original research and approaches in the broad areas of intelligent systems and knowledge engineering, but also present new methodologies and practices in intelligent computing paradigms. The book introduces the current scientific and technical advances in the fields of artificial intelligence, machine learning, pattern recognition, data mining, information retrieval, knowledge-based systems, knowledge representation and reasoning, multi-agent systems, natural-language processing, etc. Furthermore, new computing methodologies are presented, including cloud computing, service computing and pervasive computing with traditional intelligent methods. The proceedings will be beneficial for both researchers and practitioners who want to utilize intelligent methods in their specific res...

  10. Fermilab Tevatron and CERN LEP II probes of minimal and string-motivated supergravity models

    International Nuclear Information System (INIS)

    Baer, H.; Gunion, J.F.; Kao, C.; Pois, H.

    1995-01-01

    We explore the ability of the Fermilab Tevatron to probe minimal supersymmetry with high-energy-scale boundary conditions motivated by supersymmetry breaking in the context of minimal and string-motivated supergravity theory. A number of boundary condition possibilities are considered: dilatonlike string boundary conditions applied at the standard GUT unification scale or alternatively at the string scale; and extreme (''no-scale'') minimal supergravity boundary conditions imposed at the GUT scale or string scale. For numerous specific cases within each scenario the sparticle spectra are computed and then fed into ISAGET 7.07 so that explicit signatures can be examined in detail. We find that, for some of the boundary condition choices, large regions of parameter space can be explored via same-sign dilepton and isolated trilepton signals. For other choices, the mass reach of Tevatron collider experiments is much more limited. We also compare the mass reach of Tevatron experiments with the corresponding reach at CERN LEP 200

  11. Minimal Webs in Riemannian Manifolds

    DEFF Research Database (Denmark)

    Markvorsen, Steen

    2008-01-01

    For a given combinatorial graph $G$ a {\\it geometrization} $(G, g)$ of the graph is obtained by considering each edge of the graph as a $1-$dimensional manifold with an associated metric $g$. In this paper we are concerned with {\\it minimal isometric immersions} of geometrized graphs $(G, g......)$ into Riemannian manifolds $(N^{n}, h)$. Such immersions we call {\\em{minimal webs}}. They admit a natural 'geometric' extension of the intrinsic combinatorial discrete Laplacian. The geometric Laplacian on minimal webs enjoys standard properties such as the maximum principle and the divergence theorems, which...... are of instrumental importance for the applications. We apply these properties to show that minimal webs in ambient Riemannian spaces share several analytic and geometric properties with their smooth (minimal submanifold) counterparts in such spaces. In particular we use appropriate versions of the divergence...

  12. Waste minimization handbook, Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility`s life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996.

  13. Waste minimization handbook, Volume 1

    International Nuclear Information System (INIS)

    Boing, L.E.; Coffey, M.J.

    1995-12-01

    This technical guide presents various methods used by industry to minimize low-level radioactive waste (LLW) generated during decommissioning and decontamination (D and D) activities. Such activities generate significant amounts of LLW during their operations. Waste minimization refers to any measure, procedure, or technique that reduces the amount of waste generated during a specific operation or project. Preventive waste minimization techniques implemented when a project is initiated can significantly reduce waste. Techniques implemented during decontamination activities reduce the cost of decommissioning. The application of waste minimization techniques is not limited to D and D activities; it is also useful during any phase of a facility's life cycle. This compendium will be supplemented with a second volume of abstracts of hundreds of papers related to minimizing low-level nuclear waste. This second volume is expected to be released in late 1996

  14. IPython interactive computing and visualization cookbook

    CERN Document Server

    Rossant, Cyrille

    2014-01-01

    Intended to anyone interested in numerical computing and data science: students, researchers, teachers, engineers, analysts, hobbyists... Basic knowledge of Python/NumPy is recommended. Some skills in mathematics will help you understand the theory behind the computational methods.

  15. A new paradigm of knowledge management

    African Journals Online (AJOL)

    kirstam

    (R&D) flow of innovations, or the flow of knowledge needed to develop new ...... emergent effects that are not inherently related to the behaviour of individuals within ..... Genetic algorithms (GAs) are computational models of the evolution.

  16. Extending Landauer's bound from bit erasure to arbitrary computation

    Science.gov (United States)

    Wolpert, David

    The minimal thermodynamic work required to erase a bit, known as Landauer's bound, has been extensively investigated both theoretically and experimentally. However, when viewed as a computation that maps inputs to outputs, bit erasure has a very special property: the output does not depend on the input. Existing analyses of thermodynamics of bit erasure implicitly exploit this property, and thus cannot be directly extended to analyze the computation of arbitrary input-output maps. Here we show how to extend these earlier analyses of bit erasure to analyze the thermodynamics of arbitrary computations. Doing this establishes a formal connection between the thermodynamics of computers and much of theoretical computer science. We use this extension to analyze the thermodynamics of the canonical ``general purpose computer'' considered in computer science theory: a universal Turing machine (UTM). We consider a UTM which maps input programs to output strings, where inputs are drawn from an ensemble of random binary sequences, and prove: i) The minimal work needed by a UTM to run some particular input program X and produce output Y is the Kolmogorov complexity of Y minus the log of the ``algorithmic probability'' of Y. This minimal amount of thermodynamic work has a finite upper bound, which is independent of the output Y, depending only on the details of the UTM. ii) The expected work needed by a UTM to compute some given output Y is infinite. As a corollary, the overall expected work to run a UTM is infinite. iii) The expected work needed by an arbitrary Turing machine T (not necessarily universal) to compute some given output Y can either be infinite or finite, depending on Y and the details of T. To derive these results we must combine ideas from nonequilibrium statistical physics with fundamental results from computer science, such as Levin's coding theorem and other theorems about universal computation. I would like to ackowledge the Santa Fe Institute, Grant No

  17. Evolution of a minimal parallel programming model

    International Nuclear Information System (INIS)

    Lusk, Ewing; Butler, Ralph; Pieper, Steven C.

    2017-01-01

    Here, we take a historical approach to our presentation of self-scheduled task parallelism, a programming model with its origins in early irregular and nondeterministic computations encountered in automated theorem proving and logic programming. We show how an extremely simple task model has evolved into a system, asynchronous dynamic load balancing (ADLB), and a scalable implementation capable of supporting sophisticated applications on today’s (and tomorrow’s) largest supercomputers; and we illustrate the use of ADLB with a Green’s function Monte Carlo application, a modern, mature nuclear physics code in production use. Our lesson is that by surrendering a certain amount of generality and thus applicability, a minimal programming model (in terms of its basic concepts and the size of its application programmer interface) can achieve extreme scalability without introducing complexity.

  18. Swarm robotics and minimalism

    Science.gov (United States)

    Sharkey, Amanda J. C.

    2007-09-01

    Swarm Robotics (SR) is closely related to Swarm Intelligence, and both were initially inspired by studies of social insects. Their guiding principles are based on their biological inspiration and take the form of an emphasis on decentralized local control and communication. Earlier studies went a step further in emphasizing the use of simple reactive robots that only communicate indirectly through the environment. More recently SR studies have moved beyond these constraints to explore the use of non-reactive robots that communicate directly, and that can learn and represent their environment. There is no clear agreement in the literature about how far such extensions of the original principles could go. Should there be any limitations on the individual abilities of the robots used in SR studies? Should knowledge of the capabilities of social insects lead to constraints on the capabilities of individual robots in SR studies? There is a lack of explicit discussion of such questions, and researchers have adopted a variety of constraints for a variety of reasons. A simple taxonomy of swarm robotics is presented here with the aim of addressing and clarifying these questions. The taxonomy distinguishes subareas of SR based on the emphases and justifications for minimalism and individual simplicity.

  19. Expert knowledge elicitation using computer simulation: the organization of frail elderly case management as an illustration.

    Science.gov (United States)

    Chiêm, Jean-Christophe; Van Durme, Thérèse; Vandendorpe, Florence; Schmitz, Olivier; Speybroeck, Niko; Cès, Sophie; Macq, Jean

    2014-08-01

    Various elderly case management projects have been implemented in Belgium. This type of long-term health care intervention involves contextual factors and human interactions. These underlying complex mechanisms can be usefully informed with field experts' knowledge, which are hard to make explicit. However, computer simulation has been suggested as one possible method of overcoming the difficulty of articulating such elicited qualitative views. A simulation model of case management was designed using an agent-based methodology, based on the initial qualitative research material. Variables and rules of interaction were formulated into a simple conceptual framework. This model has been implemented and was used as a support for a structured discussion with experts in case management. The rigorous formulation provided by the agent-based methodology clarified the descriptions of the interventions and the problems encountered regarding: the diverse network topologies of health care actors in the project; the adaptation time required by the intervention; the communication between the health care actors; the institutional context; the organization of the care; and the role of the case manager and his or hers personal ability to interpret the informal demands of the frail older person. The simulation model should be seen primarily as a tool for thinking and learning. A number of insights were gained as part of a valuable cognitive process. Computer simulation supporting field experts' elicitation can lead to better-informed decisions in the organization of complex health care interventions. © 2013 John Wiley & Sons, Ltd.

  20. The acknowledge project: toward improved efficiency in the knowledge acquisition process

    International Nuclear Information System (INIS)

    Marty, J.C.; Ramparany, F.

    1990-01-01

    This paper presents a general overview of the ACKnowledge Project (Acquisition of Knowledge). Knowledge Acquisition is a critical and time-consuming phase in the development of expert systems. The ACKnowledge project aims at improving the efficiency of knowledge acquisition by analyzing and evaluating knowledge acquisition techniques, and developing a Knowledge Engineering Workbench that supports the Knowledge Engineer from the early stage of knowledge aquisition up to the implementation of the knowledge base in large and complex application domains such as the diagnosis of dynamic computer networks

  1. Data Mining and Knowledge Discover - IBM Cognitive Alternatives for NASA KSC

    Science.gov (United States)

    Velez, Victor Hugo

    2016-01-01

    Skillful tools in cognitive computing to transform industries have been found favorable and profitable for different Directorates at NASA KSC. In this study is shown how cognitive computing systems can be useful for NASA when computers are trained in the same way as humans are to gain knowledge over time. Increasing knowledge through senses, learning and a summation of events is how the applications created by the firm IBM empower the artificial intelligence in a cognitive computing system. NASA has explored and applied for the last decades the artificial intelligence approach specifically with cognitive computing in few projects adopting similar models proposed by IBM Watson. However, the usage of semantic technologies by the dedicated business unit developed by IBM leads these cognitive computing applications to outperform the functionality of the inner tools and present outstanding analysis to facilitate the decision making for managers and leads in a management information system.

  2. Knowledge discovery from data streams

    CERN Document Server

    Gama, Joao

    2010-01-01

    Since the beginning of the Internet age and the increased use of ubiquitous computing devices, the large volume and continuous flow of distributed data have imposed new constraints on the design of learning algorithms. Exploring how to extract knowledge structures from evolving and time-changing data, Knowledge Discovery from Data Streams presents a coherent overview of state-of-the-art research in learning from data streams.The book covers the fundamentals that are imperative to understanding data streams and describes important applications, such as TCP/IP traffic, GPS data, sensor networks,

  3. Minimizing E-factor in the continuous-flow synthesis of diazepam and atropine.

    Science.gov (United States)

    Bédard, Anne-Catherine; Longstreet, Ashley R; Britton, Joshua; Wang, Yuran; Moriguchi, Hideki; Hicklin, Robert W; Green, William H; Jamison, Timothy F

    2017-12-01

    Minimizing the waste stream associated with the synthesis of active pharmaceutical ingredients (APIs) and commodity chemicals is of high interest within the chemical industry from an economic and environmental perspective. In exploring solutions to this area, we herein report a highly optimized and environmentally conscious continuous-flow synthesis of two APIs identified as essential medicines by the World Health Organization, namely diazepam and atropine. Notably, these approaches significantly reduced the E-factor of previously published routes through the combination of continuous-flow chemistry techniques, computational calculations and solvent minimization. The E-factor associated with the synthesis of atropine was reduced by 94-fold (about two orders of magnitude), from 2245 to 24, while the E-factor for the synthesis of diazepam was reduced by 4-fold, from 36 to 9. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Computer Support for Vicarious Learning.

    Science.gov (United States)

    Monthienvichienchai, Rachada; Sasse, M. Angela

    This paper investigates how computer support for vicarious learning can be implemented by taking a principled approach to selecting and combining different media to capture educational dialogues. The main goal is to create vicarious learning materials of appropriate pedagogic content and production quality, and at the same time minimize the…

  5. Surfaces of Minimal Paths from Topological Structures and Applications to 3D Object Segmentation

    KAUST Repository

    Algarni, Marei

    2017-10-24

    Extracting surfaces, representing boundaries of objects of interest, from volumetric images, has important applications in various scientific domains, from medicine to geology. In this thesis, I introduce novel mathematical, computational, and algorithmic machinery for extraction of sheet-like surfaces (with boundary), whose boundary is unknown a-priori, a particularly important case in applications that has no convenient methods. This case of a surface with boundaries has applications in extracting faults (among other geological structures) from seismic images in geological applications. Another application domain is in the extraction of structures in the lung from computed tomography (CT) images. Although many methods have been developed in computer vision for extraction of surfaces, including level sets, convex optimization approaches, and graph cut methods, none of these methods appear to be applicable to the case of surfaces with boundary. The novel methods for surface extraction, derived in this thesis, are built on the theory of Minimal Paths, which has been used primarily to extract curves in noisy or corrupted images and have had wide applicability in 2D computer vision. This thesis extends such methods to surfaces, and it is based on novel observations that surfaces can be determined by extracting topological structures from the solution of the eikonal partial differential equation (PDE), which is the basis of Minimal Path theory. Although topological structures are known to be difficult to extract from images, which are both noisy and discrete, this thesis builds robust methods based on Morse theory and computational topology to address such issues. The algorithms have run-time complexity O(NlogN), less complex than existing approaches. The thesis details the algorithms, theory, and shows an extensive experimental evaluation on seismic images and medical images. Experiments show out-performance in accuracy, computational speed, and user convenience

  6. Selection of personalized patient therapy through the use of knowledge-based computational models that identify tumor-driving signal transduction pathways.

    Science.gov (United States)

    Verhaegh, Wim; van Ooijen, Henk; Inda, Márcia A; Hatzis, Pantelis; Versteeg, Rogier; Smid, Marcel; Martens, John; Foekens, John; van de Wiel, Paul; Clevers, Hans; van de Stolpe, Anja

    2014-06-01

    Increasing knowledge about signal transduction pathways as drivers of cancer growth has elicited the development of "targeted drugs," which inhibit aberrant signaling pathways. They require a companion diagnostic test that identifies the tumor-driving pathway; however, currently available tests like estrogen receptor (ER) protein expression for hormonal treatment of breast cancer do not reliably predict therapy response, at least in part because they do not adequately assess functional pathway activity. We describe a novel approach to predict signaling pathway activity based on knowledge-based Bayesian computational models, which interpret quantitative transcriptome data as the functional output of an active signaling pathway, by using expression levels of transcriptional target genes. Following calibration on only a small number of cell lines or cohorts of patient data, they provide a reliable assessment of signaling pathway activity in tumors of different tissue origin. As proof of principle, models for the canonical Wnt and ER pathways are presented, including initial clinical validation on independent datasets from various cancer types. ©2014 American Association for Cancer Research.

  7. Computers in engineering. 1988

    International Nuclear Information System (INIS)

    Tipnis, V.A.; Patton, E.M.

    1988-01-01

    These proceedings discuss the following subjects: Knowledge base systems; Computers in designing; uses of artificial intelligence; engineering optimization and expert systems of accelerators; and parallel processing in designing

  8. Computing in Hydraulic Engineering Education

    Science.gov (United States)

    Duan, J. G.

    2011-12-01

    Civil engineers, pioneers of our civilization, are rarely perceived as leaders and innovators in modern society because of retardations in technology innovation. This crisis has resulted in the decline of the prestige of civil engineering profession, reduction of federal funding on deteriorating infrastructures, and problems with attracting the most talented high-school students. Infusion of cutting-edge computer technology and stimulating creativity and innovation therefore are the critical challenge to civil engineering education. To better prepare our graduates to innovate, this paper discussed the adaption of problem-based collaborative learning technique and integration of civil engineering computing into a traditional civil engineering curriculum. Three interconnected courses: Open Channel Flow, Computational Hydraulics, and Sedimentation Engineering, were developed with emphasis on computational simulations. In Open Channel flow, the focuses are principles of free surface flow and the application of computational models. This prepares students to the 2nd course, Computational Hydraulics, that introduce the fundamental principles of computational hydraulics, including finite difference and finite element methods. This course complements the Open Channel Flow class to provide students with in-depth understandings of computational methods. The 3rd course, Sedimentation Engineering, covers the fundamentals of sediment transport and river engineering, so students can apply the knowledge and programming skills gained from previous courses to develop computational models for simulating sediment transport. These courses effectively equipped students with important skills and knowledge to complete thesis and dissertation research.

  9. Computational intelligence and quantitative software engineering

    CERN Document Server

    Succi, Giancarlo; Sillitti, Alberto

    2016-01-01

    In a down-to-the earth manner, the volume lucidly presents how the fundamental concepts, methodology, and algorithms of Computational Intelligence are efficiently exploited in Software Engineering and opens up a novel and promising avenue of a comprehensive analysis and advanced design of software artifacts. It shows how the paradigm and the best practices of Computational Intelligence can be creatively explored to carry out comprehensive software requirement analysis, support design, testing, and maintenance. Software Engineering is an intensive knowledge-based endeavor of inherent human-centric nature, which profoundly relies on acquiring semiformal knowledge and then processing it to produce a running system. The knowledge spans a wide variety of artifacts, from requirements, captured in the interaction with customers, to design practices, testing, and code management strategies, which rely on the knowledge of the running system. This volume consists of contributions written by widely acknowledged experts ...

  10. Advanced software development workstation. Knowledge base design: Design of knowledge base for flight planning application

    Science.gov (United States)

    Izygon, Michel E.

    1992-01-01

    The development process of the knowledge base for the generation of Test Libraries for Mission Operations Computer (MOC) Command Support focused on a series of information gathering interviews. These knowledge capture sessions are supporting the development of a prototype for evaluating the capabilities of INTUIT on such an application. the prototype includes functions related to POCC (Payload Operation Control Center) processing. It prompts the end-users for input through a series of panels and then generates the Meds associated with the initialization and the update of hazardous command tables for a POCC Processing TLIB.

  11. Relationship of the duke jeopardy score combined with minimal lumen diameter as assessed by computed tomography angiography to the hemodynamic relevance of coronary artery stenosis.

    Science.gov (United States)

    Yu, Mengmeng; Zhao, Yonghong; Li, Wenbin; Lu, Zhigang; Wei, Meng; Zhou, Wenxiao; Zhang, Jiayin

    2018-03-02

    To study the diagnostic performance of the ratio between the Duke jeopardy score (DJS) and the minimal lumen diameter (MLD) (DJS/MLD CT ratio) as assessed by coronary computed tomographic angiography (CTA) for differentiating functionally significant from non-significant coronary artery stenoses, with reference to invasive fractional flow reserve (FFR). Patients who underwent both coronary CTA and FFR measurement during invasive coronary angiography (ICA) within 2 weeks were retrospectively included in the study. Invasive FFR measurement was performed in patients with intermediate to severe coronary stenoseis. DJS/MLD CT ratio and anatomical parameters were recorded. Lesions with FFR ≤0.80 were considered to be functionally significant. One hundred and sixty-one patients with 175 lesions were included into the analysis. Diameter stenosis in CT, area stenosis, plaque burden, lesion length (LL), ICA-based stenosis degree, DJS, LL/MLD 4 ratio, DJS/MLA ratio as well as DJS/MLD ratio were all significantly different between hemodynamically significant and non-significant lesions (pvalue for DJS/MLD CT ratio to be 1.96 (area under curve = 0.863, 95 % confidence interval = 0.803-0.910), yielding a high diagnostic accuracy (86.9%, 152/175). In coronary artery stenoses detected by coronary CTA, the DJS/MLD ratio is able to predict hemodynamic relevance. Copyright © 2018 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.

  12. Data Linkage Graph: computation, querying and knowledge discovery of life science database networks

    Directory of Open Access Journals (Sweden)

    Lange Matthias

    2007-12-01

    Full Text Available To support the interpretation of measured molecular facts, like gene expression experiments or EST sequencing, the functional or the system biological context has to be considered. Doing so, the relationship to existing biological knowledge has to be discovered. In general, biological knowledge is worldwide represented in a network of databases. In this paper we present a method for knowledge extraction in life science databases, which prevents the scientists from screen scraping and web clicking approaches.

  13. Computer science - A bridge between nuclear knowledge and practice

    International Nuclear Information System (INIS)

    Pavelescu, A.O.; Ghizdeanu, E.N.

    2004-01-01

    The paper analyzes the horizon of the new information technologies, in a way that regards both parties involved in the education process: trainers and trainees. The case study refers to the Nuclear Power Plant Department within Power Engineering Faculty at University 'POLITEHNICA' of Bucharest in Romania. Attracting qualified people to nuclear field is essential for sustainable development - especially for developing countries, such as Romania. One way to accomplish this goal is to utilize the interest of the students in interconnected domains like computer based tools The study took into account the feedback information from the students, as well as the international framework (recommendation from IAEA) in order to postulate what can be changed or improved in order to create a better learning environment. The work revealed the fact that students have a relatively poor education related to computer programming, although there are some applications currently being studied, such as ACSL (Advanced Continuous Simulation Language). There is a need for better utilization of other general technical use programs such as Mathcad, Matlab, and specialized programs such as CANDU-9 Compact Reactor Simulator, Advanced Reactor Simulator, MMS (Modular Modelling System), MicroShield, FISPACT, designed to simulate, in an attractive and 'user friendly' mode, the comportment of a nuclear power plant and for radiological assessment. This would increase the degree of understanding of the extremely complex processes that take place in such an installation. A more practical approach is imperative in order to capture the interest of the students. Because this is not always possible, the importance of computer simulated processes is emphasized. (author)

  14. Software support environment design knowledge capture

    Science.gov (United States)

    Dollman, Tom

    1990-01-01

    The objective of this task is to assess the potential for using the software support environment (SSE) workstations and associated software for design knowledge capture (DKC) tasks. This assessment will include the identification of required capabilities for DKC and hardware/software modifications needed to support DKC. Several approaches to achieving this objective are discussed and interim results are provided: (1) research into the problem of knowledge engineering in a traditional computer-aided software engineering (CASE) environment, like the SSE; (2) research into the problem of applying SSE CASE tools to develop knowledge based systems; and (3) direct utilization of SSE workstations to support a DKC activity.

  15. The Potential for Computer Based Systems in Modular Engineering

    DEFF Research Database (Denmark)

    Miller, Thomas Dedenroth

    1998-01-01

    The paper elaborates on knowledge management and the possibility for computer support of the design process of pharmaceutical production plants in relation to the ph.d. project modular engineering.......The paper elaborates on knowledge management and the possibility for computer support of the design process of pharmaceutical production plants in relation to the ph.d. project modular engineering....

  16. Multi-terminal pipe routing by Steiner minimal tree and particle swarm optimisation

    Science.gov (United States)

    Liu, Qiang; Wang, Chengen

    2012-08-01

    Computer-aided design of pipe routing is of fundamental importance for complex equipments' developments. In this article, non-rectilinear branch pipe routing with multiple terminals that can be formulated as a Euclidean Steiner Minimal Tree with Obstacles (ESMTO) problem is studied in the context of an aeroengine-integrated design engineering. Unlike the traditional methods that connect pipe terminals sequentially, this article presents a new branch pipe routing algorithm based on the Steiner tree theory. The article begins with a new algorithm for solving the ESMTO problem by using particle swarm optimisation (PSO), and then extends the method to the surface cases by using geodesics to meet the requirements of routing non-rectilinear pipes on the surfaces of aeroengines. Subsequently, the adaptive region strategy and the basic visibility graph method are adopted to increase the computation efficiency. Numeral computations show that the proposed routing algorithm can find satisfactory routing layouts while running in polynomial time.

  17. A Cross-Cultural Study of the Effect of a Graph-Oriented Computer-Assisted Project-Based Learning Environment on Middle School Students' Science Knowledge and Argumentation Skills

    Science.gov (United States)

    Hsu, P.-S.; Van Dyke, M.; Chen, Y.; Smith, T. J.

    2016-01-01

    The purpose of this mixed-methods study was to explore how seventh graders in a suburban school in the United States and sixth graders in an urban school in Taiwan developed argumentation skills and science knowledge in a project-based learning environment that incorporated a graph-oriented, computer-assisted application (GOCAA). A total of 42…

  18. Hybrid classifiers methods of data, knowledge, and classifier combination

    CERN Document Server

    Wozniak, Michal

    2014-01-01

    This book delivers a definite and compact knowledge on how hybridization can help improving the quality of computer classification systems. In order to make readers clearly realize the knowledge of hybridization, this book primarily focuses on introducing the different levels of hybridization and illuminating what problems we will face with as dealing with such projects. In the first instance the data and knowledge incorporated in hybridization were the action points, and then a still growing up area of classifier systems known as combined classifiers was considered. This book comprises the aforementioned state-of-the-art topics and the latest research results of the author and his team from Department of Systems and Computer Networks, Wroclaw University of Technology, including as classifier based on feature space splitting, one-class classification, imbalance data, and data stream classification.

  19. A Simple Method for Dynamic Scheduling in a Heterogeneous Computing System

    OpenAIRE

    Žumer, Viljem; Brest, Janez

    2002-01-01

    A simple method for the dynamic scheduling on a heterogeneous computing system is proposed in this paper. It was implemented to minimize the parallel program execution time. The proposed method decomposes the program workload into computationally homogeneous subtasks, which may be of the different size, depending on the current load of each machine in a heterogeneous computing system.

  20. Computer-Aided Parts Estimation

    OpenAIRE

    Cunningham, Adam; Smart, Robert

    1993-01-01

    In 1991, Ford Motor Company began deployment of CAPE (computer-aided parts estimating system), a highly advanced knowledge-based system designed to generate, evaluate, and cost automotive part manufacturing plans. cape is engineered on an innovative, extensible, declarative process-planning and estimating knowledge representation language, which underpins the cape kernel architecture. Many manufacturing processes have been modeled to date, but eventually every significant process in motor veh...

  1. Exploring Students' Knowledge Construction Strategies in Computer-Supported Collaborative Learning Discussions Using Sequential Analysis

    NARCIS (Netherlands)

    Shukor, N.B.A.; Tasir, Z.; Meijden, H.A.T. van der; Harun, J.

    2014-01-01

    Online collaborative learning allows discussion to occur at greater depth where knowledge can be constructed remotely. However students were found to construct knowledge at low-level where they discussed by sharing and comparing opinions; those are inadequate for new knowledge creation. As such,

  2. A simultaneous minimally invasive approach to treat a patient with coronary artery disease and metastatic lung cancer.

    Science.gov (United States)

    Fu, Yuanhao; Zhang, Lufeng; Ji, Ling; Xu, Chenyang

    2016-01-01

    Concurrent lung cancer and coronary artery disease requiring treatment with percutaneous coronary intervention or coronary artery bypass grafting is not rare. An individualized perioperative anticoagulation regimen and minimal surgical trauma will benefit the patient's postoperative recovery. We successfully treated a 68-year-old female patient with a lesion in the left anterior descending artery and metastatic right lung carcinoma by simultaneous minimally invasive direct coronary artery bypass grafting via a small left thoracotomy and thoracoscopic wedge resection of the lung lesion. She recovered and was discharged on the eighth postoperative day. The patient showed no symptoms of myocardial ischemia postoperatively. Computed tomography scan did not indicate metastatic lesion of lung carcinoma at 1-year follow-up. In conclusion, minimally invasive direct coronary artery bypass grafting combined with thoracoscopic wedge resection is an effective minimally invasive treatment for concurrent lung cancer and coronary artery disease. This technique eliminates the risk of perioperative bleeding and provides satisfactory mid-term follow-up results.

  3. Ruled Laguerre minimal surfaces

    KAUST Repository

    Skopenkov, Mikhail; Pottmann, Helmut; Grohs, Philipp

    2011-01-01

    A Laguerre minimal surface is an immersed surface in ℝ 3 being an extremal of the functional ∫ (H 2/K-1)dA. In the present paper, we prove that the only ruled Laguerre minimal surfaces are up to isometry the surfaces ℝ (φλ) = (Aφ, Bφ, Cφ + D cos 2φ

  4. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    Science.gov (United States)

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  5. A Model of an Expanded-Frame Hypermedia Knowledge-Base for Instruction.

    Science.gov (United States)

    Lacy, Mark J.; Wood, R. Kent

    1993-01-01

    Argues that current computer-based instruction does not exploit the instructional possibilities of computers. Critiques current models of computer-based instruction: behaviorist as too linear and constructivist as too unstructured. Offers a design model of Expanded-frame Hypermedia Knowledge-bases as an instructional approach allowing hypermedia…

  6. A Matter of Computer Time

    Science.gov (United States)

    Celano, Donna; Neuman, Susan B.

    2010-01-01

    Many low-income children do not have the opportunity to develop the computer skills necessary to succeed in our technological economy. Their only access to computers and the Internet--school, afterschool programs, and community organizations--is woefully inadequate. Educators must work to close this knowledge gap and to ensure that low-income…

  7. A Systematic Procedure for the Generation of Cost-Minimized Designs

    DEFF Research Database (Denmark)

    Becker, Peter W.; Jarkler, Bjorn

    1972-01-01

    We present a procedure for the generation of cost-minimized designs of circuits and systems. Suppose a designer has decided upon the topology of his product. Also suppose he knows the cost and quality of the different grades of the N components required to implement the product. The designer...... then faces the following problem: How should he proceed to find the combination of grades that will give him the desired manufacturing yield at minimum product cost? We discuss the problem and suggest a policy by which the designer, with a reasonable computational effort, can find a set of ``good...

  8. Y-12 Plant waste minimization strategy

    International Nuclear Information System (INIS)

    Kane, M.A.

    1987-01-01

    The 1984 Amendments to the Resource Conservation and Recovery Act (RCRA) mandate that waste minimization be a major element of hazardous waste management. In response to this mandate and the increasing costs for waste treatment, storage, and disposal, the Oak Ridge Y-12 Plant developed a waste minimization program to encompass all types of wastes. Thus, waste minimization has become an integral part of the overall waste management program. Unlike traditional approaches, waste minimization focuses on controlling waste at the beginning of production instead of the end. This approach includes: (1) substituting nonhazardous process materials for hazardous ones, (2) recycling or reusing waste effluents, (3) segregating nonhazardous waste from hazardous and radioactive waste, and (4) modifying processes to generate less waste or less toxic waste. An effective waste minimization program must provide the appropriate incentives for generators to reduce their waste and provide the necessary support mechanisms to identify opportunities for waste minimization. This presentation focuses on the Y-12 Plant's strategy to implement a comprehensive waste minimization program. This approach consists of four major program elements: (1) promotional campaign, (2) process evaluation for waste minimization opportunities, (3) waste generation tracking system, and (4) information exchange network. The presentation also examines some of the accomplishments of the program and issues which need to be resolved

  9. Towards a computational spatial knowledge acquisition model in architectural space

    NARCIS (Netherlands)

    Lyu, J.; Vries, de B.; Sun, C.; Sun, C.; Zhang, J.

    2013-01-01

    Abstract. Existing research which is related to spatial knowledge acquisition often shows a limited scope because of the complexity in the cognition process. Research in spatial representation such as space syntax presumes that vision drives movement. This assumption is only true under certain

  10. Minimal Composite Inflation

    DEFF Research Database (Denmark)

    Channuie, Phongpichit; Jark Joergensen, Jakob; Sannino, Francesco

    2011-01-01

    We investigate models in which the inflaton emerges as a composite field of a four dimensional, strongly interacting and nonsupersymmetric gauge theory featuring purely fermionic matter. We show that it is possible to obtain successful inflation via non-minimal coupling to gravity, and that the u......We investigate models in which the inflaton emerges as a composite field of a four dimensional, strongly interacting and nonsupersymmetric gauge theory featuring purely fermionic matter. We show that it is possible to obtain successful inflation via non-minimal coupling to gravity...

  11. Students' inductive reasoning skills and the relevance of prior knowledge: an exploratory study with a computer-based training course on the topic of acne vulgaris.

    Science.gov (United States)

    Horn-Ritzinger, Sabine; Bernhardt, Johannes; Horn, Michael; Smolle, Josef

    2011-04-01

    The importance of inductive instruction in medical education is increasingly growing. Little is known about the relevance of prior knowledge regarding students' inductive reasoning abilities. The purpose is to evaluate this inductive teaching method as a means of fostering higher levels of learning and to explore how individual differences in prior knowledge (high [HPK] vs. low [LPK]) contribute to students' inductive reasoning skills. Twenty-six LPK and 18 HPK students could train twice with an interactive computer-based training object to discover the underlying concept before doing the final comprehension check. Students had a median of 76.9% of correct answers in the first, 90.9% in the second training, and answered 92% of the final assessment questions correctly. More important, 86% of all students succeeded with inductive learning, among them 83% of the HPK students and 89% of the LPK students. Prior knowledge did not predict performance on overall comprehension. This inductive instructional strategy fostered students' deep approaches to learning in a time-effective way.

  12. Knowledge management: High energy physics as model case

    International Nuclear Information System (INIS)

    Trabelsi, A.

    2004-01-01

    Full text: The world-wide High Energy Physics (HEP) community has emerged as one of the major forces in developing new tools and concepts to enhance the overall quality of knowledge management and to support technological innovation in this field. Though joint research and academic activities in HEP represent a more than 50-years old tradition, collaboration in this field has changed over the decades. In coming years, bigger and more distributed than ever before collaborations, with several thousand physicists and engineers, will concentrate on fewer major HEP experiments. They will face unprecedented challenges to accomplish their work at the leading laboratories where large accelerators are being constructed. These challenges arise primarily from the rapidly increasing size and complexity of datasets to be collected and the enormous computational, storage and networking resources to be deployed by global collaborations in order to process, distribute and analyze information. During the last two decades, the Web was HEP community response to the new wave of scientific collaborations. Almost all data networking in the HEP community is today based on the Internet which has since grown into a global information highway. Currently, HEP community needs to attempt to progress beyond structure information towards automated knowledge management of scientific data which requires extremely capable computing infrastructures supporting several key areas. Together with computer scientists, HEP community recognised as a driving force, is extremely well positioned to continue this successful strategy with respect to the initiative to build 'the next generation internet'. Facing knowledge sharing, acquisition and organisation growing requirement, HEP scientists invented the preprint concept in order to facilitate and speed up access to the ongoing research development and results. Preprint archive has since become a global repository for research particularly in physics

  13. Angiomyolipoma with minimal fat: Differentiation from papillary renal cell carcinoma by helical CT

    International Nuclear Information System (INIS)

    Zhang, Y.-Y.; Luo, S.; Liu, Y.; Xu, R.-T.

    2013-01-01

    Aim: To evaluate whether helical computed tomography (CT) images can be used to differentiate angiomyolipomas (AMLs) with minimal fat from papillary renal cell carcinomas (PRCCs) based on their morphological characteristics and enhancement features. Materials and methods: This retrospective study was approved by the institutional review board. Informed consent was waived. Forty-four patients (21 with AMLs with minimal fat and 23 with PRCCs) who underwent enhanced helical CT before total or partial nephrectomy were included. Two radiologists, who were blinded to the histopathology results, read the CT images and recorded the attenuation value, morphological characteristics, and enhancement features of the tumours, which were subsequently evaluated. An independent samples t-test, χ 2 test, and rank sum test were performed between the tumours. The predictive value of a CT finding was determined by multivariate logistic regression analysis. Results: AML with minimal fat had an apparent female prevalence (p < 0.01). Intra-tumoural vessels were noted in 11 cases of AML with minimal fat and three PRCC cases (p < 0.01). The unenhanced attenuation characteristic was significantly different between the two diseases (p < 0.001). The absolute attenuation values (AAVs) and the corrected attenuation values (CAVs) of the AML with minimal fat group of unenhanced and two phases of enhanced images were greater compared with that of the PRCC group (p < 0.05). After contrast medium injection, the tumour enhancement value (TEV) of the AML with minimal fat group in the corticomedullary phase was greater than that of the PRCC group (p < 0.01). Most cases of both tumour types demonstrated early enhancement characteristics; the enhancement value of the AML with minimal fat group was greater compared with that of the PRCC group (p < 0.01). The unenhanced attenuation characteristic, intra-tumoural vessels, and CAVs of unenhanced and early excretory phase scans were valuable parameters to

  14. Guideline Knowledge Representation Model (GLIKREM)

    Czech Academy of Sciences Publication Activity Database

    Buchtela, David; Peleška, Jan; Veselý, Arnošt; Zvárová, Jana; Zvolský, Miroslav

    2008-01-01

    Roč. 4, č. 1 (2008), s. 17-23 ISSN 1801-5603 R&D Projects: GA MŠk(CZ) 1M06014 Institutional research plan: CEZ:AV0Z10300504 Keywords : knowledge representation * GLIF model * guidelines Subject RIV: IN - Informatics, Computer Science http://www.ejbi.org/articles/200812/34/1.html

  15. INFORMATION SYSTEM QUALITY CONTROL KNOWLEDGE

    Directory of Open Access Journals (Sweden)

    Vladimir Nikolaevich Babeshko

    2017-02-01

    Full Text Available The development of the educational system is associated with the need to control the quality of educational services. Quality control knowledge is an important part of the scientific process. The penetration of computers into all areas of activities changing approaches and technologies that previously they were used.

  16. Electromagnetic Induction: A Computer-Assisted Experiment

    Science.gov (United States)

    Fredrickson, J. E.; Moreland, L.

    1972-01-01

    By using minimal equipment it is possible to demonstrate Faraday's Law. An electronic desk calculator enables sophomore students to solve a difficult mathematical expression for the induced EMF. Polaroid pictures of the plot of induced EMF, together with the computer facility, enables students to make comparisons. (PS)

  17. Minimal abdominal incisions

    Directory of Open Access Journals (Sweden)

    João Carlos Magi

    2017-04-01

    Full Text Available Minimally invasive procedures aim to resolve the disease with minimal trauma to the body, resulting in a rapid return to activities and in reductions of infection, complications, costs and pain. Minimally incised laparotomy, sometimes referred to as minilaparotomy, is an example of such minimally invasive procedures. The aim of this study is to demonstrate the feasibility and utility of laparotomy with minimal incision based on the literature and exemplifying with a case. The case in question describes reconstruction of the intestinal transit with the use of this incision. Male, young, HIV-positive patient in a late postoperative of ileotiflectomy, terminal ileostomy and closing of the ascending colon by an acute perforating abdomen, due to ileocolonic tuberculosis. The barium enema showed a proximal stump of the right colon near the ileostomy. The access to the cavity was made through the orifice resulting from the release of the stoma, with a lateral-lateral ileo-colonic anastomosis with a 25 mm circular stapler and manual closure of the ileal stump. These surgeries require their own tactics, such as rigor in the lysis of adhesions, tissue traction, and hemostasis, in addition to requiring surgeon dexterity – but without the need for investments in technology; moreover, the learning curve is reported as being lower than that for videolaparoscopy. Laparotomy with minimal incision should be considered as a valid and viable option in the treatment of surgical conditions. Resumo: Procedimentos minimamente invasivos visam resolver a doença com o mínimo de trauma ao organismo, resultando em retorno rápido às atividades, reduções nas infecções, complicações, custos e na dor. A laparotomia com incisão mínima, algumas vezes referida como minilaparotomia, é um exemplo desses procedimentos minimamente invasivos. O objetivo deste trabalho é demonstrar a viabilidade e utilidade das laparotomias com incisão mínima com base na literatura e

  18. Development of educational system for nuclear power plant operators using knowledge processing techniques

    International Nuclear Information System (INIS)

    Yoshimura, Seiichi

    1990-01-01

    It is important to carry out effective education optimally adapted to the operator's knowledge level for the enhancement of the operator's ability to deal with abnormal situations. This paper outlines the educational system that realizes effective education using knowledge-processing techniques. This system is composed of three devices. One is a knowledge-processing computer that evaluates the operator's knowledge level and presents educational materials optimally adapted to his knowledge level. Another is a computer for displaying transients and plant equipments. The other is a computer for voice input and output. The educational materials utilize cause-and-effect relationships. It is possible to perform effective education by pointing out the parts the operator failed to understand using the relationships. An evaluation test was performed using several tens of operators by actually operating the system and then impressions were gathered by questionnaire. As a result, the cause-and-effect relationships were proved to be useful to understand the transients. And the contents of the educational materials and the display pictures were also deemed to have practical value. (author)

  19. Inference with minimal Gibbs free energy in information field theory

    International Nuclear Information System (INIS)

    Ensslin, Torsten A.; Weig, Cornelius

    2010-01-01

    Non-linear and non-Gaussian signal inference problems are difficult to tackle. Renormalization techniques permit us to construct good estimators for the posterior signal mean within information field theory (IFT), but the approximations and assumptions made are not very obvious. Here we introduce the simple concept of minimal Gibbs free energy to IFT, and show that previous renormalization results emerge naturally. They can be understood as being the Gaussian approximation to the full posterior probability, which has maximal cross information with it. We derive optimized estimators for three applications, to illustrate the usage of the framework: (i) reconstruction of a log-normal signal from Poissonian data with background counts and point spread function, as it is needed for gamma ray astronomy and for cosmography using photometric galaxy redshifts, (ii) inference of a Gaussian signal with unknown spectrum, and (iii) inference of a Poissonian log-normal signal with unknown spectrum, the combination of (i) and (ii). Finally we explain how Gaussian knowledge states constructed by the minimal Gibbs free energy principle at different temperatures can be combined into a more accurate surrogate of the non-Gaussian posterior.

  20. Effective Fault-Tolerant Quantum Computation with Slow Measurements

    International Nuclear Information System (INIS)

    DiVincenzo, David P.; Aliferis, Panos

    2007-01-01

    How important is fast measurement for fault-tolerant quantum computation? Using a combination of existing and new ideas, we argue that measurement times as long as even 1000 gate times or more have a very minimal effect on the quantum accuracy threshold. This shows that slow measurement, which appears to be unavoidable in many implementations of quantum computing, poses no essential obstacle to scalability