WorldWideScience

Sample records for human computer debating

  1. Technological Imperatives: Using Computers in Academic Debate.

    Science.gov (United States)

    Ticku, Ravinder; Phelps, Greg

    Intended for forensic educators and debate teams, this document details how one university debate team, at the University of Iowa, makes use of computer resources on campus to facilitate storage and retrieval of information useful to debaters. The introduction notes the problem of storing and retrieving the amount of information required by debate…

  2. Human Computation

    CERN Document Server

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  3. Human embryonic stem cell research debates: a confucian argument.

    Science.gov (United States)

    Tsai, D F-C

    2005-11-01

    Human embryonic stem cell research can bring about major biomedical breakthroughs and thus contribute enormously to human welfare, yet it raises serious moral problems because it involves using human embryos for experiment. The "moral status of the human embryo" remains the core of such debates. Three different positions regarding the moral status of the human embryo can be categorised: the "all" position, the "none" position, and the "gradualist" position. The author proposes that the "gradualist" position is more plausible than the other two positions. Confucius's moral principle of jen, which proposes a unique theory of "love of gradation", and the principle of yi, which advocates "due treatment for persons", are then explored. The author then argues that our moral obligations to do good to other living organisms, persons, and our families are different. Putting together the "gradualist" position on the human embryo, and Confucius's theories of "love of gradation" and "due treatment for persons", the author concludes that the early embryo has less ethical significance than the later fetus and adult human. The moral obligation we have toward persons is clearer and stronger than that which we have toward human embryos. Embryo research is justifiable if it brings enormous welfare to human persons that cannot be otherwise achieved. The "love of gradation" requires us, however, to extend love and respect towards other entities according to their different status. We should therefore be very cautious in using human embryos for research, acknowledging the gradualist nature of their moral status.

  4. Procreative liberty, enhancement and commodification in the human cloning debate.

    Science.gov (United States)

    Shapshay, Sandra

    2012-12-01

    The aim of this paper is to scrutinize a contemporary standoff in the American debate over the moral permissibility of human reproductive cloning in its prospective use as a eugenic enhancement technology. I shall argue that there is some significant and under-appreciated common ground between the defenders and opponents of human cloning. Champions of the moral and legal permissibility of cloning support the technology based on the right to procreative liberty provided it were to become as safe as in vitro fertilization and that it be used only by adults who seek to rear their clone children. However, even champions of procreative liberty oppose the commodification of cloned embryos, and, by extension, the resulting commodification of the cloned children who would be produced via such embryos. I suggest that a Kantian moral argument against the use of cloning as an enhancement technology can be shown to be already implicitly accepted to some extent by champions of procreative liberty on the matter of commodification of cloned embryos. It is in this argument against commodification that the most vocal critics of cloning such as Leon Kass and defenders of cloning such as John Robertson can find greater common ground. Thus, I endeavor to advance the debate by revealing a greater degree of moral agreement on some fundamental premises than hitherto recognized.

  5. Computing Education in Children's Early Years: A Call for Debate

    Science.gov (United States)

    Manches, Andrew; Plowman, Lydia

    2017-01-01

    International changes in policy and curricula (notably recent developments in England) have led to a focus on the role of computing education in the early years. As interest in the potential of computing education has increased, there has been a proliferation of programming tools designed for young children. While these changes are broadly to be…

  6. Human Dispersal Out of Africa: A Lasting Debate.

    Science.gov (United States)

    López, Saioa; van Dorp, Lucy; Hellenthal, Garrett

    2015-01-01

    Unraveling the first migrations of anatomically modern humans out of Africa has invoked great interest among researchers from a wide range of disciplines. Available fossil, archeological, and climatic data offer many hypotheses, and as such genetics, with the advent of genome-wide genotyping and sequencing techniques and an increase in the availability of ancient samples, offers another important tool for testing theories relating to our own history. In this review, we report the ongoing debates regarding how and when our ancestors left Africa, how many waves of dispersal there were and what geographical routes were taken. We explore the validity of each, using current genetic literature coupled with some of the key archeological findings.

  7. An Interdisciplinary Bibliography for Computers and the Humanities Courses.

    Science.gov (United States)

    Ehrlich, Heyward

    1991-01-01

    Presents an annotated bibliography of works related to the subject of computers and the humanities. Groups items into textbooks and overviews; introductions; human and computer languages; literary and linguistic analysis; artificial intelligence and robotics; social issue debates; computers' image in fiction; anthologies; writing and the…

  8. Parliamentary cultures and human embryos: the Dutch and British debates compared

    NARCIS (Netherlands)

    Kirejczyk, Marta

    1999-01-01

    Twenty years ago, the technology of in vitro fertilization created a new artefact: the human embryo outside the woman's body. In many countries, political debates developed around this artefact. One of the central questions in these debates is whether it is permissible to use human embryos in resear

  9. Methodological debates in human rights research: a case study of human trafficking in South Africa

    NARCIS (Netherlands)

    Vigneswaran, D.

    2012-01-01

    Debates over human trafficking are riddled with methodological dilemmas. Agencies with vested interests in the anti-trafficking agenda advance claims about numbers of victims, level of organized trafficking and scale of exploitation, but with limited data and using questionable techniques. Skeptics,

  10. Ubiquitous Human Computing

    OpenAIRE

    Zittrain, Jonathan L.

    2008-01-01

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a thumb tack and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This short essay explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  11. When computers were human

    CERN Document Server

    Grier, David Alan

    2013-01-01

    Before Palm Pilots and iPods, PCs and laptops, the term ""computer"" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth account of this little-known, 200-year epoch in the history of science and technology. Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider wo

  12. Computational human body models

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Dommelen, J.A.W. van

    2005-01-01

    Computational human body models are widely used for automotive crashsafety research and design and as such have significantly contributed to a reduction of traffic injuries and fatalities. Currently crash simulations are mainly performed using models based on crash-dummies. However crash dummies dif

  13. Computational human body models

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Happee, R.; Dommelen, J.A.W. van

    2005-01-01

    Computational human body models are widely used for automotive crashsafety research and design and as such have significantly contributed to a reduction of traffic injuries and fatalities. Currently crash simulations are mainly performed using models based on crash-dummies. However crash dummies

  14. A Computational Model of Self-Efficacy's Various Effects on Performance: Moving the Debate Forward.

    Science.gov (United States)

    Vancouver, Jeffrey B; Purl, Justin D

    2016-12-19

    Self-efficacy, which is one's belief in one's capacity, has been found to both positively and negatively influence effort and performance. The reasons for these different effects have been a major topic of debate among social-cognitive and perceptual control theorists. In particular, the findings of various self-efficacy effects has been motivated by a perceptual control theory view of self-regulation that social-cognitive theorists' question. To provide more clarity to the theoretical arguments, a computational model of the multiple processes presumed to create the positive, negative, and null effects for self-efficacy is presented. Building on an existing computational model of goal choice that produces a positive effect for self-efficacy, the current article adds a symbolic processing structure used during goal striving that explains the negative self-efficacy effect observed in recent studies. Moreover, the multiple processes, operating together, allow the model to recreate the various effects found in a published study of feedback ambiguity's moderating role on the self-efficacy to performance relationship (Schmidt & DeShon, 2010). Discussion focuses on the implications of the model for the self-efficacy debate, alternative computational models, the overlap between control theory and social-cognitive theory explanations, the value of using computational models for resolving theoretical disputes, and future research and directions the model inspires. (PsycINFO Database Record

  15. Handbook of human computation

    CERN Document Server

    Michelucci, Pietro

    2013-01-01

    This volume addresses the emerging area of human computation, The chapters, written by leading international researchers, explore existing and future opportunities to combine the respective strengths of both humans and machines in order to create powerful problem-solving capabilities. The book bridges scientific communities, capturing and integrating the unique perspective and achievements of each. It coalesces contributions from industry and across related disciplines in order to motivate, define, and anticipate the future of this exciting new frontier in science and cultural evolution. Reade

  16. New perspectives in the debate about human nature

    Directory of Open Access Journals (Sweden)

    Alfredo Marcos

    2016-02-01

    Full Text Available In a previous article, entitled Philosophy of Human Nature, I have stated my criticism against the denial of human nature, against their complet naturalization and against its complete artificialization. Here I will give a brief summary of these arguments (section 1. Then I will try to deepen the criticism in dialogue with several contemporary authors who shape a crucial change in the current perspectives on human nature (section 2. Taken together, these authors present a lucid review of the radical naturalism. They suggest as well a new and more accurate theory of human nature. I will focus especially on the contributions of Thomas Nagel, because I believe that he lays the foundation for a possible constructive interchange on human nature between theistic and non theistic positions (section 3. This dialogue should be oriented, as I argue in the concluding section, by the guidelines of a critical common sense.

  17. Childbirth care: contributing to the debate on human development.

    Science.gov (United States)

    Garcia de Lima Parada, Cristina Maria; Leite Carvalhaes, Maria Antonieta de Barros

    2007-01-01

    This study aimed to evaluate care during childbirth and neonatal development in the interior of São Paulo in order to support managers responsible for formulating public policies on human development and allocating public resources to the women's healthcare. This epidemiological study focused on the evaluation of health services based on the observation of the assistance delivered by the Single Health System in 12 maternities and 134 delivers. The Brazilian Health Ministry or World Health Organization standards were adopted for comparison. The results revealed problems related to the structure of some maternities, where some well-proven practices in normal childbirth are still little used, whereas other prejudicial or ineffective ones are routinely used. Reversing this picture is essential in order to offer humanized quality care to women with consequent reductions in maternal and neonatal mortality rates, in such a way that the region achieves the millennium goals established for improving human development.

  18. Theories of Human Evolution. A Century of Debate, 1844-1944.

    Science.gov (United States)

    Bowler, Peter J.

    The question of human origin has always been disputed by evolution theorists. This book provides a comprehensive survey of the debates over human evolution from the time of Darwin to the 1940s. Part 1 discusses the early controversies, noting that they focused on philosophical issues rather than causes or details of the evolutionary process. A…

  19. Euthanasia and international human rights law: prolegomena for an international debate.

    Science.gov (United States)

    Van den Akker, B; Janssens, R M; Ten Have, H A

    1997-10-01

    In this paper we examine in what respects international human rights law can provide a basis for the establishment of an international debate on euthanasia. Such a debate seems imperative, as in many countries euthanasia is considered taboo in the context of medical practice, yet at the same time, supposedly, decisions are taken to intentionally shorten patients' lives. In the Netherlands, the act of euthanasia will not lead to the prosecution of the physician involved if the physician has complied with certain procedures. The Dutch debate centres on procedures marginalizing important moral aspects of euthanasia. An international debate, addressing the fundamental morality of euthanasia and of other medical decisions involving the end of life, will eventually enhance medical practice in the Netherlands as well as in other countries.

  20. Study duration and earnings: A test in relation to the human capital versus screening debate

    NARCIS (Netherlands)

    H. Oosterbeek

    1992-01-01

    In this paper we propose a simple test in relation to the human capital versus screening debate. It is argued that these theories lead to different predictions with respect to the earnings effects of deviations between actual and nominal durations of a study. Earnings and study duration equations ar

  1. Bringing a European perspective to the health human resources debate: a scoping study.

    NARCIS (Netherlands)

    Kuhlmann, E.; Batenburg, R.; Groenewegen, P.P.; Larsen, C.

    2013-01-01

    Healthcare systems across the world are increasingly challenged by workforce shortages and misdistribution of skills. Yet, no comprehensive European approach to health human resources (HHR) policy exists and action remains fragmented. This scoping study seeks to contribute to the debates by providin

  2. Thinking and Caring about Indigenous Peoples' Human Rights: Swedish Students Writing History beyond Scholarly Debate

    Science.gov (United States)

    Nygren, Thomas

    2016-01-01

    According to national and international guidelines, schools should promote historical thinking and foster moral values. Scholars have debated, but not analysed in depth in practice, whether history education can and should hold a normative dimension. This study analyses current human rights education in two Swedish senior high school groups, in…

  3. Thinking and Caring about Indigenous Peoples' Human Rights: Swedish Students Writing History beyond Scholarly Debate

    Science.gov (United States)

    Nygren, Thomas

    2016-01-01

    According to national and international guidelines, schools should promote historical thinking and foster moral values. Scholars have debated, but not analysed in depth in practice, whether history education can and should hold a normative dimension. This study analyses current human rights education in two Swedish senior high school groups, in…

  4. Visualizing Humans by Computer.

    Science.gov (United States)

    Magnenat-Thalmann, Nadia

    1992-01-01

    Presents an overview of the problems and techniques involved in visualizing humans in a three-dimensional scene. Topics discussed include human shape modeling, including shape creation and deformation; human motion control, including facial animation and interaction with synthetic actors; and human rendering and clothing, including textures and…

  5. Evaluation the Impact of Human Interaction/Debate on Online News to Improve User Interfaces for Debate Applications

    Directory of Open Access Journals (Sweden)

    Abdulrahman Alqahtani

    2016-01-01

    Full Text Available The average of many people trust online comments for any news as much as personal recommendations [1], [2]. In this paper, we analyzed the impact of the online news’s comments to evaluating the threading models of electronic debates by using online surveys. In this paper, based on the results of our online survey of 500 participants, we evaluated whether forums with comments concerning online news are appropriate for the study of debates. In particular, we have to verify whether the nature of discussions around news is argumentative and whether the participating people expect to engage in multiple rounds of arguments. We presented DirectDemocracyP2P application as a user interface for decentralized debates. In this paper, we evaluated and analyzed the comments that were collected from online surveys to improve the DirectDemocracyP2P applications. Also we have to verify whether the actual comments commonly submitted around news do go beyond the simple advertisement of one own’s merchandise and attacks of competitors, into fair reviews of news features and quality.

  6. The pros and cons of human therapeutic cloning in the public debate.

    Science.gov (United States)

    Nippert, Irmgard

    2002-09-11

    Few issues linked to genetic research have raised as much controversial debate as the use of somatic cell nuclear transfer technology to create embryos specifically for stem cell research. Whereas European countries unanimously agree that reproductive cloning should be prohibited there is no agreement to be found on whether or not research into therapeutic cloning should be permitted. Since the UK took the lead and voted in favour of regulations allowing therapeutic cloning the public debate has intensified on the Continent. This debate reflects the wide spectrum of diverse religious and secular moralities that are prevalent in modern multicultural European democratic societies. Arguments range from putting forward strictly utilitarian views that weight the moral issues involved against the potential benefits that embryonic stem cell research may harbour to considering the embryo as a human being, endowed with human dignity and human rights from the moment of its creation, concluding that its use for research is unethical and should be strictly prohibited. Given the current state of dissension among the various European states, it is difficult to predict whether 'non-harmonisation' will prevail or whether in the long run 'harmonisation' of legislation that will allow stem cell research will evolve in the EU.

  7. Minimal mobile human computer interaction

    NARCIS (Netherlands)

    el Ali, A.

    2013-01-01

    In the last 20 years, the widespread adoption of personal, mobile computing devices in everyday life, has allowed entry into a new technological era in Human Computer Interaction (HCI). The constant change of the physical and social context in a user's situation made possible by the portability of m

  8. Are animal models useful or confusing in understanding the human feto-maternal relationship? A debate.

    Science.gov (United States)

    Chaouat, Gérard; Clark, David A

    2015-04-01

    The proposition "This house agrees that the proper study of man is woman" was debated. For those negating the proposition, the alternative was that "animal models are useful in understanding the human feto-maternal relationship." Evidence for the proposition emphasized molecular and structural differences between the human and animal placenta and placentation. Evidence against the proposition and in favor of the alternative focused on functional and structural homologies, emphasizing that different molecules could be used in humans to achieve similar functional effects seen in animal (e.g., mouse) models. It was agreed that one always needed to test the validity of animal data by studying humans. The advantages and limitations of animal models were discussed. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  9. The question of disability in the post-human debate. Critical remarks.

    Science.gov (United States)

    Gatti, Chiara

    2014-01-01

    The issue of disability represents a test case for the sustainability, practical and theoretical, of transhumanist theories that lead to the advent of a posthuman era. In fact, dealing with mankind implies also the possibility that man has a disability. So, seeing whether, in the posthuman debate, persons with disability are respected, will show us if every man is respected. In this paper we start by analyzing the definition of disability given by the posthuman theorists. As we will see, this definition is deficient because it is strictly linked with the transhumanists' refusal of the distinction between therapeutic treatment and enhancement. The field of enhancement is very wide, and the moral judgment on it cannot be generalized. Nowadays, many developments made possible by human enhancement theories remain only speculated upon. However, those theories are already influential in the field of studying the beginning of life. Indeed, the possibility "to choose children" is real: here the issue of disability is decisive and the risk of discrimination is very high. So, looking at the issue of disability will allow us to explore the ethical weight of the post-human project. In the background, it will be possible to glimpse the question of what is the essence of man, an issue not considered enough in the post-human debate. On the contrary, it is a fundamental question which should be answered before proceeding to a substantial alteration of humanity.

  10. Making IBM's Computer, Watson, Human

    Science.gov (United States)

    Rachlin, Howard

    2012-01-01

    This essay uses the recent victory of an IBM computer (Watson) in the TV game, Jeopardy, to speculate on the abilities Watson would need, in addition to those it has, to be human. The essay's basic premise is that to be human is to behave as humans behave and to function in society as humans function. Alternatives to this premise are considered and rejected. The viewpoint of the essay is that of teleological behaviorism. Mental states are defined as temporally extended patterns of overt behavior. From this viewpoint (although Watson does not currently have them), essential human attributes such as consciousness, the ability to love, to feel pain, to sense, to perceive, and to imagine may all be possessed by a computer. Most crucially, a computer may possess self-control and may act altruistically. However, the computer's appearance, its ability to make specific movements, its possession of particular internal structures (e.g., whether those structures are organic or inorganic), and the presence of any nonmaterial “self,” are all incidental to its humanity. PMID:22942530

  11. Humans, computers and wizards human (simulated) computer interaction

    CERN Document Server

    Fraser, Norman; McGlashan, Scott; Wooffitt, Robin

    2013-01-01

    Using data taken from a major European Union funded project on speech understanding, the SunDial project, this book considers current perspectives on human computer interaction and argues for the value of an approach taken from sociology which is based on conversation analysis.

  12. The misuse of Kant in the debate about a market for human body parts.

    Science.gov (United States)

    Gerrand, N

    1999-01-01

    Passages from the writings of Immanuel Kant concerning how a person should treat her body are often cited in the present-day debate about a market for human body parts. In this paper, I demonstrate that this has been a misuse of Kant because unlike those who cite him, Kant was not primarily concerned with prohibiting the sale of body parts. In the first section, I argue that once these particular passages are understood against the background of Kant's moral philosophy, they indicate he had much broader concerns relating to the correct moral relationship a rational person should have with her body. In the second section, I examine Stephen Munzer's unusually detailed analysis of these passages, but conclude that like those who have provided less detailed analyses, he also fails fully to understand the rationale for Kant's various prescriptions and prohibitions concerning the treatment of human body parts, and in doing so misrepresents Kant's position.

  13. Human-computer interface design

    Energy Technology Data Exchange (ETDEWEB)

    Bowser, S.E.

    1995-04-01

    Modern military forces assume that computer-based information is reliable, timely, available, usable, and shared. The importance of computer-based information is based on the assumption that {open_quotes}shared situation awareness, coupled with the ability to conduct continuous operations, will allow information age armies to observe, decide, and act faster, more correctly and more precisely than their enemies.{close_quotes} (Sullivan and Dubik 1994). Human-Computer Interface (HCI) design standardization is critical to the realization of the previously stated assumptions. Given that a key factor of a high-performance, high-reliability system is an easy-to-use, effective design of the interface between the hardware, software, and the user, it follows logically that the interface between the computer and the military user is critical to the success of the information-age military. The proliferation of computer technology has resulted in the development of an extensive variety of computer-based systems and the implementation of varying HCI styles on these systems. To accommodate the continued growth in computer-based systems, minimize HCI diversity, and improve system performance and reliability, the U.S. Department of Defense (DoD) is continuing to adopt interface standards for developing computer-based systems.

  14. Human dignity and the future of the voluntary active euthanasia debate in South Africa

    Directory of Open Access Journals (Sweden)

    Donrich W Jordaan

    2017-05-01

    Full Text Available The issue of voluntary active euthanasia was thrust into the public policy arena by the Stransham-Ford lawsuit. The High Court legalised voluntary active euthanasia – however, ostensibly only in the specific case of Mr Stransham-Ford. The Supreme Court of Appeal overturned the High Court judgment on technical grounds, not on the merits. This means that in future the courts can be approached again to consider the legalisation of voluntary active euthanasia. As such, Stransham-Ford presents a learning opportunity for both sides of the legalisation divide. In particular, conceptual errors pertaining to human dignity were made in Stransham-Ford, and can be avoided in future. In this article, I identify these errors and propose the following three corrective principles to inform future debate on the subject: (i human dignity is violable; (ii human suffering violates human dignity; and (iii the ‘natural’ causes of suffering due to terminal illness do not exclude the application of human dignity.

  15. Human papillomavirus vaccination: the policy debate over the prevention of cervical cancer--a commentary.

    Science.gov (United States)

    Hoops, Katherine E M; Twiggs, Leo B

    2008-07-01

    The human papillomavirus (HPV) family causes a variety of benign, premalignant, and malignant lesions in men and women. HPV types 16 and 18 are responsible for causing 70% of all cases of cervical cancer each year. Recently, a vaccine that can prevent cervical cancer by protecting women from infection with the most common types of HPV has been made available. Following Food and Drug Administration approval and endorsement by the Centers for Disease Control and Prevention, it is the right and the duty of the state legislatures to implement vaccination programs. This vaccine, a vaccine for a sexually transmitted disease, has stirred a fierce debate. Religion and sexuality have dominated the discussion, and political calculations are inherent to the process; nonetheless, epidemiological analyses are also essential to the decision to mandate the HPV vaccine. HPV vaccine program implementation processes are at many stages in many states, and programs vary widely. Some provide information to families, whereas others allot a range of funding for voluntary vaccination. Virginia is, thus far, the only state to have enacted a mandate. This article discusses the various programs in place, the proposed legislation, and the debate surrounding the political process.

  16. Programming Anxiety amongst Computing Students--A Key in the Retention Debate?

    Science.gov (United States)

    Connolly, C.; Murphy, E.; Moore, S.

    2009-01-01

    Low retention rates in third-level computing courses, despite continuing research into new and improved computer teaching methods, present a worrying concern. For some computing students learning programming is intimidating, giving rise to lack of confidence and anxiety. The noncognitive domain of anxiety with regard to learning computer…

  17. The Quantum Human Computer (QHC) Hypothesis

    Science.gov (United States)

    Salmani-Nodoushan, Mohammad Ali

    2008-01-01

    This article attempts to suggest the existence of a human computer called Quantum Human Computer (QHC) on the basis of an analogy between human beings and computers. To date, there are two types of computers: Binary and Quantum. The former operates on the basis of binary logic where an object is said to exist in either of the two states of 1 and…

  18. Human ear recognition by computer

    CERN Document Server

    Bhanu, Bir; Chen, Hui

    2010-01-01

    Biometrics deals with recognition of individuals based on their physiological or behavioral characteristics. The human ear is a new feature in biometrics that has several merits over the more common face, fingerprint and iris biometrics. Unlike the fingerprint and iris, it can be easily captured from a distance without a fully cooperative subject, although sometimes it may be hidden with hair, scarf and jewellery. Also, unlike a face, the ear is a relatively stable structure that does not change much with the age and facial expressions. ""Human Ear Recognition by Computer"" is the first book o

  19. Nicotine and the Developing Human: A Neglected Element in the Electronic Cigarette Debate.

    Science.gov (United States)

    England, Lucinda J; Bunnell, Rebecca E; Pechacek, Terry F; Tong, Van T; McAfee, Tim A

    2015-08-01

    The elimination of cigarettes and other combusted tobacco products in the U.S. would prevent tens of millions of tobacco-related deaths. It has been suggested that the introduction of less harmful nicotine delivery devices, such as electronic cigarettes or other electronic nicotine delivery systems, will accelerate progress toward ending combustible cigarette use. However, careful consideration of the potential adverse health effects from nicotine itself is often absent from public health debates. Human and animal data support that nicotine exposure during periods of developmental vulnerability (fetal through adolescent stages) has multiple adverse health consequences, including impaired fetal brain and lung development, and altered development of cerebral cortex and hippocampus in adolescents. Measures to protect the health of pregnant women and children are needed and could include (1) strong prohibitions on marketing that increase youth uptake; (2) youth access laws similar to those in effect for other tobacco products; (3) appropriate health warnings for vulnerable populations; (4) packaging to prevent accidental poisonings; (5) protection of non-users from exposure to secondhand electronic cigarette aerosol; (6) pricing that helps minimize youth initiation and use; (7) regulations to reduce product addiction potential and appeal for youth; and (8) the age of legal sale. Published by Elsevier Inc.

  20. The question of disability in the post-human debate. Critical remarks

    Directory of Open Access Journals (Sweden)

    Chiara Gatti

    2014-01-01

    Full Text Available El tema de la discapacidad representa una prueba para la sustentabilidad - práctica y teórica - de las teorías transhumanistas que conducen a la llegada de una era post-humana. De hecho, al tratar la humanidad, hay que tener en cuenta la posibilidad de que el hombre tenga una discapacidad. Por esta razón, averiguar si en el debate sobre el post-humano se respetan a las personas con discapacidad, nos puede mostrar si se respeta a cada hombre. En este trabajo analizaremos la definición de discapacidad dada por los teóricos post humanistas. Se ilustrará como esta definición es deficitaria porque está vinculada estrictamente con el rechazo, por parte de los transhumanistas, de la distinción entre el tratamiento terapéutico y la potenciación. El campo de la potenciación es muy extenso y, por lo tanto, no se puede generalizar el juicio moral. Hoy en día, muchas de las posibilidades proyectadas por el “human enhancement” siguen siendo solamente especulación. Sin embargo, esas teorías ya son influyentes en el campo del principio de la vida. En efecto, la posibilidad “de elegir a los hijos” es real: aquí la cuestión de la discapacidad es determinante y el riesgo de discriminación es muy alto. Por estas razones, observar el tema de la discapacidad nos permitirá explorar el peso ético del proyecto post humanista. Al final, se podrá vislumbrar la cuestión de la esencia del hombre, una pregunta que queda puesta entre paréntesis en el debate post-humanista. Por el contrario, se trata de una cuestión fundamental que tiene que ser contestada antes de proceder a una alteración sustancial de la humanidad.

  1. Human-centered Computing: Toward a Human Revolution

    OpenAIRE

    Jaimes, Alejandro; Gatica-Perez, Daniel; Sebe, Nicu; Thomas S. Huang

    2007-01-01

    Human-centered computing studies the design, development, and deployment of mixed-initiative human-computer systems. HCC is emerging from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts.

  2. Computational Techniques of Electromagnetic Dosimetry for Humans

    Science.gov (United States)

    Hirata, Akimasa; Fujiwara, Osamu

    There has been increasing public concern about the adverse health effects of human exposure to electromagnetic fields. This paper reviews the rationale of international safety guidelines for human protection against electromagnetic fields. Then, this paper also presents computational techniques to conduct dosimetry in anatomically-based human body models. Computational examples and remaining problems are also described briefly.

  3. The Social Computer: Combining Machine and Human Computation

    OpenAIRE

    Giunchiglia, Fausto; Robertson, Dave

    2010-01-01

    The social computer is a future computational system that harnesses the innate problem solving, action and information gathering powers of humans and the environments in which they live in order to tackle large scale social problems that are beyond our current capabilities. The hardware of a social computer is supplied by people’s brains and bodies, the environment where they live, including artifacts, e.g., buildings and roads, sensors into the environment, networks and computers; while the ...

  4. Cooperation in human-computer communication

    OpenAIRE

    Kronenberg, Susanne

    2000-01-01

    The goal of this thesis is to simulate cooperation in human-computer communication to model the communicative interaction process of agents in natural dialogs in order to provide advanced human-computer interaction in that coherence is maintained between contributions of both agents, i.e. the human user and the computer. This thesis contributes to certain aspects of understanding and generation and their interaction in the German language. In spontaneous dialogs agents cooperate by the pro...

  5. Debate: Forced Labour, Slavery and Human Trafficking: When do definitions matter?

    Directory of Open Access Journals (Sweden)

    Roger Plant

    2015-09-01

    Full Text Available We can spend a lot of time debating the connections or essential differences between the concepts of trafficking, forced labour, slavery and modern slavery, or slavery-like practices. Some insist that trafficking is a subset of forced labour, others the reverse. The arguments between academics, bureaucracies and even government agencies have often been vitriolic.

  6. Language evolution and human-computer interaction

    Science.gov (United States)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  7. Language evolution and human-computer interaction

    Science.gov (United States)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  8. Computer Vision Method in Human Motion Detection

    Institute of Scientific and Technical Information of China (English)

    FU Li; FANG Shuai; XU Xin-he

    2007-01-01

    Human motion detection based on computer vision is a frontier research topic and is causing an increasing attention in the field of computer vision research. The wavelet transform is used to sharpen the ambiguous edges in human motion image. The shadow's effect to the image processing is also removed. The edge extraction can be successfully realized.This is an effective method for the research of human motion analysis system.

  9. The Social and Ethical Acceptability of NBICs for Purposes of Human Enhancement: Why Does the Debate Remain Mired in Impasse?

    Science.gov (United States)

    Béland, Jean-Pierre; Patenaude, Johane; Legault, Georges A; Boissy, Patrick; Parent, Monelle

    2011-12-01

    The emergence and development of convergent technologies for the purpose of improving human performance, including nanotechnology, biotechnology, information sciences, and cognitive science (NBICs), open up new horizons in the debates and moral arguments that must be engaged by philosophers who hope to take seriously the question of the ethical and social acceptability of these technologies. This article advances an analysis of the factors that contribute to confusion and discord on the topic, in order to help in understanding why arguments that form a part of the debate between transhumanism and humanism result in a philosophical and ethical impasse: 1. The lack of clarity that emerges from the fact that any given argument deployed (arguments based on nature and human nature, dignity, the good life) can serve as the basis for both the positive and the negative evaluation of NBICs. 2. The impossibility of providing these arguments with foundations that will enable others to deem them acceptable. 3. The difficulty of applying these same arguments to a specific situation. 4. The ineffectiveness of moral argument in a democratic society. The present effort at communication about the difficulties of the argumentation process is intended as a necessary first step towards developing an interdisciplinary response to those difficulties.

  10. Object categorization: computer and human vision perspectives

    National Research Council Canada - National Science Library

    Dickinson, Sven J

    2009-01-01

    .... The result of a series of four highly successful workshops on the topic, the book gathers many of the most distinguished researchers from both computer and human vision to reflect on their experience...

  11. Handling emotions in human-computer dialogues

    CERN Document Server

    Pittermann, Johannes; Minker, Wolfgang

    2010-01-01

    This book presents novel methods to perform robust speech-based emotion recognition at low complexity. It describes a flexible dialogue model to conveniently integrate emotions and other dialogue-influencing parameters in human-computer interaction.

  12. Language and values in the human cloning debate: a web-based survey of scientists and Christian fundamentalist pastors.

    Science.gov (United States)

    Weasel, Lisa H; Jensen, Eric

    2005-04-01

    Over the last seven years, a major debate has arisen over whether human cloning should remain legal in the United States. Given that this may be the 'first real global and simultaneous news story on biotechnology' (Einsiedel et al., 2002, p.313), nations around the world have struggled with the implications of this newly viable scientific technology, which is often also referred to as somatic cell nuclear transfer. Since the successful cloning of Dolly the sheep in 1997, and with increasing media attention paid to the likelihood of a successful human reproductive clone coupled with research suggesting the medical potential of therapeutic cloning in humans, members of the scientific community and Christian fundamentalist leaders have become increasingly vocal in the debate over U.S. policy decisions regarding human cloning (Wilmut, 2000). Yet despite a surfeit of public opinion polls and widespread opining in the news media on the topic of human cloning, there have been no empirical studies comparing the views of scientists and Christian fundamentalists in this debate (see Evans, 2002a for a recent study of opinion polls assessing religion and attitudes toward cloning). In order to further investigate the values that underlie scientists' and Christian fundamentalist leader's understanding of human cloning, as well as their differential use of language in communicating about this issue, we conducted an open-ended, exploratory survey of practicing scientists in the field of molecular biology and Christian fundamentalist pastors. We then analyzed the responses from this survey using qualitative discourse analysis. While this was not necessarily a representative sample (in quantitative terms, see Gaskell & Bauer, 2000) of each of the groups and the response rate was limited, this approach was informative in identifying both commonalities between the two groups, such as a focus on ethical concerns about reproductive cloning and the use of scientific terminology, as well

  13. Microscopic computation in human brain evolution.

    Science.gov (United States)

    Wallace, R

    1995-04-01

    When human psychological performance is viewed in terms of cognitive modules, our species displays remarkable differences in computational power. Algorithmically simple computations are generally difficult to perform, whereas optimal routing or "Traveling Salesman" Problems (TSP) of far greater complexity are solved on an everyday basis. It is argued that even "simple" instances of TSP are not purely Euclidian problems in human computations, but involve emotional, autonomic, and cognitive constraints. They therefore require a level of parallel processing not possible in a macroscopic system to complete the algorithm within a brief period of time. A microscopic neurobiological model emphasizing the computational power of excited atoms within the neuronal membrane is presented as an alternative to classical connectionist approaches. The evolution of the system is viewed in terms of specific natural selection pressures driving satisfying computations toward global optimization. The relationship of microscopic computation to the nature of consciousness is examined, and possible mathematical models as a basis for simulation studies are briefly discussed.

  14. Human-Computer Interactions and Decision Behavior

    Science.gov (United States)

    1984-01-01

    software interfaces. The major components of the reseach program included the Diaiogue Management System. (DMS) operating environment, the role of...specification; and new methods for modeling, designing, and developing human-computer interfaces based on syntactic and semantic specification. The DMS...achieving communication is language. Accordingly, the transaction model employs a linguistic model consisting of parts that relate computer responses

  15. Human Adaptation to the Computer.

    Science.gov (United States)

    1986-09-01

    Resistance to Change ; Stress; Adaptation to Computers ABSTRACT (Continue on reverie of necessary and identify by block number) This thesis is a study of...OF RESISTANCE TO CHANGE -------------- 48 B. OVERCOMING RESISTANCE TO CHANGE ------------- 50 C. SPECIFIC RECOMMENDATIONS TO OVERCOME RESISTANCE...greater his bewilderment and the greater his bewilderment, the greater his resistance will be [Ref. 7:p. 539]. Overcoming man’s resistance to change

  16. Contemporary debates on social-environmental conflicts, extractivism and human rights in Latin America

    DEFF Research Database (Denmark)

    Raftopoulos, Malayna

    2017-01-01

    This opening contribution to ‘Social-Environmental Conflicts, Extractivism and Human Rights’ analyses how human rights have emerged as a weapon in the political battleground over the environment as natural resource extraction has become an increasingly contested and politicised form of development....... It examines the link between human rights abuses and extractivism, arguing that this new cycle of protests has opened up new political spaces for human rights based resistance. Furthermore, the explosion of socio-environmental conflicts that have accompanied the expansion and politicisation of natural...... resources has highlighted the different conceptualisations of nature, development and human rights that exist within Latin America. While new human rights perspectives are emerging in the region, mainstream human rights discourses are providing social movements and activists with the legal power...

  17. Contemporary debates on social-environmental conflicts, extractivism and human rights in Latin America

    DEFF Research Database (Denmark)

    Raftopoulos, Malayna

    2017-01-01

    resources has highlighted the different conceptualisations of nature, development and human rights that exist within Latin America. While new human rights perspectives are emerging in the region, mainstream human rights discourses are providing social movements and activists with the legal power......This opening contribution to ‘Social-Environmental Conflicts, Extractivism and Human Rights’ analyses how human rights have emerged as a weapon in the political battleground over the environment as natural resource extraction has become an increasingly contested and politicised form of development...... to challenge extractivism and critique the current development agenda. However, while the application of human rights discourses can put pressure on governments, it has yielded limited concrete results largely because the state as a guardian of human rights remains fragile in Latin America and is willing...

  18. Fundamentals of human-computer interaction

    CERN Document Server

    Monk, Andrew F

    1985-01-01

    Fundamentals of Human-Computer Interaction aims to sensitize the systems designer to the problems faced by the user of an interactive system. The book grew out of a course entitled """"The User Interface: Human Factors for Computer-based Systems"""" which has been run annually at the University of York since 1981. This course has been attended primarily by systems managers from the computer industry. The book is organized into three parts. Part One focuses on the user as processor of information with studies on visual perception; extracting information from printed and electronically presented

  19. Deep architectures for Human Computer Interaction

    NARCIS (Netherlands)

    Noulas, A.K.; Kröse, B.J.A.

    2008-01-01

    In this work we present the application of Conditional Restricted Boltzmann Machines in Human Computer Interaction. These provide a well suited framework to model the complex temporal patterns produced from humans in the audio and video modalities. They can be trained in a semisupervised fashion and

  20. Exploring human inactivity in computer power consumption

    Science.gov (United States)

    Candrawati, Ria; Hashim, Nor Laily Binti

    2016-08-01

    Managing computer power consumption has become an important challenge in computer society and this is consistent with a trend where a computer system is more important to modern life together with a request for increased computing power and functions continuously. Unfortunately, previous approaches are still inadequately designed to handle the power consumption problem due to unpredictable workload of a system caused by unpredictable human behaviors. This is happens due to lack of knowledge in a software system and the software self-adaptation is one approach in dealing with this source of uncertainty. Human inactivity is handled by adapting the behavioral changes of the users. This paper observes human inactivity in the computer usage and finds that computer power usage can be reduced if the idle period can be intelligently sensed from the user activities. This study introduces Control, Learn and Knowledge model that adapts the Monitor, Analyze, Planning, Execute control loop integrates with Q Learning algorithm to learn human inactivity period to minimize the computer power consumption. An experiment to evaluate this model was conducted using three case studies with same activities. The result show that the proposed model obtained those 5 out of 12 activities shows the power decreasing compared to others.

  1. Chimeras, moral status, and public policy: implications of the abortion debate for public policy on human/nonhuman chimera research.

    Science.gov (United States)

    Streiffer, Robert

    2010-01-01

    Researchers are increasingly interested in creating chimeras by transplanting human embryonic stem cells (hESCs) into animals early in development. One concern is that such research could confer upon an animal the moral status of a normal human adult but then impermissibly fail to accord it the protections it merits in virtue of its enhanced moral status. Understanding the public policy implications of this ethical conclusion, though, is complicated by the fact that claims about moral status cannot play an unfettered role in public policy. Arguments like those employed in the abortion debate for the conclusion that abortion should be legally permissible even if abortion is not morally permissible also support, to a more limited degree, a liberal policy on hESC research involving the creation of chimeras.

  2. Reproductive cloning in humans and therapeutic cloning in primates: is the ethical debate catching up with the recent scientific advances?

    Science.gov (United States)

    Camporesi, S; Bortolotti, L

    2008-09-01

    After years of failure, in November 2007 primate embryonic stem cells were derived by somatic cellular nuclear transfer, also known as therapeutic cloning. The first embryo transfer for human reproductive cloning purposes was also attempted in 2006, albeit with negative results. These two events force us to think carefully about the possibility of human cloning which is now much closer to becoming a reality. In this paper we tackle this issue from two sides, first summarising what scientists have achieved so far, then discussing some of the ethical arguments in favour and against human cloning which are debated in the context of policy making and public consultation. Therapeutic cloning as a means to improve and save lives has uncontroversial moral value. As to human reproductive cloning, we consider and assess some common objections and failing to see them as conclusive. We do recognise, though, that there will be problems at the level of policy and regulation that might either impair the implementation of human reproductive cloning or make its accessibility restricted in a way that could become difficult to justify on moral grounds. We suggest using the time still available before human reproductive cloning is attempted successfully to create policies and institutions that can offer clear directives on its legitimate applications on the basis of solid arguments, coherent moral principles, and extensive public consultation.

  3. Can humanization theory contribute to the philosophical debate in public health?

    Science.gov (United States)

    Hemingway, A

    2012-05-01

    This paper will explore the humanization value framework for research, policy and practice with regard to its relevance for public health, specifically the reduction of inequities in health. This proposed framework introduces humanizing values to influence research, policy and practice. The framework is articulated through eight specific constituents of what it is to be human. These dimensions are articulated as humanizing and dehumanizing dimensions that have the potential to guide both research and practice. The paper will then go on to consider these dimensions in relation to the emergent qualities of the potential 'fifth-wave' of public health intervention. The humanization dimensions outlined in this paper were presented as emerging from Husserl's notion of lifeworld, Heidegger's contemplations about human freedom and being with others, and Merleau-Ponty`s ideas about body subject and body object. Husserl's ideas about the dimensions that make up 'lifeworld', such as embodiment, temporality and spatiality, underpin the suggested dimensions of what it is to be human. They are proposed in the paper as together informing a value base for considering the potentially humanizing and dehumanizing elements in systems and interactions. It is then proposed that such a framework is useful when considering methods in public health, particularly in relation to developing new knowledge of what is both humanizing and dehumanizing within research and practice.

  4. From Human Nature to Moral Judgments : Reframing Debates about Disability and Enhancement

    NARCIS (Netherlands)

    Harnacke, C.E.

    2015-01-01

    My goal in my dissertation is to develop an account of how a theory of human nature should be integrated into bioethics and to show what bioethics can gain from using this account. I explore the relevance of human nature for moral argumentation, and especially for bioethics. Thereby, I focus on deba

  5. CORPORATIONS AND HUMAN RIGHTS: THE DEBATE BETWEEN VOLUNTARISTS AND OBLIGATIONISTS AND THE UNDERMINING EFFECT OF SANCTIONS

    National Research Council Canada - National Science Library

    Leandro Martins Zanitelli

    2011-01-01

    .... Drawing on research on the undermining effect of sanctions, the article discusses the risk of such an effect should the method of promoting respect for human rights advocated by obligationists be applied, i.e. through regulation...

  6. Upper Pleistocene Human Dispersals out of Africa: A Review of the Current State of the Debate

    OpenAIRE

    Amanuel Beyin

    2011-01-01

    Although there is a general consensus on African origin of early modern humans, there is disagreement about how and when they dispersed to Eurasia. This paper reviews genetic and Middle Stone Age/Middle Paleolithic archaeological literature from northeast Africa, Arabia, and the Levant to assess the timing and geographic backgrounds of Upper Pleistocene human colonization of Eurasia. At the center of the discussion lies the question of whether eastern Africa alone was the source of Upper Plei...

  7. Human Computer Interaction: An intellectual approach

    Directory of Open Access Journals (Sweden)

    Kuntal Saroha

    2011-08-01

    Full Text Available This paper discusses the research that has been done in thefield of Human Computer Interaction (HCI relating tohuman psychology. Human-computer interaction (HCI isthe study of how people design, implement, and useinteractive computer systems and how computers affectindividuals, organizations, and society. This encompassesnot only ease of use but also new interaction techniques forsupporting user tasks, providing better access toinformation, and creating more powerful forms ofcommunication. It involves input and output devices andthe interaction techniques that use them; how information ispresented and requested; how the computer’s actions arecontrolled and monitored; all forms of help, documentation,and training; the tools used to design, build, test, andevaluate user interfaces; and the processes that developersfollow when creating Interfaces.

  8. Upper Pleistocene Human Dispersals out of Africa: A Review of the Current State of the Debate

    Science.gov (United States)

    Beyin, Amanuel

    2011-01-01

    Although there is a general consensus on African origin of early modern humans, there is disagreement about how and when they dispersed to Eurasia. This paper reviews genetic and Middle Stone Age/Middle Paleolithic archaeological literature from northeast Africa, Arabia, and the Levant to assess the timing and geographic backgrounds of Upper Pleistocene human colonization of Eurasia. At the center of the discussion lies the question of whether eastern Africa alone was the source of Upper Pleistocene human dispersals into Eurasia or were there other loci of human expansions outside of Africa? The reviewed literature hints at two modes of early modern human colonization of Eurasia in the Upper Pleistocene: (i) from multiple Homo sapiens source populations that had entered Arabia, South Asia, and the Levant prior to and soon after the onset of the Last Interglacial (MIS-5), (ii) from a rapid dispersal out of East Africa via the Southern Route (across the Red Sea basin), dating to ~74–60 kya. PMID:21716744

  9. Upper Pleistocene Human Dispersals out of Africa: A Review of the Current State of the Debate

    Directory of Open Access Journals (Sweden)

    Amanuel Beyin

    2011-01-01

    Full Text Available Although there is a general consensus on African origin of early modern humans, there is disagreement about how and when they dispersed to Eurasia. This paper reviews genetic and Middle Stone Age/Middle Paleolithic archaeological literature from northeast Africa, Arabia, and the Levant to assess the timing and geographic backgrounds of Upper Pleistocene human colonization of Eurasia. At the center of the discussion lies the question of whether eastern Africa alone was the source of Upper Pleistocene human dispersals into Eurasia or were there other loci of human expansions outside of Africa? The reviewed literature hints at two modes of early modern human colonization of Eurasia in the Upper Pleistocene: (i from multiple Homo sapiens source populations that had entered Arabia, South Asia, and the Levant prior to and soon after the onset of the Last Interglacial (MIS-5, (ii from a rapid dispersal out of East Africa via the Southern Route (across the Red Sea basin, dating to ~74–60 kya.

  10. Upper Pleistocene Human Dispersals out of Africa: A Review of the Current State of the Debate.

    Science.gov (United States)

    Beyin, Amanuel

    2011-01-01

    Although there is a general consensus on African origin of early modern humans, there is disagreement about how and when they dispersed to Eurasia. This paper reviews genetic and Middle Stone Age/Middle Paleolithic archaeological literature from northeast Africa, Arabia, and the Levant to assess the timing and geographic backgrounds of Upper Pleistocene human colonization of Eurasia. At the center of the discussion lies the question of whether eastern Africa alone was the source of Upper Pleistocene human dispersals into Eurasia or were there other loci of human expansions outside of Africa? The reviewed literature hints at two modes of early modern human colonization of Eurasia in the Upper Pleistocene: (i) from multiple Homo sapiens source populations that had entered Arabia, South Asia, and the Levant prior to and soon after the onset of the Last Interglacial (MIS-5), (ii) from a rapid dispersal out of East Africa via the Southern Route (across the Red Sea basin), dating to ~74-60 kya.

  11. Human/computer control of undersea teleoperators

    Science.gov (United States)

    Sheridan, T. B.; Verplank, W. L.; Brooks, T. L.

    1978-01-01

    The potential of supervisory controlled teleoperators for accomplishment of manipulation and sensory tasks in deep ocean environments is discussed. Teleoperators and supervisory control are defined, the current problems of human divers are reviewed, and some assertions are made about why supervisory control has potential use to replace and extend human diver capabilities. The relative roles of man and computer and the variables involved in man-computer interaction are next discussed. Finally, a detailed description of a supervisory controlled teleoperator system, SUPERMAN, is presented.

  12. Integrated But Not Whole? Applying an Ontological Account of Human Organismal Unity to the Brain Death Debate.

    Science.gov (United States)

    Moschella, Melissa

    2016-10-01

    As is clear in the 2008 report of the President's Council on Bioethics, the brain death debate is plagued by ambiguity in the use of such key terms as 'integration' and 'wholeness'. Addressing this problem, I offer a plausible ontological account of organismal unity drawing on the work of Hoffman and Rosenkrantz, and then apply that account to the case of brain death, concluding that a brain dead body lacks the unity proper to a human organism, and has therefore undergone a substantial change. I also show how my view can explain hard cases better than one in which biological integration (as understood by Alan Shewmon and the President's Council) is taken to imply ontological wholeness or unity.

  13. Democratizing Human Genome Project Information: A Model Program for Education, Information and Debate in Public Libraries.

    Science.gov (United States)

    Pollack, Miriam

    The "Mapping the Human Genome" project demonstrated that librarians can help whomever they serve in accessing information resources in the areas of biological and health information, whether it is the scientists who are developing the information or a member of the public who is using the information. Public libraries can guide library…

  14. Applying Human Computation Methods to Information Science

    Science.gov (United States)

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  15. Soft Computing in Humanities and Social Sciences

    CERN Document Server

    González, Veronica

    2012-01-01

    The field of Soft Computing in Humanities and Social Sciences is at a turning point. The strong distinction between “science” and “humanities” has been criticized from many fronts and, at the same time, an increasing cooperation between the so-called “hard sciences” and “soft sciences” is taking place in a wide range of scientific projects dealing with very complex and interdisciplinary topics. In the last fifteen years the area of Soft Computing has also experienced a gradual rapprochement to disciplines in the Humanities and Social Sciences, and also in the field of Medicine, Biology and even the Arts, a phenomenon that did not occur much in the previous years.   The collection of this book presents a generous sampling of the new and burgeoning field of Soft Computing in Humanities and Social Sciences, bringing together a wide array of authors and subject matters from different disciplines. Some of the contributors of the book belong to the scientific and technical areas of Soft Computing w...

  16. Introduction to human-computer interaction

    CERN Document Server

    Booth, Paul

    2014-01-01

    Originally published in 1989 this title provided a comprehensive and authoritative introduction to the burgeoning discipline of human-computer interaction for students, academics, and those from industry who wished to know more about the subject. Assuming very little knowledge, the book provides an overview of the diverse research areas that were at the time only gradually building into a coherent and well-structured field. It aims to explain the underlying causes of the cognitive, social and organizational problems typically encountered when computer systems are introduced. It is clear and co

  17. Human brain mapping: Experimental and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)

    1998-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  18. Human brain mapping: Experimental and computational approaches

    Energy Technology Data Exchange (ETDEWEB)

    Wood, C.C.; George, J.S.; Schmidt, D.M.; Aine, C.J. [Los Alamos National Lab., NM (US); Sanders, J. [Albuquerque VA Medical Center, NM (US); Belliveau, J. [Massachusetts General Hospital, Boston, MA (US)

    1998-11-01

    This is the final report of a three-year, Laboratory-Directed Research and Development (LDRD) project at the Los Alamos National Laboratory (LANL). This program developed project combined Los Alamos' and collaborators' strengths in noninvasive brain imaging and high performance computing to develop potential contributions to the multi-agency Human Brain Project led by the National Institute of Mental Health. The experimental component of the project emphasized the optimization of spatial and temporal resolution of functional brain imaging by combining: (a) structural MRI measurements of brain anatomy; (b) functional MRI measurements of blood flow and oxygenation; and (c) MEG measurements of time-resolved neuronal population currents. The computational component of the project emphasized development of a high-resolution 3-D volumetric model of the brain based on anatomical MRI, in which structural and functional information from multiple imaging modalities can be integrated into a single computational framework for modeling, visualization, and database representation.

  19. Brain-Computer Interfaces Revolutionizing Human-Computer Interaction

    CERN Document Server

    Graimann, Bernhard; Allison, Brendan

    2010-01-01

    A brain-computer interface (BCI) establishes a direct output channel between the human brain and external devices. BCIs infer user intent via direct measures of brain activity and thus enable communication and control without movement. This book, authored by experts in the field, provides an accessible introduction to the neurophysiological and signal-processing background required for BCI, presents state-of-the-art non-invasive and invasive approaches, gives an overview of current hardware and software solutions, and reviews the most interesting as well as new, emerging BCI applications. The book is intended not only for students and young researchers, but also for newcomers and other readers from diverse backgrounds keen to learn about this vital scientific endeavour.

  20. Computer Simulation of the Beating Human Heart

    Science.gov (United States)

    Peskin, Charles S.; McQueen, David M.

    2001-06-01

    The mechanical function of the human heart couples together the fluid mechanics of blood and the soft tissue mechanics of the muscular heart walls and flexible heart valve leaflets. We discuss a unified mathematical formulation of this problem in which the soft tissue looks like a specialized part of the fluid in which additional forces are applied. This leads to a computational scheme known as the Immersed Boundary (IB) method for solving the coupled equations of motion of the whole system. The IB method is used to construct a three-dimensional Virtual Heart, including representations of all four chambers of the heart and all four valves, in addition to the large arteries and veins that connect the heart to the rest of the circulation. The chambers, valves, and vessels are all modeled as collections of elastic (and where appropriate, actively contractile) fibers immersed in viscous incompressible fluid. Results are shown as a computer-generated video animation of the beating heart.

  1. The epistemology and ontology of human-computer interaction

    NARCIS (Netherlands)

    Brey, Philip

    2005-01-01

    This paper analyzes epistemological and ontological dimensions of Human-Computer Interaction (HCI) through an analysis of the functions of computer systems in relation to their users. It is argued that the primary relation between humans and computer systems has historically been epistemic: computer

  2. Human-Computer Interaction in Smart Environments

    Directory of Open Access Journals (Sweden)

    Gianluca Paravati

    2015-08-01

    Full Text Available Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  3. debate latinoamericano

    Directory of Open Access Journals (Sweden)

    Mabel Thwaites Rey

    2008-01-01

    Full Text Available Una vez completado el ciclo de ajuste estructural y de reformas estatales pro-mercado de corte neoliberal de los años noventa, en América latina ha comenzado una nueva etapa. Ya en el contexto de la globalización, problemas clásicos como el desarrollo, la dependencia y el papel del estado nacional vuelven a tener vigencia teórica y práctica. En estas páginas pasamos revista a una muy rica tradición crítica, que va desde la visión del desarrollo de la CEPAL hasta la “teoría de la dependencia” –incluyendo las contribuciones de autores marxistas y neo-marxistas–, que ha hecho un aporte importante para analizar los límites y las posibilidades del estado nación para establecer un espacio de autonomía frente al capitalismo global. Veremos, entonces, cómo viejos debates se entroncan hoy con nuevas confi guraciones políticas y experiencias en diversos países de la región y reintroducen en la agenda cuestiones tan vigentes como el desarrollo y la dependencia.

  4. Computer Aided Design in Digital Human Modeling for Human Computer Interaction in Ergonomic Assessment: A Review

    Directory of Open Access Journals (Sweden)

    Suman Mukhopadhyay , Sanjib Kumar Das and Tania Chakraborty

    2012-12-01

    Full Text Available Research in Human-Computer Interaction (HCI hasbeen enormously successful in the area of computeraidedergonomics or human-centric designs. Perfectfit for people has always been a target for productdesign. Designers traditionally used anthropometricdimensions for 3D product design which created a lotof fitting problems when dealing with thecomplexities of the human body shapes. Computeraided design (CAD, also known as Computer aideddesign and drafting (CADD is the computertechnology used for the design processing and designdocumentation. CAD has now been used extensivelyin many applications such as automotive,shipbuilding, aerospace industries, architectural andindustrial designs, prosthetics, computer animationfor special effects in movies, advertising andtechnical manuals. As a technology, digital humanmodeling (DHM has rapidly emerged as atechnology that creates, manipulates and controlhuman representations and human-machine systemsscenes on computers for interactive ergonomic designproblem solving. DHM promises to profoundlychange how products or systems are designed, howergonomics analysis is performed, how disorders andimpairments are assessed and how therapies andsurgeries are conducted. The imperative andemerging need for the DHM appears to be consistentwith the fact that the past decade has witnessedsignificant growth in both the software systemsoffering DHM capabilities as well as the corporateadapting the technology.The authors shall dwell atlength and deliberate on how research in DHM hasfinally brought about an enhanced HCI, in thecontext of computer-aided ergonomics or humancentricdesign and discuss about future trends in thiscontext.

  5. The 1988 Electoral Debates and Debate Theory.

    Science.gov (United States)

    Weiler, Michael

    1989-01-01

    Discusses the relationship of debate theory to the 1988 presidential and vice presidential debates. Proposes that the press's involvement retrieves the debates from the category of "joint appearances." Argues that major definitional difficulties are resolved by recognizing the press as one of the adversaries in the debate process. (MM)

  6. Computers vs. Humans in Galaxy Classification

    Science.gov (United States)

    Kohler, Susanna

    2016-04-01

    In this age of large astronomical surveys, one major scientific bottleneck is the analysis of enormous data sets. Traditionally, this task requires human input but could computers eventually take over? A pair of scientists explore this question by testing whether computers can classify galaxies as well as humans.Examples of disagreement: galaxies that Galaxy-Zoo humans classified as spirals with 95% agreement, but the computer algorithm classified as ellipticals with 70% certainty. Most are cases where the computer got it wrong but not all of them. [Adapted from Kuminski et al. 2016]Limits of Citizen ScienceGalaxy Zoo is an internet-based citizen science project that uses non-astronomer volunteers to classify galaxy images. This is an innovative way to provide more manpower, but its still only practical for limited catalog sizes. How do we handle the data from upcoming surveys like the Large Synoptic Survey Telescope (LSST), which will produce billions of galaxy images when it comes online?In a recent study by Evan Kuminski and Lior Shamir, two computer scientists at Lawrence Technological University in Michigan, a machine learning algorithm known as Wndchrm was used to classify a dataset of Sloan Digital Sky Survey (SDSS) galaxies into ellipticals and spirals. The authors goal is to determine whether their algorithm can classify galaxies as accurately as the human volunteers for Galaxy Zoo.Automatic ClassificationAfter training their classifier on a small set of spiral and elliptical galaxies, Kuminski and Shamir set it loose on a catalog of ~3 million SDSS galaxies. The classifier first computes a set of 2,885 numerical descriptors (like textures, edges, and shapes) for each galaxy image, and then uses these descriptors to categorize the galaxy as spiral or elliptical.Rate of agreement of the computer classification with human classification (for the Galaxy Zoo superclean subset) for different ranges of computed classification certainties. For certainties above

  7. Human-Computer Interaction The Agency Perspective

    CERN Document Server

    Oliveira, José

    2012-01-01

    Agent-centric theories, approaches and technologies are contributing to enrich interactions between users and computers. This book aims at highlighting the influence of the agency perspective in Human-Computer Interaction through a careful selection of research contributions. Split into five sections; Users as Agents, Agents and Accessibility, Agents and Interactions, Agent-centric Paradigms and Approaches, and Collective Agents, the book covers a wealth of novel, original and fully updated material, offering:   ü  To provide a coherent, in depth, and timely material on the agency perspective in HCI ü  To offer an authoritative treatment of the subject matter presented by carefully selected authors ü  To offer a balanced and broad coverage of the subject area, including, human, organizational, social, as well as technological concerns. ü  To offer a hands-on-experience by covering representative case studies and offering essential design guidelines   The book will appeal to a broad audience of resea...

  8. The moral status of the embryo: the human embryo in the UK Human Fertilisation and Embryology (Research Purposes) Regulation 2001 debate.

    Science.gov (United States)

    Bahadur, G

    2003-01-01

    The use of the embryo in research into birth defects, infertility and the possible therapeutic value of embryonic stem cells, has given rise to vigorous discussion of the ethical, moral and legal status of the embryo. This paper considers the parliamentary debate that surrounded the passing of legislation in the UK in 2000 governing the use of the embryo in research. Underlying disagreement by members of Parliament as to whether embryo research was permissible, were considerable differences regarding when life was thought to begin--whether at the moment of fertilization of the egg, or whether after 14 days, at the time of the beginnings of cell differentiation, and the point after which the embryo can no longer split to form twins. Those who favoured the latter view argued that, while the conceptus might possess a unique genetic formula, it had only the potential for life before 14 days, the development of human life being a gradual and continuous process. They considered it mistaken to accord the embryo full human rights. Those who adopted an opposed standpoint insisted that life was present and actual from the moment of conception and therefore sacrosanct and inviolable. The notion of the pre-embryo, they maintained, merely serves to disguise the embryo's humanity.

  9. Environment Debate

    African Journals Online (AJOL)

    komla

    resource diversity, environmental variability and global influences on local ... these relationship has moved from the biased technocratic objective assessment of ... The environment of a particular human group includes both cultural ... and analysis using a livelihood approach that incorporates local knowledge, perceptions, ...

  10. Debating Taoism

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Scholars resort to "the Way" to strike a balance between man and nature A forum on how to integrate Taoist philosophy with social reality, and how to build a harmonious world for humanity, was held at the Hengshan Mountain, a scenic site renowned for Taoist culture in

  11. Debating Taoism

    Institute of Scientific and Technical Information of China (English)

    BAI SHI

    2011-01-01

    Aforum on how to integrate Taoist philosophy with social reality,and how to build a harmonious world for humanity,was held at the Hengshan Mountain,a scenic site renowned for Taoist culture in central China,from October 23-25.

  12. Affective Learning and the Classroom Debate

    Science.gov (United States)

    Jagger, Suzy

    2013-01-01

    A commonly used teaching method to promote student engagement is the classroom debate. This study evaluates how affective characteristics, as defined in Bloom's taxonomy, were stimulated during debates that took place on a professional ethics module for first year computing undergraduates. The debates led to lively interactive group discussions…

  13. Human computer interaction using hand gestures

    CERN Document Server

    Premaratne, Prashan

    2014-01-01

    Human computer interaction (HCI) plays a vital role in bridging the 'Digital Divide', bringing people closer to consumer electronics control in the 'lounge'. Keyboards and mouse or remotes do alienate old and new generations alike from control interfaces. Hand Gesture Recognition systems bring hope of connecting people with machines in a natural way. This will lead to consumers being able to use their hands naturally to communicate with any electronic equipment in their 'lounge.' This monograph will include the state of the art hand gesture recognition approaches and how they evolved from their inception. The author would also detail his research in this area for the past 8 years and how the future might turn out to be using HCI. This monograph will serve as a valuable guide for researchers (who would endeavour into) in the world of HCI.

  14. Human Computation An Integrated Approach to Learning from the Crowd

    CERN Document Server

    Law, Edith

    2011-01-01

    Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

  15. Advanced Technologies, Embedded and Multimedia for Human-Centric Computing

    CERN Document Server

    Chao, Han-Chieh; Deng, Der-Jiunn; Park, James; HumanCom and EMC 2013

    2014-01-01

    The theme of HumanCom and EMC are focused on the various aspects of human-centric computing for advances in computer science and its applications, embedded and multimedia computing and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. And the theme of EMC (Advanced in Embedded and Multimedia Computing) is focused on the various aspects of embedded system, smart grid, cloud and multimedia computing, and it provides an opportunity for academic, industry professionals to discuss the latest issues and progress in the area of embedded and multimedia computing. Therefore this book will be include the various theories and practical applications in human-centric computing and embedded and multimedia computing.

  16. Towards a Human Rights Framework to Advance the Debate on the Role of Private Actors in Education

    Science.gov (United States)

    Aubry, Sylvain; Dorsi, Delphine

    2016-01-01

    Part of the debate on the impact of privatisation in and of education lies in determining against which standards of evidence should the phenomenon be assessed. The questions "what impacts of privatisation in education are we measuring?" and its corollary "what education system do we wish to have?" are crucial to determining…

  17. Computational Models to Synthesize Human Walking

    Institute of Scientific and Technical Information of China (English)

    Lei Ren; David Howard; Laurence Kenney

    2006-01-01

    The synthesis of human walking is of great interest in biomechanics and biomimetic engineering due to its predictive capabilities and potential applications in clinical biomechanics, rehabilitation engineering and biomimetic robotics. In this paper,the various methods that have been used to synthesize humanwalking are reviewed from an engineering viewpoint. This involves a wide spectrum of approaches, from simple passive walking theories to large-scale computational models integrating the nervous, muscular and skeletal systems. These methods are roughly categorized under four headings: models inspired by the concept of a CPG (Central Pattern Generator), methods based on the principles of control engineering, predictive gait simulation using optimisation, and models inspired by passive walking theory. The shortcomings and advantages of these methods are examined, and future directions are discussed in the context of providing insights into the neural control objectives driving gait and improving the stability of the predicted gaits. Future advancements are likely to be motivated by improved understanding of neural control strategies and the subtle complexities of the musculoskeletal system during human locomotion. It is only a matter of time before predictive gait models become a practical and valuable tool in clinical diagnosis, rehabilitation engineering and robotics.

  18. 2012 International Conference on Human-centric Computing

    CERN Document Server

    Jin, Qun; Yeo, Martin; Hu, Bin; Human Centric Technology and Service in Smart Space, HumanCom 2012

    2012-01-01

    The theme of HumanCom is focused on the various aspects of human-centric computing for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in human-centric computing. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject.

  19. Human-Computer Interaction and Information Management Research Needs

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — In a visionary future, Human-Computer Interaction HCI and Information Management IM have the potential to enable humans to better manage their lives through the use...

  20. Computational insight into nitration of human myoglobin.

    Science.gov (United States)

    Lin, Ying-Wu; Shu, Xiao-Gang; Du, Ke-Jie; Nie, Chang-Ming; Wen, Ge-Bo

    2014-10-01

    Protein nitration is an important post-translational modification regulating protein structure and function, especially for heme proteins. Myoglobin (Mb) is an ideal protein model for investigating the structure and function relationship of heme proteins. With limited structural information available for nitrated heme proteins from experiments, we herein performed a molecular dynamics study of human Mb with successive nitration of Tyr103, Tyr146, Trp7 and Trp14. We made a detailed comparison of protein motions, intramolecular contacts and internal cavities of nitrated Mbs with that of native Mb. It showed that although nitration of both Tyr103 and Tyr146 slightly alters the local conformation of heme active site, further nitration of both Trp7 and Trp14 shifts helix A apart from the rest of protein, which results in altered internal cavities and forms a water channel, representing an initial stage of Mb unfolding. The computational study provides an insight into the nitration of heme proteins at an atomic level, which is valuable for understanding the structure and function relationship of heme proteins in non-native states by nitration. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Lightness computation by the human visual system

    Science.gov (United States)

    Rudd, Michael E.

    2017-05-01

    A model of achromatic color computation by the human visual system is presented, which is shown to account in an exact quantitative way for a large body of appearance matching data collected with simple visual displays. The model equations are closely related to those of the original Retinex model of Land and McCann. However, the present model differs in important ways from Land and McCann's theory in that it invokes additional biological and perceptual mechanisms, including contrast gain control, different inherent neural gains for incremental, and decremental luminance steps, and two types of top-down influence on the perceptual weights applied to local luminance steps in the display: edge classification and spatial integration attentional windowing. Arguments are presented to support the claim that these various visual processes must be instantiated by a particular underlying neural architecture. By pointing to correspondences between the architecture of the model and findings from visual neurophysiology, this paper suggests that edge classification involves a top-down gating of neural edge responses in early visual cortex (cortical areas V1 and/or V2) while spatial integration windowing occurs in cortical area V4 or beyond.

  2. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  3. Intermediality between Games and Fiction: The “Ludology vs. Narratology” Debate in Computer Game Studies: A Response to Gonzalo Frasca

    Directory of Open Access Journals (Sweden)

    Kokonis Michalis

    2014-12-01

    Full Text Available In the last ten or fourteen years there has been a debate among the so called ludologists and narratologists in Computer Games Studies as to what is the best methodological approach for the academic study of electronic games. The aim of this paper is to propose a way out of the dilemma, suggesting that both ludology and narratology can be helpful methodologically. However, there is need for a wider theoretical perspective, that of semiotics, in which both approaches can be operative. The semiotic perspective proposed allows research in the field to focus on the similarities between games and traditional narrative forms (since they share narrativity to a greater or lesser extent as well as on their difference (they have different degrees of interaction; it will facilitate communication among theorists if we want to understand each other when talking about games and stories, and it will lead to a better understanding of the hybrid nature of the medium of game. In this sense the present paper aims to complement Gonzalo Frasca’s reconciliatory attempt made a few years back and expand on his proposal.

  4. Cloning: revisiting an old debate.

    Science.gov (United States)

    Verhey, Allen D

    1994-09-01

    The debate about cloning that took place 25 years ago, although directed toward a different sort of cloning, elucidates fundamental issues currently at stake in reproductive technologies and research. Paul Ramsey and Joseph Fletcher were participants in this early debate. The differences between Ramsey and Fletcher about the meaning and sufficiency of freedom, the understanding and weighing of good and evil, the connection between embodiment and personhood, the relationship of humans with nature, and the meaning of parenthood suggest both a broader agenda for the debate about cloning and a cautious move forward in the development of embryo-splitting.

  5. Bajtín en la encrucijada de las ciencias humanas europeas “en crisis”. Revisión de un debate / Bakhtin at the crossroads of the European Human Sciences “in crisis”. Review of a debate

    Directory of Open Access Journals (Sweden)

    Bénédicte Vauthier

    2009-10-01

    Full Text Available RESUMEM: En este artículo se valora la aportación del Círculo de Bajtin a las ciencias humanas. Después de esbozar el contexto de escritura, se valora el diálogo implícito que se instaura entre estos autores y teóricos alemanes. Los textos de Bajtin de los años veinte (Hacia una fi losofía del acto ético, “Autor y personaje en la actividad estética” “Problema del contenido, material y forma en la actividad estética” forman un conjunto coherente, verdadero cimento de la “Estilística de la creación verbal”. Bajtin toma cartas en el debate que enfrentó a Husserl con Dilthey. El marxismo y la filosofía del lenguaje y Freudismo: un bosquejo crítico de V. Volochinov; El método formal en los estudios literarios de P. Medvedev ponen al alcance de un mayor público las ideas fi losófi cas del joven Bajtín. De ahí la necesidad de no instaurar una ruptura hermenéutica entre textos que se aclaran recíprocamente. ABSTRACT: In this article, we evaluate the contribution of the Bakthin Circle to human sciences. After sketching out the writing context, we consider the implicitdialogue that is established between these authors and some Germantheoreticians. Bakhtin’s texts of the twenties (Toward a Philosophy ofthe Act, Author and Hero in Aesthetic Activity, The Problem of Content, Material, and Form in Verbal Art constitute a consistent set, true basis of the “aesthetics of verbal creation”. Bakhtin plays a part in the debate between Husserl and Dilthey. Voloshinov’s Marxism and the Philosophy of Language and Freudianism as well as P. Medvedev’s The Formal Method in Literary Scholarship made the philosophical ideas of the young Bakhtin understandable for a larger public. It is, therefore, necessary not to establish a hermeneutic break between texts which make one another clearer.

  6. On the Rhetorical Contract in Human-Computer Interaction.

    Science.gov (United States)

    Wenger, Michael J.

    1991-01-01

    An exploration of the rhetorical contract--i.e., the expectations for appropriate interaction--as it develops in human-computer interaction revealed that direct manipulation interfaces were more likely to establish social expectations. Study results suggest that the social nature of human-computer interactions can be examined with reference to the…

  7. Computer Modeling of Human Delta Opioid Receptor

    Directory of Open Access Journals (Sweden)

    Tatyana Dzimbova

    2013-04-01

    Full Text Available The development of selective agonists of δ-opioid receptor as well as the model of interaction of ligands with this receptor is the subjects of increased interest. In the absence of crystal structures of opioid receptors, 3D homology models with different templates have been reported in the literature. The problem is that these models are not available for widespread use. The aims of our study are: (1 to choose within recently published crystallographic structures templates for homology modeling of the human δ-opioid receptor (DOR; (2 to evaluate the models with different computational tools; and (3 to precise the most reliable model basing on correlation between docking data and in vitro bioassay results. The enkephalin analogues, as ligands used in this study, were previously synthesized by our group and their biological activity was evaluated. Several models of DOR were generated using different templates. All these models were evaluated by PROCHECK and MolProbity and relationship between docking data and in vitro results was determined. The best correlations received for the tested models of DOR were found between efficacy (erel of the compounds, calculated from in vitro experiments and Fitness scoring function from docking studies. New model of DOR was generated and evaluated by different approaches. This model has good GA341 value (0.99 from MODELLER, good values from PROCHECK (92.6% of most favored regions and MolProbity (99.5% of favored regions. Scoring function correlates (Pearson r = -0.7368, p-value = 0.0097 with erel of a series of enkephalin analogues, calculated from in vitro experiments. So, this investigation allows suggesting a reliable model of DOR. Newly generated model of DOR receptor could be used further for in silico experiments and it will give possibility for faster and more correct design of selective and effective ligands for δ-opioid receptor.

  8. Changing Human-Animal Relationships in Sport: An Analysis of the UK and Australian Horse Racing Whips Debates

    Directory of Open Access Journals (Sweden)

    Raewyn Graham

    2016-05-01

    Full Text Available Changing social values and new technologies have contributed to increasing media attention and debate about the acceptable use of animals in sport. This paper focuses on the use of the whip in thoroughbred horse racing. Those who defend its use argue it is a necessary tool needed for safety, correction and encouragement, and that it does not cause the horse any pain. For those who oppose its use, it is an instrument of cruelty. Media framing is employed to unpack the discourses played out in print and social media in the UK (2011 and Australia (2009 during key periods of the whip debate following the introduction of new whip rules. Media coverage for the period August 2014–August 2015 for both countries is also considered. This paper seeks to identify the perceptions of advocates and opponents of the whip as portrayed in conventional and social media in Australia and the UK, to consider if these perceptions have changed over time, and whose voices are heard in these platforms. This paper contributes to discussions on the impacts that media sites have either in reinforcing existing perspectives or creating new perspectives; and importantly how this impacts on equine welfare.

  9. Changing Human-Animal Relationships in Sport: An Analysis of the UK and Australian Horse Racing Whips Debates.

    Science.gov (United States)

    Graham, Raewyn; McManus, Phil

    2016-05-03

    Changing social values and new technologies have contributed to increasing media attention and debate about the acceptable use of animals in sport. This paper focuses on the use of the whip in thoroughbred horse racing. Those who defend its use argue it is a necessary tool needed for safety, correction and encouragement, and that it does not cause the horse any pain. For those who oppose its use, it is an instrument of cruelty. Media framing is employed to unpack the discourses played out in print and social media in the UK (2011) and Australia (2009) during key periods of the whip debate following the introduction of new whip rules. Media coverage for the period August 2014-August 2015 for both countries is also considered. This paper seeks to identify the perceptions of advocates and opponents of the whip as portrayed in conventional and social media in Australia and the UK, to consider if these perceptions have changed over time, and whose voices are heard in these platforms. This paper contributes to discussions on the impacts that media sites have either in reinforcing existing perspectives or creating new perspectives; and importantly how this impacts on equine welfare.

  10. The inhuman computer/the too-human psychotherapist.

    Science.gov (United States)

    Nadelson, T

    1987-10-01

    There has been an understandable rejection by psychotherapists of any natural language processing (computer/human interaction by means of usual language exchange) which is intended to embrace aspects of psychotherapy. For at least twenty years therapists have experimented with computer programs for specific and general purpose with reported success. This paper describes some of the aspects of artificial intelligence used in computer-mediated or computer-assisted therapy and the utility of such efforts in general reevaluation of human-to-human psychotherapy.

  11. Brain-Computer Interfaces and Human-Computer Interaction

    NARCIS (Netherlands)

    Tan, Desney; Nijholt, Anton; Tan, Desney S.; Nijholt, Anton

    2010-01-01

    Advances in cognitive neuroscience and brain imaging technologies have started to provide us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that can monitor some of the physical processes that occur within the brain that correspo

  12. Brain-Computer Interfaces and Human-Computer Interaction

    NARCIS (Netherlands)

    Tan, Desney; Tan, Desney S.; Nijholt, Antinus

    2010-01-01

    Advances in cognitive neuroscience and brain imaging technologies have started to provide us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that can monitor some of the physical processes that occur within the brain that

  13. Simulating Human Cognitive Using Computational Verb Theory

    Institute of Scientific and Technical Information of China (English)

    YANGTao

    2004-01-01

    Modeling and simulation of a life system is closely connected to the modeling of cognition,especially for advanced life systems. The primary difference between an advanced life system and a digital computer is that the advanced life system consists of a body with mind while a digital computer is only a mind in a formal sense. To model an advanced life system one needs to symbols into a body where a digital computer is embedded. In this paper, a computational verb theory is proposed as a new paradigm of grounding symbols into the outputs of sensors. On one hand, a computational verb can preserve the physical "meanings" of the dynamics of sensor data such that a symbolic system can be used to manipulate physical meanings instead of abstract tokens in the digital computer. On the other hand, the physical meanings of an abstract symbol/token, which is usually an output of a reasoning process in the digital computer, can be restored and fed back to the actuators. Therefore, the computational verb theory bridges the gap between symbols and physical reality from the dynamic cognition perspective.

  14. Human-Computer Interaction (HCI) in Educational Environments: Implications of Understanding Computers as Media.

    Science.gov (United States)

    Berg, Gary A.

    2000-01-01

    Reviews literature in the field of human-computer interaction (HCI) as it applies to educational environments. Topics include the origin of HCI; human factors; usability; computer interface design; goals, operations, methods, and selection (GOMS) models; command language versus direct manipulation; hypertext; visual perception; interface…

  15. Human-Computer Etiquette Cultural Expectations and the Design Implications They Place on Computers and Technology

    CERN Document Server

    Hayes, Caroline C

    2010-01-01

    Written by experts from various fields, this edited collection explores a wide range of issues pertaining to how computers evoke human social expectations. The book illustrates how socially acceptable conventions can strongly impact the effectiveness of human-computer interactions and how to consider such norms in the design of human-computer interfaces. Providing a complete introduction to the design of social responses to computers, the text emphasizes the value of social norms in the development of usable and enjoyable technology. It also describes the role of socially correct behavior in t

  16. Safety Metrics for Human-Computer Controlled Systems

    Science.gov (United States)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  17. Classical Humanism and the Challenge of Modernity. Debates on classical education in Germany c. 1770-1860

    NARCIS (Netherlands)

    van Bommel, S.P.

    2013-01-01

    Classical humanism was a living tradition until far into the nineteenth century. In scholarship, classical (Renaissance) humanism is usually strictly distinguished from so-called ‘neo-humanism,’ which, especially in Germany, reigned supreme at the beginning of the nineteenth century. While most clas

  18. O debate cosmopolitismo x comunitarismo sobre direitos humanos e a esquizofrenia das relações internacionais The cosmopolitanism x comunitarianism debate on human rights and the esquizofreny of the international relations

    Directory of Open Access Journals (Sweden)

    Leonardo Carvalho Braga

    2008-04-01

    Full Text Available A consideração do Estado Nacional como o ator privilegiado nas Relações Internacionais a partir da criação do Sistema de Westphalia apresenta uma esquizofrenia congênita. Os princípios clássicos das Relações Internacionais - autodeterminação dos povos e não-intervenção - sugerem, por um lado, um direito de cada Estado se autodeterminar soberanamente e, por outro, um direito dos Estados de não sofrerem intervenção dos outros Estados. O primeiro direito possui uma natureza mais excludente; o segundo, mais includente. Só o próprio Estado garante a sua autodeterminação e, assim, exclui os outros; ao passo que a não-intervenção depende de todos os Estados a respeitarem - isso inclui os outros. O debate acerca dos direitos humanos nas Relações Internacionais segue a mesma lógica. Cosmopolitas defendem uma inclusão; comunitaristas, uma exclusão. São direitos que se excluem e fazem das Relações Internacionais algo esquizofrênico. Rawls tenta resolver este dilema com o seu "Direito dos Povos", mas fracassa. A proposta talvez então seja pensar Relações Internacionais por outro viés, a partir do pós-modernismo, pelo qual pensamos a satisfação de demandas globais humanas que ultrapassam as fronteiras criadas em Westphalia por outra conformação política que não o Estado Nacional.Considering the Nation State as the privilegied actor in International Relations since Westphalian System points a congenital esquizofreny. The International Relations classical principles - self-determination and non-intervention - sugests, by one side, a right of each State self-determinate itself sovereingtly and, by other side, a right of not being object of intervention. The first right excludes, the second, includes. Just the State guarrantees its self-determination and doing so, excludes the others; meanwhile the non-intervention depends on the other States respect. The debate about human rights in International Relations follows

  19. Rationale awareness for quality assurance in iterative human computation processes

    CERN Document Server

    Xiao, Lu

    2012-01-01

    Human computation refers to the outsourcing of computation tasks to human workers. It offers a new direction for solving a variety of problems and calls for innovative ways of managing human computation processes. The majority of human computation tasks take a parallel approach, whereas the potential of an iterative approach, i.e., having workers iteratively build on each other's work, has not been sufficiently explored. This study investigates whether and how human workers' awareness of previous workers' rationales affects the performance of the iterative approach in a brainstorming task and a rating task. Rather than viewing this work as a conclusive piece, the author believes that this research endeavor is just the beginning of a new research focus that examines and supports meta-cognitive processes in crowdsourcing activities.

  20. Pedagogical Strategies for Human and Computer Tutoring.

    Science.gov (United States)

    Reiser, Brian J.

    The pedagogical strategies of human tutors in problem solving domains are described and the possibility of incorporating these techniques into computerized tutors is examined. GIL (Graphical Instruction in LISP), an intelligent tutoring system for LISP programming, is compared to human tutors teaching the same material in order to identify how the…

  1. Shared resource control between human and computer

    Science.gov (United States)

    Hendler, James; Wilson, Reid

    1989-01-01

    The advantages of an AI system of actively monitoring human control of a shared resource (such as a telerobotic manipulator) are presented. A system is described in which a simple AI planning program gains efficiency by monitoring human actions and recognizing when the actions cause a change in the system's assumed state of the world. This enables the planner to recognize when an interaction occurs between human actions and system goals, and allows maintenance of an up-to-date knowledge of the state of the world and thus informs the operator when human action would undo a goal achieved by the system, when an action would render a system goal unachievable, and efficiently replans the establishment of goals after human intervention.

  2. Computational Intelligence in a Human Brain Model

    Directory of Open Access Journals (Sweden)

    Viorel Gaftea

    2016-06-01

    Full Text Available This paper focuses on the current trends in brain research domain and the current stage of development of research for software and hardware solutions, communication capabilities between: human beings and machines, new technologies, nano-science and Internet of Things (IoT devices. The proposed model for Human Brain assumes main similitude between human intelligence and the chess game thinking process. Tactical & strategic reasoning and the need to follow the rules of the chess game, all are very similar with the activities of the human brain. The main objective for a living being and the chess game player are the same: securing a position, surviving and eliminating the adversaries. The brain resolves these goals, and more, the being movement, actions and speech are sustained by the vital five senses and equilibrium. The chess game strategy helps us understand the human brain better and easier replicate in the proposed ‘Software and Hardware’ SAH Model.

  3. Computational Intelligence in a Human Brain Model

    Directory of Open Access Journals (Sweden)

    Viorel Gaftea

    2016-06-01

    Full Text Available This paper focuses on the current trends in brain research domain and the current stage of development of research for software and hardware solutions, communication capabilities between: human beings and machines, new technologies, nano-science and Internet of Things (IoT devices. The proposed model for Human Brain assumes main similitude between human intelligence and the chess game thinking process. Tactical & strategic reasoning and the need to follow the rules of the chess game, all are very similar with the activities of the human brain. The main objective for a living being and the chess game player are the same: securing a position, surviving and eliminating the adversaries. The brain resolves these goals, and more, the being movement, actions and speech are sustained by the vital five senses and equilibrium. The chess game strategy helps us understand the human brain better and easier replicate in the proposed ‘Software and Hardware’ SAH Model.

  4. Debate: Lessons Learnt from 10 Years and USD 50 million of Grant Making to End Human Trafficking

    Directory of Open Access Journals (Sweden)

    Randy Newcomb

    2014-09-01

    Full Text Available On 26 September 2013, Humanity United, with our partners Legatum Foundation and Walk Free, announced the creation of the USD 100 million Freedom Fund to combat human trafficking around the world. This fund is the first of its kind, organised by three private foundations and borne in part from Humanity United’s experience as a donor over the past decade, during which time we provided more than USD 50 million to fund anti-trafficking efforts globally. During this time, we also worked closely with the donor community as well as organisations and activists working on the frontlines of the struggle to end human trafficking.  Over this period, four themes have emerged that help us better understand how to more effectively work and provide grants to combat human trafficking.

  5. A Glance into the Future of Human Computer Interactions

    CERN Document Server

    Farooq, Umer; Nazir, Sohail

    2011-01-01

    Computers have a direct impact on our lives nowadays. Human's interaction with the computer has modified with the passage of time as improvement in technology occurred the better the human computer interaction became. Today we are facilitated by the operating system that has reduced all the complexity of hardware and we undergo our computation in a very convenient way irrespective of the process occurring at the hardware level. Though the human computer interaction has improved but it's not done yet. If we come to the future the computer's role in our lives would be a lot more rather our life would be of the artificial intelligence. In our future the biggest resource would be component of time and wasting time for a key board entry or a mouse input would be unbearable so the need would be of the computer interaction environment that along with the complexity reduction also minimizes the time wastage in the human computer interaction. Accordingly in our future the computation would also be increased it would n...

  6. A Glance into the Future of Human Computer Interaction

    CERN Document Server

    Farooq, Umer; Nazir, Sohail

    2011-01-01

    Computers have a direct impact on our lives nowadays. Human's interaction with the computer has modified with the passage of time as improvement in technology occurred the better the human computer interaction became. Today we are facilitated by the operating system that has reduced all the complexity of hardware and we undergo our computation in a very convenient way irrespective of the process occurring at the hardware level. Though the human computer interaction has improved but it's not done yet. If we come to the future the computer's role in our lives would be a lot more rather our life would be of the artificial intelligence. In our future the biggest resource would be component of time and wasting time for a key board entry or a mouse input would be unbearable so the need would be of the computer interaction environment that along with the complexity reduction also minimizes the time wastage in the human computer interaction. Accordingly in our future the computation would also be increased it would n...

  7. Can the human brain do quantum computing?

    Science.gov (United States)

    Rocha, A F; Massad, E; Coutinho, F A B

    2004-01-01

    The electrical membrane properties have been the key issues in the understanding of the cerebral physiology for more than almost two centuries. But, molecular neurobiology has now discovered that biochemical transactions play an important role in neuronal computations. Quantum computing (QC) is becoming a reality both from the theoretical point of view as well as from practical applications. Quantum mechanics is the most accurate description at atomic level and it lies behind all chemistry that provides the basis for biology ... maybe the magic of entanglement is also crucial for life. The purpose of the present paper is to discuss the dendrite spine as a quantum computing device, taking into account what is known about the physiology of the glutamate receptors and the cascade of biochemical transactions triggered by the glutamate binding to these receptors.

  8. Human-computer interaction and management information systems

    CERN Document Server

    Galletta, Dennis F

    2014-01-01

    ""Human-Computer Interaction and Management Information Systems: Applications"" offers state-of-the-art research by a distinguished set of authors who span the MIS and HCI fields. The original chapters provide authoritative commentaries and in-depth descriptions of research programs that will guide 21st century scholars, graduate students, and industry professionals. Human-Computer Interaction (or Human Factors) in MIS is concerned with the ways humans interact with information, technologies, and tasks, especially in business, managerial, organizational, and cultural contexts. It is distinctiv

  9. STUDY ON HUMAN-COMPUTER SYSTEM FOR STABLE VIRTUAL DISASSEMBLY

    Institute of Scientific and Technical Information of China (English)

    Guan Qiang; Zhang Shensheng; Liu Jihong; Cao Pengbing; Zhong Yifang

    2003-01-01

    The cooperative work between human being and computer based on virtual reality (VR) is investigated to plan the disassembly sequences more efficiently. A three-layer model of human-computer cooperative virtual disassembly is built, and the corresponding human-computer system for stable virtual disassembly is developed. In this system, an immersive and interactive virtual disassembly environment has been created to provide planners with a more visual working scene. For cooperative disassembly, an intelligent module of stability analysis of disassembly operations is embedded into the human-computer system to assist planners to implement disassembly tasks better. The supporting matrix for stability analysis of disassembly operations is defined and the method of stability analysis is detailed. Based on the approach, the stability of any disassembly operation can be analyzed to instruct the manual virtual disassembly. At last, a disassembly case in the virtual environment is given to prove the validity of above ideas.

  10. Cognition beyond the brain computation, interactivity and human artifice

    CERN Document Server

    Cowley, Stephen J

    2013-01-01

    Arguing that a collective dimension has given cognitive flexibility to human intelligence, this book shows that traditional cognitive psychology underplays the role of bodies, dialogue, diagrams, tools, talk, customs, habits, computers and cultural practices.

  11. Researching on SHRM: An analysis of the debate over the role played by human resources in firm success

    OpenAIRE

    Martín Alcázar, Fernando; Romero Fernández, Pedro Miguel; Sánchez Gardey, Gonzalo

    2005-01-01

    Many different models have been recently proposed to explain the contribution of human resource management to organizational performance, drawing on diverse theoretical frameworks and using many different methodologies. Trying to shed light on the complex state of the art in this field of research, this paper proposes an analysis of the discipline, drawing both on a review of the literature and data obtained from an online questionnaire distributed to human resource management scholars.

  12. Debate - The Trafficking Protocol has Advanced the Global Movement against Human Exploitation: The case of the United Kingdom

    Directory of Open Access Journals (Sweden)

    Caroline Parkes

    2015-04-01

    Full Text Available When politicians, responding to public campaigns focused on human trafficking, make bold and over-emotive statements, invoking William Wilberforce and the pressing need to lead the global fight against slavery, the Trafficking Protocol,[1] proves its worth.  Insulated from national political rhetoric, international treaties, be it the Trafficking Protocol or regional instruments, provide an invaluable structure for governments’ national legislative responses to human trafficking. As the United Kingdom’s (UK Solicitor General noted,[2] The UK’s legal framework has been directly influenced by UN [United Nations] and EU [European Union] Conventions and Directives (emphasis added … [and] The ‘Palermo Protocol’ continues to shape the UK’s response to human trafficking and in particular the care and support afforded to identified human trafficking victims. [1] In full: Protocol to Prevent, Suppress and Punish Trafficking in Persons, Especially Women and Children. [2] Speech, Solicitor General, Oliver Heald QC MP, ‘Prosecuting human trafficking and slavery: The law and the UK response’, UK Government, 12 October 2012, retrieved 6 January 2015 https://www.gov.uk/government/speeches/prosecuting-human-trafficking-and-slavery-the-law-and-the-uk-response

  13. Computer games as a new ontological reality of human existence

    Directory of Open Access Journals (Sweden)

    Maksim Shymeiko

    2015-05-01

    Full Text Available The article considers the ontological dimension of the phenomenon of computer games and their role in the perception of modern man in the world and himself. Describes the characteristic features of the ontological computer game as a virtual world that has an intangible character. Reveals the positive and negative features of computer games in the formation of the meaning of human life.

  14. Use of Computers in Human Factors Engineering

    Science.gov (United States)

    1974-11-01

    SENSES (PHYSIOLOGY), THERMOPLASTIC RESINS, VISUAL ACUITY (U)R RESEARCH CONCERNS DETERMINATION OF THE INFORMATION PRESENTATION REQUIREMENTS OF HUMAN DATA...THE GEOMETRY OF THE wORK STATION, IS CURRENTLY BEING DEVELOPED. IT IS CALLED COMBIMAN, AN ACRONYM FOR COMPUTERIZED BIOMECHANICAL MAN- MODELo COMBIMAN

  15. A Review of the Organisation for Economic Cooperation and Development's International Education Surveys: Governance, Human Capital Discourses, and Policy Debates

    Science.gov (United States)

    Morgan, Clara; Volante, Louis

    2016-01-01

    Given the influential role that the Organisation for Economic Cooperation and Development (OECD) plays in educational governance, we believe it is timely to provide an in-depth review of its education surveys and their associated human capital discourses. By reviewing and summarizing the OECD's suite of education surveys, this paper identifies the…

  16. The North/South Debate: Technology, Basic Human Needs and the New International Economic Order. Working Paper Number Twelve. 1980.

    Science.gov (United States)

    Galtung, Johan

    This document contains two articles by Johan Galtung presented as papers at an international conference on the relationship of technology to the environment and human needs. The monograph is part of a series intended to stimulate research, education, dialogue, and political action toward a just world order. The first paper, "Towards a New…

  17. A Review of the Organisation for Economic Cooperation and Development's International Education Surveys: Governance, Human Capital Discourses, and Policy Debates

    Science.gov (United States)

    Morgan, Clara; Volante, Louis

    2016-01-01

    Given the influential role that the Organisation for Economic Cooperation and Development (OECD) plays in educational governance, we believe it is timely to provide an in-depth review of its education surveys and their associated human capital discourses. By reviewing and summarizing the OECD's suite of education surveys, this paper identifies the…

  18. The North/South Debate: Technology, Basic Human Needs and the New International Economic Order. Working Paper Number Twelve. 1980.

    Science.gov (United States)

    Galtung, Johan

    This document contains two articles by Johan Galtung presented as papers at an international conference on the relationship of technology to the environment and human needs. The monograph is part of a series intended to stimulate research, education, dialogue, and political action toward a just world order. The first paper, "Towards a New…

  19. [Affective computing--a mysterious tool to explore human emotions].

    Science.gov (United States)

    Li, Xin; Li, Honghong; Dou, Yi; Hou, Yongjie; Li, Changwu

    2013-12-01

    Perception, affection and consciousness are basic psychological functions of human being. Affection is the subjective reflection of different kinds of objects. The foundation of human being's thinking is constituted by the three basic functions. Affective computing is an effective tool of revealing the affectiveness of human being in order to understand the world. Our research of affective computing focused on the relation, the generation and the influent factors among different affections. In this paper, the affective mechanism, the basic theory of affective computing, is studied, the method of acquiring and recognition of affective information is discussed, and the application of affective computing is summarized as well, in order to attract more researchers into this working area.

  20. Proactive human-computer collaboration for information discovery

    Science.gov (United States)

    DiBona, Phil; Shilliday, Andrew; Barry, Kevin

    2016-05-01

    Lockheed Martin Advanced Technology Laboratories (LM ATL) is researching methods, representations, and processes for human/autonomy collaboration to scale analysis and hypotheses substantiation for intelligence analysts. This research establishes a machinereadable hypothesis representation that is commonsensical to the human analyst. The representation unifies context between the human and computer, enabling autonomy in the form of analytic software, to support the analyst through proactively acquiring, assessing, and organizing high-value information that is needed to inform and substantiate hypotheses.

  1. Unmanned Surface Vehicle Human-Computer Interface for Amphibious Operations

    Science.gov (United States)

    2013-08-01

    FIGURES Figure 1. MOCU Baseline HCI using Both Aerial Photo and Digital Nautical Chart ( DNC ) Maps to Control and Monitor Land, Sea, and Air...Action DNC Digital Nautical Chart FNC Future Naval Capability HCI Human-Computer Interface HRI Human-Robot Interface HSI Human-Systems Integration...Digital Nautical Chart ( DNC ) Maps to Control and Monitor Land, Sea, and Air Vehicles. 3.2 BASELINE MOCU HCI The Baseline MOCU interface is a tiled

  2. Studying Collective Human Decision Making and Creativity with Evolutionary Computation.

    Science.gov (United States)

    Sayama, Hiroki; Dionne, Shelley D

    2015-01-01

    We report a summary of our interdisciplinary research project "Evolutionary Perspective on Collective Decision Making" that was conducted through close collaboration between computational, organizational, and social scientists at Binghamton University. We redefined collective human decision making and creativity as evolution of ecologies of ideas, where populations of ideas evolve via continual applications of evolutionary operators such as reproduction, recombination, mutation, selection, and migration of ideas, each conducted by participating humans. Based on this evolutionary perspective, we generated hypotheses about collective human decision making, using agent-based computer simulations. The hypotheses were then tested through several experiments with real human subjects. Throughout this project, we utilized evolutionary computation (EC) in non-traditional ways-(1) as a theoretical framework for reinterpreting the dynamics of idea generation and selection, (2) as a computational simulation model of collective human decision-making processes, and (3) as a research tool for collecting high-resolution experimental data on actual collaborative design and decision making from human subjects. We believe our work demonstrates untapped potential of EC for interdisciplinary research involving human and social dynamics.

  3. What Money Cannot Buy and What Money Ought Not Buy: Dignity, Motives, and Markets in Human Organ Procurement Debates.

    Science.gov (United States)

    Gillespie, Ryan

    2017-01-06

    Given the current organ shortage, a prevalent alternative to the altruism-based policy is a market-based solution: pay people for their organs. Receiving much popular and scholarly attention, a salient normative argument against neoliberal pressures is the preservation of human dignity. This article examines how advocates of both the altruistic status quo and market challengers reason and weigh the central normative concept of dignity, meant as inherent worth and/or rank. Key rhetorical strategies, including motivations and broader social visions, of the two positions are analyzed and evaluated, and the separation of morally normative understandings of dignity from market encroachment is defended.

  4. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2012-01-01

    This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135

  5. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL.

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2011-05-01

    This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven's Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors.

  6. Are debatable scientific questions debatable? (Invited)

    Science.gov (United States)

    Oreskes, N.

    2010-12-01

    Are debatable scientific questions debatable? In 2000, the physicist-philosopher John Ziman posed this pithy—and crucial—question. He noted that scientists were at a disadvantage in public debate, because the rules of engagement are different in scientific discourse than in public discourse in ways that make it difficult for scientists to ‘win’ public arguments, even when the facts are on their side. In this paper, I revisit Ziman’s arguments in light of the difficulties that climate scientists have had in communicating the reality and gravity of global warming. In addition to the problem posed by Ziman, I also address the role of organized disinformation in further increasing the challenges that climate scientists face.

  7. Gene editing advance re-ignites debate on the merits and risks of animal to human transplantation.

    Science.gov (United States)

    Fung, R K F; Kerridge, I H

    2016-09-01

    In Australia, and internationally, the shortage of organ and tissue donors significantly limits the number of patients with critical organ or tissue failure who are able to receive a transplant each year. The rationale for xenotransplantation - the transplantation of living cells, tissues or organs from one species to another - is to meet this shortfall in human donor material. While early clinical trials showed promise, particularly in patients with type I diabetes whose insulin dependence could be temporarily reversed by the transplantation of porcine islet cells, these benefits have been balanced with scientific, clinical and ethical concerns revolving around the risks of immune rejection and the potential transmission of porcine endogenous retroviruses or other infectious agents from porcine grafts to human recipients. However, the advent of CRISPR/Cas9, a revolutionary gene editing technology, has reignited interest in the field with the possibility of genetically engineering porcine organs and tissues that are less immunogenic and have virtually no risk of transmission of porcine endogenous retroviruses. At the same time, CRISPR/Cas9 may also open up a myriad of possibilities for tissue engineering and stem cell research, which may complement xenotransplantation research by providing an additional source of donor cells, tissues and organs for transplantation into patients. The recent international symposium on gene editing, organised by the US National Academy of Sciences, highlights both the enormous therapeutic potential of CRISPR/Cas9 and the raft of ethical and regulatory challenges that may follow its utilisation in transplantation and in medicine more generally.

  8. Speech Dialogue with Facial Displays Multimodal Human-Computer Conversation

    CERN Document Server

    Nagao, K; Nagao, Katashi; Takeuchi, Akikazu

    1994-01-01

    Human face-to-face conversation is an ideal model for human-computer dialogue. One of the major features of face-to-face communication is its multiplicity of communication channels that act on multiple modalities. To realize a natural multimodal dialogue, it is necessary to study how humans perceive information and determine the information to which humans are sensitive. A face is an independent communication channel that conveys emotional and conversational signals, encoded as facial expressions. We have developed an experimental system that integrates speech dialogue and facial animation, to investigate the effect of introducing communicative facial expressions as a new modality in human-computer conversation. Our experiments have shown that facial expressions are helpful, especially upon first contact with the system. We have also discovered that featuring facial expressions at an early stage improves subsequent interaction.

  9. The UK Human Genome Mapping Project online computing service.

    Science.gov (United States)

    Rysavy, F R; Bishop, M J; Gibbs, G P; Williams, G W

    1992-04-01

    This paper presents an overview of computing and networking facilities developed by the Medical Research Council to provide online computing support to the Human Genome Mapping Project (HGMP) in the UK. The facility is connected to a number of other computing facilities in various centres of genetics and molecular biology research excellence, either directly via high-speed links or through national and international wide-area networks. The paper describes the design and implementation of the current system, a 'client/server' network of Sun, IBM, DEC and Apple servers, gateways and workstations. A short outline of online computing services currently delivered by this system to the UK human genetics research community is also provided. More information about the services and their availability could be obtained by a direct approach to the UK HGMP-RC.

  10. Linguistics in the digital humanities: (computational corpus linguistics

    Directory of Open Access Journals (Sweden)

    Kim Ebensgaard Jensen

    2014-12-01

    Full Text Available Corpus linguistics has been closely intertwined with digital technology since the introduction of university computer mainframes in the 1960s. Making use of both digitized data in the form of the language corpus and computational methods of analysis involving concordancers and statistics software, corpus linguistics arguably has a place in the digital humanities. Still, it remains obscure and fi gures only sporadically in the literature on the digital humanities. Th is article provides an overview of the main principles of corpus linguistics and the role of computer technology in relation to data and method and also off ers a bird's-eye view of the history of corpus linguistics with a focus on its intimate relationship with digital technology and how digital technology has impacted the very core of corpus linguistics and shaped the identity of the corpus linguist. Ultimately, the article is oriented towards an acknowledgment of corpus linguistics' alignment with the digital humanities.

  11. The Human Genome Project as a case study in the debate about the relationship between theology and natural science

    Directory of Open Access Journals (Sweden)

    Johan Buitendag

    2005-10-01

    Full Text Available The author presents a review article on the book, Brave new world? Theology, ethics and the human genome, edited by Celia Deane-Drummond and published in 2003 by T&T Clark International in London. After a rather elaborate exposition, he appraises the collection of essays in terms of the dialogue between theology and the natural sciences. As an acid test, he assesses the challenge Kant, however, dealt with, namely to combine and to separate the right things. Kant pushed this to extremes and ended up with both solipsism and dualism. This article tackles the challenge differently and concludes that theology is an a posteriori science and that by means of différance, knowledge of the noumenon is indeed possible. The author therefore appreciates the different contributions in the book in this light. Deane-Drummond’s proposal that a virtue ethic should be complemented by certain biblical values is therefore viewed rather sceptically. This remains a transcendental enterprise where epistemology precedes ontology.

  12. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  13. Supporting Negotiation Behavior with Haptics-Enabled Human-Computer Interfaces

    OpenAIRE

    Küçükyılmaz, Ayşe; Sezgin, Tevfik Metin; Başdoğan, Çağatay

    2012-01-01

    An active research goal for human-computer interaction is to allow humans to communicate with computers in an intuitive and natural fashion, especially in real-life interaction scenarios. One approach that has been advocated to achieve this has been to build computer systems with human-like qualities and capabilities. In this paper, we present insight on how human-computer interaction can be enriched by employing the computers with behavioral patterns that naturally appear in human-human nego...

  14. From humans to computers cognition through visual perception

    CERN Document Server

    Alexandrov, Viktor Vasilievitch

    1991-01-01

    This book considers computer vision to be an integral part of the artificial intelligence system. The core of the book is an analysis of possible approaches to the creation of artificial vision systems, which simulate human visual perception. Much attention is paid to the latest achievements in visual psychology and physiology, the description of the functional and structural organization of the human perception mechanism, the peculiarities of artistic perception and the expression of reality. Computer vision models based on these data are investigated. They include the processes of external d

  15. Human computer interaction issues in Clinical Trials Management Systems.

    Science.gov (United States)

    Starren, Justin B; Payne, Philip R O; Kaufman, David R

    2006-01-01

    Clinical trials increasingly rely upon web-based Clinical Trials Management Systems (CTMS). As with clinical care systems, Human Computer Interaction (HCI) issues can greatly affect the usefulness of such systems. Evaluation of the user interface of one web-based CTMS revealed a number of potential human-computer interaction problems, in particular, increased workflow complexity associated with a web application delivery model and potential usability problems resulting from the use of ambiguous icons. Because these design features are shared by a large fraction of current CTMS, the implications extend beyond this individual system.

  16. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  17. A computational model of the human hand 93-ERI-053

    Energy Technology Data Exchange (ETDEWEB)

    Hollerbach, K.; Axelrod, T.

    1996-03-01

    The objectives of the Computational Hand Modeling project were to prove the feasibility of the Laboratory`s NIKE3D finite element code to orthopaedic problems. Because of the great complexity of anatomical structures and the nonlinearity of their behavior, we have focused on a subset of joints of the hand and lower extremity and have developed algorithms to model their behavior. The algorithms developed here solve fundamental problems in computational biomechanics and can be expanded to describe any other joints of the human body. This kind of computational modeling has never successfully been attempted before, due in part to a lack of biomaterials data and a lack of computational resources. With the computational resources available at the National Laboratories and the collaborative relationships we have established with experimental and other modeling laboratories, we have been in a position to pursue our innovative approach to biomechanical and orthopedic modeling.

  18. Interactive Evolutionary Computation for Analyzing Human Awareness Mechanisms

    Directory of Open Access Journals (Sweden)

    Hideyuki Takagi

    2012-01-01

    Full Text Available We discuss the importance of establishing awareness science and show the idea of using interactive evolutionary computation (IEC as a tool for analyzing awareness mechanism and making awareness models. First, we describe the importance of human factors in computational intelligence and that IEC is one of approaches for the so-called humanized computational intelligence. Second, we show examples that IEC is used as an analysis tool for human science. As analyzing human awareness mechanism is in this kind of analyzing human characteristics and capabilities, IEC may be able to be used for this purpose. Based on this expectation, we express one idea for analyzing the awareness mechanism. This idea is to make an equivalent model of an IEC user using a learning model and find latent variables that connect inputs and outputs of the user model and that help to understand or explain the inputs-outputs relationship. Although there must be several definitions of awareness, this idea is based on one definition that awareness is to find out unknown variables that helps our understanding. If we establish a method for finding the latent variables automatically, we can realize an awareness model in computer.

  19. Debating complexity in modeling

    Science.gov (United States)

    Hunt, Randall J.; Zheng, Chunmiao

    1999-01-01

    Complexity in modeling would seem to be an issue of universal importance throughout the geosciences, perhaps throughout all science, if the debate last year among groundwater modelers is any indication. During the discussion the following questions and observations made up the heart of the debate.

  20. Computational Virtual Reality (VR) as a human-computer interface in the operation of telerobotic systems

    Science.gov (United States)

    Bejczy, Antal K.

    1995-01-01

    This presentation focuses on the application of computer graphics or 'virtual reality' (VR) techniques as a human-computer interface tool in the operation of telerobotic systems. VR techniques offer very valuable task realization aids for planning, previewing and predicting robotic actions, operator training, and for visual perception of non-visible events like contact forces in robotic tasks. The utility of computer graphics in telerobotic operation can be significantly enhanced by high-fidelity calibration of virtual reality images to actual TV camera images. This calibration will even permit the creation of artificial (synthetic) views of task scenes for which no TV camera views are available.

  1. A Software Framework for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2009-01-01

    This paper describes a software framework we designed and implemented for the development and research in the area of multimodal human-computer interface. The proposed framework is based on publish / subscribe architecture, which allows developers and researchers to conveniently configure, test and

  2. Formal modelling techniques in human-computer interaction

    NARCIS (Netherlands)

    Haan, de G.; Veer, van der G.C.; Vliet, van J.C.

    1991-01-01

    This paper is a theoretical contribution, elaborating the concept of models as used in Cognitive Ergonomics. A number of formal modelling techniques in human-computer interaction will be reviewed and discussed. The analysis focusses on different related concepts of formal modelling techniques in hum

  3. Computed tomography of the human developing anterior skull base

    NARCIS (Netherlands)

    J. van Loosen (J.); A.I.J. Klooswijk (A. I J); D. van Velzen (D.); C.D.A. Verwoerd (Carel)

    1990-01-01

    markdownabstractAbstract The ossification of the anterior skull base, especially the lamina cribrosa, has been studied by computed tomography and histopathology. Sixteen human fetuses, (referred to our laboratory for pathological examination after spontaneous abortion between 18 and 32 weeks of ge

  4. CHI '13 Extended Abstracts on Human Factors in Computing Systems

    DEFF Research Database (Denmark)

    The CHI Papers and Notes program is continuing to grow along with many of our sister conferences. We are pleased that CHI is still the leading venue for research in human-computer interaction. CHI 2013 continued the use of subcommittees to manage the review process. Authors selected the subcommit...

  5. A Software Framework for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2009-01-01

    This paper describes a software framework we designed and implemented for the development and research in the area of multimodal human-computer interface. The proposed framework is based on publish / subscribe architecture, which allows developers and researchers to conveniently configure, test and

  6. Studying Collective Human Decision Making and Creativity with Evolutionary Computation

    OpenAIRE

    Sayama, Hiroki; Dionne, Shelley D.

    2014-01-01

    We report a summary of our interdisciplinary research project "Evolutionary Perspective on Collective Decision Making" that was conducted through close collaboration between computational, organizational and social scientists at Binghamton University. We redefined collective human decision making and creativity as evolution of ecologies of ideas, where populations of ideas evolve via continual applications of evolutionary operators such as reproduction, recombination, mutation, selection, and...

  7. Homo ludens in the loop playful human computation systems

    CERN Document Server

    Krause, Markus

    2014-01-01

    The human mind is incredible. It solves problems with ease that will elude machines even for the next decades. This book explores what happens when humans and machines work together to solve problems machines cannot yet solve alone. It explains how machines and computers can work together and how humans can have fun helping to face some of the most challenging problems of artificial intelligence. In this book, you will find designs for games that are entertaining and yet able to collect data to train machines for complex tasks such as natural language processing or image understanding. You wil

  8. The DSM5/RDoC debate on the future of mental health research: implication for studies on human stress and presentation of the signature bank.

    Science.gov (United States)

    Lupien, S J; Sasseville, M; François, N; Giguère, C E; Boissonneault, J; Plusquellec, P; Godbout, R; Xiong, L; Potvin, S; Kouassi, E; Lesage, A

    2017-01-01

    In 2008, the National Institute of Mental Health (NIMH) announced that in the next few decades, it will be essential to study the various biological, psychological and social "signatures" of mental disorders. Along with this new "signature" approach to mental health disorders, modifications of DSM were introduced. One major modification consisted of incorporating a dimensional approach to mental disorders, which involved analyzing, using a transnosological approach, various factors that are commonly observed across different types of mental disorders. Although this new methodology led to interesting discussions of the DSM5 working groups, it has not been incorporated in the last version of the DSM5. Consequently, the NIMH launched the "Research Domain Criteria" (RDoC) framework in order to provide new ways of classifying mental illnesses based on dimensions of observable behavioral and neurobiological measures. The NIMH emphasizes that it is important to consider the benefits of dimensional measures from the perspective of psychopathology and environmental influences, and it is also important to build these dimensions on neurobiological data. The goal of this paper is to present the perspectives of DSM5 and RDoC to the science of mental health disorders and the impact of this debate on the future of human stress research. The second goal is to present the "Signature Bank" developed by the Institut Universitaire en Santé Mentale de Montréal (IUSMM) that has been developed in line with a dimensional and transnosological approach to mental illness.

  9. Computational Fluid and Particle Dynamics in the Human Respiratory System

    CERN Document Server

    Tu, Jiyuan; Ahmadi, Goodarz

    2013-01-01

    Traditional research methodologies in the human respiratory system have always been challenging due to their invasive nature. Recent advances in medical imaging and computational fluid dynamics (CFD) have accelerated this research. This book compiles and details recent advances in the modelling of the respiratory system for researchers, engineers, scientists, and health practitioners. It breaks down the complexities of this field and provides both students and scientists with an introduction and starting point to the physiology of the respiratory system, fluid dynamics and advanced CFD modeling tools. In addition to a brief introduction to the physics of the respiratory system and an overview of computational methods, the book contains best-practice guidelines for establishing high-quality computational models and simulations. Inspiration for new simulations can be gained through innovative case studies as well as hands-on practice using pre-made computational code. Last but not least, students and researcher...

  10. A novel polar-based human face recognition computational model

    Directory of Open Access Journals (Sweden)

    Y. Zana

    2009-07-01

    Full Text Available Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.

  11. Neuromolecular computing: a new approach to human brain evolution.

    Science.gov (United States)

    Wallace, R; Price, H

    1999-09-01

    Evolutionary approaches in human cognitive neurobiology traditionally emphasize macroscopic structures. It may soon be possible to supplement these studies with models of human information-processing of the molecular level. Thin-film, simulation, fluorescence microscopy, and high-resolution X-ray crystallographic studies provide evidence for transiently organized neural membrane molecular systems with possible computational properties. This review article examines evidence for hydrophobic-mismatch molecular interactions within phospholipid microdomains of a neural membrane bilayer. It is proposed that these interactions are a massively parallel algorithm which can rapidly compute near-optimal solutions to complex cognitive and physiological problems. Coupling of microdomain activity to permenant ion movements at ligand-gated and voltage-gated channels permits the conversion of molecular computations into neuron frequency codes. Evidence for microdomain transport of proteins to specific locations within the bilayer suggests that neuromolecular computation may be under some genetic control and thus modifiable by natural selection. A possible experimental approach for examining evolutionary changes in neuromolecular computation is briefly discussed.

  12. Hand Gesture and Neural Network Based Human Computer Interface

    Directory of Open Access Journals (Sweden)

    Aekta Patel

    2014-06-01

    Full Text Available Computer is used by every people either at their work or at home. Our aim is to make computers that can understand human language and can develop a user friendly human computer interfaces (HCI. Human gestures are perceived by vision. The research is for determining human gestures to create an HCI. Coding of these gestures into machine language demands a complex programming algorithm. In this project, We have first detected, recognized and pre-processing the hand gestures by using General Method of recognition. Then We have found the recognized image’s properties and using this, mouse movement, click and VLC Media player controlling are done. After that we have done all these functions thing using neural network technique and compared with General recognition method. From this we can conclude that neural network technique is better than General Method of recognition. In this, I have shown the results based on neural network technique and comparison between neural network method & general method.

  13. Human -Computer Interface using Gestures based on Neural Network

    Directory of Open Access Journals (Sweden)

    Aarti Malik

    2014-10-01

    Full Text Available - Gestures are powerful tools for non-verbal communication. Human computer interface (HCI is a growing field which reduces the complexity of interaction between human and machine in which gestures are used for conveying information or controlling the machine. In the present paper, static hand gestures are utilized for this purpose. The paper presents a novel technique of recognizing hand gestures i.e. A-Z alphabets, 0-9 numbers and 6 additional control signals (for keyboard and mouse control by extracting various features of hand ,creating a feature vector table and training a neural network. The proposed work has a recognition rate of 99%. .

  14. Human-Computer Interaction, Tourism and Cultural Heritage

    Science.gov (United States)

    Cipolla Ficarra, Francisco V.

    We present a state of the art of the human-computer interaction aimed at tourism and cultural heritage in some cities of the European Mediterranean. In the work an analysis is made of the main problems deriving from training understood as business and which can derail the continuous growth of the HCI, the new technologies and tourism industry. Through a semiotic and epistemological study the current mistakes in the context of the interrelations of the formal and factual sciences will be detected and also the human factors that have an influence on the professionals devoted to the development of interactive systems in order to safeguard and boost cultural heritage.

  15. A computer simulation approach to measurement of human control strategy

    Science.gov (United States)

    Green, J.; Davenport, E. L.; Engler, H. F.; Sears, W. E., III

    1982-01-01

    Human control strategy is measured through use of a psychologically-based computer simulation which reflects a broader theory of control behavior. The simulation is called the human operator performance emulator, or HOPE. HOPE was designed to emulate control learning in a one-dimensional preview tracking task and to measure control strategy in that setting. When given a numerical representation of a track and information about current position in relation to that track, HOPE generates positions for a stick controlling the cursor to be moved along the track. In other words, HOPE generates control stick behavior corresponding to that which might be used by a person learning preview tracking.

  16. Visual Interpretation Of Hand Gestures For Human Computer Interaction

    Directory of Open Access Journals (Sweden)

    M.S.Sahane

    2014-01-01

    Full Text Available The use of hand gestures provides an attractive alternative to cumbersome interface devices for human-computer interaction (HCI. In particular, visual interpretation of hand gestures can help in achieving the ease and naturalness desired for HCI. This discussion is organized on the basis of the method used for modeling, analyzing, and recognizing gestures. We propose pointing gesture-based large display interaction using a depth camera. A user interacts with applications for large display by using pointing gestures with the barehand. The calibration between large display and depth camera can be automatically performed by using RGB-D camera.. We also discuss implemented gestural systems as well as other potential applications of vision-based gesture recognition. We discuss directions of future research in gesture recognition, including its integration with other natural modes of human computer interaction.

  17. Computer aided systems human engineering: A hypermedia tool

    Science.gov (United States)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  18. The Human-Computer Domain Relation in UX Models

    DEFF Research Database (Denmark)

    Clemmensen, Torkil

    This paper argues that the conceptualization of the human, the computer and the domain of use in competing lines of UX research have problematic similarities and superficial differences. The paper qualitatively analyses concepts and models in five research papers that together represent two...... influential lines of UX research: aesthetics and temporal UX, and two use situations: using a website and starting to use a smartphone. The results suggest that the two lines of UX research share a focus on users’ evaluative judgments of technology, both focuses on product qualities rather than activity...... domains, give little details about users, and treat human-computer interaction as perception. The conclusion gives similarities and differences between the approaches to UX. The implications for theory building are indicated....

  19. Developing a computational model of human hand kinetics using AVS

    Energy Technology Data Exchange (ETDEWEB)

    Abramowitz, Mark S. [State Univ. of New York, Binghamton, NY (United States)

    1996-05-01

    As part of an ongoing effort to develop a finite element model of the human hand at the Institute for Scientific Computing Research (ISCR), this project extended existing computational tools for analyzing and visualizing hand kinetics. These tools employ a commercial, scientific visualization package called AVS. FORTRAN and C code, originally written by David Giurintano of the Gillis W. Long Hansen`s Disease Center, was ported to a different computing platform, debugged, and documented. Usability features were added and the code was made more modular and readable. When the code is used to visualize bone movement and tendon paths for the thumb, graphical output is consistent with expected results. However, numerical values for forces and moments at the thumb joints do not yet appear to be accurate enough to be included in ISCR`s finite element model. Future work includes debugging the parts of the code that calculate forces and moments and verifying the correctness of these values.

  20. Human-computer interaction: psychology as a science of design.

    Science.gov (United States)

    Carroll, J M

    1997-01-01

    Human-computer interaction (HCI) study is the region of intersection between psychology and the social sciences, on the one hand, and computer science and technology, on the other. HCI researchers analyze and design specific user interface technologies (e.g. pointing devices). They study and improve the processes of technology development (e.g. task analysis, design rationale). They develop and evaluate new applications of technology (e.g. word processors, digital libraries). Throughout the past two decades, HCI has progressively integrated its scientific concerns with the engineering goal of improving the usability of computer systems and applications, which has resulted in a body of technical knowledge and methodology. HCI continues to provide a challenging test domain for applying and developing psychological and social theory in the context of technology development and use.

  1. Human Computer Interface Design Criteria. Volume 1. User Interface Requirements

    Science.gov (United States)

    2010-03-19

    2 entitled Human Computer Interface ( HCI )Design Criteria Volume 1: User Interlace Requirements which contains the following major changes from...MISSILE SYSTEMS CENTER Air Force Space Command 483 N. Aviation Blvd. El Segundo, CA 90245 4. This standard has been approved for use on all Space and...and efficient model of how the system works and can generalize this knowledge to other systems. According to Mayhew in Principles and Guidelines in

  2. Human-computer systems interaction backgrounds and applications 3

    CERN Document Server

    Kulikowski, Juliusz; Mroczek, Teresa; Wtorek, Jerzy

    2014-01-01

    This book contains an interesting and state-of the art collection of papers on the recent progress in Human-Computer System Interaction (H-CSI). It contributes the profound description of the actual status of the H-CSI field and also provides a solid base for further development and research in the discussed area. The contents of the book are divided into the following parts: I. General human-system interaction problems; II. Health monitoring and disabled people helping systems; and III. Various information processing systems. This book is intended for a wide audience of readers who are not necessarily experts in computer science, machine learning or knowledge engineering, but are interested in Human-Computer Systems Interaction. The level of particular papers and specific spreading-out into particular parts is a reason why this volume makes fascinating reading. This gives the reader a much deeper insight than he/she might glean from research papers or talks at conferences. It touches on all deep issues that ...

  3. Computational Hemodynamic Simulation of Human Circulatory System under Altered Gravity

    Science.gov (United States)

    Kim. Chang Sung; Kiris, Cetin; Kwak, Dochan

    2003-01-01

    A computational hemodynamics approach is presented to simulate the blood flow through the human circulatory system under altered gravity conditions. Numerical techniques relevant to hemodynamics issues are introduced to non-Newtonian modeling for flow characteristics governed by red blood cells, distensible wall motion due to the heart pulse, and capillary bed modeling for outflow boundary conditions. Gravitational body force terms are added to the Navier-Stokes equations to study the effects of gravity on internal flows. Six-type gravity benchmark problems are originally presented to provide the fundamental understanding of gravitational effects on the human circulatory system. For code validation, computed results are compared with steady and unsteady experimental data for non-Newtonian flows in a carotid bifurcation model and a curved circular tube, respectively. This computational approach is then applied to the blood circulation in the human brain as a target problem. A three-dimensional, idealized Circle of Willis configuration is developed with minor arteries truncated based on anatomical data. Demonstrated is not only the mechanism of the collateral circulation but also the effects of gravity on the distensible wall motion and resultant flow patterns.

  4. Reconfiguring Interactivity, Agency and Pleasure in the Education and Computer Games Debate--Using Zizek's Concept of Interpassivity to Analyse Educational Play

    Science.gov (United States)

    Pelletier, Caroline

    2005-01-01

    Digital or computer games have recently attracted the interest of education researchers and policy-makers for two main reasons: their interactivity, which is said to allow greater agency, and their inherent pleasures, which are linked to increased motivation to learn. However, the relationship between pleasure, agency and motivation in educational…

  5. Criteria of Human-computer Interface Design for Computer Assisted Surgery Systems

    Institute of Scientific and Technical Information of China (English)

    ZHANG Jian-guo; LIN Yan-ping; WANG Cheng-tao; LIU Zhi-hong; YANG Qing-ming

    2008-01-01

    In recent years, computer assisted surgery (CAS) systems become more and more common in clinical practices, but few specific design criteria have been proposed for human-computer interface (HCI) in CAS systems. This paper tried to give universal criteria of HCI design for CAS systems through introduction of demonstration application, which is total knee replacement (TKR) with a nonimage-based navigation system.A typical computer assisted process can be divided into four phases: the preoperative planning phase, the intraoperative registration phase, the intraoperative navigation phase and finally the postoperative assessment phase. The interface design for four steps is described respectively in the demonstration application. These criteria this paper summarized can be useful to software developers to achieve reliable and effective interfaces for new CAS systems more easily.

  6. Issues in human/computer control of dexterous remote hands

    Science.gov (United States)

    Salisbury, K.

    1987-01-01

    Much research on dexterous robot hands has been aimed at the design and control problems associated with their autonomous operation, while relatively little research has addressed the problem of direct human control. It is likely that these two modes can be combined in a complementary manner yielding more capability than either alone could provide. While many of the issues in mixed computer/human control of dexterous hands parallel those found in supervisory control of traditional remote manipulators, the unique geometry and capabilities of dexterous hands pose many new problems. Among these are the control of redundant degrees of freedom, grasp stabilization and specification of non-anthropomorphic behavior. An overview is given of progress made at the MIT AI Laboratory in control of the Salisbury 3 finger hand, including experiments in grasp planning and manipulation via controlled slip. It is also suggested how we might introduce human control into the process at a variety of functional levels.

  7. Advancements in Violin-Related Human-Computer Interaction

    DEFF Research Database (Denmark)

    Overholt, Daniel

    2014-01-01

    Finesse is required while performing with many traditional musical instruments, as they are extremely responsive to human inputs. The violin is specifically examined here, as it excels at translating a performer’s gestures into sound in manners that evoke a wide range of affective qualities...... of human intelligence and emotion is at the core of the Musical Interface Technology Design Space, MITDS. This is a framework that endeavors to retain and enhance such traits of traditional instruments in the design of interactive live performance interfaces. Utilizing the MITDS, advanced Human......-Computer Interaction technologies for the violin are developed in order to allow musicians to explore new methods of creating music. Through this process, the aim is to provide musicians with control systems that let them transcend the interface itself, and focus on musically compelling performances....

  8. Nuclear energy debate

    CERN Document Server

    Healey, Justin

    2012-01-01

    The debate over the introduction of nuclear power in Australia has recently become more heated in light of safety concerns over the nuclear reactor meltdown emergency in Japan. Australia has also just committed to a carbon trading scheme to address its reliance on coal-fired energy and reduce greenhouse emissions. With 40% of the world's uranium located in Australia, the economic, environmental and health considerations are significant. This book contains an overview of global nuclear energy use and production, and presents a range of current opinions debating the pros and cons of Australia's

  9. Computed tomography of human joints and radioactive waste drums

    Energy Technology Data Exchange (ETDEWEB)

    Ashby, E; Bernardi, R; Hollerbach, K; Logan, C; Martz, H; Roberson, G P

    1999-06-01

    X- and gamma-ray imaging techniques in nondestructive evaluation (NDE) and assay (NDA) have been increasing use in an array of industrial, environmental, military, and medical applications. Much of this growth in recent years is attributed to the rapid development of computed tomography (CT) and the use of NDE throughout the life-cycle of a product. Two diverse examples of CT are discussed. (1) The computational approach to normal joint kinematics and prosthetic joint analysis offers an opportunity to evaluate and improve prosthetic human joint replacements before they are manufactured or surgically implanted. Computed tomography data from scanned joints are segmented, resulting in the identification of bone and other tissues of interest, with emphasis on the articular surfaces. (2) They are developing NDE and NDE techniques to analyze closed waste drums accurately and quantitatively. Active and passive computed tomography (A and PCT) is a comprehensive and accurate gamma-ray NDA method that can identify all detectable radioisotopes present in a container and measure their radioactivity.

  10. Gesture controlled human-computer interface for the disabled.

    Science.gov (United States)

    Szczepaniak, Oskar M; Sawicki, Dariusz J

    2017-02-28

    The possibility of using a computer by a disabled person is one of the difficult problems of the human-computer interaction (HCI), while the professional activity (employment) is one of the most important factors affecting the quality of life, especially for disabled people. The aim of the project has been to propose a new HCI system that would allow for resuming employment for people who have lost the possibility of a standard computer operation. The basic requirement was to replace all functions of a standard mouse without the need of performing precise hand movements and using fingers. The Microsoft's Kinect motion controller had been selected as a device which would recognize hand movements. Several tests were made in order to create optimal working environment with the new device. The new communication system consisted of the Kinect device and the proper software had been built. The proposed system was tested by means of the standard subjective evaluations and objective metrics according to the standard ISO 9241-411:2012. The overall rating of the new HCI system shows the acceptance of the solution. The objective tests show that although the new system is a bit slower, it may effectively replace the computer mouse. The new HCI system fulfilled its task for a specific disabled person. This resulted in the ability to return to work. Additionally, the project confirmed the possibility of effective but nonstandard use of the Kinect device. Med Pr 2017;68(1):1-21.

  11. Patient-Specific Computational Modeling of Human Phonation

    Science.gov (United States)

    Xue, Qian; Zheng, Xudong; University of Maine Team

    2013-11-01

    Phonation is a common biological process resulted from the complex nonlinear coupling between glottal aerodynamics and vocal fold vibrations. In the past, the simplified symmetric straight geometric models were commonly employed for experimental and computational studies. The shape of larynx lumen and vocal folds are highly three-dimensional indeed and the complex realistic geometry produces profound impacts on both glottal flow and vocal fold vibrations. To elucidate the effect of geometric complexity on voice production and improve the fundamental understanding of human phonation, a full flow-structure interaction simulation is carried out on a patient-specific larynx model. To the best of our knowledge, this is the first patient-specific flow-structure interaction study of human phonation. The simulation results are well compared to the established human data. The effects of realistic geometry on glottal flow and vocal fold dynamics are investigated. It is found that both glottal flow and vocal fold dynamics present a high level of difference from the previous simplified model. This study also paved the important step toward the development of computer model for voice disease diagnosis and surgical planning. The project described was supported by Grant Number ROlDC007125 from the National Institute on Deafness and Other Communication Disorders (NIDCD).

  12. Derailing the Growth Debate

    DEFF Research Database (Denmark)

    Nørgaard, Jørgen

    2009-01-01

    that we know today implies that the report was in any sense fundamentally wrong. A cohort of critics at the time, it can be said, was seriously in error when they managed to derail the debate by rejecting the report’s conclusions, and a lot of the critique was not related to the content of the report...

  13. Vitalism and the Darwin Debate

    Science.gov (United States)

    Henderson, James

    2012-01-01

    There are currently both scientific and public debates surrounding Darwinism. In the scientific debate, the details of evolution are in dispute, but not the central thesis of Darwin's theory; in the public debate, Darwinism itself is questioned. I concentrate on the public debate because of its direct impact on education in the United States. Some…

  14. Vitalism and the Darwin Debate

    Science.gov (United States)

    Henderson, James

    2012-01-01

    There are currently both scientific and public debates surrounding Darwinism. In the scientific debate, the details of evolution are in dispute, but not the central thesis of Darwin's theory; in the public debate, Darwinism itself is questioned. I concentrate on the public debate because of its direct impact on education in the United States. Some…

  15. Cyborg and education, an unfinished feminist debate / Cyborg y educación, un debate feminista inconcluso

    Directory of Open Access Journals (Sweden)

    José-Luis Anta Félez

    2012-10-01

    Full Text Available Western culture has been built on dualisms reference. Thus was born the cyborg, half machine, and half human, as a paradigm of anti-informatics of domination: The cyborg is a creature in a world postgenerator, machine /computer /network and women share similarities mimetic in terms of flexibility, fluidity and plenipotencialidad, which raises an alliance between women and machines. Understand and critically analyze cyberfeminism provides a theoretical base from which to reflect on the changes and continuities in the discourse articulated around the idea of female representation and their ability to generate a debate suggestive, multifunctional and educational educationally coherent.

  16. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-03-24

    Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

  17. Shape perception in human and computer vision an interdisciplinary perspective

    CERN Document Server

    Dickinson, Sven J

    2013-01-01

    This comprehensive and authoritative text/reference presents a unique, multidisciplinary perspective on Shape Perception in Human and Computer Vision. Rather than focusing purely on the state of the art, the book provides viewpoints from world-class researchers reflecting broadly on the issues that have shaped the field. Drawing upon many years of experience, each contributor discusses the trends followed and the progress made, in addition to identifying the major challenges that still lie ahead. Topics and features: examines each topic from a range of viewpoints, rather than promoting a speci

  18. Computer simulations of human interferon gamma mutated forms

    Science.gov (United States)

    Lilkova, E.; Litov, L.; Petkov, P.; Petkov, P.; Markov, S.; Ilieva, N.

    2010-01-01

    In the general framework of the computer-aided drug design, the method of molecular-dynamics simulations is applied for investigation of the human interferon-gamma (hIFN-γ) binding to its two known ligands (its extracellular receptor and the heparin-derived oligosaccharides). A study of 100 mutated hIFN-γ forms is presented, the mutations encompassing residues 86-88. The structural changes are investigated by comparing the lengths of the α-helices, in which these residues are included, in the native hIFN-γ molecule and in the mutated forms. The most intriguing cases are examined in detail.

  19. Study on Human-Computer Interaction in Immersive Virtual Environment

    Institute of Scientific and Technical Information of China (English)

    段红; 黄柯棣

    2002-01-01

    Human-computer interaction is one of the most important issues in research of Virtual Environments. This paper introduces interaction software developed for a virtual operating environment for space experiments. Core components of the interaction software are: an object-oriented database for behavior management of virtual objects, a software agent called virtual eye for viewpoint control, and a software agent called virtual hand for object manipulation. Based on the above components, some instance programs for object manipulation have been developed. The user can observe the virtual environment through head-mounted display system, control viewpoint by head tracker and/or keyboard, and select and manipulate virtual objects by 3D mouse.

  20. A computational model of human auditory signal processing and perception

    DEFF Research Database (Denmark)

    Jepsen, Morten Løve; Ewert, Stephan D.; Dau, Torsten

    2008-01-01

    A model of computational auditory signal-processing and perception that accounts for various aspects of simultaneous and nonsimultaneous masking in human listeners is presented. The model is based on the modulation filterbank model described by Dau et al. [J. Acoust. Soc. Am. 102, 2892 (1997......)] but includes major changes at the peripheral and more central stages of processing. The model contains outer- and middle-ear transformations, a nonlinear basilar-membrane processing stage, a hair-cell transduction stage, a squaring expansion, an adaptation stage, a 150-Hz lowpass modulation filter, a bandpass...

  1. Atoms of recognition in human and computer vision.

    Science.gov (United States)

    Ullman, Shimon; Assif, Liav; Fetaya, Ethan; Harari, Daniel

    2016-03-01

    Discovering the visual features and representations used by the brain to recognize objects is a central problem in the study of vision. Recently, neural network models of visual object recognition, including biological and deep network models, have shown remarkable progress and have begun to rival human performance in some challenging tasks. These models are trained on image examples and learn to extract features and representations and to use them for categorization. It remains unclear, however, whether the representations and learning processes discovered by current models are similar to those used by the human visual system. Here we show, by introducing and using minimal recognizable images, that the human visual system uses features and processes that are not used by current models and that are critical for recognition. We found by psychophysical studies that at the level of minimal recognizable images a minute change in the image can have a drastic effect on recognition, thus identifying features that are critical for the task. Simulations then showed that current models cannot explain this sensitivity to precise feature configurations and, more generally, do not learn to recognize minimal images at a human level. The role of the features shown here is revealed uniquely at the minimal level, where the contribution of each feature is essential. A full understanding of the learning and use of such features will extend our understanding of visual recognition and its cortical mechanisms and will enhance the capacity of computational models to learn from visual experience and to deal with recognition and detailed image interpretation.

  2. 'Homeopathy': untangling the debate.

    Science.gov (United States)

    Relton, Clare; O'Cathain, Alicia; Thomas, Kate J

    2008-07-01

    There are active public campaigns both for and against homeopathy, and its continuing availability in the NHS is debated in the medical, scientific and popular press. However, there is a lack of clarity in key terms used in the debate, and in how the evidence base of homeopathy is described and interpreted. The term 'homeopathy' is used with several different meanings including: the therapeutic system, homeopathic medicine, treatment by a homeopath, and the principles of 'homeopathy'. Conclusions drawn from one of these aspects are often inappropriately applied to another aspect. In interpreting the homeopathy evidence it is important to understand that the existing clinical experimental (randomised controlled trial) evidence base provides evidence as to the efficacy of homeopathic medicines, but not the effectiveness of treatment by a homeopath. The observational evidence base provides evidence as to the effectiveness of treatment by a homeopath. We make four recommendations to promote clarity in the reporting, design and interpretation of homeopathy research.

  3. Anonymity in Classroom Voting and Debating

    Science.gov (United States)

    Ainsworth, Shaaron; Gelmini-Hornsby, Giulia; Threapleton, Kate; Crook, Charles; O'Malley, Claire; Buda, Marie

    2011-01-01

    The advent of networked environments into the classroom is changing classroom debates in many ways. This article addresses one key attribute of these environments, namely anonymity, to explore its consequences for co-present adolescents anonymous, by virtue of the computer system, to peers not to teachers. Three studies with 16-17 year-olds used a…

  4. Computational modeling of hypertensive growth in the human carotid artery

    Science.gov (United States)

    Sáez, Pablo; Peña, Estefania; Martínez, Miguel Angel; Kuhl, Ellen

    2014-06-01

    Arterial hypertension is a chronic medical condition associated with an elevated blood pressure. Chronic arterial hypertension initiates a series of events, which are known to collectively initiate arterial wall thickening. However, the correlation between macrostructural mechanical loading, microstructural cellular changes, and macrostructural adaptation remains unclear. Here, we present a microstructurally motivated computational model for chronic arterial hypertension through smooth muscle cell growth. To model growth, we adopt a classical concept based on the multiplicative decomposition of the deformation gradient into an elastic part and a growth part. Motivated by clinical observations, we assume that the driving force for growth is the stretch sensed by the smooth muscle cells. We embed our model into a finite element framework, where growth is stored locally as an internal variable. First, to demonstrate the features of our model, we investigate the effects of hypertensive growth in a real human carotid artery. Our results agree nicely with experimental data reported in the literature both qualitatively and quantitatively.

  5. Human-computer interface glove using flexible piezoelectric sensors

    Science.gov (United States)

    Cha, Youngsu; Seo, Jeonggyu; Kim, Jun-Sik; Park, Jung-Min

    2017-05-01

    In this note, we propose a human-computer interface glove based on flexible piezoelectric sensors. We select polyvinylidene fluoride as the piezoelectric material for the sensors because of advantages such as a steady piezoelectric characteristic and good flexibility. The sensors are installed in a fabric glove by means of pockets and Velcro bands. We detect changes in the angles of the finger joints from the outputs of the sensors, and use them for controlling a virtual hand that is utilized in virtual object manipulation. To assess the sensing ability of the piezoelectric sensors, we compare the processed angles from the sensor outputs with the real angles from a camera recoding. With good agreement between the processed and real angles, we successfully demonstrate the user interaction system with the virtual hand and interface glove based on the flexible piezoelectric sensors, for four hand motions: fist clenching, pinching, touching, and grasping.

  6. Human-computer interface incorporating personal and application domains

    Science.gov (United States)

    Anderson, Thomas G.

    2011-03-29

    The present invention provides a human-computer interface. The interface includes provision of an application domain, for example corresponding to a three-dimensional application. The user is allowed to navigate and interact with the application domain. The interface also includes a personal domain, offering the user controls and interaction distinct from the application domain. The separation into two domains allows the most suitable interface methods in each: for example, three-dimensional navigation in the application domain, and two- or three-dimensional controls in the personal domain. Transitions between the application domain and the personal domain are under control of the user, and the transition method is substantially independent of the navigation in the application domain. For example, the user can fly through a three-dimensional application domain, and always move to the personal domain by moving a cursor near one extreme of the display.

  7. Combining Natural Human-Computer Interaction and Wireless Communication

    Directory of Open Access Journals (Sweden)

    Ştefan Gheorghe PENTIUC

    2011-01-01

    Full Text Available In this paper we present how human-computer interaction can be improved by using wireless communication between devices. Devices that offer a natural user interaction, like the Microsoft Surface Table and tablet PCs, can work together to enhance the experience of an application. Users can use physical objects for a more natural way of handling the virtual world on one hand, and interact with other users wirelessly connected on the other. Physical objects, that interact with the surface table, have a tag attached to them, allowing us to identify them, and take the required action. The TCP/IP protocol was used to handle the wireless communication over the wireless network. A server and a client application were developed for the used devices. To get a wide range of targeted mobile devices, different frameworks for developing cross platform applications were analyzed.

  8. Wearable joystick for gloves-on human/computer interaction

    Science.gov (United States)

    Bae, Jaewook; Voyles, Richard M.

    2006-05-01

    In this paper, we present preliminary work on a novel wearable joystick for gloves-on human/computer interaction in hazardous environments. Interacting with traditional input devices can be clumsy and inconvenient for the operator in hazardous environments due to the bulkiness of multiple system components and troublesome wires. During a collapsed structure search, for example, protective clothing, uneven footing, and "snag" points in the environment can render traditional input devices impractical. Wearable computing has been studied by various researchers to increase the portability of devices and to improve the proprioceptive sense of the wearer's intentions. Specifically, glove-like input devices to recognize hand gestures have been developed for general-purpose applications. But, regardless of their performance, prior gloves have been fragile and cumbersome to use in rough environments. In this paper, we present a new wearable joystick to remove the wires from a simple, two-degree of freedom glove interface. Thus, we develop a wearable joystick that is low cost, durable and robust, and wire-free at the glove. In order to evaluate the wearable joystick, we take into consideration two metrics during operator tests of a commercial robot: task completion time and path tortuosity. We employ fractal analysis to measure path tortuosity. Preliminary user test results are presented that compare the performance of both a wearable joystick and a traditional joystick.

  9. Debating China's assertiveness

    DEFF Research Database (Denmark)

    He, Kai; Feng, Huiyun

    2012-01-01

    Engaging the recent debate on China's assertive foreign policy, we suggest that it is normal for China – a rising power – to change its policy to a confident or even assertive direction because of its transformed national interests. We argue also that it is better to understand future US–China...... relations as a bargaining process. Whereas China negotiates for a new status in the system with redefined interests, the United States and other countries need to adjust their old political practices. China's ‘core interest’ diplomacy launched in 2009 is the first step in revealing ‘private information...

  10. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  11. Framing the patent troll debate.

    Science.gov (United States)

    Risch, Michael

    2014-02-01

    The patent troll debate has reached a fevered pitch in the USA. This editorial seeks to frame the debate by pointing out the lack of clarity in defining patent trolls and their allegedly harmful actions. It then frames the debate by asking currently unanswered questions: Where do troll patents come from? What are the effects of troll assertions? Will policy changes improve the system?

  12. Are Debates Helpful to Voters?

    Science.gov (United States)

    Chaffee, Steven H.

    The usefulness of presidential debates to the electorate and to the total political system is evaluated in this paper. The paper first reports the results of opinion polls concerning the value of the 1976 debates and cites studies showing the types of information that people obtained from watching the debates. It then considers whether voters'…

  13. A computational model for dynamic analysis of the human gait.

    Science.gov (United States)

    Vimieiro, Claysson; Andrada, Emanuel; Witte, Hartmut; Pinotti, Marcos

    2015-01-01

    Biomechanical models are important tools in the study of human motion. This work proposes a computational model to analyse the dynamics of lower limb motion using a kinematic chain to represent the body segments and rotational joints linked by viscoelastic elements. The model uses anthropometric parameters, ground reaction forces and joint Cardan angles from subjects to analyse lower limb motion during the gait. The model allows evaluating these data in each body plane. Six healthy subjects walked on a treadmill to record the kinematic and kinetic data. In addition, anthropometric parameters were recorded to construct the model. The viscoelastic parameter values were fitted for the model joints (hip, knee and ankle). The proposed model demonstrated that manipulating the viscoelastic parameters between the body segments could fit the amplitudes and frequencies of motion. The data collected in this work have viscoelastic parameter values that follow a normal distribution, indicating that these values are directly related to the gait pattern. To validate the model, we used the values of the joint angles to perform a comparison between the model results and previously published data. The model results show a same pattern and range of values found in the literature for the human gait motion.

  14. A multisegment computer simulation of normal human gait.

    Science.gov (United States)

    Gilchrist, L A; Winter, D A

    1997-12-01

    The goal of this project was to develop a computer simulation of normal human walking that would use as driving moments resultant joint moments from a gait analysis. The system description, initial conditions and driving moments were taken from an inverse dynamics analysis of a normal walking trial. A nine-segment three-dimensional (3-D) model, including a two-part foot, was used. Torsional, linear springs and dampers were used at the hip joints to keep the trunk vertical and at the knee and ankle joints to prevent nonphysiological motion. Dampers at other joints were required to ensure a smooth and realistic motion. The simulated human successfully completed one step (550 ms), including both single and double support phases. The model proved to be sensitive to changes in the spring stiffness values of the trunk controllers. Similar sensitivity was found with the springs used to prevent hyperextension of the knee at heel contact and of the metatarsal-phalangeal joint at push-off. In general, there was much less sensitivity to the damping coefficients. This simulation improves on previous efforts because it incorporates some features necessary in simulations designed to answer clinical science questions. Other control algorithms are required, however, to ensure that the model can be realistically adapted to different subjects.

  15. Debate internacional sobre pobreza

    Directory of Open Access Journals (Sweden)

    Neritza Alvarado Chacín

    2016-01-01

    Full Text Available El objetivo del artículo es organizar la proliferación de ideas en torno de la pobreza, a menudo dispersas, difusas y confusas en la literatura, bosquejando sistematizadamente un marco teórico referencial útil al abordaje científico de este fenómeno, en investigaciones teóricas y aplicadas, así como en la enseñanza y discusión de tales temas en el ámbito universitario. Para ello, se compendian, organizan y clasifican diversos enfoques sobre la pobreza, según los criterios presentes en los planteamientos de autores e instituciones relevantes que han contribuido a la reflexión desde distintas lecturas, teniendo en cuenta que no existe consenso en torno a una definición. La investigación es exploratoria-documental. Se aplican las técnicas del arqueo electrónico y físico de la información, el fichaje computarizado y la reducción de la misma en tablas de contenido. Se precisa el alcance de las mismas en el tiempo, según la evolución conceptual que han experimentado en el debate internacional. Se señalan algunas ventajas y desventajas de los principales enfoques, que han sido reconocidas en el mismo debate.

  16. Hybrid Human-Computing Distributed Sense-Making: Extending the SOA Paradigm for Dynamic Adjudication and Optimization of Human and Computer Roles

    Science.gov (United States)

    Rimland, Jeffrey C.

    2013-01-01

    In many evolving systems, inputs can be derived from both human observations and physical sensors. Additionally, many computation and analysis tasks can be performed by either human beings or artificial intelligence (AI) applications. For example, weather prediction, emergency event response, assistive technology for various human sensory and…

  17. Computational modeling and analysis of the hydrodynamics of human swimming

    Science.gov (United States)

    von Loebbecke, Alfred

    Computational modeling and simulations are used to investigate the hydrodynamics of competitive human swimming. The simulations employ an immersed boundary (IB) solver that allows us to simulate viscous, incompressible, unsteady flow past complex, moving/deforming three-dimensional bodies on stationary Cartesian grids. This study focuses on the hydrodynamics of the "dolphin kick". Three female and two male Olympic level swimmers are used to develop kinematically accurate models of this stroke for the simulations. A simulation of a dolphin undergoing its natural swimming motion is also presented for comparison. CFD enables the calculation of flow variables throughout the domain and over the swimmer's body surface during the entire kick cycle. The feet are responsible for all thrust generation in the dolphin kick. Moreover, it is found that the down-kick (ventral position) produces more thrust than the up-kick. A quantity of interest to the swimming community is the drag of a swimmer in motion (active drag). Accurate estimates of this quantity have been difficult to obtain in experiments but are easily calculated with CFD simulations. Propulsive efficiencies of the human swimmers are found to be in the range of 11% to 30%. The dolphin simulation case has a much higher efficiency of 55%. Investigation of vortex structures in the wake indicate that the down-kick can produce a vortex ring with a jet of accelerated fluid flowing through its center. This vortex ring and the accompanying jet are the primary thrust generating mechanisms in the human dolphin kick. In an attempt to understand the propulsive mechanisms of surface strokes, we have also conducted a computational analysis of two different styles of arm-pulls in the backstroke and the front crawl. These simulations involve only the arm and no air-water interface is included. Two of the four strokes are specifically designed to take advantage of lift-based propulsion by undergoing lateral motions of the hand

  18. Open-Box Muscle-Computer Interface: Introduction to Human-Computer Interactions in Bioengineering, Physiology, and Neuroscience Courses

    Science.gov (United States)

    Landa-Jiménez, M. A.; González-Gaspar, P.; Pérez-Estudillo, C.; López-Meraz, M. L.; Morgado-Valle, C.; Beltran-Parrazal, L.

    2016-01-01

    A Muscle-Computer Interface (muCI) is a human-machine system that uses electromyographic (EMG) signals to communicate with a computer. Surface EMG (sEMG) signals are currently used to command robotic devices, such as robotic arms and hands, and mobile robots, such as wheelchairs. These signals reflect the motor intention of a user before the…

  19. Open-Box Muscle-Computer Interface: Introduction to Human-Computer Interactions in Bioengineering, Physiology, and Neuroscience Courses

    Science.gov (United States)

    Landa-Jiménez, M. A.; González-Gaspar, P.; Pérez-Estudillo, C.; López-Meraz, M. L.; Morgado-Valle, C.; Beltran-Parrazal, L.

    2016-01-01

    A Muscle-Computer Interface (muCI) is a human-machine system that uses electromyographic (EMG) signals to communicate with a computer. Surface EMG (sEMG) signals are currently used to command robotic devices, such as robotic arms and hands, and mobile robots, such as wheelchairs. These signals reflect the motor intention of a user before the…

  20. The Human-Computer Interface and Information Literacy: Some Basics and Beyond.

    Science.gov (United States)

    Church, Gary M.

    1999-01-01

    Discusses human/computer interaction research, human/computer interface, and their relationships to information literacy. Highlights include communication models; cognitive perspectives; task analysis; theory of action; problem solving; instructional design considerations; and a suggestion that human/information interface may be a more appropriate…

  1. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

  2. Human Pacman: A Mobile Augmented Reality Entertainment System Based on Physical, Social, and Ubiquitous Computing

    Science.gov (United States)

    Cheok, Adrian David

    This chapter details the Human Pacman system to illuminate entertainment computing which ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by mobile computing, wireless LAN, and ubiquitous computing. With Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor physical area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now real human players in the real world experiencing mixed computer graphics fantasy-reality provided by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between the real and virtual worlds. This is an example of a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.

  3. Effective Use of Human Computer Interaction in Digital Academic Supportive Devices

    OpenAIRE

    Thuseethan, S.; Kuhanesan, S.

    2015-01-01

    In this research, a literature in human-computer interaction is reviewed and the technology aspect of human computer interaction related with digital academic supportive devices is also analyzed. According to all these concerns, recommendations to design good human-computer digital academic supportive devices are analyzed and proposed. Due to improvements in both hardware and software, digital devices have unveiled continuous advances in efficiency and processing capacity. However, many of th...

  4. The Danish Biofuel Debate

    DEFF Research Database (Denmark)

    Hansen, Janus

    2014-01-01

    the molecular level and envisions positive synergies in the use of biomass. The other is a holistic bioscarcity perspective originating in life-cycle analysis and ecology. This perspective works downwards from global resource scope conditions, and envisions negative consequences from an increased reliance......What role does scientific claims-making play in the worldwide promotion of biofuels for transport, which continues despite serious concerns about its potentially adverse social and environmental effects? And how do actors with very different and conflicting viewpoints on the benefits and drawbacks...... of biofuels enrol scientific authority to support their positions? The sociological theory of functional differentiation combined with the concept of advocacy coalition can help in exploring this relationship between scientific claims-making and the policy stance of different actors in public debates about...

  5. Computational Thinking: A Digital Age Skill for Everyone

    Science.gov (United States)

    Barr, David; Harrison, John; Conery, Leslie

    2011-01-01

    In a seminal article published in 2006, Jeanette Wing described computational thinking (CT) as a way of "solving problems, designing systems, and understanding human behavior by drawing on the concepts fundamental to computer science." Wing's article gave rise to an often controversial discussion and debate among computer scientists,…

  6. Energies: the real debate; Energies: Le Vrai Debat

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    Concurrently to the National Debate on the energies, a real debate has been proposed by seven associations of the environment protection and improvement. This debate, international, proposes: a panorama of the stakes, a presentation of the nuclear as an energy source not necessary dangerous, the relation between climate and employment and the conditions of existence and development of a local energy policy. (A.L.B.)

  7. Computational lipidology: predicting lipoprotein density profiles in human blood plasma.

    Directory of Open Access Journals (Sweden)

    Katrin Hübner

    2008-05-01

    Full Text Available Monitoring cholesterol levels is strongly recommended to identify patients at risk for myocardial infarction. However, clinical markers beyond "bad" and "good" cholesterol are needed to precisely predict individual lipid disorders. Our work contributes to this aim by bringing together experiment and theory. We developed a novel computer-based model of the human plasma lipoprotein metabolism in order to simulate the blood lipid levels in high resolution. Instead of focusing on a few conventionally used predefined lipoprotein density classes (LDL, HDL, we consider the entire protein and lipid composition spectrum of individual lipoprotein complexes. Subsequently, their distribution over density (which equals the lipoprotein profile is calculated. As our main results, we (i successfully reproduced clinically measured lipoprotein profiles of healthy subjects; (ii assigned lipoproteins to narrow density classes, named high-resolution density sub-fractions (hrDS, revealing heterogeneous lipoprotein distributions within the major lipoprotein classes; and (iii present model-based predictions of changes in the lipoprotein distribution elicited by disorders in underlying molecular processes. In its present state, the model offers a platform for many future applications aimed at understanding the reasons for inter-individual variability, identifying new sub-fractions of potential clinical relevance and a patient-oriented diagnosis of the potential molecular causes for individual dyslipidemia.

  8. Brain computer interface to enhance episodic memory in human participants

    Directory of Open Access Journals (Sweden)

    John F Burke

    2015-01-01

    Full Text Available Recent research has revealed that neural oscillations in the theta (4-8 Hz and alpha (9-14 Hz bands are predictive of future success in memory encoding. Because these signals occur before the presentation of an upcoming stimulus, they are considered stimulus-independent in that they correlate with enhanced memory encoding independent of the item being encoded. Thus, such stimulus-independent activity has important implications for the neural mechanisms underlying episodic memory as well as the development of cognitive neural prosthetics. Here, we developed a brain computer interface (BCI to test the ability of such pre-stimulus activity to modulate subsequent memory encoding. We recorded intracranial electroencephalography (iEEG in neurosurgical patients as they performed a free recall memory task, and detected iEEG theta and alpha oscillations that correlated with optimal memory encoding. We then used these detected oscillatory changes to trigger the presentation of items in the free recall task. We found that item presentation contingent upon the presence of prestimulus theta and alpha oscillations modulated memory performance in more sessions than expected by chance. Our results suggest that an electrophysiological signal may be causally linked to a specific behavioral condition, and contingent stimulus presentation has the potential to modulate human memory encoding.

  9. Psychosocial and Cultural Modeling in Human Computation Systems: A Gamification Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Riensche, Roderick M.; Haack, Jereme N.; Butner, R. Scott

    2013-11-20

    “Gamification”, the application of gameplay to real-world problems, enables the development of human computation systems that support decision-making through the integration of social and machine intelligence. One of gamification’s major benefits includes the creation of a problem solving environment where the influence of cognitive and cultural biases on human judgment can be curtailed through collaborative and competitive reasoning. By reducing biases on human judgment, gamification allows human computation systems to exploit human creativity relatively unhindered by human error. Operationally, gamification uses simulation to harvest human behavioral data that provide valuable insights for the solution of real-world problems.

  10. Human-Centered Software Engineering: Software Engineering Architectures, Patterns, and Sodels for Human Computer Interaction

    Science.gov (United States)

    Seffah, Ahmed; Vanderdonckt, Jean; Desmarais, Michel C.

    The Computer-Human Interaction and Software Engineering (CHISE) series of edited volumes originated from a number of workshops and discussions over the latest research and developments in the field of Human Computer Interaction (HCI) and Software Engineering (SE) integration, convergence and cross-pollination. A first volume in this series (CHISE Volume I - Human-Centered Software Engineering: Integrating Usability in the Development Lifecycle) aims at bridging the gap between the field of SE and HCI, and addresses specifically the concerns of integrating usability and user-centered systems design methods and tools into the software development lifecycle and practices. This has been done by defining techniques, tools and practices that can fit into the entire software engineering lifecycle as well as by defining ways of addressing the knowledge and skills needed, and the attitudes and basic values that a user-centered development methodology requires. The first volume has been edited as Vol. 8 in the Springer HCI Series (Seffah, Gulliksen and Desmarais, 2005).

  11. Human-Centered Design of Human-Computer-Human Dialogs in Aerospace Systems

    Science.gov (United States)

    Mitchell, Christine M.

    1998-01-01

    A series of ongoing research programs at Georgia Tech established a need for a simulation support tool for aircraft computer-based aids. This led to the design and development of the Georgia Tech Electronic Flight Instrument Research Tool (GT-EFIRT). GT-EFIRT is a part-task flight simulator specifically designed to study aircraft display design and single pilot interaction. ne simulator, using commercially available graphics and Unix workstations, replicates to a high level of fidelity the Electronic Flight Instrument Systems (EFIS), Flight Management Computer (FMC) and Auto Flight Director System (AFDS) of the Boeing 757/767 aircraft. The simulator can be configured to present information using conventional looking B757n67 displays or next generation Primary Flight Displays (PFD) such as found on the Beech Starship and MD-11.

  12. The euthanasia debate.

    Science.gov (United States)

    Harris, N M

    2001-10-01

    Debates about the moral dilemmas of euthanasia date back to ancient times. Many of the historical arguments used for and against the practice remain valid today. Indeed, any form of discussion on the subject often provokes emotive responses, both from members of the medical profession and the general public. For this reason alone, the issue will continue to be debated at all levels of society. There are, however, other factors that ensure euthanasia will remain a subject of major controversy within medical, legal and governmental bodies. Firstly, the act of euthanasia itself is illegal, yet in its passive form occurs on a daily basis in many of our hospitals (1). Secondly, medical advances have made it possible to artificially prolong the life of an increasing number of patients far beyond what was possible only a few years ago. Furthermore, we must all contend with the reality that financial constraints are an important consideration in modern health care provision. Finally, there is an ethical difficulty in interpreting the concept of a patient's right, or autonomy, versus the rights and duty of a doctor. Before attempting to answer the questions posed by these issues, it is important to have some accurate definitions of both euthanasia and of the concept of morality. According to the House of Lords Select Committee on Medical Ethics, the precise definition of euthanasia is "a deliberate intervention undertaken with the express intention of ending a life, to relieve intractable suffering" (2). The term can be further divided into voluntary and involuntary euthanasia. The former is said to occur if a competent patient makes an informed request for a life terminating event and the latter can be used if a patient does not give informed and specific consent for such treatment. It is the occurrence of involuntary euthanasia which forms one of the main arguments against legalisation. This is discussed in greater detail below. Euthanasia is frequently separated into

  13. 'To arrive where we started, and know the place for the first time': Heidegger, phenomenology, the way human beings first appear in the world, and fresh perspectives on the abortion debate.

    Science.gov (United States)

    Mumford, James

    2013-01-01

    Intellectual stalemate in the abortion debate can be traced in part to its being framed as a standoff between religion and secular philosophy. While the former is thought to generate a broadly 'pro-life' position, the latter is associated with more 'pro-choice' thinking. This essay attempts to break free of this framing by criticising the philosophy informing 'pro-choice' positions, but not by resorting immediately to religious arguments but rather by drawing upon a rival philosophical tradition--the movement within twentieth and twenty-first Continental philosophy which was and is phenomenology. A phenomenological approach to human 'emergence', and in particular an application of the framework Heidegger developed in Being and Time (1927), leads to a radical questioning of whether contemporary English-speaking beginning-of-life ethics have adequately taken into account the way human beings come forth in the world.

  14. Moving beyond the GM debate.

    Science.gov (United States)

    Leyser, Ottoline

    2014-06-01

    Once again, there are calls to reopen the debate on genetically modified (GM) crops. I find these calls frustrating and unnecessarily decisive. In my opinion the GM debate, on both sides, continues to hamper the urgent need to address the diverse and pressing challenges of global food security and environmental sustainability. The destructive power of the debate comes from its conflation of unrelated issues, coupled with deeply rooted misconceptions of the nature of agriculture.

  15. Reflections on the debriefing debate.

    Science.gov (United States)

    Robinson, Robyn

    2008-01-01

    This article examines the debate on debriefing that has persisted for two decades and remains largely unresolved to this day. A brief history of Critical Incident Stress Management (CISM) and Critical Incident Stress Debriefing (CISD) is given, these being the subject of the debate, followed by a summary of the development and current status of the debate. Discussion follows on why the opposing positions appear to be at a stalemate.

  16. Moving beyond the GM debate.

    Directory of Open Access Journals (Sweden)

    Ottoline Leyser

    2014-06-01

    Full Text Available Once again, there are calls to reopen the debate on genetically modified (GM crops. I find these calls frustrating and unnecessarily decisive. In my opinion the GM debate, on both sides, continues to hamper the urgent need to address the diverse and pressing challenges of global food security and environmental sustainability. The destructive power of the debate comes from its conflation of unrelated issues, coupled with deeply rooted misconceptions of the nature of agriculture.

  17. Debate in EFL Classroom

    Directory of Open Access Journals (Sweden)

    Mirjana Želježič

    2017-06-01

    Full Text Available Relying primarily on the Common European Framework of Reference for Languages (CEFR and The National EFL Syllabus, this paper focuses on the highest ranking goals within formal foreign language (L2 education: the development of communicative competence (which the communicative paradigm regards as the most important goal of contemporary language teaching, and of critical thinking (CT ability, which is widely recognised as the main general education goal. It also points to some of the discrepancies generated by tensions between the fact that language is a social and cultural phenomenon that exists and evolves only through interaction with others, and individual-student-centred pedagogical practices of teaching (and assessment – which jeopardise the validity of these practices. Next, it links the official educational goals to the cultivation of oral interaction (rather than oral production in argumentative discursive practices in general and in structured debate formats in particular, which are proposed as an effective pedagogical method for developing CT skills and oral interactional competence in argumentative discursive events, especially on B2+ levels.

  18. The Changing Face of Human-Computer Interaction in the Age of Ubiquitous Computing

    Science.gov (United States)

    Rogers, Yvonne

    HCI is reinventing itself. No longer only about being user-centered, it has set its sights on pastures new, embracing a much broader and far-reaching set of interests. From emotional, eco-friendly, embodied experiences to context, constructivism and culture, HCI research is changing apace: from what it looks at, the lenses it uses and what it has to offer. Part of this is as a reaction to what is happening in the world; ubiquitous technologies are proliferating and transforming how we live our lives. We are becoming more connected and more dependent on technology. The home, the crèche, outdoors, public places and even the human body are now being experimented with as potential places to embed computational devices, even to the extent of invading previously private and taboo aspects of our lives. In this paper, I examine the diversity of lifestyle and technological transformations in our midst and outline some 'difficult' questions these raise together with alternative directions for HCI research and practice.

  19. Debates in English Teaching. The Debates in Subject Teaching Series

    Science.gov (United States)

    Davison, Jon, Ed.; Daly, Caroline, Ed.; Moss, John, Ed.

    2012-01-01

    "Debates in English Teaching" explores the major issues all English teachers encounter in their daily professional lives. It engages with established and contemporary debates, promotes and supports critical reflection and aims to stimulate both novice and experienced teachers to reach informed judgements and argue their point of view with deeper…

  20. Debates in English Teaching. The Debates in Subject Teaching Series

    Science.gov (United States)

    Davison, Jon, Ed.; Daly, Caroline, Ed.; Moss, John, Ed.

    2012-01-01

    "Debates in English Teaching" explores the major issues all English teachers encounter in their daily professional lives. It engages with established and contemporary debates, promotes and supports critical reflection and aims to stimulate both novice and experienced teachers to reach informed judgements and argue their point of view with deeper…

  1. Computer science security research and human subjects: emerging considerations for research ethics boards.

    Science.gov (United States)

    Buchanan, Elizabeth; Aycock, John; Dexter, Scott; Dittrich, David; Hvizdak, Erin

    2011-06-01

    This paper explores the growing concerns with computer science research, and in particular, computer security research and its relationship with the committees that review human subjects research. It offers cases that review boards are likely to confront, and provides a context for appropriate consideration of such research, as issues of bots, clouds, and worms enter the discourse of human subjects review.

  2. Human Computer Interaction Approach in Developing Customer Relationship Management

    Directory of Open Access Journals (Sweden)

    Mohd H.N.M. Nasir

    2008-01-01

    Full Text Available Problem statement: Many published studies have found that more than 50% of Customer Relationship Management (CRM system implementations have failed due to the failure of system usability and does not fulfilled user expectation. This study presented the issues that contributed to the failures of CRM system and proposed a prototype of CRM system developed using Human Computer Interaction approaches in order to resolve the identified issues. Approach: In order to capture the users' requirements, a single in-depth case study of a multinational company was chosen in this research, in which the background, current conditions and environmental interactions were observed, recorded and analyzed for stages of patterns in relation to internal and external influences. Some techniques of blended data gathering which are interviews, naturalistic observation and studying user documentation were employed and then the prototype of CRM system was developed which incorporated User-Centered Design (UCD approach, Hierarchical Task Analysis (HTA, metaphor and identification of users' behaviors and characteristics. The implementation of these techniques, were then measured in terms of usability. Results: Based on the usability testing conducted, the results showed that most of the users agreed that the system is comfortable to work with by taking the quality attributes of learnability, memorizeablity, utility, sortability, font, visualization, user metaphor, information easy view and color as measurement parameters. Conclusions/Recommendations: By combining all these techniques, a comfort level for the users that leads to user satisfaction and higher usability degree can be achieved in a proposed CRM system. Thus, it is important that the companies should put usability quality attribute into a consideration before developing or procuring CRM system to ensure the implementation successfulness of the CRM system.

  3. A Real-Time Model-Based Human Motion Tracking and Analysis for Human-Computer Interface Systems

    Directory of Open Access Journals (Sweden)

    Chung-Lin Huang

    2004-09-01

    Full Text Available This paper introduces a real-time model-based human motion tracking and analysis method for human computer interface (HCI. This method tracks and analyzes the human motion from two orthogonal views without using any markers. The motion parameters are estimated by pattern matching between the extracted human silhouette and the human model. First, the human silhouette is extracted and then the body definition parameters (BDPs can be obtained. Second, the body animation parameters (BAPs are estimated by a hierarchical tritree overlapping searching algorithm. To verify the performance of our method, we demonstrate different human posture sequences and use hidden Markov model (HMM for posture recognition testing.

  4. Eliciting Children's Recall of Events: How Do Computers Compare with Humans?

    Science.gov (United States)

    Powell, Martine B.; Wilson, J. Clare; Thomson, Donald M.

    2002-01-01

    Describes a study that investigated the usefulness of an interactive computer program in eliciting children's reports about an event. Compared results of interviews by computer with interviews with humans with children aged five through eight that showed little benefit in computers over face-to-face interviews. (Author/LRW)

  5. VDT Emissions Radiate Debate.

    Science.gov (United States)

    Morgan, Bill

    1990-01-01

    Discusses the possible health effects of electromagnetic fields of radiation that are emitted from video display terminals (VDTs). Responses from vendors in the computer industry are related, steps to reduce possible risks are suggested, and additional sources of information on VDTs are listed. (LRW)

  6. Childbirth care: contributing to the debate on human development Evaluación de la estructura y del proceso de atención al parto: contribución al debate sobre desarrollo humano Avaliação da estrutura e processo da atenção ao parto: contribuição ao debate sobre desenvolvimento humano

    Directory of Open Access Journals (Sweden)

    Cristina Maria Garcia de Lima Parada

    2007-10-01

    Full Text Available This study aimed to evaluate care during childbirth and neonatal development in the interior of São Paulo in order to support managers responsible for formulating public policies on human development and allocating public resources to the women's healthcare. This epidemiological study focused on the evaluation of health services based on the observation of the assistance delivered by the Single Health System in 12 maternities and 134 delivers. The Brazilian Health Ministry or World Health Organization standards were adopted for comparison. The results revealed problems related to the structure of some maternities, where some well-proven practices in normal childbirth are still little used, whereas other prejudicial or ineffective ones are routinely used. Reversing this picture is essential in order to offer humanized quality care to women with consequent reductions in maternal and neonatal mortality rates, in such a way that the region achieves the millennium goals established for improving human development.Con la finalidad de subsidiar a gestores responsables por la atención a la salud de la mujer en la formulación de políticas públicas dirigidas al desarrollo humano, se propone la presente investigación, cuyo objetivo es evaluar la estructura y proceso de atención al parto y al neonato desarrollada en una región del interior del Estado de São Paulo, Brasil. Se trata de un estudio epidemiológico caracterizado por la evaluación de la calidad de servicios de salud. Los resultados obtenidos fueron comparados con patrones establecidos por el Ministerio de la Salud y la Organización Mundial de Salud. Los resultados apuntan problemas con la estructura de algunas maternidades y revelan que prácticas demostradamente útiles en el parto normal aún son poco utilizadas, mientras que otras perjudiciales o ineficaces son rutinariamente utilizadas. Modificar esa situación será esencial para ofrecer atención humanizada y de calidad, con

  7. Intercollegiate Debate: An Intrapersonal View

    Science.gov (United States)

    Walwik, Theodore; Mehrley, R. Samuel

    1971-01-01

    The Thesis of the paper is that there is need to distinguish between debating as a means of training for public policy decision making and debatingas a means of training students in the cognitive processes necessary for effective decision making. The author views debating as fundamental training in interpersonal communication. (Author/MS)

  8. Developing Educational Computer Animation Based on Human Personality Types

    Science.gov (United States)

    Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

    2015-01-01

    Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

  9. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  10. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...... context spanning heterogeneous devices, multiple applications, services, and information sources. In this article, we present ABC as an approach to contextualize information, and present our research into designing activity-centric computing technologies.......An important research topic in artificial intelligence is automatic sensing and inferencing of contextual information, which is used to build computer models of the user’s activity. One approach to build such activity-aware systems is the notion of activity-based computing (ABC). ABC is a computing...

  11. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    An important research topic in artificial intelligence is automatic sensing and inferencing of contextual information, which is used to build computer models of the user’s activity. One approach to build such activity-aware systems is the notion of activity-based computing (ABC). ABC is a computing...... paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...... context spanning heterogeneous devices, multiple applications, services, and information sources. In this article, we present ABC as an approach to contextualize information, and present our research into designing activity-centric computing technologies....

  12. Appearance-based human gesture recognition using multimodal features for human computer interaction

    Science.gov (United States)

    Luo, Dan; Gao, Hua; Ekenel, Hazim Kemal; Ohya, Jun

    2011-03-01

    The use of gesture as a natural interface plays an utmost important role for achieving intelligent Human Computer Interaction (HCI). Human gestures include different components of visual actions such as motion of hands, facial expression, and torso, to convey meaning. So far, in the field of gesture recognition, most previous works have focused on the manual component of gestures. In this paper, we present an appearance-based multimodal gesture recognition framework, which combines the different groups of features such as facial expression features and hand motion features which are extracted from image frames captured by a single web camera. We refer 12 classes of human gestures with facial expression including neutral, negative and positive meanings from American Sign Languages (ASL). We combine the features in two levels by employing two fusion strategies. At the feature level, an early feature combination can be performed by concatenating and weighting different feature groups, and LDA is used to choose the most discriminative elements by projecting the feature on a discriminative expression space. The second strategy is applied on decision level. Weighted decisions from single modalities are fused in a later stage. A condensation-based algorithm is adopted for classification. We collected a data set with three to seven recording sessions and conducted experiments with the combination techniques. Experimental results showed that facial analysis improve hand gesture recognition, decision level fusion performs better than feature level fusion.

  13. A serious case of Strasbourg-bashing? : An evaluation of the debates on the legitimacy of the European Court of Human Rights in the Netherlands

    NARCIS (Netherlands)

    Oomen, Barbara

    Over the past several years, there has been an increase in critiques of the European Court of Human Rights, most notably and surprisingly amongst its founding members, like the Netherlands. These critiques are often understood as a crisis of legitimacy. In order to assess whether this is the case,

  14. La Société des Nations suppose la Société des Esprits: The Debate on Modern Humanism

    NARCIS (Netherlands)

    van Heerikhuizen, A.

    2015-01-01

    This article focuses on the themes of the two conferences organized by the League of Nations—"Modern Man" and "The Foundations of Modern Humanism"—which were held in Nice and Budapest in 1935 and 1936, respectively. It was a time of deepening crisis, when the pervasive belief was that European civil

  15. Operational characteristics optimization of human-computer system

    OpenAIRE

    Zulquernain Mallick; Irfan Anjum Badruddin magami; Khaleed Hussain Tandur

    2010-01-01

    Computer operational parameters are having vital influence on the operators efficiency from readability viewpoint. Four parameters namely font, text/background color, viewing angle and viewing distance are analyzed. The text reading task, in the form of English text, was presented on the computer screen to the participating subjects and their performance, measured in terms of number of words read per minute (NWRPM), was recorded. For the purpose of optimization, the Taguchi method is u...

  16. The video violence debate.

    Science.gov (United States)

    Lande, R G

    1993-04-01

    Some researchers and theorists are convinced that graphic scenes of violence on television and in movies are inextricably linked to human aggression. Others insist that a link has not been conclusively established. This paper summarizes scientific studies that have informed these two perspectives. Although many instances of children and adults imitating video violence have been documented, no court has imposed liability for harm allegedly resulting from a video program, an indication that considerable doubt still exists about the role of video violence in stimulating human aggression. The author suggests that a small group of vulnerable viewers are probably more impressionable and therefore more likely to suffer deleterious effects from violent programming. He proposes that research on video violence be narrowed to identifying and describing the vulnerable viewer.

  17. Simulation of Human Episodic Memory by Using a Computational Model of the Hippocampus

    Directory of Open Access Journals (Sweden)

    Naoyuki Sato

    2010-01-01

    Full Text Available The episodic memory, the memory of personal events and history, is essential for understanding the mechanism of human intelligence. Neuroscience evidence has shown that the hippocampus, a part of the limbic system, plays an important role in the encoding and the retrieval of the episodic memory. This paper reviews computational models of the hippocampus and introduces our own computational model of human episodic memory based on neural synchronization. Results from computer simulations demonstrate that our model provides advantage for instantaneous memory formation and selective retrieval enabling memory search. Moreover, this model was found to have the ability to predict human memory recall by integrating human eye movement data during encoding. The combined approach between computational models and experiment is efficient for theorizing the human episodic memory.

  18. Applying systemic-structural activity theory to design of human-computer interaction systems

    CERN Document Server

    Bedny, Gregory Z; Bedny, Inna

    2015-01-01

    Human-Computer Interaction (HCI) is an interdisciplinary field that has gained recognition as an important field in ergonomics. HCI draws on ideas and theoretical concepts from computer science, psychology, industrial design, and other fields. Human-Computer Interaction is no longer limited to trained software users. Today people interact with various devices such as mobile phones, tablets, and laptops. How can you make such interaction user friendly, even when user proficiency levels vary? This book explores methods for assessing the psychological complexity of computer-based tasks. It also p

  19. Comparison of human face matching behavior and computational image similarity measure

    Institute of Scientific and Technical Information of China (English)

    CHEN WenFeng; LIU ChangHong; LANDER Karen; FU XiaoLan

    2009-01-01

    Computational similarity measures have been evaluated in a variety of ways, but few of the validated computational measures are based on a high-level, cognitive criterion of objective similarity. In this paper, we evaluate two popular objective similarity measures by comparing them with face matching performance In human observers. The results suggest that these measures are still limited in predicting human behavior, especially In rejection behavior, but objective measure taking advantage of global and local face characteristics may improve the prediction. It is also suggested that human may set different criterions for "hit" and "rejection" and this may provide implications for biologically-inspired computational systems.

  20. Aiding human reliance decision making using computational models of trust

    NARCIS (Netherlands)

    Maanen, P.P. van; Klos, T.; Dongen, C.J. van

    2007-01-01

    This paper involves a human-agent system in which there is an operator charged with a pattern recognition task, using an automated decision aid. The objective is to make this human-agent system operate as effectively as possible. Effectiveness is gained by an increase of appropriate reliance on the

  1. Adapting the human-computer interface for reading literacy and computer skill to facilitate collection of information directly from patients.

    Science.gov (United States)

    Lobach, David F; Arbanas, Jennifer M; Mishra, Dharani D; Campbell, Marci; Wildemuth, Barbara M

    2004-01-01

    Clinical information collected directly from patients is critical to the practice of medicine. Past efforts to collect this information using computers have had limited utility because these efforts required users to be facile with the computerized information collecting system. In this paper we describe the design, development, and function of a computer system that uses recent technology to overcome the limitations of previous computer-based data collection tools by adapting the human-computer interface to the native language, reading literacy, and computer skills of the user. Specifically, our system uses a numerical representation of question content, multimedia, and touch screen technology to adapt the computer interface to the native language, reading literacy, and computer literacy of the user. In addition, the system supports health literacy needs throughout the data collection session and provides contextually relevant disease-specific education to users based on their responses to the questions. The system has been successfully used in an academically affiliated family medicine clinic and in an indigent adult medicine clinic.

  2. Exploring the compassion deficit debate.

    Science.gov (United States)

    Stenhouse, Rosie; Ion, Robin; Roxburgh, Michelle; Devitt, Patric Ffrench; Smith, Stephen D M

    2016-04-01

    Several recent high profile failures in the UK health care system have promoted strong debate on compassion and care in nursing. A number of papers articulating a range of positions within this debate have been published in this journal over the past two and a half years. These articulate a diverse range of theoretical perspectives and have been drawn together here in an attempt to bring some coherence to the debate and provide an overview of the key arguments and positions taken by those involved. In doing this we invite the reader to consider their own position in relation to the issues raised and to consider the impact of this for their own practice. Finally the paper offers some sense of how individual practitioners might use their understanding of the debates to ensure delivery of good nursing care.

  3. Supporting Negotiation Behavior with Haptics-Enabled Human-Computer Interfaces.

    Science.gov (United States)

    Oguz, S O; Kucukyilmaz, A; Sezgin, Tevfik Metin; Basdogan, C

    2012-01-01

    An active research goal for human-computer interaction is to allow humans to communicate with computers in an intuitive and natural fashion, especially in real-life interaction scenarios. One approach that has been advocated to achieve this has been to build computer systems with human-like qualities and capabilities. In this paper, we present insight on how human-computer interaction can be enriched by employing the computers with behavioral patterns that naturally appear in human-human negotiation scenarios. For this purpose, we introduce a two-party negotiation game specifically built for studying the effectiveness of haptic and audio-visual cues in conveying negotiation related behaviors. The game is centered around a real-time continuous two-party negotiation scenario based on the existing game-theory and negotiation literature. During the game, humans are confronted with a computer opponent, which can display different behaviors, such as concession, competition, and negotiation. Through a user study, we show that the behaviors that are associated with human negotiation can be incorporated into human-computer interaction, and the addition of haptic cues provides a statistically significant increase in the human-recognition accuracy of machine-displayed behaviors. In addition to aspects of conveying these negotiation-related behaviors, we also focus on and report game-theoretical aspects of the overall interaction experience. In particular, we show that, as reported in the game-theory literature, certain negotiation strategies such as tit-for-tat may generate maximum combined utility for the negotiating parties, providing an excellent balance between the energy spent by the user and the combined utility of the negotiating parties.

  4. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

    Science.gov (United States)

    1989-01-01

    domain communi- Iatelligent s t s s Me cation. Users familiar with problem Inteligent support systes. High-func- anddomains but inxperienced with comput...8217i. April 1987, pp. 7.3-78. His research interests include artificial intel- Creating better HCI softw-are will have a 8. S.K Catrd. I.P. Moran. arid

  5. [Attempt at computer modeling of evolution of human society].

    Science.gov (United States)

    Levchenko, V F; Menshutkin, V V

    2009-01-01

    A model of evolution of human society and biosphere, which is based on the concepts of V. I. Vernadskii about noosphere and of L. N. Gumilev about ethnogenesis is developed and studied. The mathematical apparatus of the model is composition of finite stochastic automata. By using this model, a possibility of the global ecological crisis is demonstrated in the case of preservation of the current tendencies of interaction of the biosphere and the human civilization.

  6. Investigating Students’ Achievements in Computing Science Using Human Metric

    Directory of Open Access Journals (Sweden)

    Ezekiel U. Okike

    2014-05-01

    Full Text Available This study investigates the role of personality traits, motivation for career choice and study habits in students’ academic achievements in the computing sciences. A quantitative research method was employed. Data was collected from 60 computing science students using the Myer Briggs Type indicator (MBTI with additional questionnaires. A model of the form y_(ij=ß_0+ß_1 x_(1j+ ß_2 x_(2j+ ß_3 x_(3j+ ß_4 x_(4j+ …ß_n x_nj was used, where y_ij represents a dependent variable, ß_0+ß_1 x_(1j+ ß_2 x_(2j+ ß_3 x_(3j+ ß_4 x_(4j+ …ß_n x_nj the independent variables. Data analysis was performed on the data using the Statistical Package for the social sciences (SPSS. Linear regression was done in order to fit the model and justify its significance or none significance at the 0.05 level of significance. Result of regression model was also used to determine the impact of the independent variable on students’ performance. Results from this study suggests that the strongest motivator for a choice of career in the computing sciences is the desire to become a computing professional. Students’ achievements especially in the computing sciences do not depend only on students temperamental ability or personality traits, motivations for choice of course of study and reading habit, but also on the use of Internet based sources more than going to the university library to read book materials available in all areas

  7. Moving research beyond the spanking debate.

    Science.gov (United States)

    MacMillan, Harriet L; Mikton, Christopher R

    2017-02-26

    Despite numerous studies identifying a broad range of harms associated with the use of spanking and other types of physical punishment, debate continues about its use as a form of discipline. In this commentary, we recommend four strategies to move the field forward and beyond the spanking debate including: 1) use of methodological approaches that allow for stronger causal inference; 2) consideration of human rights issues; 3) a focus on understanding the causes of spanking and reasons for its decline in certain countries; and 4) more emphasis on evidence-based approaches to changing social norms to reject spanking as a form of discipline. Physical punishment needs to be recognized as an important public health problem.

  8. Deep Space Network (DSN), Network Operations Control Center (NOCC) computer-human interfaces

    Science.gov (United States)

    Ellman, Alvin; Carlton, Magdi

    1993-01-01

    The Network Operations Control Center (NOCC) of the DSN is responsible for scheduling the resources of DSN, and monitoring all multi-mission spacecraft tracking activities in real-time. Operations performs this job with computer systems at JPL connected to over 100 computers at Goldstone, Australia and Spain. The old computer system became obsolete, and the first version of the new system was installed in 1991. Significant improvements for the computer-human interfaces became the dominant theme for the replacement project. Major issues required innovating problem solving. Among these issues were: How to present several thousand data elements on displays without overloading the operator? What is the best graphical representation of DSN end-to-end data flow? How to operate the system without memorizing mnemonics of hundreds of operator directives? Which computing environment will meet the competing performance requirements? This paper presents the technical challenges, engineering solutions, and results of the NOCC computer-human interface design.

  9. Secure Human-Computer Identification against Peeping Attacks (SecHCI): A Survey

    OpenAIRE

    Li, SJ; Shum, HY

    2003-01-01

    This paper focuses on human-computer identification systems against peeping attacks, in which adversaries can observe (and even control) interactions between humans (provers) and computers (verifiers). Real cases on peeping attacks were reported by Ross J. Anderson ten years before. Fixed passwords are insecure to peeping attacks since adversaries can simply replay the observed passwords. Some identification techniques can be used to defeat peeping attacks, but auxiliary devices must be used ...

  10. Design of Food Management Information System Based on Human-computer Interaction

    Directory of Open Access Journals (Sweden)

    Xingkai Cui

    2015-07-01

    Full Text Available Food safety problem is directly related with public health. This study takes the necessity of establishing food management information system as the breakthrough point, through the interpretation of the overview of human-computer interaction technology, as well as the conceptual framework of human-computer interaction, it discusses the construction of food management information system, expecting to promote China's food safety management process so as to guarantee public health guarantee.

  11. The human-computer interaction design of self-operated mobile telemedicine devices

    OpenAIRE

    Zheng, Shaoqing

    2015-01-01

    Human-computer interaction (HCI) is an important issue in the area of medicine, for example, the operation of surgical simulators, virtual rehabilitation systems, telemedicine treatments, and so on. In this thesis, the human-computer interaction of a self-operated mobile telemedicine device is designed. The mobile telemedicine device (i.e. intelligent Medication Box or iMedBox) is used for remotely monitoring patient health and activity information such as ECG (electrocardiogram) signals, hom...

  12. Developing Educational Computer Animation Based on Human Personality Types

    Directory of Open Access Journals (Sweden)

    Sajid Musa

    2015-03-01

    Full Text Available Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered to be one of the most elegant ways for preparing materials for teaching, and its importance in assisting learners to process, understand and remember information efficiently has vastly grown since the advent of powerful graphics-oriented computers era. Based on theories and facts of psychology, colour science, computer animation, geometric modelling and technical aesthetics, this study intends to establish an inter-disciplinary area of research towards a greater educational effectiveness. With today’s high educational demands as well as the lack of time provided for certain courses, classical educational methods have shown deficiencies in keeping up with the drastic changes observed in the digital era. Generally speaking, without taking into account various significant factors as, for instance, gender, age, level of interest and memory level, educational animations may turn out to be insufficient for learners or fail to meet their needs. Though, we have noticed that the applications of animation for education have been given only inadequate attention, and students’ personality types of temperaments (sanguine, choleric, melancholic, phlegmatic, etc. have never been taken into account. We suggest there is an interesting relationship here, and propose essential factors in creating educational animations based on students’ personality types. Particularly, we study how information in computer animation may be presented in a more preferable way based on font types and their families, colours and colour schemes, emphasizing texts, shapes of characters designed by planar quadratic Bernstein-Bézier curves

  13. Design Science in Human-Computer Interaction: A Model and Three Examples

    Science.gov (United States)

    Prestopnik, Nathan R.

    2013-01-01

    Humanity has entered an era where computing technology is virtually ubiquitous. From websites and mobile devices to computers embedded in appliances on our kitchen counters and automobiles parked in our driveways, information and communication technologies (ICTs) and IT artifacts are fundamentally changing the ways we interact with our world.…

  14. Using Noninvasive Brain Measurement to Explore the Psychological Effects of Computer Malfunctions on Users during Human-Computer Interactions

    Directory of Open Access Journals (Sweden)

    Leanne M. Hirshfield

    2014-01-01

    Full Text Available In today’s technologically driven world, there is a need to better understand the ways that common computer malfunctions affect computer users. These malfunctions may have measurable influences on computer user’s cognitive, emotional, and behavioral responses. An experiment was conducted where participants conducted a series of web search tasks while wearing functional near-infrared spectroscopy (fNIRS and galvanic skin response sensors. Two computer malfunctions were introduced during the sessions which had the potential to influence correlates of user trust and suspicion. Surveys were given after each session to measure user’s perceived emotional state, cognitive load, and perceived trust. Results suggest that fNIRS can be used to measure the different cognitive and emotional responses associated with computer malfunctions. These cognitive and emotional changes were correlated with users’ self-report levels of suspicion and trust, and they in turn suggest future work that further explores the capability of fNIRS for the measurement of user experience during human-computer interactions.

  15. Computer models of the human immunoglobulins shape and segmental flexibility.

    Science.gov (United States)

    Pumphrey, R

    1986-06-01

    At present there is interest in the design and deployment of engineered biosensor molecules. Antibodies are the most versatile of the naturally occurring biosensors and it is important to understand their mechanical properties and the ways in which they can interact with their natural ligands. Two dimensional representations are clearly inadequate, and three dimensional representations are too complicated to manipulate except as numerical abstractions in computers. Recent improvements in computer graphics allow these coordinate matrices to be seen and more easily comprehended, and interactive programs permit the modification and reassembly of molecular fragments. The models which result have distinct advantages both over those of lower resolution, and those showing every atom, which are limited to the few fragments(2-5) or mutant molecules for which the X-ray crystallographic coordinates are known. In this review Richard Pumphrey describes the shape and flexibility of immunoglobulin molecules in relation to the three dimensional structure. Copyright © 1986. Published by Elsevier B.V.

  16. Parallel computing-based sclera recognition for human identification

    Science.gov (United States)

    Lin, Yong; Du, Eliza Y.; Zhou, Zhi

    2012-06-01

    Compared to iris recognition, sclera recognition which uses line descriptor can achieve comparable recognition accuracy in visible wavelengths. However, this method is too time-consuming to be implemented in a real-time system. In this paper, we propose a GPU-based parallel computing approach to reduce the sclera recognition time. We define a new descriptor in which the information of KD tree structure and sclera edge are added. Registration and matching task is divided into subtasks in various sizes according to their computation complexities. Every affine transform parameters are generated by searching on KD tree. Texture memory, constant memory, and shared memory are used to store templates and transform matrixes. The experiment results show that the proposed method executed on GPU can dramatically improve the sclera matching speed in hundreds of times without accuracy decreasing.

  17. Social effects of an anthropomorphic help agent: humans versus computers.

    Science.gov (United States)

    David, Prabu; Lu, Tingting; Kline, Susan; Cai, Li

    2007-06-01

    The purpose of this study was to examine perceptions of fairness of a computer-administered quiz as a function of the anthropomorphic features of the help agent offered within the quiz environment. The addition of simple anthropomorphic cues to a computer help agent reduced the perceived friendliness of the agent, perceived intelligence of the agent, and the perceived fairness of the quiz. These differences were observed only for male anthropomorphic cues, but not for female anthropomorphic cues. The results were not explained by the social attraction of the anthropomorphic agents used in the quiz or by gender identification with the agents. Priming of visual cues provides the best account of the data. Practical implications of the study are discussed.

  18. Human cardiac systems electrophysiology and arrhythmogenesis: iteration of experiment and computation.

    Science.gov (United States)

    Holzem, Katherine M; Madden, Eli J; Efimov, Igor R

    2014-11-01

    Human cardiac electrophysiology (EP) is a unique system for computational modelling at multiple scales. Due to the complexity of the cardiac excitation sequence, coordinated activity must occur from the single channel to the entire myocardial syncytium. Thus, sophisticated computational algorithms have been developed to investigate cardiac EP at the level of ion channels, cardiomyocytes, multicellular tissues, and the whole heart. Although understanding of each functional level will ultimately be important to thoroughly understand mechanisms of physiology and disease, cardiac arrhythmias are expressly the product of cardiac tissue-containing enough cardiomyocytes to sustain a reentrant loop of activation. In addition, several properties of cardiac cellular EP, that are critical for arrhythmogenesis, are significantly altered by cell-to-cell coupling. However, relevant human cardiac EP data, upon which to develop or validate models at all scales, has been lacking. Thus, over several years, we have developed a paradigm for multiscale human heart physiology investigation and have recovered and studied over 300 human hearts. We have generated a rich experimental dataset, from which we better understand mechanisms of arrhythmia in human and can improve models of human cardiac EP. In addition, in collaboration with computational physiologists, we are developing a database for the deposition of human heart experimental data, including thorough experimental documentation. We anticipate that accessibility to this human heart dataset will further human EP computational investigations, as well as encourage greater data transparency within the field of cardiac EP.

  19. Supporting Human Activities - Exploring Activity-Centered Computing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob

    2002-01-01

    In this paper we explore an activity-centered computing paradigm that is aimed at supporting work processes that are radically different from the ones known from office work. Our main inspiration is healthcare work that is characterized by an extreme degree of mobility, many interruptions, ad...... objects. We also present an exploratory prototype design and first implementation and present some initial results from evaluations in a healthcare environment....

  20. 对“人性善”与“人性恶”争论的反思——兼论马克思主义的人性观%The Reflection of the "Human Good" and "Human Evil" Debate --On the Marxist View of Human Nature

    Institute of Scientific and Technical Information of China (English)

    唐忠宝; 王丽梅

    2012-01-01

    人性善与人性恶争论的实质是对人的本性、本质的认识。无论是性善论、性恶论抑或是不善不恶论,都预先设定了一个前提,即认为人有一个不变的本性。萨特的存在先于本质以及马克思的人在现实性上,是一切社会关系的总和的观点,为我们深入剖析人的本性、本质提供了更为广阔的视角。%The substance of human good and human evil debate is human nature unElerstanding. Not only hu- man good or human evil, but also neither of them, we preset a premise that the people have a constant nature. Sartre' existence precedes essence theory and Marx' s human in reality theory are the sum of all social relations point of view. It provides a broader perspective to analyze human nature.

  1. The Human Dimension of Computer-Mediated Communications: Implications for International Educational Computer Conferences.

    Science.gov (United States)

    Scott, Douglass J.

    This article presents a conceptual framework for the research and practice of educational computer conferences that shifts the focus from the on-line messages being exchanged to the participants' engagement with the conference. This framework, known as the "Iceberg Metaphor" or the "Michigan Model of educational…

  2. Preface (to: Brain-Computer Interfaces. Applying our Minds to Human-Computer Interaction)

    NARCIS (Netherlands)

    Tan, Desney; Tan, Desney S.; Nijholt, Antinus

    2010-01-01

    The advances in cognitive neuroscience and brain imaging technologies provide us with the increasing ability to interface directly with activity in the brain. Researchers have begun to use these technologies to build brain-computer interfaces. Originally, these interfaces were meant to allow

  3. Data Bases and Other Computer Tools in the Humanities.

    Science.gov (United States)

    Collegiate Microcomputer, 1990

    1990-01-01

    Describes 38 database projects sponsored by the National Endowment for the Humanities (NEH). Information on hardware, software, and access and dissemination is given for projects in the areas of art and architectural history; folklore; history; medicinal plants; interdisciplinary topics; language and linguistics; literature; and music and music…

  4. The Human Genome Project: Biology, Computers, and Privacy.

    Science.gov (United States)

    Cutter, Mary Ann G.; Drexler, Edward; Gottesman, Kay S.; Goulding, Philip G.; McCullough, Laurence B.; McInerney, Joseph D.; Micikas, Lynda B.; Mural, Richard J.; Murray, Jeffrey C.; Zola, John

    This module, for high school teachers, is the second of two modules about the Human Genome Project (HGP) produced by the Biological Sciences Curriculum Study (BSCS). The first section of this module provides background information for teachers about the structure and objectives of the HGP, aspects of the science and technology that underlie the…

  5. Computational biology in human aging : an omics data integration approach

    NARCIS (Netherlands)

    Akker, Erik Ben van den

    2015-01-01

    Throughout this thesis, human aging and its relation to health are studied in the context of two parallel though complementary lines of research: biomarkers and genetics. The search for informative biomarkers of aging focuses on easy accessible and quantifiable substances of the body that can be u

  6. Teaching Speaking Through Debate Technique

    Directory of Open Access Journals (Sweden)

    . Suranto

    2016-07-01

    Full Text Available Abstract : Teaching Speaking Through Debate Technique. Speaking is one of the basic competence from the other fourth basic competence (listening, speaking, reading and writing. Speaking ability should be mastered by every students, in order to achieve that competence students should be given the right technique to study sepaking. The successfull of the students speaking can be seen from their ability to express idea, thought and feeling through speaking. The objective of this Action Research is to improve students’s oral communication skill through the debate technique. This study was conducted at MA Ma’arif Nu 5 Sekampung Lampung Timur from March to April 2014. The research data were taken from students in the eleventh class, with 28 students and analyzed qualitatively and quantitatively. The research findings indicate that there are improvements in students’ english speaking skill through the debate technique. By analyzing data qualitatively and quantitatively from the end of the first cycle to the second cycle and it was found that the students’ English speaking skill increased 20,9% over the standard that has been determined by the researcher that is 65%. The researcher concludes that the students’ english speaking skill can be improve through the debate technique in learning process.   Key words : action research, debate technique, english speaking skill

  7. Recent Advances in Computational Mechanics of the Human Knee Joint

    Directory of Open Access Journals (Sweden)

    M. Kazemi

    2013-01-01

    Full Text Available Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling.

  8. Recent advances in computational mechanics of the human knee joint.

    Science.gov (United States)

    Kazemi, M; Dabiri, Y; Li, L P

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling.

  9. Individual Difference Effects in Human-Computer Interaction

    Science.gov (United States)

    1991-10-01

    evaluated in terns of the amount of sales revenue af -er deducting production costs. nhe time variable was measured in terms of the amount of time a subject...subject acted as an inventory/ production manage:r of a hypothetical firm which was simulated by a computer program. The cubject’s task was to obtain the...34search list" will be examined. Thus, the u3ar w.ll probably match "apple pie" but not "apple cider " or "appl-? butter’ because these items would not

  10. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  11. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  12. Advanced approaches to characterize the human intestinal microbiota by computational meta-analysis

    NARCIS (Netherlands)

    Nikkilä, J.; Vos, de W.M.

    2010-01-01

    GOALS: We describe advanced approaches for the computational meta-analysis of a collection of independent studies, including over 1000 phylogenetic array datasets, as a means to characterize the variability of human intestinal microbiota. BACKGROUND: The human intestinal microbiota is a complex micr

  13. Human Inspired Self-developmental Model of Neural Network (HIM): Introducing Content/Form Computing

    Science.gov (United States)

    Krajíček, Jiří

    This paper presents cross-disciplinary research between medical/psychological evidence on human abilities and informatics needs to update current models in computer science to support alternative methods for computation and communication. In [10] we have already proposed hypothesis introducing concept of human information model (HIM) as cooperative system. Here we continue on HIM design in detail. In our design, first we introduce Content/Form computing system which is new principle of present methods in evolutionary computing (genetic algorithms, genetic programming). Then we apply this system on HIM (type of artificial neural network) model as basic network self-developmental paradigm. Main inspiration of our natural/human design comes from well known concept of artificial neural networks, medical/psychological evidence and Sheldrake theory of "Nature as Alive" [22].

  14. Operational characteristics optimization of human-computer system

    Directory of Open Access Journals (Sweden)

    Zulquernain Mallick

    2010-09-01

    Full Text Available Computer operational parameters are having vital influence on the operators efficiency from readability viewpoint. Four parameters namely font, text/background color, viewing angle and viewing distance are analyzed. The text reading task, in the form of English text, was presented on the computer screen to the participating subjects and their performance, measured in terms of number of words read per minute (NWRPM, was recorded. For the purpose of optimization, the Taguchi method is used to find the optimal parameters to maximize operators’ efficiency for performing readability task. Two levels of each parameter have been considered in this study. An orthogonal array, the signal-to-noise (S/N ratio and the analysis of variance (ANOVA were employed to investigate the operators’ performance/efficiency. Results showed that Times Roman font, black text on white background, 40 degree viewing angle and 60 cm viewing distance, the subjects were quite comfortable, efficient and read maximum number of words per minute. Text/background color was dominant parameter with a percentage contribution of 76.18% towards the laid down objective followed by font type at 18.17%, viewing distance 7.04% and viewing angle 0.58%. Experimental results are provided to confirm the effectiveness of this approach.

  15. Measuring Human Performance within Computer Security Incident Response Teams

    Energy Technology Data Exchange (ETDEWEB)

    McClain, Jonathan T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva, Austin Ray [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Avina, Glory Emmanuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Forsythe, James C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Human performance has become a pertinen t issue within cyber security. However, this research has been stymied by the limited availability of expert cyber security professionals. This is partly attributable to the ongoing workload faced by cyber security professionals, which is compound ed by the limited number of qualified personnel and turnover of p ersonnel across organizations. Additionally, it is difficult to conduct research, and particularly, openly published research, due to the sensitivity inherent to cyber ope rations at most orga nizations. As an alternative, the current research has focused on data collection during cyb er security training exercises. These events draw individuals with a range of knowledge and experience extending from seasoned professionals to recent college gradu ates to college students. The current paper describes research involving data collection at two separate cyber security exercises. This data collection involved multiple measures which included behavioral performance based on human - machine transactions and questionnaire - based assessments of cyber security experience.

  16. Computational Human Performance Modeling For Alarm System Design

    Energy Technology Data Exchange (ETDEWEB)

    Jacques Hugo

    2012-07-01

    The introduction of new technologies like adaptive automation systems and advanced alarms processing and presentation techniques in nuclear power plants is already having an impact on the safety and effectiveness of plant operations and also the role of the control room operator. This impact is expected to escalate dramatically as more and more nuclear power utilities embark on upgrade projects in order to extend the lifetime of their plants. One of the most visible impacts in control rooms will be the need to replace aging alarm systems. Because most of these alarm systems use obsolete technologies, the methods, techniques and tools that were used to design the previous generation of alarm system designs are no longer effective and need to be updated. The same applies to the need to analyze and redefine operators’ alarm handling tasks. In the past, methods for analyzing human tasks and workload have relied on crude, paper-based methods that often lacked traceability. New approaches are needed to allow analysts to model and represent the new concepts of alarm operation and human-system interaction. State-of-the-art task simulation tools are now available that offer a cost-effective and efficient method for examining the effect of operator performance in different conditions and operational scenarios. A discrete event simulation system was used by human factors researchers at the Idaho National Laboratory to develop a generic alarm handling model to examine the effect of operator performance with simulated modern alarm system. It allowed analysts to evaluate alarm generation patterns as well as critical task times and human workload predicted by the system.

  17. Behind Human Error: Cognitive Systems, Computers and Hindsight

    Science.gov (United States)

    1994-12-01

    squeeze became on the powers of the operator.... And as Norbert Wiener noted some years later (1964, p. 63): The gadget-minded people often have the...for one exception see Woods and Elias , 1988). This failure to develop representations that reveal change and highlight events in the monitored...Woods, D. D., and Elias , G. (1988). Significance messages: An inte- gral display concept. In Proceedings of the 32nd Annual Meeting of the Human

  18. Consciousness operationalized, a debate realigned.

    Science.gov (United States)

    Carruthers, Peter; Veillet, Bénédicte

    2017-08-10

    This paper revisits the debate about cognitive phenomenology. It elaborates, defends, and improves on our earlier proposal for resolving that debate, according to which the test for irreducible phenomenology is the presence of explanatory gaps. After showing how proposals like ours have been misunderstood or misused by others, we deploy our operationalization to argue that the correct way to align the debate over cognitive phenomenology is not between sensory and (alleged) cognitive phenomenology, but rather between non-conceptual and (alleged) conceptual or propositional phenomenology. In doing so we defend three varieties of non-sensory (amodal)(1) non-conceptual phenomenology: valence, a sense of approximate number, and a sense of elapsed time. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Transversal Lines of the Debates

    Directory of Open Access Journals (Sweden)

    Yolanda Onghena

    1998-12-01

    Full Text Available The Transversal Lines of the Debates gathers for publication the presentations of the scholars invited to the seminar. In the papers, Yolanda Onghena observes that the evolution from the cultural to the inter-cultural travels along four axes: the relations between cultureand society; the processes of change within identity-based dynamics; the representations of the Other; and, interculturality. Throughout the presentations and subsequent debates, whenever the different participants referred to aspects of the cultural identity problematic--”angst”, “obsession”, “deficit”, manipulation”, and others, these same participants in the Transversal Lines of the Debates also showed that, in certain areas, an optimistic viewpoint is not out of the question.

  20. Collection of Information Directly from Patients through an Adaptive Human-computer Interface

    Science.gov (United States)

    Lobach, David F.; Arbanas, Jennifer M.; Mishra, Dharani D.; Wildemuth, Barbara; Campbell, Marci

    2002-01-01

    Clinical information collected directly from patients is critical to the practice of medicine. Past efforts to collect this information using computers have had limited utility because these efforts required users to be facile with the information collecting system. This poster describes the development and function of a computer system that uses technology to overcome the limitations of previous computer-based data collection tools by adapting the human-computer interface to fit the skills of the user. The system has been successfully used at two diverse clinical sites.

  1. Brain-Computer Interfaces Applying Our Minds to Human-computer Interaction

    CERN Document Server

    Tan, Desney S

    2010-01-01

    For generations, humans have fantasized about the ability to create devices that can see into a person's mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science fiction stories. Recent advances in cognitive neuroscience and brain imaging technologies have started to turn these myths into a reality, and are providing us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that monitor physical p

  2. Debates in Music Teaching. The Debates in Subject Teaching Series

    Science.gov (United States)

    Philpott, Chris, Ed.; Spruce, Gary, Ed.

    2012-01-01

    "Debates in Music Teaching" encourages student and practising teachers to engage with contemporary issues and developments in music education. It aims to introduce a critical approach to the central concepts and practices that have influenced major interventions and initiatives in music teaching, and supports the development of new ways of looking…

  3. Digging into data using new collaborative infrastructures supporting humanities-based computer science research

    OpenAIRE

    2011-01-01

    This paper explores infrastructure supporting humanities–computer science research in large–scale image data by asking: Why is collaboration a requirement for work within digital humanities projects? What is required for fruitful interdisciplinary collaboration? What are the technical and intellectual approaches to constructing such an infrastructure? What are the challenges associated with digital humanities collaborative work? We reveal that digital humanities collaboration requ...

  4. Computational analysis of expression of human embryonic stem cell-associated signatures in tumors

    OpenAIRE

    Wang, Xiaosheng

    2011-01-01

    Background The cancer stem cell model has been proposed based on the linkage between human embryonic stem cells and human cancer cells. However, the evidences supporting the cancer stem cell model remain to be collected. In this study, we extensively examined the expression of human embryonic stem cell-associated signatures including core genes, transcription factors, pathways and microRNAs in various cancers using the computational biology approach. Results We used the class comparison analy...

  5. Computational analysis of expression of human embryonic stem cell-associated signatures in tumors

    OpenAIRE

    Wang Xiaosheng

    2011-01-01

    Abstract Background The cancer stem cell model has been proposed based on the linkage between human embryonic stem cells and human cancer cells. However, the evidences supporting the cancer stem cell model remain to be collected. In this study, we extensively examined the expression of human embryonic stem cell-associated signatures including core genes, transcription factors, pathways and microRNAs in various cancers using the computational biology approach. Results We used the class compari...

  6. Brain-Computer Interfaces. Applying our Minds to Human-Computer Interaction

    NARCIS (Netherlands)

    Tan, Desney S.; Nijholt, Antinus

    2010-01-01

    For generations, humans have fantasized about the ability to create devices that can see into a person’s mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science

  7. Brain-Computer Interfaces: Applying our Minds to Human-Computer Interaction

    NARCIS (Netherlands)

    Tan, Desney S.; Nijholt, Anton

    2010-01-01

    For generations, humans have fantasized about the ability to create devices that can see into a person’s mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science f

  8. Evolution of Neural Computations: Mantis Shrimp and Human Color Decoding

    Directory of Open Access Journals (Sweden)

    Qasim Zaidi

    2014-10-01

    Full Text Available Mantis shrimp and primates both possess good color vision, but the neural implementation in the two species is very different, a reflection of the largely unrelated evolutionary lineages of these creatures. Mantis shrimp have scanning compound eyes with 12 classes of photoreceptors, and have evolved a system to decode color information at the front-end of the sensory stream. Primates have image-focusing eyes with three classes of cones, and decode color further along the visual-processing hierarchy. Despite these differences, we report a fascinating parallel between the computational strategies at the color-decoding stage in the brains of stomatopods and primates. Both species appear to use narrowly tuned cells that support interval decoding color identification.

  9. Evolution of neural computations: Mantis shrimp and human color decoding.

    Science.gov (United States)

    Zaidi, Qasim; Marshall, Justin; Thoen, Hanne; Conway, Bevil R

    2014-01-01

    Mantis shrimp and primates both possess good color vision, but the neural implementation in the two species is very different, a reflection of the largely unrelated evolutionary lineages of these creatures. Mantis shrimp have scanning compound eyes with 12 classes of photoreceptors, and have evolved a system to decode color information at the front-end of the sensory stream. Primates have image-focusing eyes with three classes of cones, and decode color further along the visual-processing hierarchy. Despite these differences, we report a fascinating parallel between the computational strategies at the color-decoding stage in the brains of stomatopods and primates. Both species appear to use narrowly tuned cells that support interval decoding color identification.

  10. A Study of Electromyogram Based on Human-Computer Interface

    Institute of Scientific and Technical Information of China (English)

    Jun-Ru Ren; Tie-Jun Liu; Yu Huang; De-Zhong Yao

    2009-01-01

    In this paper,a new control system based on forearm electromyogram (EMG) is proposed for computer peripheral control and artificial prosthesis control.This control system intends to realize the commands of six pre-defined hand poses:up,down,left,right,yes,and no.In order to research the possibility of using a unified amplifier for both electro-encephalogram (EEG) and EMG,the surface forearm EMG data is acquired by a 4-channel EEG measure-ment system.The Bayesian classifier is used to classify the power spectral density (PSD) of the signal.The experiment result verifies that this control system can supply a high command recognition rate (average 48%) even the EMG data is collected with an EEG system just with single electrode measurement.

  11. Human Computation in Visualization: Using Purpose Driven Games for Robust Evaluation of Visualization Algorithms.

    Science.gov (United States)

    Ahmed, N; Zheng, Ziyi; Mueller, K

    2012-12-01

    Due to the inherent characteristics of the visualization process, most of the problems in this field have strong ties with human cognition and perception. This makes the human brain and sensory system the only truly appropriate evaluation platform for evaluating and fine-tuning a new visualization method or paradigm. However, getting humans to volunteer for these purposes has always been a significant obstacle, and thus this phase of the development process has traditionally formed a bottleneck, slowing down progress in visualization research. We propose to take advantage of the newly emerging field of Human Computation (HC) to overcome these challenges. HC promotes the idea that rather than considering humans as users of the computational system, they can be made part of a hybrid computational loop consisting of traditional computation resources and the human brain and sensory system. This approach is particularly successful in cases where part of the computational problem is considered intractable using known computer algorithms but is trivial to common sense human knowledge. In this paper, we focus on HC from the perspective of solving visualization problems and also outline a framework by which humans can be easily seduced to volunteer their HC resources. We introduce a purpose-driven game titled "Disguise" which serves as a prototypical example for how the evaluation of visualization algorithms can be mapped into a fun and addicting activity, allowing this task to be accomplished in an extensive yet cost effective way. Finally, we sketch out a framework that transcends from the pure evaluation of existing visualization methods to the design of a new one.

  12. Impact of Cognitive Architectures on Human-Computer Interaction

    Science.gov (United States)

    2014-09-01

    simulation. In this work they were preparing for the Synthetic Theatre of War-1997 exercise where between 10,000 and 50,000 automated agents would...work with up to 1,000 humans.27 The results of this exercise are documented by Laird et al.28 5. Conclusions and Future Work To assess whether cognitive...RW, MacKenzie IS. Towards a standard for pointing device evaluation, perspectives on 27 years of Fitts’ law research in HCI. International Journal of

  13. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    Science.gov (United States)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  14. Metaphors for the Nature of Human-Computer Interaction in an Empowering Environment: Interaction Style Influences the Manner of Human Accomplishment.

    Science.gov (United States)

    Weller, Herman G.; Hartson, H. Rex

    1992-01-01

    Describes human-computer interface needs for empowering environments in computer usage in which the machine handles the routine mechanics of problem solving while the user concentrates on its higher order meanings. A closed-loop model of interaction is described, interface as illusion is discussed, and metaphors for human-computer interaction are…

  15. DEBATES

    African Journals Online (AJOL)

    tionally ostracised minority regimes to sovereign state structures legitimised ..... a chapter entitled 'The Pitfalls of National Consciousness ' he predicted that .... ous other contributions (see Melber 2000,2001a, 2001b and 2002) in a modified.

  16. Debat

    DEFF Research Database (Denmark)

    Adler-Nissen, Rebecca

    2005-01-01

    At Norge og Island klarer sig godt pga. fraværet af et EU-medlemskab, er en sandhed med modifikationer. De to lande er tværtimod meget afhængige af EU, og at de skulle nyde godt af en udbredt selvstændighed, er en illusion. Norge og Island forpligtes med EØS-aftalen til at gennemføre EU's lovgivn...

  17. [Debat

    DEFF Research Database (Denmark)

    Myong, Lene; Müller, Anders Riel

    2015-01-01

    Kritik af racisme bliver systematisk afvist som enten abstrakt intellektuelt spind eller individuelle følelsesudbrud. Senest i teksten 'Tanker om en hottentot-karussel', hvor racialiserede minoriteter bliver bedt om at skrue ned for kritikken og i stedet appellere til det hvide hjerte...

  18. Debate

    Directory of Open Access Journals (Sweden)

    Asmita Naik

    2003-01-01

    Full Text Available FMR 15 included two articles on the need to protect children from sexual exploitation and abuse in humanitarian crises. Since then, the UN has carried out its own investigation into the matter. Asmita Naik – author of one of the articles in FMR 15 – responds here to the UN’s report.

  19. A conceptual and computational model of moral decision making in human and artificial agents.

    Science.gov (United States)

    Wallach, Wendell; Franklin, Stan; Allen, Colin

    2010-07-01

    Recently, there has been a resurgence of interest in general, comprehensive models of human cognition. Such models aim to explain higher-order cognitive faculties, such as deliberation and planning. Given a computational representation, the validity of these models can be tested in computer simulations such as software agents or embodied robots. The push to implement computational models of this kind has created the field of artificial general intelligence (AGI). Moral decision making is arguably one of the most challenging tasks for computational approaches to higher-order cognition. The need for increasingly autonomous artificial agents to factor moral considerations into their choices and actions has given rise to another new field of inquiry variously known as Machine Morality, Machine Ethics, Roboethics, or Friendly AI. In this study, we discuss how LIDA, an AGI model of human cognition, can be adapted to model both affective and rational features of moral decision making. Using the LIDA model, we will demonstrate how moral decisions can be made in many domains using the same mechanisms that enable general decision making. Comprehensive models of human cognition typically aim for compatibility with recent research in the cognitive and neural sciences. Global workspace theory, proposed by the neuropsychologist Bernard Baars (1988), is a highly regarded model of human cognition that is currently being computationally instantiated in several software implementations. LIDA (Franklin, Baars, Ramamurthy, & Ventura, 2005) is one such computational implementation. LIDA is both a set of computational tools and an underlying model of human cognition, which provides mechanisms that are capable of explaining how an agent's selection of its next action arises from bottom-up collection of sensory data and top-down processes for making sense of its current situation. We will describe how the LIDA model helps integrate emotions into the human decision-making process, and we

  20. A debate on open inflation

    Science.gov (United States)

    Hawking, S. W.

    1999-07-01

    This is a reproduction of Professor Stephen Hawking's part in a debate, which took place at the COSMO 98 Coference, in Monterey, California. Two other physicists, Andrei Linde and Alexander Villenkin, also took part. Professor Hawking is the Lucasian Professor of Mathematics at the University of Cambridge, in England.

  1. Cooling Signs in Wake Debate

    Science.gov (United States)

    Samuels, Christina A.

    2011-01-01

    More than a year after dismantling a student-assignment policy based on socioeconomic diversity and setting off a wave of reaction that drew national attention, the Wake County, North Carolina, school board took a step that may turn down the temperature of the intense debate. The board, which has been deeply split on an assignment plan for the…

  2. Can Computers Foster Human Users’ Creativity? Theory and Praxis of Mixed-Initiative Co-Creativity

    Directory of Open Access Journals (Sweden)

    Antonios Liapis

    2016-07-01

    Full Text Available This article discusses the impact of artificially intelligent computers to the process of design, play and educational activities. A computational process which has the necessary intelligence and creativity to take a proactive role in such activities can not only support human creativity but also foster it and prompt lateral thinking. The argument is made both from the perspective of human creativity, where the computational input is treated as an external stimulus which triggers re-framing of humans’ routines and mental associations, but also from the perspective of computational creativity where human input and initiative constrains the search space of the algorithm, enabling it to focus on specific possible solutions to a problem rather than globally search for the optimal. The article reviews four mixed-initiative tools (for design and educational play based on how they contribute to human-machine co-creativity. These paradigms serve different purposes, afford different human interaction methods and incorporate different computationally creative processes. Assessing how co-creativity is facilitated on a per-paradigm basis strengthens the theoretical argument and provides an initial seed for future work in the burgeoning domain of mixed-initiative interaction.

  3. Intimate Debate Technique: Medicinal Use of Marijuana

    Science.gov (United States)

    Herreid, Clyde Freeman; DeRei, Kristie

    2007-01-01

    Classroom debates used to be familiar exercises to students schooled in past generations. In this article, the authors describe the technique called "intimate debate". To cooperative learning specialists, the technique is known as "structured debate" or "constructive debate". It is a powerful method for dealing with case topics that involve…

  4. Students in Action: Debating the "Mighty Opposites."

    Science.gov (United States)

    Insights on Law & Society, 2000

    2000-01-01

    Focuses on the hate speech, gun, and privacy debates that today's youth will have to address in their future. Includes articles addressing the arguments in each issue: (1) "Debating Hate Speech" (Frank Kopecky); (2) "Debating the Gun Issue" (Denise Barr); and (3) "Debating the Right to Privacy" (Pinky Wassenberg.)…

  5. Computation of particle detachment from floors due to human walking

    Science.gov (United States)

    Elhadidi, Basman; Khalifa, Ezzat

    2005-11-01

    A computational model for detachment of fine particles due to the unsteady flow under a foot is developed. As the foot approaches the floor, fluid volume is displaced laterally as a wall jet from the perimeter of the contact area at high velocity and acceleration. Unsteady aerodynamic forces on particles attached to the floor are considered. Results show that the jet velocity is ˜40 m/s for a foot idealized as a 15 cm circular disk approaching the floor at 1 m/s with a final gap of 0.8 mm. This velocity is sufficient to detach small particles (1˜μm). The flow accelerates at ˜400 m/s^2 which affects the detachment of larger sized particles (˜100 μm). As the disk is brought to rest, the unsteady jet expands outwards, advecting a vortex ring closely attached to it. At the disk edge, a counter rotating vortex is generated by the sudden deceleration of the disk. Both vortices can play a role in entrainment of the suspended particles in the flowfield. Numerical studies also show that the maximum jet velocity is ˜20 m/s for a simplified foot immediately after heel contact in the stance phase of the gait.

  6. Computer-assisted learning in human and dental medicine.

    Science.gov (United States)

    Höhne, S; Schumann, R R

    2004-04-01

    This article describes the development and application of new didactic methods for use in computer-assisted teaching and learning systems for training doctors and dentists. Taking the Meducase project as an example, didactic models and their technological implementation are explained, together with the limitations of imparting knowledge with the "new media". In addition, legal concepts for a progressive, pragmatic, and innovative distribution of knowledge to undergraduate students are presented. In conclusion, potential and visions for the wide use of electronic learning in the German and European universities in the future are discussed. Self-directed learning (SDL) is a key component in both undergraduate education and lifelong learning for medical practitioners. E-learning can already be used to promote SDL at undergraduate level. The Meducase project uses self-directed, constructive, case- and problem-oriented learning within a learning platform for medical and dental students. In the long run, e-learning programs can only be successful in education if there is consistent analysis and implementation of value-added factors and the development and use of media-didactic concepts matched to electronic learning. The use of innovative forms of licensing - open source licenses for software and similar licenses for content - facilitates continuous, free access to these programs for all students and teachers. These legal concepts offer the possibility of innovative knowledge distribution, quality assurance and standardization across specializations, university departments, and possibly even national borders.

  7. Computational model of soft tissues in the human upper airway.

    Science.gov (United States)

    Pelteret, J-P V; Reddy, B D

    2012-01-01

    This paper presents a three-dimensional finite element model of the tongue and surrounding soft tissues with potential application to the study of sleep apnoea and of linguistics and speech therapy. The anatomical data was obtained from the Visible Human Project, and the underlying histological data was also extracted and incorporated into the model. Hyperelastic constitutive models were used to describe the material behaviour, and material incompressibility was accounted for. An active Hill three-element muscle model was used to represent the muscular tissue of the tongue. The neural stimulus for each muscle group was determined through the use of a genetic algorithm-based neural control model. The fundamental behaviour of the tongue under gravitational and breathing-induced loading is investigated. It is demonstrated that, when a time-dependent loading is applied to the tongue, the neural model is able to control the position of the tongue and produce a physiologically realistic response for the genioglossus.

  8. Interactive 3D computer model of the human corneolimbal region

    DEFF Research Database (Denmark)

    Molvaer, Rikke Kongshaug; Andreasen, Arne; Heegaard, Steffen;

    2013-01-01

    in the superior limbal region and one LEC, six LCs and 12 FSPs in the inferior limbal region. Only few LECs, LCs and FSPs were localized nasally and temporally. CONCLUSION: Interactive 3D models are a powerful tool that may help to shed more light on the existence and spatial localization of the different stem......PURPOSE: This study aims to clarify the existence of and to map the localization of different proposed stem cell niches in the corneal limbal region. MATERIALS AND METHODS: One human eye was cut into 2200 consecutive sections. Every other section was stained with haematoxylin and eosin, digitized...... in the limbal region: limbal epithelial crypts (LECs), limbal crypts (LCs) and focal stromal projections (FSPs). In all, eight LECs, 25 LCs and 105 FSPs were identified in the limbal region. The LECs, LCs and FSPs were predominantly located in the superior limbal region with seven LECs, 19 LCs and 93 FSPs...

  9. Situated dialog in speech-based human-computer interaction

    CERN Document Server

    Raux, Antoine; Lane, Ian; Misu, Teruhisa

    2016-01-01

    This book provides a survey of the state-of-the-art in the practical implementation of Spoken Dialog Systems for applications in everyday settings. It includes contributions on key topics in situated dialog interaction from a number of leading researchers and offers a broad spectrum of perspectives on research and development in the area. In particular, it presents applications in robotics, knowledge access and communication and covers the following topics: dialog for interacting with robots; language understanding and generation; dialog architectures and modeling; core technologies; and the analysis of human discourse and interaction. The contributions are adapted and expanded contributions from the 2014 International Workshop on Spoken Dialog Systems (IWSDS 2014), where researchers and developers from industry and academia alike met to discuss and compare their implementation experiences, analyses and empirical findings.

  10. When a Talking-Face Computer Agent Is Half-Human and Half-Humanoid: Human Identity and Consistency Preference

    Science.gov (United States)

    Gong, Li; Nass, Clifford

    2007-01-01

    Computer-generated anthropomorphic characters are a growing type of communicator that is deployed in digital communication environments. An essential theoretical question is how people identify humanlike but clearly artificial, hence humanoid, entities in comparison to natural human ones. This identity categorization inquiry was approached under…

  11. Computational model of sustained acceleration effects on human cognitive performance.

    Science.gov (United States)

    McKinlly, Richard A; Gallimore, Jennie J

    2013-08-01

    Extreme acceleration maneuvers encountered in modern agile fighter aircraft can wreak havoc on human physiology, thereby significantly influencing cognitive task performance. As oxygen content declines under acceleration stress, the activity of high order cortical tissue reduces to ensure sufficient metabolic resources are available for critical life-sustaining autonomic functions. Consequently, cognitive abilities reliant on these affected areas suffer significant performance degradations. The goal was to develop and validate a model capable of predicting human cognitive performance under acceleration stress. Development began with creation of a proportional control cardiovascular model that produced predictions of several hemodynamic parameters, including eye-level blood pressure and regional cerebral oxygen saturation (rSo2). An algorithm was derived to relate changes in rSo2 within specific brain structures to performance on cognitive tasks that require engagement of different brain areas. Data from the "precision timing" experiment were then used to validate the model predicting cognitive performance as a function of G(z) profile. The following are value ranges. Results showed high agreement between the measured and predicted values for the rSo2 (correlation coefficient: 0.7483-0.8687; linear best-fit slope: 0.5760-0.9484; mean percent error: 0.75-3.33) and cognitive performance models (motion inference task--correlation coefficient: 0.7103-0.9451; linear best-fit slope: 0.7416-0.9144; mean percent error: 6.35-38.21; precision timing task--correlation coefficient: 0.6856-0.9726; linear best-fit slope: 0.5795-1.027; mean percent error: 6.30-17.28). The evidence suggests that the model is capable of accurately predicting cognitive performance of simplistic tasks under high acceleration stress.

  12. Computational analysis of splicing errors and mutations in human transcripts

    Directory of Open Access Journals (Sweden)

    Gelfand Mikhail S

    2008-01-01

    Full Text Available Abstract Background Most retained introns found in human cDNAs generated by high-throughput sequencing projects seem to result from underspliced transcripts, and thus they capture intermediate steps of pre-mRNA splicing. On the other hand, mutations in splice sites cause exon skipping of the respective exon or activation of pre-existing cryptic sites. Both types of events reflect properties of the splicing mechanism. Results The retained introns were significantly shorter than constitutive ones, and skipped exons are shorter than exons with cryptic sites. Both donor and acceptor splice sites of retained introns were weaker than splice sites of constitutive introns. The authentic acceptor sites affected by mutations were significantly weaker in exons with activated cryptic sites than in skipped exons. The distance from a mutated splice site to the nearest equivalent site is significantly shorter in cases of activated cryptic sites compared to exon skipping events. The prevalence of retained introns within genes monotonically increased in the 5'-to-3' direction (more retained introns close to the 3'-end, consistent with the model of co-transcriptional splicing. The density of exonic splicing enhancers was higher, and the density of exonic splicing silencers lower in retained introns compared to constitutive ones and in exons with cryptic sites compared to skipped exons. Conclusion Thus the analysis of retained introns in human cDNA, exons skipped due to mutations in splice sites and exons with cryptic sites produced results consistent with the intron definition mechanism of splicing of short introns, co-transcriptional splicing, dependence of splicing efficiency on the splice site strength and the density of candidate exonic splicing enhancers and silencers. These results are consistent with other, recently published analyses.

  13. Rugoscopy: Human identification by computer-assisted photographic superimposition technique

    Directory of Open Access Journals (Sweden)

    Rezwana Begum Mohammed

    2013-01-01

    Full Text Available Background: Human identification has been studied since fourteenth century and it has gradually advanced for forensic purposes. Traditional methods such as dental, fingerprint, and DNA comparisons are probably the most common techniques used in this context, allowing fast and secure identification processes. But, in circumstances where identification of an individual by fingerprint or dental record comparison is difficult, palatal rugae may be considered as an alternative source of material. Aim: The present study was done to evaluate the individualistic nature and use of palatal rugae patterns for personal identification and also to test the efficiency of computerized software for forensic identification by photographic superimposition of palatal photographs obtained from casts. Materials and Methods: Two sets of Alginate impressions were made from the upper arches of 100 individuals (50 males and 50 females with one month interval in between and the casts were poured. All the teeth except the incisors were removed to ensure that only the palate could be used in identification process. In one set of the casts, the palatal rugae were highlighted with a graphite pencil. All the 200 casts were randomly numbered, and then, they were photographed with a 10.1 Mega Pixel Kodak digital camera using standardized method. Using computerized software, the digital photographs of the models without highlighting the palatal rugae were overlapped over the images (transparent of the palatal rugae with highlighted palatal rugae, in order to identify the pairs by superimposition technique. Incisors were remained and used as landmarks to determine the magnification required to bring the two set of photographs to the same size, in order to make perfect superimposition of images. Results: The result of the overlapping of the digital photographs of highlighted palatal rugae over normal set of models without highlighted palatal rugae resulted in 100% positive

  14. Proceedings of the Third International Conference on Intelligent Human Computer Interaction

    CERN Document Server

    Pokorný, Jaroslav; Snášel, Václav; Abraham, Ajith

    2013-01-01

    The Third International Conference on Intelligent Human Computer Interaction 2011 (IHCI 2011) was held at Charles University, Prague, Czech Republic from August 29 - August 31, 2011. This conference was third in the series, following IHCI 2009 and IHCI 2010 held in January at IIIT Allahabad, India. Human computer interaction is a fast growing research area and an attractive subject of interest for both academia and industry. There are many interesting and challenging topics that need to be researched and discussed. This book aims to provide excellent opportunities for the dissemination of interesting new research and discussion about presented topics. It can be useful for researchers working on various aspects of human computer interaction. Topics covered in this book include user interface and interaction, theoretical background and applications of HCI and also data mining and knowledge discovery as a support of HCI applications.

  15. Real Time Multiple Hand Gesture Recognition System for Human Computer Interaction

    Directory of Open Access Journals (Sweden)

    Siddharth S. Rautaray

    2012-05-01

    Full Text Available With the increasing use of computing devices in day to day life, the need of user friendly interfaces has lead towards the evolution of different types of interfaces for human computer interaction. Real time vision based hand gesture recognition affords users the ability to interact with computers in more natural and intuitive ways. Direct use of hands as an input device is an attractive method which can communicate much more information by itself in comparison to mice, joysticks etc allowing a greater number of recognition system that can be used in a variety of human computer interaction applications. The gesture recognition system consist of three main modules like hand segmentation, hand tracking and gesture recognition from hand features. The designed system further integrated with different applications like image browser, virtual game etc. possibilities for human computer interaction. Computer Vision based systems has the potential to provide more natural, non-contact solutions. The present research work focuses on to design and develops a practical framework for real time hand gesture.

  16. Genetic advances require comprehensive bioethical debate.

    Science.gov (United States)

    ten Have, Henk A M J

    2003-10-01

    In the popular media and scientific literature, the idea of medical utopia seems to have been revived. Medical science and technology are expected to provide solutions for all kinds of daily problems in human existence. The utopian context and optimistic atmosphere are influencing deeply the bio-ethical debate concerning bio-molecular technologies. They a priori direct this debate towards individual perspectives, emphasizing the benefits among which an autonomous person can make his or her choice, and towards practical applications the potential beneficial effects of which are almost there. It is argued that the concept of "geneticization" is useful for the analysis of the interrelations between genetics, medicine, society, and culture. This concept focuses on conceptual issues--the use of genetic vocabulary to define problems; institutional issues--the emergence of bio-ethics experts; cultural issues--the transformation of individual and social attitudes under the influence of genetic knowledge and technology; and philosophical issues--changing views of human identity, interpersonal relationships, and individual responsibility.

  17. Human-computer interaction handbook fundamentals, evolving technologies and emerging applications

    CERN Document Server

    Sears, Andrew

    2007-01-01

    This second edition of The Human-Computer Interaction Handbook provides an updated, comprehensive overview of the most important research in the field, including insights that are directly applicable throughout the process of developing effective interactive information technologies. It features cutting-edge advances to the scientific knowledge base, as well as visionary perspectives and developments that fundamentally transform the way in which researchers and practitioners view the discipline. As the seminal volume of HCI research and practice, The Human-Computer Interaction Handbook feature

  18. Application of next generation sequencing to human gene fusion detection: computational tools, features and perspectives.

    Science.gov (United States)

    Wang, Qingguo; Xia, Junfeng; Jia, Peilin; Pao, William; Zhao, Zhongming

    2013-07-01

    Gene fusions are important genomic events in human cancer because their fusion gene products can drive the development of cancer and thus are potential prognostic tools or therapeutic targets in anti-cancer treatment. Major advancements have been made in computational approaches for fusion gene discovery over the past 3 years due to improvements and widespread applications of high-throughput next generation sequencing (NGS) technologies. To identify fusions from NGS data, existing methods typically leverage the strengths of both sequencing technologies and computational strategies. In this article, we review the NGS and computational features of existing methods for fusion gene detection and suggest directions for future development.

  19. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  20. Signatures of a Statistical Computation in the Human Sense of Confidence.

    Science.gov (United States)

    Sanders, Joshua I; Hangya, Balázs; Kepecs, Adam

    2016-05-04

    Human confidence judgments are thought to originate from metacognitive processes that provide a subjective assessment about one's beliefs. Alternatively, confidence is framed in mathematics as an objective statistical quantity: the probability that a chosen hypothesis is correct. Despite similar terminology, it remains unclear whether the subjective feeling of confidence is related to the objective, statistical computation of confidence. To address this, we collected confidence reports from humans performing perceptual and knowledge-based psychometric decision tasks. We observed two counterintuitive patterns relating confidence to choice and evidence: apparent overconfidence in choices based on uninformative evidence, and decreasing confidence with increasing evidence strength for erroneous choices. We show that these patterns lawfully arise from statistical confidence, and therefore occur even for perfectly calibrated confidence measures. Furthermore, statistical confidence quantitatively accounted for human confidence in our tasks without necessitating heuristic operations. Accordingly, we suggest that the human feeling of confidence originates from a mental computation of statistical confidence.

  1. Signatures of a statistical computation in the human sense of confidence

    Science.gov (United States)

    Sanders, Joshua I.; Hangya, Balázs; Kepecs, Adam

    2017-01-01

    Summary Human confidence judgments are thought to originate from metacognitive processes that provide a subjective assessment about one’s beliefs. Alternatively, confidence is framed in mathematics as an objective statistical quantity: the estimated probability that a chosen hypothesis is correct. Despite similar terminology, it remains unclear whether the subjective feeling of confidence is related to the objective, statistical computation of confidence. To address this, we collected confidence reports from humans performing perceptual and knowledge-based psychometric decision tasks. We observed two counterintuitive patterns relating confidence to choice and evidence: apparent overconfidence in choices based on uninformative evidence, and for erroneous choices, that confidence decreased with increasing evidence strength. We show that these patterns lawfully arise when statistical confidence qualifies a decision. Furthermore, statistical confidence quantitatively accounted for human confidence in our tasks without necessitating heuristic operations. Accordingly, we suggest that the human feeling of confidence originates from a mental computation of statistical confidence. PMID:27151640

  2. Green grabbing debate and Madagascar

    DEFF Research Database (Denmark)

    Casse, Thorkil; Razafy, Fara Lala; Wurtzebach, Zachary

    2017-01-01

    Green grabbing is a scholarly critique of conservation efforts. Scholars of green grabbing argue that many conservation strategies - such as the designation of protected areas and the creation of market-based conservation mechanisms - are designed with the intent to dispossess local peoples...... and capitalise natural assets. First, to provide some context on the green grabbing debate, we discuss the trade-offs between conservation and development objectives. In addition, we refer briefly to the broader land grabbing debate of which green grabbing is a sub-component. Second, we question the theoretical...... foundations of green grabbing, the concepts of primitive accumulation and commodification of nature. Third, we compare data collected by the green grabbing scholars and conservation NGOs from the very same site in Madagascar. We conclude that rigorous post-intervention stakeholder analysis, rather than pre...

  3. Vitalism and the Darwin Debate

    Science.gov (United States)

    Henderson, James

    2012-08-01

    There are currently both scientific and public debates surrounding Darwinism. In the scientific debate, the details of evolution are in dispute, but not the central thesis of Darwin's theory; in the public debate, Darwinism itself is questioned. I concentrate on the public debate because of its direct impact on education in the United States. Some critics of Darwin advocate the teaching of intelligent design theory along with Darwin's theory, and others seek to eliminate even the mention of evolution from science classes altogether. Many of these critics base their objections on the claim that non-living matter cannot give rise to living matter. After considering some of the various meanings assigned to `vitalism' over the years, I argue that a considerable portion of Darwin deniers support a literal version of vitalism that is not scientifically respectable. Their position seems to be that since life cannot arise naturally, Darwin's theory accomplishes nothing: If it can only account for life forms changing from one to another (even this is disputed by some) but not how life arose in the first place, what's the point? I argue that there is every reason to believe that living and non-living matter differ only in degree, not in kind, and that all conversation about Darwinism should start with the assumption that abiogenesis is possible unless or until compelling evidence of its impossibility is presented. That is, I advocate a position that the burden of proof lies with those who claim "Life only comes from life." Until that case is made, little weight should be given to their position.

  4. Debate on globalization. A comment

    OpenAIRE

    Schilirò, Daniele

    2003-01-01

    Globalization means the affirmation of a single market at the global level. More generally, the word globalization is usually used to indicate a unified world that tends to homogenize products and consumption patterns. In addition to the undoubtedly positive effects that the processes of globalization have on the overall well-being and the possibilities of consumption in all countries, a lively debate has developed among economists, but also among philosophers, sociologists and other scholars...

  5. A Computational Approach for Automated Posturing of a Human Finite Element Model

    Science.gov (United States)

    2016-07-01

    following: obtaining source geometries in the posture being tested, a so- called posturing “by hand” where geometries are moved to what “looks correct ...ARL-MR-0934• JULY 2016 US Army Research Laboratory A Computational Approach for Automated Posturing of a Human Finite ElementModel by Justin McKee...Automated Posturing of a Human Finite ElementModel by Justin McKee Bennett Aerospace, Inc., Cary, NC Adam Sokolow Weapons and Materials Research

  6. Die menswaardigheid van die menslike embrio: die debat tot dusver

    Directory of Open Access Journals (Sweden)

    J.M. Vorster

    2011-06-01

    Full Text Available The human dignity of the human embryo: the debate thus farThis article examines some recent arguments regarding the ethics of stem cell research as they are discussed in the various essays in the publication of Gruen et al. (2007, “Stem cell research: the ethical issues”. Regarding the use of human embryos in stem cell research, these essays discuss among other things the potential of the human embryo, the moral status (human dignity of the human embryo, the creation of chimeras, the sale of ocytes and other ethical issues in modern bioethics. Eventually the article draws attention to the main ethical problems at stake to be dealt with by Christian ethics using a deontological ethical theory. Christian ethics should focus on these problems in the on-going ethical debate regarding stem cell research.

  7. National debate on the energies; Debat national sur les energies

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This document gathered the allocutions presented at the national debate on the energies of the 18 march 2003. The full text of the presentations of the Ministry of the industry N. Fontaine and the first Ministry J.P. Raffarin are provided. A synthesis of the answers to the following questions is also presented: understand the energy, the increase of the energy demand, the international consumption, the necessary changes of the consumption and production modes, the environmental impact, the resources, the decision making and the deciders. (A.L.B.)

  8. Human vs. Computer Diagnosis of Students' Natural Selection Knowledge: Testing the Efficacy of Text Analytic Software

    Science.gov (United States)

    Nehm, Ross H.; Haertig, Hendrik

    2012-01-01

    Our study examines the efficacy of Computer Assisted Scoring (CAS) of open-response text relative to expert human scoring within the complex domain of evolutionary biology. Specifically, we explored whether CAS can diagnose the explanatory elements (or Key Concepts) that comprise undergraduate students' explanatory models of natural selection with…

  9. A Project-Based Learning Setting to Human-Computer Interaction for Teenagers

    Science.gov (United States)

    Geyer, Cornelia; Geisler, Stefan

    2012-01-01

    Knowledge of fundamentals of human-computer interaction resp. usability engineering is getting more and more important in technical domains. However this interdisciplinary field of work and corresponding degree programs are not broadly known. Therefore at the Hochschule Ruhr West, University of Applied Sciences, a program was developed to give…

  10. Human-competitive evolution of quantum computing artefacts by Genetic Programming.

    Science.gov (United States)

    Massey, Paul; Clark, John A; Stepney, Susan

    2006-01-01

    We show how Genetic Programming (GP) can be used to evolve useful quantum computing artefacts of increasing sophistication and usefulness: firstly specific quantum circuits, then quantum programs, and finally system-independent quantum algorithms. We conclude the paper by presenting a human-competitive Quantum Fourier Transform (QFT) algorithm evolved by GP.

  11. Characteristics of an Intelligent Computer Assisted Instruction Shell with an Example in Human Physiology.

    Science.gov (United States)

    Dori, Yehudit J.; Yochim, Jerome M.

    1992-01-01

    Discusses exemplary teacher and student characteristics that can provide the base to generate an Intelligent Computer Assisted Instruction (ICAI) shell. Outlines the expertise, learning, student-model, and inference modules of an ICAI shell. Describes the development of an ICAI shell for an undergraduate course in human physiology. (33 references)…

  12. Human Computer Collaboration at the Edge: Enhancing Collective Situation Understanding with Controlled Natural Language

    Science.gov (United States)

    2016-09-06

    Braines, D. Pizzocaro, and C. Parizas, “Human-machine conversations to support multi-agency missions,” ACM SIGMOBILE Mobile Computing and Communications...management,” Commu- nications of the ACM , vol. 50, no. 3, pp. 44–49, 2007. [27] D. Braines, J. Ibbotson, D. Shaw, and A. Preece, “Building a living database

  13. Enhancing Human-Computer Interaction Design Education: Teaching Affordance Design for Emerging Mobile Devices

    Science.gov (United States)

    Faiola, Anthony; Matei, Sorin Adam

    2010-01-01

    The evolution of human-computer interaction design (HCID) over the last 20 years suggests that there is a growing need for educational scholars to consider new and more applicable theoretical models of interactive product design. The authors suggest that such paradigms would call for an approach that would equip HCID students with a better…

  14. The Human-Computer Interaction of Cross-Cultural Gaming Strategy

    Science.gov (United States)

    Chakraborty, Joyram; Norcio, Anthony F.; Van Der Veer, Jacob J.; Andre, Charles F.; Miller, Zachary; Regelsberger, Alexander

    2015-01-01

    This article explores the cultural dimensions of the human-computer interaction that underlies gaming strategies. The article is a desktop study of existing literature and is organized into five sections. The first examines the cultural aspects of knowledge processing. The social constructs technology interaction is discussed. Following this, the…

  15. Study on Speciation of Pr(III) in Human Blood Plasma by Computer Simulation

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Speciation of Pr(III) in human blood plasma has been investigated by computer simulation. The speciation and distribution of Pr(III) has been obtained. It has been found that most of Pr(III) is bound to phosphate and to form precipitate. The results obtained are in accord with experimental observations.

  16. HCI^2 Workbench: A Development Tool for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Wenzhe, Shi; Pantic, Maja

    2011-01-01

    In this paper, we present a novel software tool designed and implemented to simplify the development process of Multimodal Human-Computer Interaction (MHCI) systems. This tool, which is called the HCI^2 Workbench, exploits a Publish / Subscribe (P/S) architecture [13] [14] to facilitate efficient an

  17. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems

    DEFF Research Database (Denmark)

    The CHI Papers and Notes program is continuing to grow along with many of our sister conferences. We are pleased that CHI is still the leading venue for research in human-computer interaction. CHI 2013 continued the use of subcommittees to manage the review process. Authors selected the subcommit...

  18. Use of computational modeling approaches in studying the binding interactions of compounds with human estrogen receptors.

    Science.gov (United States)

    Wang, Pan; Dang, Li; Zhu, Bao-Ting

    2016-01-01

    Estrogens have a whole host of physiological functions in many human organs and systems, including the reproductive, cardiovascular, and central nervous systems. Many naturally-occurring compounds with estrogenic or antiestrogenic activity are present in our environment and food sources. Synthetic estrogens and antiestrogens are also important therapeutic agents. At the molecular level, estrogen receptors (ERs) mediate most of the well-known actions of estrogens. Given recent advances in computational modeling tools, it is now highly practical to use these tools to study the interaction of human ERs with various types of ligands. There are two common categories of modeling techniques: one is the quantitative structure activity relationship (QSAR) analysis, which uses the structural information of the interacting ligands to predict the binding site properties of a macromolecule, and the other one is molecular docking-based computational analysis, which uses the 3-dimensional structural information of both the ligands and the receptor to predict the binding interaction. In this review, we discuss recent results that employed these and other related computational modeling approaches to characterize the binding interaction of various estrogens and antiestrogens with the human ERs. These examples clearly demonstrate that the computational modeling approaches, when used in combination with other experimental methods, are powerful tools that can precisely predict the binding interaction of various estrogenic ligands and their derivatives with the human ERs.

  19. A Framework and Implementation of User Interface and Human-Computer Interaction Instruction

    Science.gov (United States)

    Peslak, Alan

    2005-01-01

    Researchers have suggested that up to 50 % of the effort in development of information systems is devoted to user interface development (Douglas, Tremaine, Leventhal, Wills, & Manaris, 2002; Myers & Rosson, 1992). Yet little study has been performed on the inclusion of important interface and human-computer interaction topics into a current…

  20. Computer-based personality judgments are more accurate than those made by humans

    Science.gov (United States)

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-01

    Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  1. Computer-based personality judgments are more accurate than those made by humans.

    Science.gov (United States)

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-27

    Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.

  2. Influence of media on collective debates

    CERN Document Server

    Quattrociocchi, Walter; Scala, Antonio

    2013-01-01

    The information system (T.V., newspapers, blogs, social network platforms) and its inner dynamics play a fundamental role on the evolution of collective debates and thus on the public opinion. In this work we address such a process focusing on how the current inner strategies of the information system (competition, customer satisfaction) once combined with the gossip may affect the opinions dynamics. A reinforcement effect is particularly evident in the social network platforms where several and incompatible cultures coexist (e.g, pro or against the existence of chemical trails and reptilians, the new world order conspiracy and so forth). We introduce a computational model of opinion dynamics which accounts for the coexistence of media and gossip as separated but interdependent mechanisms influencing the opinions evolution. Individuals may change their opinions under the contemporary pressure of the information supplied by the media and the opinions of their social contacts. We stress the effect of the media ...

  3. MoCog1: A computer simulation of recognition-primed human decision making, considering emotions

    Science.gov (United States)

    Gevarter, William B.

    1992-01-01

    The successful results of the first stage of a research effort to develop a versatile computer model of motivated human cognitive behavior are reported. Most human decision making appears to be an experience-based, relatively straightforward, largely automatic response to situations, utilizing cues and opportunities perceived from the current environment. The development, considering emotions, of the architecture and computer program associated with such 'recognition-primed' decision-making is described. The resultant computer program (MoCog1) was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  4. Exploring Effective Decision Making through Human-Centered and Computational Intelligence Methods

    Energy Technology Data Exchange (ETDEWEB)

    Han, Kyungsik; Cook, Kristin A.; Shih, Patrick C.

    2016-06-13

    Decision-making has long been studied to understand a psychological, cognitive, and social process of selecting an effective choice from alternative options. Its studies have been extended from a personal level to a group and collaborative level, and many computer-aided decision-making systems have been developed to help people make right decisions. There has been significant research growth in computational aspects of decision-making systems, yet comparatively little effort has existed in identifying and articulating user needs and requirements in assessing system outputs and the extent to which human judgments could be utilized for making accurate and reliable decisions. Our research focus is decision-making through human-centered and computational intelligence methods in a collaborative environment, and the objectives of this position paper are to bring our research ideas to the workshop, and share and discuss ideas.

  5. Human-Computer Interaction and Operators' Performance Optimizing Work Design with Activity Theory

    CERN Document Server

    Bedny, Gregory Z

    2010-01-01

    Directed to a broad and interdisciplinary audience, this book provides a complete account of what has been accomplished in applied and systemic-structural activity theory. It presents a new approach to applied psychology and the study of human work that has derived from activity theory. The selected articles demonstrate the basic principles of studying human work and particularly computer-based work in complex sociotechnical systems. The book includes examples of applied and systemic-structural activity theory to HCI and man-machine-systems, aviation, safety, design and optimization of human p

  6. [The human body and the computer as pedagogic tools for anatomy: review of the literature].

    Science.gov (United States)

    Captier, G; Canovas, F; Bonnel, F

    2005-09-01

    Since the first dissections, the human body has been the main tool for the teaching of anatomy in medical courses. For the last 30 years, university anatomy laboratory dissection has been brought into question and the total hours of anatomy teaching have decreased. In parallel, new technologies have progressed and become more competitive and more attractive than dissection. The aim of this review of the literature was to evaluate the use of the human body as a pedagogic tool compared to today's computer tools. Twenty comparative studies were reviewed. Their analysis showed that the human body remains the main tool in anatomy teaching even if anatomic demonstration (prosection) can replace dissection, and that the computer tools were complementary but not a substitute to dissection.

  7. Dry needling versus acupuncture: the ongoing debate.

    Science.gov (United States)

    Zhou, Kehua; Ma, Yan; Brogan, Michael S

    2015-12-01

    Although Western medical acupuncture (WMA) is commonly practised in the UK, a particular approach called dry needling (DN) is becoming increasingly popular in other countries. The legitimacy of the use of DN by conventional non-physician healthcare professionals is questioned by acupuncturists. This article describes the ongoing debate over the practice of DN between physical therapists and acupuncturists, with a particular emphasis on the USA. DN and acupuncture share many similarities but may differ in certain aspects. Currently, little information is available from the literature regarding the relationship between the two needling techniques. Through reviewing their origins, theory, and practice, we found that DN and acupuncture overlap in terms of needling technique with solid filiform needles as well as some fundamental theories. Both WMA and DN are based on modern biomedical understandings of the human body, although DN arguably represents only one subcategory of WMA. The increasing volume of research into needling therapy explains its growing popularity in the musculoskeletal field including sports medicine. To resolve the debate over DN practice, we call for the establishment of a regulatory body to accredit DN courses and a formal, comprehensive educational component and training for healthcare professionals who are not physicians or acupuncturists. Because of the close relationship between DN and acupuncture, collaboration rather than dispute between acupuncturists and other healthcare professionals should be encouraged with respect to education, research, and practice for the benefit of patients with musculoskeletal conditions who require needling therapy.

  8. Debatable Premises in Telecom Policy

    DEFF Research Database (Denmark)

    HURWITZ, Justin; Layton, Roslyn

    2014-01-01

    Around the world, telecommunications policy is one of the most important areas of public policy. The modern economy is driven by telecom technologies, and many telecom-related firms – Google, Apple, Facebook, and myriad fixed and mobile Internet service providers – are among the largest companies...... in the world. The Internet is opening up new platforms for business, education, government, and civic engagement. It has literally been a driving force in toppling governments. Telecommunications policy is important to every government in the world, and debates over what policies should be implemented...

  9. The globalization debate: The skeptics

    Directory of Open Access Journals (Sweden)

    Tadić Tadija

    2006-01-01

    Full Text Available A devastating criticism of a "hard core" argumentation, stemming from skeptical authors, has strongly challenged an enthusiasm noticeable in most theoretical analyses of globalization, bringing to light many "darker sides" of the globalization phenomena. A detailed critical re-examination of their often unrealistic assumptions has presented a very serious challenge to globalists and has made room for the arising of the so called "great globalization debate", which has started over time to shape the mainstream of the contemporary social philosophy. In this paper we are closely looking into the way in which skeptics realize their devastating criticism of globalists' argumentation.

  10. Debating the viability of ethnicity

    Directory of Open Access Journals (Sweden)

    Vilna Bashi

    2004-01-01

    Full Text Available [First paragraph] Immigration and the Political Economy of Home: West Indian Brooklyn and American Indian Minneapolis, 1945-1992. RACHEL BUFF. Berkeley: University of Califomia Press, 2001. xv + 240 pp. (Paper US$ 18.95 Black Cuban, Black American: A Memoir. EVELIO GRILLO. Houston TX: Arte Püblico Press, 2000. xvi + 134 pp. (Paper US$ 13.95 West Indian in the West: Self Representations in an Immigrant Community. PERCY C. HINTZEN. New York: New York University Press, 2001. x + 200pp. (Paper US$ 18.50 Caribbean Families in Britain and the Transatlantic World. HARRY GOULBOURNE & MARY CHAMBERLAIN (eds.. Oxford UK: Macmillan, 2001. xvi + 270 pp. (Paper £15.50 Legacies: The Story of the Immigrant Second Generation. ALEJANDRO PORTES & RUBÉN G. RUMBAUT. Berkeley: University of Califomia Press/ New York: Russell Sage Foundation, 2001. xxiv + 406 pp. (Paper US$ 19.95 "Ethnicity" and its meaning, both as an identity and as a resilient cultural influence, has dominated late twentieth-century social scientific analyses of the process of immigrant incorporation. Perhaps we may mark the crowning of the term with the publication of Glazer and Moynihan's The Melting Pot, one famous tome that "explained" varying "assimilation" outcomes among the "new" (post-1965 newcomers by examining their ethnic culture for flaws or strengths that justified socioeconomic failure or success. Muddying the ensuing policy debate was the use of buzzwords, like mainstream, deviant, assimilated, minority, black matriarch, absent father, and underclass, that were themselves categorizing and hierarchical. The tautology of hierarchically labeling groups and then asking why groups with different labels have different outcomes seems to be perpetually invisible to the parties in the assimilation debate, but the debate itself rages on. Newer scholarship has added a different voice to that debate, arguing that variance in "assimilation" is instead explained by incorporation into

  11. Understanding digital humanities

    CERN Document Server

    Berry, D

    2012-01-01

    The application of new computational techniques and visualisation technologies in the Arts and Humanities are resulting in fresh approaches and methodologies for the study of new and traditional corpora. This 'computational turn' takes the methods and techniques from computer science to create innovative means of close and distant reading. This book discusses the implications and applications of 'Digital Humanities' and the questions raised when using algorithmic techniques. Key researchers in the field provide a comprehensive introduction to important debates surrounding issues such as th

  12. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  13. The Importance of High School Debate

    Science.gov (United States)

    Hooley, Diana

    2007-01-01

    One of the most important educational objectives of high school is to teach critical-thinking skills, and no class does this better than strategic debate. Professor Mike Allen, lead author in a definitive study on debate and critical thinking, lauded debate's promotion of critical-thinking skills. Additionally, researcher Joe Bellon discusses the…

  14. Speech and Debate as Civic Education

    Science.gov (United States)

    Hogan, J. Michael; Kurr, Jeffrey A.; Johnson, Jeremy D.; Bergmaier, Michael J.

    2016-01-01

    In light of the U.S. Senate's designation of March 15, 2016 as "National Speech and Debate Education Day" (S. Res. 398, 2016), it only seems fitting that "Communication Education" devote a special section to the role of speech and debate in civic education. Speech and debate have been at the heart of the communication…

  15. The Power of In-Class Debates

    Science.gov (United States)

    Kennedy, Ruth R.

    2009-01-01

    The students in three sections of a class rated their knowledge and identified their view before and after each of five in-class debates. The degree of self-reported knowledge was significantly different after four of the five debates. Between 31% and 58% of participants changed their views after participating in or observing each debate. Some…

  16. Literacy as Social Action in City Debate

    Science.gov (United States)

    Cridland-Hughes, Susan

    2012-01-01

    This study examines critical literacy and the intersections of oral, aural, written, and performative literate practices in City Debate, an afterschool program dedicated to providing debate instruction to students in a major Southeastern city. Previous research into definitions and beliefs about literacy in an urban debate program over its twenty…

  17. HEADING RECOVERY FROM OPTIC FLOW: COMPARING PERFORMANCE OF HUMANS AND COMPUTATIONAL MODELS

    Directory of Open Access Journals (Sweden)

    Andrew John Foulkes

    2013-06-01

    Full Text Available Human observers can perceive their direction of heading with a precision of about a degree. Several computational models of the processes underpinning the perception of heading have been proposed. In the present study we set out to assess which of four candidate models best captured human performance; the four models we selected reflected key differences in terms of approach and methods to modelling optic flow processing to recover movement parameters. We first generated a performance profile for human observers by measuring how performance changed as we systematically manipulated both the quantity (number of dots in the stimulus per frame and quality (amount of 2D directional noise of the flow field information. We then generated comparable performance profiles for the four candidate models. Models varied markedly in terms of both their performance and similarity to human data. To formally assess the match between the models and human performance we regressed the output of each of the four models against human performance data. We were able to rule out two models that produced very different performance profiles to human observers. The remaining two shared some similarities with human performance profiles in terms of the magnitude and pattern of thresholds. However none of the models tested could capture all aspect of the human data.

  18. Computation of electrostatic fields in anisotropic human tissues using the Finite Integration Technique (FIT)

    Science.gov (United States)

    Motresc, V. C.; van Rienen, U.

    2004-05-01

    The exposure of human body to electromagnetic fields has in the recent years become a matter of great interest for scientists working in the area of biology and biomedicine. Due to the difficulty of performing measurements, accurate models of the human body, in the form of a computer data set, are used for computations of the fields inside the body by employing numerical methods such as the method used for our calculations, namely the Finite Integration Technique (FIT). A fact that has to be taken into account when computing electromagnetic fields in the human body is that some tissue classes, i.e. cardiac and skeletal muscles, have higher electrical conductivity and permittivity along fibers rather than across them. This property leads to diagonal conductivity and permittivity tensors only when expressing them in a local coordinate system while in a global coordinate system they become full tensors. The Finite Integration Technique (FIT) in its classical form can handle diagonally anisotropic materials quite effectively but it needed an extension for handling fully anisotropic materials. New electric voltages were placed on the grid and a new averaging method of conductivity and permittivity on the grid was found. In this paper, we present results from electrostatic computations performed with the extended version of FIT for fully anisotropic materials.

  19. Computation of electrostatic fields in anisotropic human tissues using the Finite Integration Technique (FIT

    Directory of Open Access Journals (Sweden)

    V. C. Motresc

    2004-01-01

    Full Text Available The exposure of human body to electromagnetic fields has in the recent years become a matter of great interest for scientists working in the area of biology and biomedicine. Due to the difficulty of performing measurements, accurate models of the human body, in the form of a computer data set, are used for computations of the fields inside the body by employing numerical methods such as the method used for our calculations, namely the Finite Integration Technique (FIT. A fact that has to be taken into account when computing electromagnetic fields in the human body is that some tissue classes, i.e. cardiac and skeletal muscles, have higher electrical conductivity and permittivity along fibers rather than across them. This property leads to diagonal conductivity and permittivity tensors only when expressing them in a local coordinate system while in a global coordinate system they become full tensors. The Finite Integration Technique (FIT in its classical form can handle diagonally anisotropic materials quite effectively but it needed an extension for handling fully anisotropic materials. New electric voltages were placed on the grid and a new averaging method of conductivity and permittivity on the grid was found. In this paper, we present results from electrostatic computations performed with the extended version of FIT for fully anisotropic materials.

  20. Towards passive brain-computer interfaces: applying brain-computer interface technology to human-machine systems in general

    Science.gov (United States)

    Zander, Thorsten O.; Kothe, Christian

    2011-04-01

    Cognitive monitoring is an approach utilizing realtime brain signal decoding (RBSD) for gaining information on the ongoing cognitive user state. In recent decades this approach has brought valuable insight into the cognition of an interacting human. Automated RBSD can be used to set up a brain-computer interface (BCI) providing a novel input modality for technical systems solely based on brain activity. In BCIs the user usually sends voluntary and directed commands to control the connected computer system or to communicate through it. In this paper we propose an extension of this approach by fusing BCI technology with cognitive monitoring, providing valuable information about the users' intentions, situational interpretations and emotional states to the technical system. We call this approach passive BCI. In the following we give an overview of studies which utilize passive BCI, as well as other novel types of applications resulting from BCI technology. We especially focus on applications for healthy users, and the specific requirements and demands of this user group. Since the presented approach of combining cognitive monitoring with BCI technology is very similar to the concept of BCIs itself we propose a unifying categorization of BCI-based applications, including the novel approach of passive BCI.

  1. The use of analytical models in human-computer interface design

    Science.gov (United States)

    Gugerty, Leo

    1993-01-01

    Recently, a large number of human-computer interface (HCI) researchers have investigated building analytical models of the user, which are often implemented as computer models. These models simulate the cognitive processes and task knowledge of the user in ways that allow a researcher or designer to estimate various aspects of an interface's usability, such as when user errors are likely to occur. This information can lead to design improvements. Analytical models can supplement design guidelines by providing designers rigorous ways of analyzing the information-processing requirements of specific tasks (i.e., task analysis). These models offer the potential of improving early designs and replacing some of the early phases of usability testing, thus reducing the cost of interface design. This paper describes some of the many analytical models that are currently being developed and evaluates the usefulness of analytical models for human-computer interface design. This paper will focus on computational, analytical models, such as the GOMS model, rather than less formal, verbal models, because the more exact predictions and task descriptions of computational models may be useful to designers. The paper also discusses some of the practical requirements for using analytical models in complex design organizations such as NASA.

  2. Application of high-performance computing to numerical simulation of human movement

    Science.gov (United States)

    Anderson, F. C.; Ziegler, J. M.; Pandy, M. G.; Whalen, R. T.

    1995-01-01

    We have examined the feasibility of using massively-parallel and vector-processing supercomputers to solve large-scale optimization problems for human movement. Specifically, we compared the computational expense of determining the optimal controls for the single support phase of gait using a conventional serial machine (SGI Iris 4D25), a MIMD parallel machine (Intel iPSC/860), and a parallel-vector-processing machine (Cray Y-MP 8/864). With the human body modeled as a 14 degree-of-freedom linkage actuated by 46 musculotendinous units, computation of the optimal controls for gait could take up to 3 months of CPU time on the Iris. Both the Cray and the Intel are able to reduce this time to practical levels. The optimal solution for gait can be found with about 77 hours of CPU on the Cray and with about 88 hours of CPU on the Intel. Although the overall speeds of the Cray and the Intel were found to be similar, the unique capabilities of each machine are better suited to different portions of the computational algorithm used. The Intel was best suited to computing the derivatives of the performance criterion and the constraints whereas the Cray was best suited to parameter optimization of the controls. These results suggest that the ideal computer architecture for solving very large-scale optimal control problems is a hybrid system in which a vector-processing machine is integrated into the communication network of a MIMD parallel machine.

  3. A Review on the Computational Methods for Emotional State Estimation from the Human EEG

    Science.gov (United States)

    Kim, Min-Ki; Kim, Miyoung; Oh, Eunmi

    2013-01-01

    A growing number of affective computing researches recently developed a computer system that can recognize an emotional state of the human user to establish affective human-computer interactions. Various measures have been used to estimate emotional states, including self-report, startle response, behavioral response, autonomic measurement, and neurophysiologic measurement. Among them, inferring emotional states from electroencephalography (EEG) has received considerable attention as EEG could directly reflect emotional states with relatively low costs and simplicity. Yet, EEG-based emotional state estimation requires well-designed computational methods to extract information from complex and noisy multichannel EEG data. In this paper, we review the computational methods that have been developed to deduct EEG indices of emotion, to extract emotion-related features, or to classify EEG signals into one of many emotional states. We also propose using sequential Bayesian inference to estimate the continuous emotional state in real time. We present current challenges for building an EEG-based emotion recognition system and suggest some future directions.   PMID:23634176

  4. Automatic procedure for realistic 3D finite element modelling of human brain for bioelectromagnetic computations

    Energy Technology Data Exchange (ETDEWEB)

    Aristovich, K Y; Khan, S H, E-mail: kirill.aristovich.1@city.ac.u [School of Engineering and Mathematical Sciences, City University London, Northampton Square, London EC1V 0HB (United Kingdom)

    2010-07-01

    Realistic computer modelling of biological objects requires building of very accurate and realistic computer models based on geometric and material data, type, and accuracy of numerical analyses. This paper presents some of the automatic tools and algorithms that were used to build accurate and realistic 3D finite element (FE) model of whole-brain. These models were used to solve the forward problem in magnetic field tomography (MFT) based on Magnetoencephalography (MEG). The forward problem involves modelling and computation of magnetic fields produced by human brain during cognitive processing. The geometric parameters of the model were obtained from accurate Magnetic Resonance Imaging (MRI) data and the material properties - from those obtained from Diffusion Tensor MRI (DTMRI). The 3D FE models of the brain built using this approach has been shown to be very accurate in terms of both geometric and material properties. The model is stored on the computer in Computer-Aided Parametrical Design (CAD) format. This allows the model to be used in a wide a range of methods of analysis, such as finite element method (FEM), Boundary Element Method (BEM), Monte-Carlo Simulations, etc. The generic model building approach presented here could be used for accurate and realistic modelling of human brain and many other biological objects.

  5. Soviet debate on missile defense

    Energy Technology Data Exchange (ETDEWEB)

    Parrott, B.

    1987-04-01

    Although the Strategic Defense Initiative (SDI) is meant to cope with the danger of a Soviet nuclear attack, the recent US debate over SDI has paid surprisingly little attention to Soviet views of ballistic missile defense. Despite the existence of a substantial body of pertinent scholarship, the debate has failed to take adequate account of major changes in Soviet ballistic missile defense policy since the mid-1960s. It has also neglected the links between current Soviet military policy and broader Soviet political and economic choices. The Soviets regard SDI not as a novel undertaking to reduce the risks of nuclear war but as an extension of the geopolitical competition between the superpowers. This competition has been dominated in the 1980s, in the Soviet view, by sharply increased US assertiveness and the decline of detente. Viewing SDI as a manifestation of these general trends, Soviet decision makers find the prospect of an unregulated race in ballistic missile defenses and military space technologies deeply unsettling. The deterioration of superpower relations has raised serious doubts in Moscow about the wisdom of Soviet external policy during the 1970s and has provoked sharp internal differences over policy toward the US. Already highly suspicious of the Reagan administration, the elite is united by a general conviction that SDI is an American gambit that may ultimately undercut past Soviet strategic gains and pose a grave new threat to Soviet security. 14 references.

  6. [The debate over drug legalization].

    Science.gov (United States)

    Babín Vich, Francisco de Asís

    2013-01-01

    The debate over drug legalization appears frequently in the media as a potential solution to issues such as drug trafficking and other problems related to drug use. In Spain, private consumption or even the production of small quantities of certain plants, whose active ingredients are considered illegal drugs, if clearly for own consumption are not practices criminalized by any law. In addition, a drug addict is considered a person who is ill. Although it has not always been like that even in the countries that have called for this debate, where at times the law prosecutes consumers. The population of our country, according to the views expressed in the opinion polls, prefer to increase preventive measures, foster the treatment freely assumed by drug addicts and make stricter the repression on drug trafficking. Therefore, when speaking of "legalization" we should be scrupulous with the semantics; legalize and decriminalize are not the same, it is not the same decriminalize consumption than decriminalize trafficking, neither is the same decriminalize private consumption than public consumption. Decriminalize private consumption is a fact in our country. Beyond this, we advocate for the strict need to analyze from a scientific perspective the hypothetical benefits that would result from drug legalization. Certainly, from the public health perspective, they are hard to find. We believe that the same logic applied to tobacco, increasing the restrictions on its use, is the path to follow with any addictive substance.

  7. A computational predictor of human episodic memory based on a theta phase precession network.

    Directory of Open Access Journals (Sweden)

    Naoyuki Sato

    Full Text Available In the rodent hippocampus, a phase precession phenomena of place cell firing with the local field potential (LFP theta is called "theta phase precession" and is considered to contribute to memory formation with spike time dependent plasticity (STDP. On the other hand, in the primate hippocampus, the existence of theta phase precession is unclear. Our computational studies have demonstrated that theta phase precession dynamics could contribute to primate-hippocampal dependent memory formation, such as object-place association memory. In this paper, we evaluate human theta phase precession by using a theory-experiment combined analysis. Human memory recall of object-place associations was analyzed by an individual hippocampal network simulated by theta phase precession dynamics of human eye movement and EEG data during memory encoding. It was found that the computational recall of the resultant network is significantly correlated with human memory recall performance, while other computational predictors without theta phase precession are not significantly correlated with subsequent memory recall. Moreover the correlation is larger than the correlation between human recall and traditional experimental predictors. These results indicate that theta phase precession dynamics are necessary for the better prediction of human recall performance with eye movement and EEG data. In this analysis, theta phase precession dynamics appear useful for the extraction of memory-dependent components from the spatio-temporal pattern of eye movement and EEG data as an associative network. Theta phase precession may be a common neural dynamic between rodents and humans for the formation of environmental memories.

  8. El nacimiento del Caníbal: un debate conceptual.

    Directory of Open Access Journals (Sweden)

    Yobenj Aucardo Chicangana-Bayona

    2008-12-01

    Full Text Available This article offers a theoretical debate regarding the origins of cannibalism based on the writings of Christopher Columbus. The manner in which the cannibal was constructed is of vital importance to understand the arguments later used to justify not only the conquest of the Caribbean but also to legitimize the stigmatization of Amerindian groups accused of consuming human fesh.

  9. Issues in Sociobiology: The Nature vs. Nurture Debate.

    Science.gov (United States)

    Lorenzen, Eric

    2001-01-01

    Explains the two theories on the origins of human and animal behavior. Introduces the new discipline of sociobiology, a merging of biology and sociology. Describes the central dogma of sociobiology and its societal implications, and discusses criticism of sociobiology. Presents the nature vs. nurture debate. (YDS)

  10. Issues in Sociobiology: The Nature vs. Nurture Debate.

    Science.gov (United States)

    Lorenzen, Eric

    2001-01-01

    Explains the two theories on the origins of human and animal behavior. Introduces the new discipline of sociobiology, a merging of biology and sociology. Describes the central dogma of sociobiology and its societal implications, and discusses criticism of sociobiology. Presents the nature vs. nurture debate. (YDS)

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  12. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  13. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  14. Cross-cultural human-computer interaction and user experience design a semiotic perspective

    CERN Document Server

    Brejcha, Jan

    2015-01-01

    This book describes patterns of language and culture in human-computer interaction (HCI). Through numerous examples, it shows why these patterns matter and how to exploit them to design a better user experience (UX) with computer systems. It provides scientific information on the theoretical and practical areas of the interaction and communication design for research experts and industry practitioners and covers the latest research in semiotics and cultural studies, bringing a set of tools and methods to benefit the process of designing with the cultural background in mind.

  15. Portable tongue-supported human computer interaction system design and implementation.

    Science.gov (United States)

    Quain, Rohan; Khan, Masood Mehmood

    2014-01-01

    Tongue supported human-computer interaction (TSHCI) systems can help critically ill patients interact with both computers and people. These systems can be particularly useful for patients suffering injuries above C7 on their spinal vertebrae. Despite recent successes in their application, several limitations restrict performance of existing TSHCI systems and discourage their use in real life situations. This paper proposes a low-cost, less-intrusive, portable and easy to use design for implementing a TSHCI system. Two applications of the proposed system are reported. Design considerations and performance of the proposed system are also presented.

  16. The ethics of algorithms: Mapping the debate

    Directory of Open Access Journals (Sweden)

    Brent Daniel Mittelstadt

    2016-11-01

    Full Text Available In information societies, operations, decisions and choices previously left to humans are increasingly delegated to algorithms, which may advise, if not decide, about how data should be interpreted and what actions should be taken as a result. More and more often, algorithms mediate social processes, business transactions, governmental decisions, and how we perceive, understand, and interact among ourselves and with the environment. Gaps between the design and operation of algorithms and our understanding of their ethical implications can have severe consequences affecting individuals as well as groups and whole societies. This paper makes three contributions to clarify the ethical importance of algorithmic mediation. It provides a prescriptive map to organise the debate. It reviews the current discussion of ethical aspects of algorithms. And it assesses the available literature in order to identify areas requiring further work to develop the ethics of algorithms.

  17. Conversational flow in Oxford-style debates

    CERN Document Server

    Zhang, Justine; Ravi, Sujith; Danescu-Niculescu-Mizil, Cristian

    2016-01-01

    Public debates are a common platform for presenting and juxtaposing diverging views on important issues. In this work we propose a methodology for tracking how ideas flow between participants throughout a debate. We use this approach in a case study of Oxford-style debates---a competitive format where the winner is determined by audience votes---and show how the outcome of a debate depends on aspects of conversational flow. In particular, we find that winners tend to make better use of a debate's interactive component than losers, by actively pursuing their opponents' points rather than promoting their own ideas over the course of the conversation.

  18. Is the corticomedullary index valid to distinguish human from nonhuman bones: a multislice computed tomography study.

    Science.gov (United States)

    Rérolle, Camille; Saint-Martin, Pauline; Dedouit, Fabrice; Rousseau, Hervé; Telmon, Norbert

    2013-09-10

    The first step in the identification process of bone remains is to determine whether they are of human or nonhuman origin. This issue may arise when only a fragment of bone is available, as the species of origin is usually easily determined on a complete bone. The present study aims to assess the validity of a morphometric method used by French forensic anthropologists to determine the species of origin: the corticomedullary index (CMI), defined by the ratio of the diameter of the medullary cavity to the total diameter of the bone. We studied the constancy of the CMI from measurements made on computed tomography images (CT scans) of different human bones, and compared our measurements with reference values selected in the literature. The measurements obtained on CT scans at three different sites of 30 human femurs, 24 tibias, and 24 fibulas were compared between themselves and with the CMI reference values for humans, pigs, dogs and sheep. Our results differed significantly from these reference values, with three exceptions: the proximal quarter of the femur and mid-fibular measurements for the human CMI, and the proximal quarter of the tibia for the sheep CMI. Mid-tibial, mid-femoral, and mid-fibular measurements also differed significantly between themselves. Only 22.6% of CT scans of human bones were correctly identified as human. We concluded that the CMI is not an effective method for determining the human origin of bone remains.

  19. Human Factors Principles in Design of Computer-Mediated Visualization for Robot Missions

    Energy Technology Data Exchange (ETDEWEB)

    David I Gertman; David J Bruemmer

    2008-12-01

    With increased use of robots as a resource in missions supporting countermine, improvised explosive devices (IEDs), and chemical, biological, radiological nuclear and conventional explosives (CBRNE), fully understanding the best means by which to complement the human operator’s underlying perceptual and cognitive processes could not be more important. Consistent with control and display integration practices in many other high technology computer-supported applications, current robotic design practices rely highly upon static guidelines and design heuristics that reflect the expertise and experience of the individual designer. In order to use what we know about human factors (HF) to drive human robot interaction (HRI) design, this paper reviews underlying human perception and cognition principles and shows how they were applied to a threat detection domain.

  20. Toward a Theory of Vice Presidential Debate Purposes: An Analysis of the 1992 Vice Presidential Debate.

    Science.gov (United States)

    Carlin, Diana B.; Bicak, Peter J.

    1993-01-01

    Describes why televised vice presidential debates are worthy of more sustained study. Identifies five purposes of vice presidential debates. Examines critically the 1992 vice presidential debate in light of these purposes. Considers the debate format's effects and the argument strategies of the participants. (HB)

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  2. Computational Characterization of Exogenous MicroRNAs that Can Be Transferred into Human Circulation.

    Directory of Open Access Journals (Sweden)

    Jiang Shu

    Full Text Available MicroRNAs have been long considered synthesized endogenously until very recent discoveries showing that human can absorb dietary microRNAs from animal and plant origins while the mechanism remains unknown. Compelling evidences of microRNAs from rice, milk, and honeysuckle transported to human blood and tissues have created a high volume of interests in the fundamental questions that which and how exogenous microRNAs can be transferred into human circulation and possibly exert functions in humans. Here we present an integrated genomics and computational analysis to study the potential deciding features of transportable microRNAs. Specifically, we analyzed all publicly available microRNAs, a total of 34,612 from 194 species, with 1,102 features derived from the microRNA sequence and structure. Through in-depth bioinformatics analysis, 8 groups of discriminative features have been used to characterize human circulating microRNAs and infer the likelihood that a microRNA will get transferred into human circulation. For example, 345 dietary microRNAs have been predicted as highly transportable candidates where 117 of them have identical sequences with their homologs in human and 73 are known to be associated with exosomes. Through a milk feeding experiment, we have validated 9 cow-milk microRNAs in human plasma using microRNA-sequencing analysis, including the top ranked microRNAs such as bta-miR-487b, miR-181b, and miR-421. The implications in health-related processes have been illustrated in the functional analysis. This work demonstrates the data-driven computational analysis is highly promising to study novel molecular characteristics of transportable microRNAs while bypassing the complex mechanistic details.

  3. Advancements in remote physiological measurement and applications in human-computer interaction

    Science.gov (United States)

    McDuff, Daniel

    2017-04-01

    Physiological signals are important for tracking health and emotional states. Imaging photoplethysmography (iPPG) is a set of techniques for remotely recovering cardio-pulmonary signals from video of the human body. Advances in iPPG methods over the past decade combined with the ubiquity of digital cameras presents the possibility for many new, lowcost applications of physiological monitoring. This talk will highlight methods for recovering physiological signals, work characterizing the impact of video parameters and hardware on these measurements, and applications of this technology in human-computer interfaces.

  4. Toward Scalable Trustworthy Computing Using the Human-Physiology-Immunity Metaphor

    Energy Technology Data Exchange (ETDEWEB)

    Hively, Lee M [ORNL; Sheldon, Frederick T [ORNL

    2011-01-01

    The cybersecurity landscape consists of an ad hoc patchwork of solutions. Optimal cybersecurity is difficult for various reasons: complexity, immense data and processing requirements, resource-agnostic cloud computing, practical time-space-energy constraints, inherent flaws in 'Maginot Line' defenses, and the growing number and sophistication of cyberattacks. This article defines the high-priority problems and examines the potential solution space. In that space, achieving scalable trustworthy computing and communications is possible through real-time knowledge-based decisions about cyber trust. This vision is based on the human-physiology-immunity metaphor and the human brain's ability to extract knowledge from data and information. The article outlines future steps toward scalable trustworthy systems requiring a long-term commitment to solve the well-known challenges.

  5. Human perceptual deficits as factors in computer interface test and evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Bowser, S.E.

    1992-06-01

    Issues related to testing and evaluating human computer interfaces are usually based on the machine rather than on the human portion of the computer interface. Perceptual characteristics of the expected user are rarely investigated, and interface designers ignore known population perceptual limitations. For these reasons, environmental impacts on the equipment will more likely be defined than will user perceptual characteristics. The investigation of user population characteristics is most often directed toward intellectual abilities and anthropometry. This problem is compounded by the fact that some deficits capabilities tend to be found in higher-than-overall population distribution in some user groups. The test and evaluation community can address the issue from two primary aspects. First, assessing user characteristics should be extended to include tests of perceptual capability. Secondly, interface designs should use multimode information coding.

  6. Interactions among human behavior, social networks, and societal infrastructures: A Case Study in Computational Epidemiology

    Science.gov (United States)

    Barrett, Christopher L.; Bisset, Keith; Chen, Jiangzhuo; Eubank, Stephen; Lewis, Bryan; Kumar, V. S. Anil; Marathe, Madhav V.; Mortveit, Henning S.

    Human behavior, social networks, and the civil infrastructures are closely intertwined. Understanding their co-evolution is critical for designing public policies and decision support for disaster planning. For example, human behaviors and day to day activities of individuals create dense social interactions that are characteristic of modern urban societies. These dense social networks provide a perfect fabric for fast, uncontrolled disease propagation. Conversely, people’s behavior in response to public policies and their perception of how the crisis is unfolding as a result of disease outbreak can dramatically alter the normally stable social interactions. Effective planning and response strategies must take these complicated interactions into account. In this chapter, we describe a computer simulation based approach to study these issues using public health and computational epidemiology as an illustrative example. We also formulate game-theoretic and stochastic optimization problems that capture many of the problems that we study empirically.

  7. AFFECTIVE AND EMOTIONAL ASPECTS OF HUMAN-COMPUTER INTERACTION: Game-Based and Innovative Learning Approaches

    Directory of Open Access Journals (Sweden)

    A. Askim GULUMBAY, Anadolu University, TURKEY

    2006-07-01

    Full Text Available This book was edited by, Maja Pivec, an educator at the University of Applied Sciences, and published by IOS Pres in 2006. The learning process can be seen as an emotional and personal experience that is addictive and leads learners to proactive behavior. New research methods in this field are related to affective and emotional approaches to computersupported learning and human-computer interactions.Bringing together scientists and research aspects from psychology, educational sciences, cognitive sciences, various aspects of communication and human computer interaction, interface design andcomputer science on one hand and educators and game industry on the other, this should open gates to evolutionary changes of the learning industry. The major topics discussed are emotions, motivation, games and game-experience.

  8. Computer

    CERN Document Server

    Atkinson, Paul

    2011-01-01

    The pixelated rectangle we spend most of our day staring at in silence is not the television as many long feared, but the computer-the ubiquitous portal of work and personal lives. At this point, the computer is almost so common we don't notice it in our view. It's difficult to envision that not that long ago it was a gigantic, room-sized structure only to be accessed by a few inspiring as much awe and respect as fear and mystery. Now that the machine has decreased in size and increased in popular use, the computer has become a prosaic appliance, little-more noted than a toaster. These dramati

  9. Fetal pain: an infantile debate.

    Science.gov (United States)

    Derbyshire, S W G

    2001-02-01

    The question of whether a fetus can experience pain is an immense challenge. The issue demands consideration of the physical and psychological basis of being and the relation between the two. At the center of this debate is the question of how it is that we are conscious, a question that has inspired the writing of some of our most brilliant contemporary philosophers and scientists, with one commentary suggesting surrender. In my earlier review I attempted to draw together the various strands of thinking that had attacked the question of fetal pain and relate them back to the bigger question of consciousness. In their vituperative response, Benatar and Benatar bite off my finger before looking to where I am pointing. I will examine each of their criticisms.

  10. Debatable Premises in Telecom Policy

    DEFF Research Database (Denmark)

    Hurwitz, Justin (Gus); Layton, Roslyn

    2015-01-01

    Around the world, telecommunications policy is one of the most important areas of public policy. The modern economy is driven by telecom technologies, and many telecom-related firms – Google, Apple, Facebook, and myriad fixed and mobile Internet service providers – are among the largest companies...... in the world. The Internet is opening up new platforms for business, education, government, and civic engagement. It has literally been a driving force in toppling governments. Telecommunications policy is important to every government in the world, and debates over what policies should be implemented......‟t stand up well to critical analysis. This paper collects and responds to a number of these premises that, collectively, underlie much popular, political, and academic support for increased telecommunications regulation in the United States and Europe – as well as much of the rest of the world....

  11. Levinas and the euthanasia debate.

    Science.gov (United States)

    Nuyen, A T

    2000-01-01

    The philosophers' tendency to characterize euthanasia in terms of either the right or the responsibility to die is, in some ways, problematic. Stepping outside of the analytic framework, the author draws out the implications of the ethics of Emmanuel Levinas for the euthanasia debate, tracing the ways Levinas' position differs not only from the philosophical consensus but also from the theological one. The article shows that, according to Levinas, there is no ethical case for suicide or assisted suicide. Death cannot be assumed or chosen--not only because suicide is a logically and metaphysically contradictory concept but also because in the choice of death ethical responsibility turns into irresponsibility. However, since Levinas holds that one must be responsible to the point of expiation, he can be said to approve certain actions that may have the consequence of hastening death.

  12. [The climate debate: the facts].

    Science.gov (United States)

    van den Broeke, Michiel R

    2009-01-01

    The first report by the Intergovernmental Panel on Climate Change (IPCC) appeared almost 20 years ago. Environmental contamination has a negative effect on the environment in which we live. However, the public at large is confused about the ins and outs of climate change. Managers, politicians, various kinds of advisors, scientists, so-called experts, sceptics and journalists have all taken it upon themselves to lead the debate. Whose task is it to ensure a sound discussion? Surely it is the IPCC's task. However, most politicians and many journalists, and even many scientists, do not take the trouble to read the entire IPCC report or parts of it. As a consequence, much nonsense is published and broadcast. An effective procedure to deal with the climate problem starts with a fair discussion of the scientific evidence. My advice is: just read the free IPCC report: http://www.ipcc.ch/ and click on 'WG I The Physical Science Basis'.

  13. Debatable Premises in Telecom Policy

    DEFF Research Database (Denmark)

    Hurwitz, Justin (Gus); Layton, Roslyn

    2015-01-01

    Around the world, telecommunications policy is one of the most important areas of public policy. The modern economy is driven by telecom technologies, and many telecom-related firms – Google, Apple, Facebook, and myriad fixed and mobile Internet service providers – are among the largest companies...... in the world. The Internet is opening up new platforms for business, education, government, and civic engagement. It has literally been a driving force in toppling governments. Telecommunications policy is important to every government in the world, and debates over what policies should be implemented......‟t stand up well to critical analysis. This paper collects and responds to a number of these premises that, collectively, underlie much popular, political, and academic support for increased telecommunications regulation in the United States and Europe – as well as much of the rest of the world....

  14. Conformational effects on the circular dichroism of Human Carbonic Anhydrase II: a multilevel computational study.

    Directory of Open Access Journals (Sweden)

    Tatyana G Karabencheva-Christova

    Full Text Available Circular Dichroism (CD spectroscopy is a powerful method for investigating conformational changes in proteins and therefore has numerous applications in structural and molecular biology. Here a computational investigation of the CD spectrum of the Human Carbonic Anhydrase II (HCAII, with main focus on the near-UV CD spectra of the wild-type enzyme and it seven tryptophan mutant forms, is presented and compared to experimental studies. Multilevel computational methods (Molecular Dynamics, Semiempirical Quantum Mechanics, Time-Dependent Density Functional Theory were applied in order to gain insight into the mechanisms of interaction between the aromatic chromophores within the protein environment and understand how the conformational flexibility of the protein influences these mechanisms. The analysis suggests that combining CD semi empirical calculations, crystal structures and molecular dynamics (MD could help in achieving a better agreement between the computed and experimental protein spectra and provide some unique insight into the dynamic nature of the mechanisms of chromophore interactions.

  15. Building HAL: computers that sense, recognize, and respond to human emotion

    Science.gov (United States)

    Picard, Rosalind W.

    2001-06-01

    The HAL 9000 computer, the inimitable star of the classic Kubrick and Clarke film '2001: A Space Odyssey,' displayed image understanding capabilities vastly beyond today's computer systems. HAL could not only instantly recognize who he was interacting with, but also he could lip read, judge aesthetics of visual sketches, recognize emotions subtly expressed by scientists on board the ship, and respond to these emotions in an adaptive personalized way. Of course, HAL also had capabilities that we might not want to give to machines, like the ability to terminate life support or otherwise take lives of people. This presentation highlights recent research in giving machines certain affective abilities that aim to make them ore intelligent, shows examples of some of these systems, and describes the role that affective abilities may play in future human-computer interaction.

  16. The ANWR debate rages on

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, A.

    2001-03-26

    Americans opposed to drilling in the Arctic National Wildlife Refuge (ANWR) have scored a victory this week by succeeding to convince 55 members of Congress to oppose inclusion of future revenues from Arctic drilling, an item very dear to the heart of U.S. President George W. Bush. In their letter to Congressional Committee Chairman Nussle the members of Congress argued that that 'inclusion in the budget of the projected revenues would sacrifice the untrammeled and pristine nature of this existing wilderness for a limited supply of oil and an increase in federal receipts of an estimated $1.2 billion by 2004'. The offending budget item was subsequently removed from the draft budget before Congress. The debate over oil exploration in the ANWR has generated emotional volleys on both sides, exemplifying how differently political battles are fought in the United States, with powerful lobbyists with limitless funds on one side and well-organized grassroots movements mobilizing on the other. At the same time, apparently little attention is paid to the views of Alaska's native people who make up 16 per cent of the state's population. In Canada, oil and gas development in the Yukon and Northwest Territories is championed by prominent native leaders as well as by political leaders. It is also supported by Canadians at large. The official Canadian position is that ANWR should retain its protected status; the long-standing agreement that each country will consult with the other before drilling in the Arctic, should also remain in force. With both President Bush and Vice-President Cheney eager proponents of exploration in the wildlife refuge, expert opinion is that the debate over the ANWR will heat up again during the summer.

  17. Measuring human emotions with modular neural networks and computer vision based applications

    Directory of Open Access Journals (Sweden)

    Veaceslav Albu

    2015-05-01

    Full Text Available This paper describes a neural network architecture for emotion recognition for human-computer interfaces and applied systems. In the current research, we propose a combination of the most recent biometric techniques with the neural networks (NN approach for real-time emotion and behavioral analysis. The system will be tested in real-time applications of customers' behavior for distributed on-land systems, such as kiosks and ATMs.

  18. Rapid Human-Computer Interactive Conceptual Design of Mobile and Manipulative Robot Systems

    Science.gov (United States)

    2015-05-19

    Learning Comparative User Models for Accelerating Human-Computer Collaborative Search, Evolutionary and Biologically Inspired Music , Sound, Art and...has been investigated theoretically to some extent ([12]) and successfully applied to artistic tasks ([11, 5]). Our hypothesis is that it is possible...model’s prediction to the sign of the original entry. If the signs coincide for all entries, the network is considered to be successfully trained

  19. Computers in a human perspective: an alternative way of teaching informatics to health professionals.

    Science.gov (United States)

    Schneider, W

    1989-11-01

    An alternative way of teaching informatics, especially health informatics, to health professionals of different categories has been developed and practiced. The essentials of human competence and skill in handling and processing information are presented parallel with the essentials of computer-assisted methodologies and technologies of formal language-based informatics. Requirements on how eventually useful computer-based tools will have to be designed in order to be well adapted to genuine human skill and competence in handling tools in various work contexts are established. On the basis of such a balanced knowledge methods for work analysis are introduced. These include how the existing problems at a workplace can be identified and analyzed in relation to the goals to be achieved. Special emphasis is given to new ways of information analysis, i.e. methods which even allow the comprehension and documentation of those parts of the actually practiced 'human' information handling and processing which are normally overlooked, as e.g. non-verbal communication processes and so-called 'tacit knowledge' based information handling and processing activities. Different ways of problem solving are discussed involving in an integrated human perspective--alternative staffing, enhancement of the competence of the staff, optimal planning of premises as well as organizational and technical means. The main result of this alternative way of education has been a considerably improved user competence which in turn has led to very different designs of computer assistance and man-computer interfaces. It is the purpose of this paper to give a brief outline of the teaching material and a short presentation of the above mentioned results.(ABSTRACT TRUNCATED AT 250 WORDS)

  20. OPTIMIZATION DESIGN OF HYDRAU-LIC MANIFOLD BLOCKS BASED ON HUMAN-COMPUTER COOPERATIVE GENETIC ALGORITHM

    Institute of Scientific and Technical Information of China (English)

    Feng Yi; Li Li; Tian Shujun

    2003-01-01

    Optimization design of hydraulic manifold blocks (HMB) is studied as a complex solid spatial layout problem. Based on comprehensive research into structure features and design rules of HMB, an optimal mathematical model for this problem is presented. Using human-computer cooperative genetic algorithm (GA) and its hybrid optimization strategies, integrated layout and connection design schemes of HMB can be automatically optimized. An example is given to testify it.

  1. 08292 Abstracts Collection -- The Study of Visual Aesthetics in Human-Computer Interaction

    OpenAIRE

    Hassenzahl, Marc; Lindgaard, Gitte; Platz, Axel; Tractinsky, Noam

    2008-01-01

    From 13.07. to 16.07.2008, the Dagstuhl Seminar 08292 ``The Study of Visual Aesthetics in Human-Computer Interaction'' was held in the International Conference and Research Center (IBFI), Schloss Dagstuhl. During the seminar, several participants presented their current research, and ongoing work and open problems were discussed. Abstracts of the presentations given during the seminar as well as abstracts of seminar results and ideas are put together in this paper. The first secti...

  2. AFFECTIVE AND EMOTIONAL ASPECTS OF HUMAN-COMPUTER INTERACTION: Game-Based and Innovative Learning Approaches

    OpenAIRE

    A. Askim GULUMBAY, Anadolu University, TURKEY

    2006-01-01

    This book was edited by, Maja Pivec, an educator at the University of Applied Sciences, and published by IOS Pres in 2006. The learning process can be seen as an emotional and personal experience that is addictive and leads learners to proactive behavior. New research methods in this field are related to affective and emotional approaches to computersupported learning and human-computer interactions.Bringing together scientists and research aspects from psychology, educational sciences, cogni...

  3. Towards a semio-cognitive theory of human-computer interaction

    OpenAIRE

    Scolari, Carlos Alberto

    2001-01-01

    The research here presented is theoretical and introduces a critical analysis of instrumental approaches in Human-Computer Interaction (HCI). From a semiotic point of view interfaces are not "natural" or "neutral" instruments, but rather complex sense production devices. Interaction, in other words, is far from being a "transparent" process.In this abstract we present the fundaments of a theoretical model that combines Semiotics with Cognitive Science approaches.

  4. US Army Weapon Systems Human-Computer Interface (WSHCI) style guide, Version 1

    Energy Technology Data Exchange (ETDEWEB)

    Avery, L.W.; O`Mara, P.A.; Shepard, A.P.

    1996-09-30

    A stated goal of the U.S. Army has been the standardization of the human computer interfaces (HCIS) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of style guides. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide. This document, the U.S. Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide, represents the first version of that style guide. The purpose of this document is to provide HCI design guidance for RT/NRT Army systems across the weapon systems domains of ground, aviation, missile, and soldier systems. Each domain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their domains.

  5. The experience of agency in human-computer interactions: a review.

    Science.gov (United States)

    Limerick, Hannah; Coyle, David; Moore, James W

    2014-01-01

    The sense of agency is the experience of controlling both one's body and the external environment. Although the sense of agency has been studied extensively, there is a paucity of studies in applied "real-life" situations. One applied domain that seems highly relevant is human-computer-interaction (HCI), as an increasing number of our everyday agentive interactions involve technology. Indeed, HCI has long recognized the feeling of control as a key factor in how people experience interactions with technology. The aim of this review is to summarize and examine the possible links between sense of agency and understanding control in HCI. We explore the overlap between HCI and sense of agency for computer input modalities and system feedback, computer assistance, and joint actions between humans and computers. An overarching consideration is how agency research can inform HCI and vice versa. Finally, we discuss the potential ethical implications of personal responsibility in an ever-increasing society of technology users and intelligent machine interfaces.

  6. An Efficient and Secure m-IPS Scheme of Mobile Devices for Human-Centric Computing

    Directory of Open Access Journals (Sweden)

    Young-Sik Jeong

    2014-01-01

    Full Text Available Recent rapid developments in wireless and mobile IT technologies have led to their application in many real-life areas, such as disasters, home networks, mobile social networks, medical services, industry, schools, and the military. Business/work environments have become wire/wireless, integrated with wireless networks. Although the increase in the use of mobile devices that can use wireless networks increases work efficiency and provides greater convenience, wireless access to networks represents a security threat. Currently, wireless intrusion prevention systems (IPSs are used to prevent wireless security threats. However, these are not an ideal security measure for businesses that utilize mobile devices because they do not take account of temporal-spatial and role information factors. Therefore, in this paper, an efficient and secure mobile-IPS (m-IPS is proposed for businesses utilizing mobile devices in mobile environments for human-centric computing. The m-IPS system incorporates temporal-spatial awareness in human-centric computing with various mobile devices and checks users’ temporal spatial information, profiles, and role information to provide precise access control. And it also can extend application of m-IPS to the Internet of things (IoT, which is one of the important advanced technologies for supporting human-centric computing environment completely, for real ubiquitous field with mobile devices.

  7. Adaptation of hybrid human-computer interaction systems using EEG error-related potentials.

    Science.gov (United States)

    Chavarriaga, Ricardo; Biasiucci, Andrea; Forster, Killian; Roggen, Daniel; Troster, Gerhard; Millan, Jose Del R

    2010-01-01

    Performance improvement in both humans and artificial systems strongly relies in the ability of recognizing erroneous behavior or decisions. This paper, that builds upon previous studies on EEG error-related signals, presents a hybrid approach for human computer interaction that uses human gestures to send commands to a computer and exploits brain activity to provide implicit feedback about the recognition of such commands. Using a simple computer game as a case study, we show that EEG activity evoked by erroneous gesture recognition can be classified in single trials above random levels. Automatic artifact rejection techniques are used, taking into account that subjects are allowed to move during the experiment. Moreover, we present a simple adaptation mechanism that uses the EEG signal to label newly acquired samples and can be used to re-calibrate the gesture recognition system in a supervised manner. Offline analysis show that, although the achieved EEG decoding accuracy is far from being perfect, these signals convey sufficient information to significantly improve the overall system performance.

  8. Computational analysis of expression of human embryonic stem cell-associated signatures in tumors

    Directory of Open Access Journals (Sweden)

    Wang Xiaosheng

    2011-10-01

    Full Text Available Abstract Background The cancer stem cell model has been proposed based on the linkage between human embryonic stem cells and human cancer cells. However, the evidences supporting the cancer stem cell model remain to be collected. In this study, we extensively examined the expression of human embryonic stem cell-associated signatures including core genes, transcription factors, pathways and microRNAs in various cancers using the computational biology approach. Results We used the class comparison analysis and survival analysis algorithms to identify differentially expressed genes and their associated transcription factors, pathways and microRNAs among normal vs. tumor or good prognosis vs. poor prognosis phenotypes classes based on numerous human cancer gene expression data. We found that most of the human embryonic stem cell- associated signatures were frequently identified in the analysis, suggesting a strong linkage between human embryonic stem cells and cancer cells. Conclusions The present study revealed the close linkage between the human embryonic stem cell associated gene expression profiles and cancer-associated gene expression profiles, and therefore offered an indirect support for the cancer stem cell theory. However, many interest issues remain to be addressed further.

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  11. Human Computer Confluence in Rehabilitation: Digital Media Plasticity and Human Performance Plasticity

    DEFF Research Database (Denmark)

    Brooks, Anthony Lewis

    2013-01-01

    approaches promoting mindsets and activities commonly considered enduring, mundane and boring. The concept focuses on sensor-based interfaces mapped to control tailored-content that acts as direct and immediate feedbacks mirroring input. These flexible, adaptive, and ‘plastic’ options offer facilitators new......Digital media plasticity evocative to embodied interaction is presented as a utilitarian tool when mixed and matched to target human performance potentials specific to nuance of development for those with impairment. A distinct intervention strategy trains via alternative channeling of external...

  12. Development of human reliability analysis methodology and its computer code during low power/shutdown operation

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Chang Hyun; You, Young Woo; Huh, Chang Wook; Kim, Ju Yeul; Kim Do Hyung; Kim, Yoon Ik; Yang, Hui Chang [Seoul National University, Seoul (Korea, Republic of); Jae, Moo Sung [Hansung University, Seoul (Korea, Republic of)

    1997-07-01

    The objective of this study is to develop the appropriate procedure that can evaluate the human error in LP/S(lower power/shutdown) and the computer code that calculate the human error probabilities(HEPs) using this framework. The assessment of applicability of the typical HRA methodologies to LP/S is conducted and a new HRA procedure, SEPLOT (Systematic Evaluation Procedure for LP/S Operation Tasks) which presents the characteristics of LP/S is developed by selection and categorization of human actions by reviewing present studies. This procedure is applied to evaluate the LOOP(Loss of Off-site Power) sequence and the HEPs obtained by using SEPLOT are used to quantitative evaluation of the core uncovery frequency. In this evaluation one of the dynamic reliability computer codes, DYLAM-3 which has the advantages against the ET/FT is used. The SEPLOT developed in this study can give the basis and arrangement as to the human error evaluation technique. And this procedure can make it possible to assess the dynamic aspects of accidents leading to core uncovery applying the HEPs obtained by using the SEPLOT as input data to DYLAM-3 code, Eventually, it is expected that the results of this study will contribute to improve safety in LP/S and reduce uncertainties in risk. 57 refs. 17 tabs., 33 figs. (author)

  13. Human Computation: Object Recognition for Mobile Games Based on Single Player

    Directory of Open Access Journals (Sweden)

    Mohamed Sakr

    2014-07-01

    Full Text Available Smart phones and its applications gain a lot of popularity nowadays. Many people depend on them to finish their tasks banking, social networking, fun and a lot other things. Games with a purpose (GWAP and microtask crowdsourcing are considered two techniques of the human-computation. GWAPs depend on humans to accomplish their tasks. Porting GWAPs to smart phones will be great in increasing the number of humans in it. One of the systems of human-computation is ESP Game. ESP Game is a type of games with a purpose. ESP game will be good candidate to be ported to smart phones. This paper presents a new mobile game called MemoryLabel. It is a single player mobile game. It helps in labeling images and gives description for them. In addition, the game gives description for objects in the image not the whole image. We deploy our algorithm at the University of Menoufia for evaluation. In addition, the game is published on Google play market for android applications. In this trial, we first focused on measuring the total number of labels generated by our game and also the number of objects that have been labeled. The results reveal that the proposed game has promising results in describing images and objects.

  14. CaPSID: A bioinformatics platform for computational pathogen sequence identification in human genomes and transcriptomes

    Directory of Open Access Journals (Sweden)

    Borozan Ivan

    2012-08-01

    Full Text Available Abstract Background It is now well established that nearly 20% of human cancers are caused by infectious agents, and the list of human oncogenic pathogens will grow in the future for a variety of cancer types. Whole tumor transcriptome and genome sequencing by next-generation sequencing technologies presents an unparalleled opportunity for pathogen detection and discovery in human tissues but requires development of new genome-wide bioinformatics tools. Results Here we present CaPSID (Computational Pathogen Sequence IDentification, a comprehensive bioinformatics platform for identifying, querying and visualizing both exogenous and endogenous pathogen nucleotide sequences in tumor genomes and transcriptomes. CaPSID includes a scalable, high performance database for data storage and a web application that integrates the genome browser JBrowse. CaPSID also provides useful metrics for sequence analysis of pre-aligned BAM files, such as gene and genome coverage, and is optimized to run efficiently on multiprocessor computers with low memory usage. Conclusions To demonstrate the usefulness and efficiency of CaPSID, we carried out a comprehensive analysis of both a simulated dataset and transcriptome samples from ovarian cancer. CaPSID correctly identified all of the human and pathogen sequences in the simulated dataset, while in the ovarian dataset CaPSID’s predictions were successfully validated in vitro.

  15. Evaluating the microstructure of human brain tissues using synchrotron radiation-based micro-computed tomography

    Science.gov (United States)

    Schulz, Georg; Morel, Anne; Imholz, Martha S.; Deyhle, Hans; Weitkamp, Timm; Zanette, Irene; Pfeiffer, Franz; David, Christian; Müller-Gerbl, Magdalena; Müller, Bert

    2010-09-01

    Minimally invasive deep brain neurosurgical interventions require a profound knowledge of the morphology of the human brain. Generic brain atlases are based on histology including multiple preparation steps during the sectioning and staining. In order to correct the distortions induced in the anisotropic, inhomogeneous soft matter and therefore improve the accuracy of brain atlases, a non-destructive 3D imaging technique with the required spatial and density resolution is of great significance. Micro computed tomography provides true micrometer resolution. The application to post mortem human brain, however, is questionable because the differences of the components concerning X-ray absorption are weak. Therefore, magnetic resonance tomography has become the method of choice for three-dimensional imaging of human brain. Because the spatial resolution of this method is limited, an alternative has to be found for the three-dimensional imaging of cellular microstructures within the brain. Therefore, the present study relies on the synchrotron radiationbased micro computed tomography in the recently developed grating-based phase contrast mode. Using data acquired at the beamline ID 19 (ESRF, Grenoble, France) we demonstrate that grating-based tomography yields premium images of human thalamus, which can be used for the correction of histological distortions by 3D non-rigid registration.

  16. Cognitive engineering models: A prerequisite to the design of human-computer interaction in complex dynamic systems

    Science.gov (United States)

    Mitchell, Christine M.

    1993-01-01

    This chapter examines a class of human-computer interaction applications, specifically the design of human-computer interaction for the operators of complex systems. Such systems include space systems (e.g., manned systems such as the Shuttle or space station, and unmanned systems such as NASA scientific satellites), aviation systems (e.g., the flight deck of 'glass cockpit' airplanes or air traffic control) and industrial systems (e.g., power plants, telephone networks, and sophisticated, e.g., 'lights out,' manufacturing facilities). The main body of human-computer interaction (HCI) research complements but does not directly address the primary issues involved in human-computer interaction design for operators of complex systems. Interfaces to complex systems are somewhat special. The 'user' in such systems - i.e., the human operator responsible for safe and effective system operation - is highly skilled, someone who in human-machine systems engineering is sometimes characterized as 'well trained, well motivated'. The 'job' or task context is paramount and, thus, human-computer interaction is subordinate to human job interaction. The design of human interaction with complex systems, i.e., the design of human job interaction, is sometimes called cognitive engineering.

  17. Study of movement coordination in human ensembles via a novel computer-based set-up

    CERN Document Server

    Alderisio, Francesco; Fiore, Gianfranco; di Bernardo, Mario

    2016-01-01

    Movement coordination in human ensembles has been studied little in the current literature. In the existing experimental works, situations where all subjects are connected with each other through direct visual and auditory coupling, and social interaction affects their coordination, have been investigated. Here, we study coordination in human ensembles via a novel computer-based set-up that enables individuals to coordinate each other's motion from a distance so as to minimize the influence of social interaction. The proposed platform makes it possible to implement different visual interaction patterns among the players, so that participants take into consideration the motion of a designated subset of the others. This allows the evaluation of the exclusive effects on coordination of the structure of interconnections among the players and their own dynamics. Our set-up enables also the deployment of virtual players to investigate dyadic interaction between a human and a virtual agent, as well as group synchron...

  18. [Geomagnetic storm decreases coherence of electric oscillations of human brain while working at the computer].

    Science.gov (United States)

    Novik, O B; Smirnov, F A

    2013-01-01

    The effect of geomagnetic storms at the latitude of Moscow on the electric oscillations of the human brain cerebral cortex was studied. In course of electroencephalogram measurements it was shown that when the voluntary persons at the age of 18-23 years old were performing tasks using a computer during moderate magnetic storm or no later than 24 hrs after it, the value of the coherence function of electric oscillations of the human brain in the frontal and occipital areas in a range of 4.0-7.9 Hz (so-called the theta rhythm oscillations of the human brain) decreased by a factor of two or more, sometimes reaching zero, although arterial blood pressure, respiratory rate and the electrocardiogram registered during electroencephalogram measurements remained within the standard values.

  19. Simulation-based computation of dose to humans in radiological environments

    Energy Technology Data Exchange (ETDEWEB)

    Breazeal, N.L. [Sandia National Labs., Livermore, CA (United States); Davis, K.R.; Watson, R.A. [Sandia National Labs., Albuquerque, NM (United States); Vickers, D.S. [Brigham Young Univ., Provo, UT (United States). Dept. of Electrical and Computer Engineering; Ford, M.S. [Battelle Pantex, Amarillo, TX (United States). Dept. of Radiation Safety

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface.

  20. Human Factors and Human-Computer Considerations in Teleradiology and Telepathology

    Directory of Open Access Journals (Sweden)

    Elizabeth A. Krupinski

    2014-02-01

    Full Text Available Radiology and pathology are unique among other clinical specialties that incorporate telemedicine technologies into clinical practice, as, for the most part in traditional practice, there are few or no direct patient encounters. The majority of teleradiology and telepathology involves viewing images, which is exactly what occurs without the “tele” component. The images used are generally quite large, require dedicated displays and software for viewing, and present challenges to the clinician who must navigate through the presented data to render a diagnostic decision or interpretation. This digital viewing environment is very different from the more traditional reading environment (i.e., film and microscopy, necessitating a new look at how to optimize reading environments and address human factors issues. This paper will review some of the key components that need to be optimized for effective and efficient practice of teleradiology and telepathology using traditional workstations as well as some of the newer mobile viewing applications.

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  2. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  3. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  4. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  5. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  6. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  7. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  8. The use of computers to teach human anatomy and physiology to allied health and nursing students

    Science.gov (United States)

    Bergeron, Valerie J.

    Educational institutions are under tremendous pressure to adopt the newest technologies in order to prepare their students to meet the challenges of the twenty-first century. For the last twenty years huge amounts of money have been spent on computers, printers, software, multimedia projection equipment, and so forth. A reasonable question is, "Has it worked?" Has this infusion of resources, financial as well as human, resulted in improved learning? Are the students meeting the intended learning goals? Any attempt to develop answers to these questions should include examining the intended goals and exploring the effects of the changes on students and faculty. This project investigated the impact of a specific application of a computer program in a community college setting on students' attitudes and understanding of human anatomy and physiology. In this investigation two sites of the same community college with seemingly similar students populations, seven miles apart, used different laboratory activities to teach human anatomy and physiology. At one site nursing students were taught using traditional dissections and laboratory activities; at the other site two of the dissections, specifically cat and sheep pluck, were replaced with the A.D.A.M.RTM (Animated Dissection of Anatomy for Medicine) computer program. Analysis of the attitude data indicated that students at both sites were extremely positive about their laboratory experiences. Analysis of the content data indicated a statistically significant difference in performance between the two sites in two of the eight content areas that were studied. For both topics the students using the computer program scored higher. A detailed analysis of the surveys, interviews with faculty and students, examination of laboratory materials, and observations of laboratory facilities in both sites, and cost-benefit analysis led to the development of seven recommendations. The recommendations call for action at the level of the

  9. The heterogeneity of mental representation: Ending the imagery debate.

    Science.gov (United States)

    Pearson, Joel; Kosslyn, Stephen M

    2015-08-18

    The possible ways that information can be represented mentally have been discussed often over the past thousand years. However, this issue could not be addressed rigorously until late in the 20th century. Initial empirical findings spurred a debate about the heterogeneity of mental representation: Is all information stored in propositional, language-like, symbolic internal representations, or can humans use at least two different types of representations (and possibly many more)? Here, in historical context, we describe recent evidence that humans do not always rely on propositional internal representations but, instead, can also rely on at least one other format: depictive representation. We propose that the debate should now move on to characterizing all of the different forms of human mental representation.

  10. Experimental verification of a computational technique for determining ground reactions in human bipedal stance.

    Science.gov (United States)

    Audu, Musa L; Kirsch, Robert F; Triolo, Ronald J

    2007-01-01

    We have developed a three-dimensional (3D) biomechanical model of human standing that enables us to study the mechanisms of posture and balance simultaneously in various directions in space. Since the two feet are on the ground, the system defines a kinematically closed-chain which has redundancy problems that cannot be resolved using the laws of mechanics alone. We have developed a computational (optimization) technique that avoids the problems with the closed-chain formulation thus giving users of such models the ability to make predictions of joint moments, and potentially, muscle activations using more sophisticated musculoskeletal models. This paper describes the experimental verification of the computational technique that is used to estimate the ground reaction vector acting on an unconstrained foot while the other foot is attached to the ground, thus allowing human bipedal standing to be analyzed as an open-chain system. The computational approach was verified in terms of its ability to predict lower extremity joint moments derived from inverse dynamic simulations performed on data acquired from four able-bodied volunteers standing in various postures on force platforms. Sensitivity analyses performed with model simulations indicated which ground reaction force (GRF) and center of pressure (COP) components were most critical for providing better estimates of the joint moments. Overall, the joint moments predicted by the optimization approach are strongly correlated with the joint moments computed using the experimentally measured GRF and COP (0.78 unity slope (experimental=computational results) for postures of the four subjects examined. These results indicate that this model-based technique can be relied upon to predict reasonable and consistent estimates of the joint moments using the predicted GRF and COP for most standing postures.

  11. Chinese and Foreign Students Debate Gender Equality

    Institute of Scientific and Technical Information of China (English)

    1994-01-01

    WITH the Fourth World Conference on Women drawing near, more and more people are showing a strong interest in women’s issues. On April 12 the debate "Is the Equality Between Men and Women Impossible?" was held at Beijing Language and Culture University. Hundreds of students and teachers crowded into the 200-seat university theater where the debate was held. The debate consisted of two teams representing positive and negative sides

  12. Abortion: taking the debate seriously.

    Science.gov (United States)

    Kottow Lang, Miguel Hugo

    2015-05-19

    Voluntarily induced abortion has been under permanent dispute and legal regulations, because societies invariably condemn extramarital pregnancies. In recent decades, a measure of societal tolerance has led to decriminalize and legalize abortion in accordance with one of two models: a more restricted and conservative model known as therapeutic abortion, and the model that accepts voluntary abortion within the first trimester of pregnancy. Liberalization of abortion aims at ending clandestine abortions and decriminalizes the practice in order to increase reproductive education and accessibility of contraceptive methods, dissuade women from interrupting their pregnancy and, ultimately, make abortion a medically safe procedure within the boundaries of the law, inspired by efforts to reduce the incidence of this practice. The current legal initiative to decriminalize abortion in Chile proposes a notably rigid set of indications which would not resolve the three main objectives that need to be considered: 1) Establish the legal framework of abortion; 2) Contribute to reduce social unrest; 3) Solve the public health issue of clandestine, illegal abortions. Debate must urgently be opened to include alternatives in line with the general tendency to respect women's decision within the first trimester of pregnancy.

  13. Debating about the climate warming

    Institute of Scientific and Technical Information of China (English)

    WANG Shaowu; LUO Yong; ZHAO Zongci; DONG Wenje; YANG Bao

    2006-01-01

    Debating about the climate warming is reviewed. Discussions have focused on the validity of the temperature reconstruction for the last millennium made by Mann et al. Arguments against and for the reconstruction are introduced. Temperature reconstructions by other authors are examined, including the one carried out by Wang et al. in 1996. It is concluded that: (1) Ability of reproducing temperature variability of time scale less than 10 a is limited, so no sufficient evidence proves that the 1990s was the warmest decade, and 1998 was the warmest year over the last millennium. (2) All ofthe temperature reconstructions by different authors demonstrate the occurrence of the MWP (Medieval Warm Period) and LIA (Little Ice Age) in low frequency band of temperature variations, though the peak in the MWP and trough in LIA varies from one reconstruction to the others. Therefore, terms of MWP and LIA can be used in studies of climate change. (3) The warming from 1975 to 2000 was significant, but we do not know if it was the strongest for the last millennium, which needs to be proved by more evidence.

  14. Inhibitory surround and grouping effects in human and computational multiple object tracking

    Science.gov (United States)

    Yilmaz, Ozgur; Guler, Sadiye; Ogmen, Haluk

    2008-02-01

    Multiple Object Tracking (MOT) experiments show that human observers can track over several seconds up to five moving targets among several moving distractors. We extended these studies by designing modified MOT experiments to investigate the spatio-temporal characteristics of human visuo-cognitive mechanisms for tracking and applied the findings and insights obtained from these experiments in designing computational multiple object tracking algorithms. Recent studies indicate that attention both enhances the neural activity of relevant information and suppresses the irrelevant visual information in the surround. Results of our experiments suggest that the suppressive surround of attention extends up to 4 deg from the target stimulus, and it takes at least 100 ms to build it. We suggest that when the attentional windows corresponding to separate target regions are spatially close, they can be grouped to form a single attentional window to avoid interference originating from suppressive surrounds. The grouping experiment results indicate that the attentional windows are grouped into a single one when the distance between them is less than 1.5 deg. Preliminary implementation of the suppressive surround concept in our computational video object tracker resulted in less number of unnecessary object merges in computational video tracking experiments.

  15. Kansei Colour Concepts to Improve Effective Colour Selection in Designing Human Computer Interfaces

    Directory of Open Access Journals (Sweden)

    Tharangie K G D

    2010-05-01

    Full Text Available Colours have a major impact on Human Computer Interaction. Although there is a very thin line between appropriate and inappropriate use of colours, if used properly, colours can be a powerful tool to improve the usefulness of a computer interface in a wide variety of areas. Many designers mostly consider the physical aspect of the colour and tend to forget that psychological aspect of colour exists. However the findings of this study confirm that the psychological aspect or the affective dimension of colour also plays an important role in colour Interface design towards user satisfaction. Using Kansei Engineering principles the study explores the affective variability of colours and how it can be manipulated to provide better design guidance and solutions. A group of twenty adults from Sri Lanka, age ranging from 30 to 40 took part in the study. Survey was conducted using a Kansei colour questionnaire in normal atmospheric conditions. The results reveal that the affective variability of colours plays an important role in human computer interaction as an influential factor in drawing the user towards or withdrawing from the Interface. Thereby improving or degrading the user satisfaction.

  16. Delays and user performance in human-computer-network interaction tasks.

    Science.gov (United States)

    Caldwell, Barrett S; Wang, Enlie

    2009-12-01

    This article describes a series of studies conducted to examine factors affecting user perceptions, responses, and tolerance for network-based computer delays affecting distributed human-computer-network interaction (HCNI) tasks. HCNI tasks, even with increasing computing and network bandwidth capabilities, are still affected by human perceptions of delay and appropriate waiting times for information flow latencies. Conducted were 6 laboratory studies with university participants in China (Preliminary Experiments 1 through 3) and the United States (Experiments 4 through 6) to examine users' perceptions of elapsed time, effect of perceived network task performance partners on delay tolerance, and expectations of appropriate delays based on task, situation, and network conditions. Results across the six experiments indicate that users' delay tolerance and estimated delay were affected by multiple task and expectation factors, including task complexity and importance, situation urgency and time availability, file size, and network bandwidth capacity. Results also suggest a range of user strategies for incorporating delay tolerance in task planning and performance. HCNI user experience is influenced by combinations of task requirements, constraints, and understandings of system performance; tolerance is a nonlinear function of time constraint ratios or decay. Appropriate user interface tools providing delay feedback information can help modify user expectations and delay tolerance. These tools are especially valuable when delay conditions exceed a few seconds or when task constraints and system demands are high. Interface designs for HCNI tasks should consider assistant-style presentations of delay feedback, information freshness, and network characteristics. Assistants should also gather awareness of user time constraints.

  17. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  18. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  19. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  20. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  1. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  2. COMPUTING

    CERN Document Server

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  3. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  4. Computational fluid dynamics modeling of Bacillus anthracis spore deposition in rabbit and human respiratory airways

    Energy Technology Data Exchange (ETDEWEB)

    Kabilan, S.; Suffield, S. R.; Recknagle, K. P.; Jacob, R. E.; Einstein, D. R.; Kuprat, A. P.; Carson, J. P.; Colby, S. M.; Saunders, J. H.; Hines, S. A.; Teeguarden, J. G.; Straub, T. M.; Moe, M.; Taft, S. C.; Corley, R. A.

    2016-09-01

    Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived respectively from computed tomography (CT) and µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation–exhalation breathing conditions using average species-specific minute volumes. Two different exposure scenarios were modeled in the rabbit based upon experimental inhalation studies. For comparison, human simulations were conducted at the highest exposure concentration used during the rabbit experimental exposures. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Due to the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the nasal sinus compared to the human at the same air concentration of anthrax spores. In contrast, higher spore deposition was predicted in the lower conducting airways of the human compared to the rabbit lung due to differences in airway branching pattern. This information can be used to refine published and ongoing biokinetic models of inhalation anthrax spore exposures, which currently estimate deposited spore concentrations based solely upon exposure concentrations and inhaled doses that do not factor in species-specific anatomy and physiology for deposition.

  5. Computational Fluid Dynamics Modeling of Bacillus anthracis Spore Deposition in Rabbit and Human Respiratory Airways

    Energy Technology Data Exchange (ETDEWEB)

    Kabilan, Senthil; Suffield, Sarah R.; Recknagle, Kurtis P.; Jacob, Rick E.; Einstein, Daniel R.; Kuprat, Andrew P.; Carson, James P.; Colby, Sean M.; Saunders, James H.; Hines, Stephanie; Teeguarden, Justin G.; Straub, Tim M.; Moe, M.; Taft, Sarah; Corley, Richard A.

    2016-09-30

    Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived from computed tomography (CT) or µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation-exhalation breathing conditions using average species-specific minute volumes. The highest exposure concentration was modeled in the rabbit based upon prior acute inhalation studies. For comparison, human simulation was also conducted at the same concentration. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Due to the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the upper conducting airways compared to the human at the same air concentration of anthrax spores. As a result, higher particle deposition was predicted in the conducting airways and deep lung of the human compared to the rabbit lung due to differences in airway branching pattern. This information can be used to refine published and ongoing biokinetic models of inhalation anthrax spore exposures, which currently estimate deposited spore concentrations based solely upon exposure concentrations and inhaled doses that do not factor in species-specific anatomy and physiology.

  6. Dynamic management of multi-channel interfaces for human interactions with computer-based intelligent assistants

    Energy Technology Data Exchange (ETDEWEB)

    Strickland, T.D. Jr.

    1989-01-01

    For complex man-machine tasks where multi-media interaction with computer-based assistants is appropriate, a portion of the assistant's intelligence must be devoted to managing its communication processes with the user. Since people often serve the role of assistants, the conventions of human communication provide a basis for designing the communication processes of the computer-based assistant. Human decision making for communication requires knowledge of the user's style, the task demands, and communication practices, and knowledge of the current situation. Decisions necessary for effective communication, when, how, and what to communicate, can be expressed using these knowledge sources. A system based on human communication rules was developed to manage the communication decisions of an intelligent assistant. The Dynamic Communication Management (DCM) system consists of four components, three models and a manager. The model of the user describes the user's communication preferences for different task situations. The model of the task is used to establish the user's current activity and to describe how communication should be conducted for this activity. The communication model provides the rules needed to make decisions: when to communicate the message, how to present the message to the user, and what information should be communicated. The Communication Manager controls and coordinates these models to conduct all communication with the user. Performance with DCM as the interface to a simulated Flexible Manufacturing System (FMS) control task was established to learn about the potential benefits of the concept.

  7. Evidence for model-based computations in the human amygdala during Pavlovian conditioning.

    Science.gov (United States)

    Prévost, Charlotte; McNamee, Daniel; Jessup, Ryan K; Bossaerts, Peter; O'Doherty, John P

    2013-01-01

    Contemporary computational accounts of instrumental conditioning have emphasized a role for a model-based system in which values are computed with reference to a rich model of the structure of the world, and a model-free system in which values are updated without encoding such structure. Much less studied is the possibility of a similar distinction operating at the level of Pavlovian conditioning. In the present study, we scanned human participants while they participated in a Pavlovian conditioning task with a simple structure while measuring activity in the human amygdala using a high-resolution fMRI protocol. After fitting a model-based algorithm and a variety of model-free algorithms to the fMRI data, we found evidence for the superiority of a model-based algorithm in accounting for activity in the amygdala compared to the model-free counterparts. These findings support an important role for model-based algorithms in describing the processes underpinning Pavlovian conditioning, as well as providing evidence of a role for the human amygdala in model-based inference.

  8. Evidence for model-based computations in the human amygdala during Pavlovian conditioning.

    Directory of Open Access Journals (Sweden)

    Charlotte Prévost

    Full Text Available Contemporary computational accounts of instrumental conditioning have emphasized a role for a model-based system in which values are computed with reference to a rich model of the structure of the world, and a model-free system in which values are updated without encoding such structure. Much less studied is the possibility of a similar distinction operating at the level of Pavlovian conditioning. In the present study, we scanned human participants while they participated in a Pavlovian conditioning task with a simple structure while measuring activity in the human amygdala using a high-resolution fMRI protocol. After fitting a model-based algorithm and a variety of model-free algorithms to the fMRI data, we found evidence for the superiority of a model-based algorithm in accounting for activity in the amygdala compared to the model-free counterparts. These findings support an important role for model-based algorithms in describing the processes underpinning Pavlovian conditioning, as well as providing evidence of a role for the human amygdala in model-based inference.

  9. Wearable Computing System with Input-Output Devices Based on Eye-Based Human Computer Interaction Allowing Location Based Web Services

    Directory of Open Access Journals (Sweden)

    Kohei Arai

    2013-08-01

    Full Text Available Wearable computing with Input-Output devices Base on Eye-Based Human Computer Interaction: EBHCI which allows location based web services including navigation, location/attitude/health condition monitoring is proposed. Through implementation of the proposed wearable computing system, all the functionality is confirmed. It is also found that the system does work well. It can be used easily and also is not expensive. Experimental results for EBHCI show excellent performance in terms of key-in accuracy as well as input speed. It is accessible to internet, obviously, and has search engine capability.

  10. Study of human performance in computer-aided architectural design: methods and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cuomo, D.L.

    1988-01-01

    The goal of this study was to develop a performance methodology which will be useful for evaluating human performance for different types of tasks on a given system and across different levels of complexity within a single task. To meet the above goals, performance measures that reflect meaningful changes in humans behavior during CAAD tasks were developed. These measures were based on models of human information processing. Two cognitively different architectural tasks formulated differed in terms of the stimulus-central processing component-response compatibility and the structuredness of their problem spaces. Methods of varying task complexity within each of these tasks were also developed to test the sensitivity of the performance measures across levels of complexity and to introduce variability into the humans design behavior. From the developed performance measures task complexity, type of task, and subjective effects on performance could be seen. It was also shown that some measures more directly reflected the computer-interaction aspects of the task while other measures reflected the cognitive design activity of the human.

  11. Single-photon emission computed tomography in human immunodeficiency virus encephalopathy: A preliminary report

    Energy Technology Data Exchange (ETDEWEB)

    Masdeu, J.C.; Yudd, A.; Van Heertum, R.L.; Grundman, M.; Hriso, E.; O' Connell, R.A.; Luck, D.; Camli, U.; King, L.N. (St. Vincent' s Medical Center, New York, NY (USA))

    1991-08-01

    Depression or psychosis in a previously asymptomatic individual infected with the human immunodeficiency virus (HIV) may be psychogenic, related to brain involvement by the HIV or both. Although prognosis and treatment differ depending on etiology, computed tomography (CT) and magnetic resonance imaging (MRI) are usually unrevealing in early HIV encephalopathy and therefore cannot differentiate it from psychogenic conditions. Thirty of 32 patients (94%) with HIV encephalopathy had single-photon emission computed tomography (SPECT) findings that differed from the findings in 15 patients with non-HIV psychoses and 6 controls. SPECT showed multifocal cortical and subcortical areas of hypoperfusion. In 4 cases, cognitive improvement after 6-8 weeks of zidovudine (AZT) therapy was reflected in amelioration of SPECT findings. CT remained unchanged. SPECT may be a useful technique for the evaluation of HIV encephalopathy.

  12. Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals

    Directory of Open Access Journals (Sweden)

    Nasoz Fatma

    2004-01-01

    Full Text Available We discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI and user modeling. We introduce the overall paradigm for our multimodal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement. We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and potential applications of emotion recognition for multimodal intelligent systems.

  13. Large Metasurface Aperture for Millimeter Wave Computational Imaging at the Human-Scale

    Science.gov (United States)

    Gollub, J. N.; Yurduseven, O.; Trofatter, K. P.; Arnitz, D.; F. Imani, M.; Sleasman, T.; Boyarsky, M.; Rose, A.; Pedross-Engel, A.; Odabasi, H.; Zvolensky, T.; Lipworth, G.; Brady, D.; Marks, D. L.; Reynolds, M. S.; Smith, D. R.

    2017-02-01

    We demonstrate a low-profile holographic imaging system at millimeter wavelengths based on an aperture composed of frequency-diverse metasurfaces. Utilizing measurements of spatially-diverse field patterns, diffraction-limited images of human-sized subjects are reconstructed. The system is driven by a single microwave source swept over a band of frequencies (17.5–26.5 GHz) and switched between a collection of transmit and receive metasurface panels. High fidelity image reconstruction requires a precise model for each field pattern generated by the aperture, as well as the manner in which the field scatters from objects in the scene. This constraint makes scaling of computational imaging systems inherently challenging for electrically large, coherent apertures. To meet the demanding requirements, we introduce computational methods and calibration approaches that enable rapid and accurate imaging performance.

  14. Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals

    Science.gov (United States)

    Lisetti, Christine Lætitia; Nasoz, Fatma

    2004-12-01

    We discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI) and user modeling. We introduce the overall paradigm for our multimodal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature) and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement). We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and potential applications of emotion recognition for multimodal intelligent systems.

  15. Computational high-resolution optical imaging of the living human retina

    Science.gov (United States)

    Shemonski, Nathan D.; South, Fredrick A.; Liu, Yuan-Zhi; Adie, Steven G.; Scott Carney, P.; Boppart, Stephen A.

    2015-07-01

    High-resolution in vivo imaging is of great importance for the fields of biology and medicine. The introduction of hardware-based adaptive optics (HAO) has pushed the limits of optical imaging, enabling high-resolution near diffraction-limited imaging of previously unresolvable structures. In ophthalmology, when combined with optical coherence tomography, HAO has enabled a detailed three-dimensional visualization of photoreceptor distributions and individual nerve fibre bundles in the living human retina. However, the introduction of HAO hardware and supporting software adds considerable complexity and cost to an imaging system, limiting the number of researchers and medical professionals who could benefit from the technology. Here we demonstrate a fully automated computational approach that enables high-resolution in vivo ophthalmic imaging without the need for HAO. The results demonstrate that computational methods in coherent microscopy are applicable in highly dynamic living systems.

  16. Computational methods to extract meaning from text and advance theories of human cognition.

    Science.gov (United States)

    McNamara, Danielle S

    2011-01-01

    Over the past two decades, researchers have made great advances in the area of computational methods for extracting meaning from text. This research has to a large extent been spurred by the development of latent semantic analysis (LSA), a method for extracting and representing the meaning of words using statistical computations applied to large corpora of text. Since the advent of LSA, researchers have developed and tested alternative statistical methods designed to detect and analyze meaning in text corpora. This research exemplifies how statistical models of semantics play an important role in our understanding of cognition and contribute to the field of cognitive science. Importantly, these models afford large-scale representations of human knowledge and allow researchers to explore various questions regarding knowledge, discourse processing, text comprehension, and language. This topic includes the latest progress by the leading researchers in the endeavor to go beyond LSA.

  17. Computational approaches towards understanding human long non-coding RNA biology.

    Science.gov (United States)

    Jalali, Saakshi; Kapoor, Shruti; Sivadas, Ambily; Bhartiya, Deeksha; Scaria, Vinod

    2015-07-15

    Long non-coding RNAs (lncRNAs) form the largest class of non-protein coding genes in the human genome. While a small subset of well-characterized lncRNAs has demonstrated their significant role in diverse biological functions like chromatin modifications, post-transcriptional regulation, imprinting etc., the functional significance of a vast majority of them still remains an enigma. Increasing evidence of the implications of lncRNAs in various diseases including cancer and major developmental processes has further enhanced the need to gain mechanistic insights into the lncRNA functions. Here, we present a comprehensive review of the various computational approaches and tools available for the identification and annotation of long non-coding RNAs. We also discuss a conceptual roadmap to systematically explore the functional properties of the lncRNAs using computational approaches.

  18. Computational drug design strategies applied to the modelling of human immunodeficiency virus-1 reverse transcriptase inhibitors

    Directory of Open Access Journals (Sweden)

    Lucianna Helene Santos

    2015-11-01

    Full Text Available Reverse transcriptase (RT is a multifunctional enzyme in the human immunodeficiency virus (HIV-1 life cycle and represents a primary target for drug discovery efforts against HIV-1 infection. Two classes of RT inhibitors, the nucleoside RT inhibitors (NRTIs and the nonnucleoside transcriptase inhibitors are prominently used in the highly active antiretroviral therapy in combination with other anti-HIV drugs. However, the rapid emergence of drug-resistant viral strains has limited the successful rate of the anti-HIV agents. Computational methods are a significant part of the drug design process and indispensable to study drug resistance. In this review, recent advances in computer-aided drug design for the rational design of new compounds against HIV-1 RT using methods such as molecular docking, molecular dynamics, free energy calculations, quantitative structure-activity relationships, pharmacophore modelling and absorption, distribution, metabolism, excretion and toxicity prediction are discussed. Successful applications of these methodologies are also highlighted.

  19. Computational drug design strategies applied to the modelling of human immunodeficiency virus-1 reverse transcriptase inhibitors.

    Science.gov (United States)

    Santos, Lucianna Helene; Ferreira, Rafaela Salgado; Caffarena, Ernesto Raúl

    2015-11-01

    Reverse transcriptase (RT) is a multifunctional enzyme in the human immunodeficiency virus (HIV)-1 life cycle and represents a primary target for drug discovery efforts against HIV-1 infection. Two classes of RT inhibitors, the nucleoside RT inhibitors (NRTIs) and the nonnucleoside transcriptase inhibitors are prominently used in the highly active antiretroviral therapy in combination with other anti-HIV drugs. However, the rapid emergence of drug-resistant viral strains has limited the successful rate of the anti-HIV agents. Computational methods are a significant part of the drug design process and indispensable to study drug resistance. In this review, recent advances in computer-aided drug design for the rational design of new compounds against HIV-1 RT using methods such as molecular docking, molecular dynamics, free energy calculations, quantitative structure-activity relationships, pharmacophore modelling and absorption, distribution, metabolism, excretion and toxicity prediction are discussed. Successful applications of these methodologies are also highlighted.

  20. 3D virtual human atria: A computational platform for studying clinical atrial fibrillation.

    Science.gov (United States)

    Aslanidi, Oleg V; Colman, Michael A; Stott, Jonathan; Dobrzynski, Halina; Boyett, Mark R; Holden, Arun V; Zhang, Henggui

    2011-10-01

    Despite a vast amount of experimental and clinical data on the underlying ionic, cellular and tissue substrates, the mechanisms of common atrial arrhythmias (such as atrial fibrillation, AF) arising from the functional interactions at the whole atria level remain unclear. Computational modelling provides a quantitative framework for integrating such multi-scale data and understanding the arrhythmogenic behaviour that emerges from the collective spatio-temporal dynamics in all parts of the heart. In this study, we have developed a multi-scale hierarchy of biophysically detailed computational models for the human atria--the 3D virtual human atria. Primarily, diffusion tensor MRI reconstruction of the tissue geometry and fibre orientation in the human sinoatrial node (SAN) and surrounding atrial muscle was integrated into the 3D model of the whole atria dissected from the Visible Human dataset. The anatomical models were combined with the heterogeneous atrial action potential (AP) models, and used to simulate the AP conduction in the human atria under various conditions: SAN pacemaking and atrial activation in the normal rhythm, break-down of regular AP wave-fronts during rapid atrial pacing, and the genesis of multiple re-entrant wavelets characteristic of AF. Contributions of different properties of the tissue to mechanisms of the normal rhythm and arrhythmogenesis were investigated. Primarily, the simulations showed that tissue heterogeneity caused the break-down of the normal AP wave-fronts at rapid pacing rates, which initiated a pair of re-entrant spiral waves; and tissue anisotropy resulted in a further break-down of the spiral waves into multiple meandering wavelets characteristic of AF. The 3D virtual atria model itself was incorporated into the torso model to simulate the body surface ECG patterns in the normal and arrhythmic conditions. Therefore, a state-of-the-art computational platform has been developed, which can be used for studying multi

  1. Abortion: taking the debate seriously

    Directory of Open Access Journals (Sweden)

    Miguel Hugo Kottow Lang

    2015-05-01

    Full Text Available El aborto voluntariamente inducido se mantiene a lo largo de la historia como práctica prevalente sumida en la oscuridad y en la clandestinidad porque toda fecundación extramatrimonial ha sido socialmente rechazada. Desde mediados del siglo 20, se produce una actitud de tolerancia que lleva a la despenalización y legalización del aborto, según dos modelos jurídicos: el modelo de indicaciones, conocido como aborto terapéutico, adoptado en naciones conservadoras, y el modelo de plazos que permite a la mujer requerir el aborto dentro del primer trimestre de embarazo. La liberalización del aborto obedece a la invariable política social que busca eliminar la clandestinidad y sus nocivos efectos, para educar, disuadir y, eventualmente, considerar el aborto como un servicio médico seguro y accesible dentro de los marcos legalmente establecidos, todas normativas orientadas a disminuir la incidencia del aborto procurado. El Proyecto de Ley de despenalización del aborto presentado al Parlamento chileno obedece al modelo de indicaciones, que son presentadas en forma muy restrictiva y por ende no cumplen con los tres objetivos que deben orientarla: 1 Enmarcar legalmente la práctica del aborto; 2 Contribuir a la paz social; 3 Resolver el problema de salud pública del aborto clandestino. Es de urgencia abrir el debate a incluir alternativas más resolutivas, en consonancia con la tendencia general a preferir el modelo de plazos que incluye el respeto a la decisión de la mujer.

  2. Real-time non-invasive eyetracking and gaze-point determination for human-computer interaction and biomedicine

    Science.gov (United States)

    Talukder, Ashit; Morookian, John-Michael; Monacos, S.; Lam, R.; Lebaw, C.; Bond, A.

    2004-01-01

    Eyetracking is one of the latest technologies that has shown potential in several areas including human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological problems in individuals.

  3. Computational Strategy for Quantifying Human Pesticide Exposure based upon a Saliva Measurement

    Directory of Open Access Journals (Sweden)

    Charles eTimchalk

    2015-05-01

    Full Text Available Quantitative exposure data is important for evaluating toxicity risk and biomonitoring is a critical tool for evaluating human exposure. Direct personal monitoring provides the most accurate estimation of a subject’s true dose, and non-invasive methods are advocated for quantifying exposure to xenobiotics. In this regard, there is a need to identify chemicals that are cleared in saliva at concentrations that can be quantified to support the implementation of this approach. This manuscript reviews the computational modeling approaches that are coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics and provides additional insight on species-dependent differences in partitioning that are of key importance for extrapolation. The primary mechanism by which xenobiotics leave the blood and enter saliva involves paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of xenobiotics transferred by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computationally modeled using compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of the Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis identified that both protein-binding and pKa (for weak acids and bases have significant impact on determining partitioning and species dependent differences based upon physiological variance. Future strategies are focused on an in vitro salivary acinar cell based system to experimentally determine and computationally predict salivary gland uptake and clearance for xenobiotics. It is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of chemical exposures in human

  4. Computational strategy for quantifying human pesticide exposure based upon a saliva measurement.

    Science.gov (United States)

    Timchalk, Charles; Weber, Thomas J; Smith, Jordan N

    2015-01-01

    Quantitative exposure data is important for evaluating toxicity risk and biomonitoring is a critical tool for evaluating human exposure. Direct personal monitoring provides the most accurate estimation of a subject's true dose, and non-invasive methods are advocated for quantifying exposure to xenobiotics. In this regard, there is a need to identify chemicals that are cleared in saliva at concentrations that can be quantified to support the implementation of this approach. This manuscript reviews the computational modeling approaches that are coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics and provides additional insight on species-dependent differences in partitioning that are of key importance for extrapolation. The primary mechanism by which xenobiotics leave the blood and enter saliva involves paracellular transport, passive transcellular diffusion, or transcellular active transport with the majority of xenobiotics transferred by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computationally modeled using compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of the Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa, and plasma protein-binding. Sensitivity analysis identified that both protein-binding and pKa (for weak acids and bases) have significant impact on determining partitioning and species dependent differences based upon physiological variance. Future strategies are focused on an in vitro salivary acinar cell based system to experimentally determine and computationally predict salivary gland uptake and clearance for xenobiotics. It is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of chemical exposures in human populations.

  5. The Human Toxome Collaboratorium: a shared environment for multi-omic computational collaboration within a consortium

    Directory of Open Access Journals (Sweden)

    Rick A Fasani

    2016-02-01

    Full Text Available The Human Toxome Project is part of a long-term vision to modernize toxicity testing for the 21st century. In the initial phase of the project, a consortium of six academic, commercial, and government organizations has partnered to map pathways of toxicity, using endocrine disruption as a model hazard. Experimental data is generated at multiple sites, and analyzed using a range of computational tools. While effectively gathering, managing, and analyzing the data for high-content experiments is a challenge in its own right, doing so for a growing number of -omics technologies, with larger data sets, across multiple institutions complicates the process. Interestingly, one of the most difficult, ongoing challenges has been the computational collaboration between the geographically separate institutions. Existing solutions cannot handle the growing heterogeneous data, provide a computational environment for consistent analysis, accommodate different workflows, and adapt to the constantly evolving methods and goals of a research project. To meet the needs of the project, we have created and managed The Human Toxome Collaboratorium, a shared computational environment hosted on third-party cloud services. The Collaboratorium provides a familiar virtual desktop, with a mix of commercial, open-source, and custom-built applications. It shares some of the challenges of traditional information technology, but with unique and unexpected constraints that emerge from the cloud. Here we describe the problems we faced, the current architecture of the solution, an example of its use, the major lessons we learned, and the future potential of the concept. In particular, the Collaboratorium represents a novel distribution method that could increase the reproducibility and reusability of results from similar large, multi-omic studies.

  6. U.S. Army weapon systems human-computer interface style guide. Version 2

    Energy Technology Data Exchange (ETDEWEB)

    Avery, L.W.; O`Mara, P.A.; Shepard, A.P.; Donohoo, D.T.

    1997-12-31

    A stated goal of the US Army has been the standardization of the human computer interfaces (HCIs) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of HCI design guidance documents. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA), now termed the Joint Technical Architecture-Army (JTA-A). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide, which resulted in the US Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide Version 1. Based on feedback from the user community, DISC4 further tasked PNNL to revise Version 1 and publish Version 2. The intent was to update some of the research and incorporate some enhancements. This document provides that revision. The purpose of this document is to provide HCI design guidance for the RT/NRT Army system domain across the weapon systems subdomains of ground, aviation, missile, and soldier systems. Each subdomain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their subdomains.

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  8. Computational prediction of vaccine strains for human influenza A (H3N2) viruses.

    Science.gov (United States)

    Steinbrück, L; Klingen, T R; McHardy, A C

    2014-10-01

    Human influenza A viruses are rapidly evolving pathogens that cause substantial morbidity and mortality in seasonal epidemics around the globe. To ensure continued protection, the strains used for the production of the seasonal influenza vaccine have to be regularly updated, which involves data collection and analysis by numerous experts worldwide. Computer-guided analysis is becoming increasingly important in this problem due to the vast amounts of generated data. We here describe a computational method for selecting a suitable strain for production of the human influenza A virus vaccine. It interprets available antigenic and genomic sequence data based on measures of antigenic novelty and rate of propagation of the viral strains throughout the population. For viral isolates sampled between 2002 and 2007, we used this method to predict the antigenic evolution of the H3N2 viruses in retrospective testing scenarios. When seasons were scored as true or false predictions, our method returned six true positives, three false negatives, eight true negatives, and one false positive, or 78% accuracy overall. In comparison to the recommendations by the WHO, we identified the correct antigenic variant once at the same time and twice one season ahead. Even though it cannot be ruled out that practical reasons such as lack of a sufficiently well-growing candidate strain may in some cases have prevented recommendation of the best-matching strain by the WHO, our computational decision procedure allows quantitative interpretation of the growing amounts of data and may help to match the vaccine better to predominating strains in seasonal influenza epidemics. Importance: Human influenza A viruses continuously change antigenically to circumvent the immune protection evoked by vaccination or previously circulating viral strains. To maintain vaccine protection and thereby reduce the mortality and morbidity caused by infections, regular updates of the vaccine strains are required. We

  9. Computer Simulation of Gd(III) Speciation in Human Interstitial Fluid

    Institute of Scientific and Technical Information of China (English)

    2003-01-01

    The speciation and distribution of Gd(III) in human interstitial fluid was studied by computer simulation. Meantime artificial neural network was applied to the estimation of log β values of complexes. The results show that the precipitate species, GdPO4 and Gd2(CO3)3, are the predominant species. Among soluble species, the free Gd(III), [Gd(HSA)] , [Gd(Ox)] and then the ternary complexes of Gd(III) with citrate are main species and [Gd3(OH)4] becomes the predominant species at the Gd(III) total concentration of 2.2×10-2mol/L.

  10. Human-Computer Interaction Handbook Fundamentals, Evolving Technologies, and Emerging Applications

    CERN Document Server

    Jacko, Julie A

    2012-01-01

    The third edition of a groundbreaking reference, The Human--Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications raises the bar for handbooks in this field. It is the largest, most complete compilation of HCI theories, principles, advances, case studies, and more that exist within a single volume. The book captures the current and emerging sub-disciplines within HCI related to research, development, and practice that continue to advance at an astonishing rate. It features cutting-edge advances to the scientific knowledge base as well as visionary perspe

  11. Machine takeover the growing threat to human freedom in a computer-controlled society

    CERN Document Server

    George, Frank Honywill

    1977-01-01

    Machine Takeover: The Growing Threat to Human Freedom in a Computer-Controlled Society discusses the implications of technological advancement. The title identifies the changes in society that no one is aware of, along with what this changes entails. The text first covers the information science, particularly the aspect of an automated system for information processing. Next, the selection deals with social implications of information science, such as information pollution. The text also tackles the concerns in the utilization of technology in order to manipulate the lives of people without th

  12. Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design

    Science.gov (United States)

    Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy

    We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.

  13. Debate Revives Old Arguments on HPV Vaccine

    Science.gov (United States)

    Shah, Nirvi

    2011-01-01

    The author reports on a Republican presidential debate which revives the contention over requiring middle school girls to be vaccinated against the virus that causes cervical cancer. At the September 12 debate, U.S. Representative Michele Bachmann, of Minnesota, and Rick Santorum, a former U.S. senator from Pennsylvania, attacked Texas Governor…

  14. Colleges Call Debate Contests out of Order

    Science.gov (United States)

    Young, Jeffrey R.

    2008-01-01

    Competitive debate has traditionally served as a laboratory for the democratic process and an important training ground for future policy makers. In recent years, a growing number of teams have played the game out of traditional bounds. They have turned events into commentaries on debate itself, in performances that bear little resemblance to the…

  15. Using Debates to Teach Information Ethics

    Science.gov (United States)

    Peace, A. Graham

    2011-01-01

    This experience report details the use of debates in a course on Information Ethics. Formal debates have been used in academia for centuries and create an environment in which students must think critically, communicate well and, above all, synthesize and evaluate the relevant classroom material. They also provide a break from the standard…

  16. Media Nihilism and the Presidential Debates.

    Science.gov (United States)

    Hogan, J. Michael

    1989-01-01

    Discusses the function of media nihilism--the rhetoric of "crisis and failure"--in the 1988 Presidential Debates. Examines journalists' debate questions, noting that they painted an almost wholly negative portrait of America. Suggests that the candidate who effectively "skewers" the media on its own hypocrisy should be declared…

  17. The Affirmative Action Debate: A Critical Reflection

    Science.gov (United States)

    van Wyk, Berte

    2010-01-01

    In this article I contend that we cannot divorce affirmative action from issues about race and racism. Further, debates on affirmative action have to acknowledge the power of words/concepts/definitions and how they can be constructed and used for the purposes of domination or liberation. I argue that, in debating affirmative action, we have to…

  18. Political Campaign Debating: A Selected, Annotated Bibliography.

    Science.gov (United States)

    Ritter, Kurt; Hellweg, Susan A.

    Noting that television debates have become a regular feature of the media politics by which candidates seek office, this annotated bibliography is particularly intended to assist teachers and researchers of debate, argumentation, and political communication. The 40 citations are limited to the television era of American politics and categorized as…

  19. Debate Revives Old Arguments on HPV Vaccine

    Science.gov (United States)

    Shah, Nirvi

    2011-01-01

    The author reports on a Republican presidential debate which revives the contention over requiring middle school girls to be vaccinated against the virus that causes cervical cancer. At the September 12 debate, U.S. Representative Michele Bachmann, of Minnesota, and Rick Santorum, a former U.S. senator from Pennsylvania, attacked Texas Governor…

  20. Orientaciones sobre los debates virtuales, setiembre 2010

    OpenAIRE

    Guitert Catasús, Montse; Romeu Fontanillas, Teresa

    2010-01-01

    Aquests materials docents pretenen dotar els estudiants d'eines i recursos per a la seva participació en debats virtuals. Estos materiales docentes pretenden dotar a los estudiantes de herramientas y recursos para su participación en debates virtuales. These teaching materials aim to provide students with tools and resources for participating in online forums.