WorldWideScience

Sample records for human computer debating

  1. Human Computation

    CERN Multimedia

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  2. Dreams versus Reality: Plenary Debate Session on Quantum Computing

    OpenAIRE

    Abbott, Derek

    2003-01-01

    This is a transcript of a debate on quantum computing that took place at 6:00pm, Wednesday, 4th June 2003, La Fonda Hotel, Santa Fe, USA. Transcript editor: Derek Abbott. Pro Team: Carlton M. Caves, Daniel Lidar, Howard Brandt, Alex Hamilton. Con Team: David Ferry, Julio Gea-Banacloche, Sergey Bezrukov, Laszlo Kish.

  3. Debat

    DEFF Research Database (Denmark)

    Brodersen, John

    2017-01-01

    DEBAT: Det er glædeligt, at flere debattører anerkender, at der sker overbehandling og overdiagnostik i sundhedsvæsnet. Næste skridt er, at branchen erkender medicinens mangler, skriver John Brodersen.......DEBAT: Det er glædeligt, at flere debattører anerkender, at der sker overbehandling og overdiagnostik i sundhedsvæsnet. Næste skridt er, at branchen erkender medicinens mangler, skriver John Brodersen....

  4. The Debates in Marx's Scholarship on Dimensions of Human nature ...

    African Journals Online (AJOL)

    Debates in Marx scholarship revolve around whether Karl Marx recognizes the individual and social dimensions of human nature and which of the two he prefers. This paper considers the debates in two ways. The first relates to Marx scholarship in favour of the individual dimension of human nature. The second concerns ...

  5. New dates reignite human evolution debate

    International Nuclear Information System (INIS)

    Nolch, G.

    2000-01-01

    Australian research into the Asian fossil record is unearthing controversial evidence with implications for the evolution of humans. Dr Jian-xin Zhao and Prof Ken Collerson from the University of Queensland's Department of Earth Sciences have been studying the fossil record in East Asia for clues to the early migration of hominids out of Africa in collaboration with Chinese archaeologists Dr Kai Hu of Nanjing University and Hankui Xu of Nanjing Institute of Palaeontology, Academia Sinica. Together they have been studying the remains of Nanjing Man, the name given to two Homo erectus skulls and the tooth of a third individual discovered in Tangshan Cave 250 km north-west of Shanghai. Dr Zhao and Prof Collerson have now employed more accurate dating techniques and materials, using a mass spectrometer to analyse the amounts of thorium-230 and uranium-234 in a calcite flowstone above the Nanjing Man fossil bed. Unlike fossil teeth, uranium and thorium became locked into the flowstone's crystal lattice when the calcite became crystallised. Because of this, the U-series decay in the calcite reliably records when the calcite crystallised. Taking into account the half-lives of uranium-234 and thorium-230, Dr Zhao and Prof Collerson determined the age of the calcite flowstone to be 577,000 years old (+44,000/-34,000 years). As the flowstone overlies the fossil bed, this date only defines the minimum age of the Nanjing Man fossil bed. For comparison, the dentine and enamel components of one fossil deer tooth collected from the Nanjing Man fossil bed yielded discordant mass spectrometric U-series ages of 388,000 and 130,100 years, respectively. Dr Zhao says that this 'strongly demonstrates the unreliability of fossil teeth as a chronometer'. Other evidence in the sediments surrounding the fossils has been the presence of flora and fauna that are typical of a glacial period. Dr Zhao therefore believes that the skulls could have been deposited during a glacial period

  6. Digital Humanities: the Next Big Thing? Enkele notities bij een ontluikend debat

    NARCIS (Netherlands)

    Besser, S.; Vaessens, T.

    2013-01-01

    In the form of provisional notes, the authors offer suggestions for an intensification of the theoretical debate on the digital humanities and computational literary studies in particular. From the perspective of poststructuralist theory, they address some of the epistemological underpinnings of

  7. Institutional violence and humanization in health: notes to debate.

    Science.gov (United States)

    Azeredo, Yuri Nishijima; Schraiber, Lilia Blima

    2017-09-01

    This paper starts from humanization policies and the academic debate around them to reflect about institutional violence inside health services. Based on research on scientific publications in Collective Health, it was observed that violence in relationships between health professionals and users - which is at the core of the humanization's debate - is conceptualized as an excessive power in exercise of professional authority. Using Hannah Arendt thinking as theoretical contributions regarding the concepts of 'authority', 'power' and 'violence', our objective is to define and rethink these phenomena. Melting these reflections with the history of institutionalization of health in Brazil, and especially the changes in medical work during the twentieth century, we conclude that the problem of institutional violence on health services is not based on excess of authority and power of professionals, but rather in its opposite. When there is a vacuum of professional authority, and relationships between people do not happen through power relations, there is space for the phenomenon of violence.

  8. Controlled human infection models for vaccine development: Zika virus debate.

    Science.gov (United States)

    Gopichandran, Vijayaprasad

    2018-01-01

    An ethics panel, convened by the National Institute of Health and other research bodies in the USA, disallowed researchers from the Johns Hopkins University and University of Vermont from performing controlled human infection of healthy volunteers to develop a vaccine against Zika virus infection. The members published their ethical analysis and recommendations in February 2017. They have elaborated on the risks posed by human challenge with Zika virus to the volunteers and other uninvolved third parties and have systematically analysed the social value of such a human challenge experiment. They have also posited some mandatory ethical requirements which should be met before allowing the infection of healthy volunteers with the Zika virus. This commentary elaborates on the debate on the ethics of the human challenge model for the development of a Zika virus vaccine and the role of systematic ethical analysis in protecting the interests of research participants. It further analyses the importance of this debate to the development of a Zika vaccine in India.

  9. Procreative liberty, enhancement and commodification in the human cloning debate.

    Science.gov (United States)

    Shapshay, Sandra

    2012-12-01

    The aim of this paper is to scrutinize a contemporary standoff in the American debate over the moral permissibility of human reproductive cloning in its prospective use as a eugenic enhancement technology. I shall argue that there is some significant and under-appreciated common ground between the defenders and opponents of human cloning. Champions of the moral and legal permissibility of cloning support the technology based on the right to procreative liberty provided it were to become as safe as in vitro fertilization and that it be used only by adults who seek to rear their clone children. However, even champions of procreative liberty oppose the commodification of cloned embryos, and, by extension, the resulting commodification of the cloned children who would be produced via such embryos. I suggest that a Kantian moral argument against the use of cloning as an enhancement technology can be shown to be already implicitly accepted to some extent by champions of procreative liberty on the matter of commodification of cloned embryos. It is in this argument against commodification that the most vocal critics of cloning such as Leon Kass and defenders of cloning such as John Robertson can find greater common ground. Thus, I endeavor to advance the debate by revealing a greater degree of moral agreement on some fundamental premises than hitherto recognized.

  10. Approaching Engagement towards Human-Engaged Computing

    DEFF Research Database (Denmark)

    Niksirat, Kavous Salehzadeh; Sarcar, Sayan; Sun, Huatong

    2018-01-01

    Debates regarding the nature and role of HCI research and practice have intensified in recent years, given the ever increasingly intertwined relations between humans and technologies. The framework of Human-Engaged Computing (HEC) was proposed and developed over a series of scholarly workshops to...

  11. Human Computer Music Performance

    OpenAIRE

    Dannenberg, Roger B.

    2012-01-01

    Human Computer Music Performance (HCMP) is the study of music performance by live human performers and real-time computer-based performers. One goal of HCMP is to create a highly autonomous artificial performer that can fill the role of a human, especially in a popular music setting. This will require advances in automated music listening and understanding, new representations for music, techniques for music synchronization, real-time human-computer communication, music generation, sound synt...

  12. Methodological debates in human rights research: a case study of human trafficking in South Africa

    NARCIS (Netherlands)

    Vigneswaran, D.

    2012-01-01

    Debates over human trafficking are riddled with methodological dilemmas. Agencies with vested interests in the anti-trafficking agenda advance claims about numbers of victims, level of organized trafficking and scale of exploitation, but with limited data and using questionable techniques. Skeptics,

  13. An Interdisciplinary Bibliography for Computers and the Humanities Courses.

    Science.gov (United States)

    Ehrlich, Heyward

    1991-01-01

    Presents an annotated bibliography of works related to the subject of computers and the humanities. Groups items into textbooks and overviews; introductions; human and computer languages; literary and linguistic analysis; artificial intelligence and robotics; social issue debates; computers' image in fiction; anthologies; writing and the…

  14. Parliamentary cultures and human embryos: the Dutch and British debates compared

    NARCIS (Netherlands)

    Kirejczyk, Marta

    1999-01-01

    Twenty years ago, the technology of in vitro fertilization created a new artefact: the human embryo outside the woman's body. In many countries, political debates developed around this artefact. One of the central questions in these debates is whether it is permissible to use human embryos in

  15. Ubiquitous human computing.

    Science.gov (United States)

    Zittrain, Jonathan

    2008-10-28

    Ubiquitous computing means network connectivity everywhere, linking devices and systems as small as a drawing pin and as large as a worldwide product distribution chain. What could happen when people are so readily networked? This paper explores issues arising from two possible emerging models of ubiquitous human computing: fungible networked brainpower and collective personal vital sign monitoring.

  16. When computers were human

    CERN Document Server

    Grier, David Alan

    2013-01-01

    Before Palm Pilots and iPods, PCs and laptops, the term ""computer"" referred to the people who did scientific calculations by hand. These workers were neither calculating geniuses nor idiot savants but knowledgeable people who, in other circumstances, might have become scientists in their own right. When Computers Were Human represents the first in-depth account of this little-known, 200-year epoch in the history of science and technology. Beginning with the story of his own grandmother, who was trained as a human computer, David Alan Grier provides a poignant introduction to the wider wo

  17. Handbook of human computation

    CERN Document Server

    Michelucci, Pietro

    2013-01-01

    This volume addresses the emerging area of human computation, The chapters, written by leading international researchers, explore existing and future opportunities to combine the respective strengths of both humans and machines in order to create powerful problem-solving capabilities. The book bridges scientific communities, capturing and integrating the unique perspective and achievements of each. It coalesces contributions from industry and across related disciplines in order to motivate, define, and anticipate the future of this exciting new frontier in science and cultural evolution. Reade

  18. Public opinions about human enhancement can enhance the expert-only debate. A review study

    NARCIS (Netherlands)

    Dijkstra, Anne M.; Schuijff, Mirjam

    2016-01-01

    Human enhancement, the non-medical use of biomedical technologies to improve the human body or performance beyond their ‘natural’ limitations, is a growing trend. At the same time, the use of these technologies has societal consequences. In societal debates about human enhancement, however, it is

  19. National Insecurity and Human Rights: Democracies Debate Counterterrorism

    OpenAIRE

    Brysk, Alison; Shafir, Gershon

    2007-01-01

    Human rights is all too often the first casualty of national insecurity. How can democracies cope with the threat of terror while protecting human rights? This timely volume compares the lessons of the United States and Israel with the "best-case scenarios" of the United Kingdom, Canada, Spain, and Germany. It demonstrates that threatened democracies have important options, and democratic governance, the rule of law, and international cooperation are crucial foundations for counterterror policy.

  20. Competitive debate classroom as a cooperative learning technique for the human resources subject

    Directory of Open Access Journals (Sweden)

    Guillermo A. SANCHEZ PRIETO

    2018-01-01

    Full Text Available The paper shows an academic debate model as a cooperative learning technique for teaching human resources at University. The general objective of this paper is to conclude if academic debate can be included in the category of cooperative learning. The Specific objective it is presenting a model to implement this technique. Thus the first part of the paper shows the concept of cooperative learning and its main characteristics. The second part presents the debate model believed to be labelled as cooperative learning. Last part concludes with the characteristics of the model that match different aspects or not of the cooperative learning.

  1. A computational model of self-efficacy's various effects on performance: Moving the debate forward.

    Science.gov (United States)

    Vancouver, Jeffrey B; Purl, Justin D

    2017-04-01

    Self-efficacy, which is one's belief in one's capacity, has been found to both positively and negatively influence effort and performance. The reasons for these different effects have been a major topic of debate among social-cognitive and perceptual control theorists. In particular, the findings of various self-efficacy effects has been motivated by a perceptual control theory view of self-regulation that social-cognitive theorists' question. To provide more clarity to the theoretical arguments, a computational model of the multiple processes presumed to create the positive, negative, and null effects for self-efficacy is presented. Building on an existing computational model of goal choice that produces a positive effect for self-efficacy, the current article adds a symbolic processing structure used during goal striving that explains the negative self-efficacy effect observed in recent studies. Moreover, the multiple processes, operating together, allow the model to recreate the various effects found in a published study of feedback ambiguity's moderating role on the self-efficacy to performance relationship (Schmidt & DeShon, 2010). Discussion focuses on the implications of the model for the self-efficacy debate, alternative computational models, the overlap between control theory and social-cognitive theory explanations, the value of using computational models for resolving theoretical disputes, and future research and directions the model inspires. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  2. How to depolarise the ethical debate over human embryonic stem cell research (and other ethical debates too!)

    NARCIS (Netherlands)

    Espinoza, N.; Peterson, M.B.

    2012-01-01

    The contention of this paper is that the current ethical debate over embryonic stem cell research is polarised to an extent that is not warranted by the underlying ethical conflict. It is argued that the ethical debate can be rendered more nuanced, and less polarised, by introducing non-binary

  3. How to depolarise the ethical debate over human embryonic stem cell research (and other ethical debates too!).

    Science.gov (United States)

    Espinoza, Nicolas; Peterson, Martin

    2012-08-01

    The contention of this paper is that the current ethical debate over embryonic stem cell research is polarised to an extent that is not warranted by the underlying ethical conflict. It is argued that the ethical debate can be rendered more nuanced, and less polarised, by introducing non-binary notions of moral rightness and wrongness. According to the view proposed, embryonic stem cell research--and possibly other controversial activities too--can be considered 'a little bit right and a little bit wrong'. If this idea were to become widely accepted, the ethical debate would, for conceptual reasons, become less polarised.

  4. Thinking and Caring about Indigenous Peoples' Human Rights: Swedish Students Writing History beyond Scholarly Debate

    Science.gov (United States)

    Nygren, Thomas

    2016-01-01

    According to national and international guidelines, schools should promote historical thinking and foster moral values. Scholars have debated, but not analysed in depth in practice, whether history education can and should hold a normative dimension. This study analyses current human rights education in two Swedish senior high school groups, in…

  5. Political Minimalism and Social Debates: The Case of Human-Enhancement Technologies.

    Science.gov (United States)

    Rodríguez-Alcázar, Javier

    2017-09-01

    A faulty understanding of the relationship between morality and politics encumbers many contemporary debates on human enhancement. As a result, some ethical reflections on enhancement undervalue its social dimensions, while some social approaches to the topic lack normative import. In this essay, I use my own conception of the relationship between ethics and politics, which I call "political minimalism," in order to support and strengthen the existing social perspectives on human-enhancement technologies.

  6. Improvement of debate competence: an outcome of an introductory course for medical humanities

    Directory of Open Access Journals (Sweden)

    Kyung Hee Chun

    2016-03-01

    Full Text Available Purpose: Academic debate is an effective method to enhance the competences of critical thinking, problem solving, communication skills and cooperation skills. The present study examined the improvement of debate competence which is an outcome of debate-based flipped learning. Methods: A questionnaire was administrated to second-year premedical school students at Yeungnam University. In total 45 students participated in the survey. The survey questionnaire was composed of 60 items of eight subfactors on debate competence. To investigate the homogeneous of low and high achievement groups, 18 items on empathy and 75 items on critical thinking scales were used. To compare the pretest with posttest scores, data was analyzed using paired sample t-test. Results: There were no significant differences between low and high achievement groups by average grade at the beginning of the semester. There was a significant improvement in high achievers on the logical argumentation (p<0.001, proficiency in inquiry (p<0.01, active participation (p<0.001, ability to investigate and analyze (p<0.001, observance of debate rules (p<0.05, and acceptability (p<0.05. Even in low achievers, active participation (p<0.05 and ability to investigate and analyze (p<0.01 were significantly improved. Conclusion: Results showed that students could improve their debate competence by the debate-based flipped learning. A prospective and comparative study on the communication and teamwork competences needs to be conducted in the future. It is suggested that in-depth discussion for the curriculum design and teaching will be needed in terms of the effectiveness and the outcomes of the medical humanities.

  7. Making IBM's Computer, Watson, Human

    Science.gov (United States)

    Rachlin, Howard

    2012-01-01

    This essay uses the recent victory of an IBM computer (Watson) in the TV game, "Jeopardy," to speculate on the abilities Watson would need, in addition to those it has, to be human. The essay's basic premise is that to be human is to behave as humans behave and to function in society as humans function. Alternatives to this premise are considered…

  8. The Artilect Debate

    Science.gov (United States)

    de Garis, Hugo; Halioris, Sam

    Twenty-first-century technologies will allow the creation of massively intelligent machines, many trillions of times as smart, fast, and durable as humans. Issues concerning industrial, consumer, and military applications of mobile autonomous robots, cyborgs, and computer-based AI systems could divisively split humanity into ideological camps regarding whether "artilects" (artificial intellects) should be built or not. The artilect debate, unlike any before it, could dominate the 21st-century political landscape, and has the potential to cause conflict on a global scale. Research is needed to inform policy and individual decisions; and healthy debate should be initiated now to prepare institutions and individuals alike for the impact of AI.

  9. The pros and cons of human therapeutic cloning in the public debate.

    Science.gov (United States)

    Nippert, Irmgard

    2002-09-11

    Few issues linked to genetic research have raised as much controversial debate as the use of somatic cell nuclear transfer technology to create embryos specifically for stem cell research. Whereas European countries unanimously agree that reproductive cloning should be prohibited there is no agreement to be found on whether or not research into therapeutic cloning should be permitted. Since the UK took the lead and voted in favour of regulations allowing therapeutic cloning the public debate has intensified on the Continent. This debate reflects the wide spectrum of diverse religious and secular moralities that are prevalent in modern multicultural European democratic societies. Arguments range from putting forward strictly utilitarian views that weight the moral issues involved against the potential benefits that embryonic stem cell research may harbour to considering the embryo as a human being, endowed with human dignity and human rights from the moment of its creation, concluding that its use for research is unethical and should be strictly prohibited. Given the current state of dissension among the various European states, it is difficult to predict whether 'non-harmonisation' will prevail or whether in the long run 'harmonisation' of legislation that will allow stem cell research will evolve in the EU.

  10. Human dignity and the future of the voluntary active euthanasia debate in South Africa.

    Science.gov (United States)

    Jordaan, Donrich W

    2017-04-25

    The issue of voluntary active euthanasia was thrust into the public policy arena by the Stransham-Ford lawsuit. The High Court legalised voluntary active euthanasia - however, ostensibly only in the specific case of Mr Stransham-Ford. The Supreme Court of Appeal overturned the High Court judgment on technical grounds, not on the merits. This means that in future the courts can be approached again to consider the legalisation of voluntary active euthanasia. As such, Stransham-Ford presents a learning opportunity for both sides of the legalisation divide. In particular, conceptual errors pertaining to human dignity were made in Stransham-Ford, and can be avoided in future. In this article, I identify these errors and propose the following three corrective principles to inform future debate on the subject: (i) human dignity is violable; (ii) human suffering violates human dignity; and (iii) the 'natural' causes of suffering due to terminal illness do not exclude the application of human dignity.

  11. Darfur debated

    Directory of Open Access Journals (Sweden)

    Roberta Cohen

    2007-12-01

    Full Text Available Bruising debates within the human rights and humanitarian communities have centered on the numbers who have died in Darfur, the use of the term genocide, the efficacy of military versus political solutions and the extent to which human rights advocacy can undermine humanitarian programmes on the ground.

  12. Minimal mobile human computer interaction

    NARCIS (Netherlands)

    el Ali, A.

    2013-01-01

    In the last 20 years, the widespread adoption of personal, mobile computing devices in everyday life, has allowed entry into a new technological era in Human Computer Interaction (HCI). The constant change of the physical and social context in a user's situation made possible by the portability of

  13. Steve Clarke, Julian Savulescu, Tony Coady, Alberto Giubilini, and Sagar Sanyal: the ethics of human enhancement: understanding the debate

    NARCIS (Netherlands)

    Frank, L.E.

    2017-01-01

    A decade of research on the ethics of human enhancement has produced a vast literature. This collection is an excellent contribution to the field; it fulfills and exceeds the promises of its two subsections: understanding and advancing the debate. Section 1, Understanding the Debate includes eight

  14. Humans, computers and wizards human (simulated) computer interaction

    CERN Document Server

    Fraser, Norman; McGlashan, Scott; Wooffitt, Robin

    2013-01-01

    Using data taken from a major European Union funded project on speech understanding, the SunDial project, this book considers current perspectives on human computer interaction and argues for the value of an approach taken from sociology which is based on conversation analysis.

  15. The misuse of Kant in the debate about a market for human body parts.

    Science.gov (United States)

    Gerrand, N

    1999-01-01

    Passages from the writings of Immanuel Kant concerning how a person should treat her body are often cited in the present-day debate about a market for human body parts. In this paper, I demonstrate that this has been a misuse of Kant because unlike those who cite him, Kant was not primarily concerned with prohibiting the sale of body parts. In the first section, I argue that once these particular passages are understood against the background of Kant's moral philosophy, they indicate he had much broader concerns relating to the correct moral relationship a rational person should have with her body. In the second section, I examine Stephen Munzer's unusually detailed analysis of these passages, but conclude that like those who have provided less detailed analyses, he also fails fully to understand the rationale for Kant's various prescriptions and prohibitions concerning the treatment of human body parts, and in doing so misrepresents Kant's position.

  16. Artifical Intelligence for Human Computing

    NARCIS (Netherlands)

    Huang, Th.S.; Nijholt, Antinus; Pantic, Maja; Pentland, A.; Unknown, [Unknown

    2007-01-01

    This book constitutes the thoroughly refereed post-proceedings of two events discussing AI for Human Computing: one Special Session during the Eighth International ACM Conference on Multimodal Interfaces (ICMI 2006), held in Banff, Canada, in November 2006, and a Workshop organized in conjunction

  17. Human dignity and the future of the voluntary active euthanasia debate in South Africa

    Directory of Open Access Journals (Sweden)

    Donrich W Jordaan

    2017-05-01

    Full Text Available The issue of voluntary active euthanasia was thrust into the public policy arena by the Stransham-Ford lawsuit. The High Court legalised voluntary active euthanasia – however, ostensibly only in the specific case of Mr Stransham-Ford. The Supreme Court of Appeal overturned the High Court judgment on technical grounds, not on the merits. This means that in future the courts can be approached again to consider the legalisation of voluntary active euthanasia. As such, Stransham-Ford presents a learning opportunity for both sides of the legalisation divide. In particular, conceptual errors pertaining to human dignity were made in Stransham-Ford, and can be avoided in future. In this article, I identify these errors and propose the following three corrective principles to inform future debate on the subject: (i human dignity is violable; (ii human suffering violates human dignity; and (iii the ‘natural’ causes of suffering due to terminal illness do not exclude the application of human dignity.

  18. Guest Editorial Special Issue on Human Computing

    NARCIS (Netherlands)

    Pantic, Maja; Santos, E.; Pentland, A.; Nijholt, Antinus

    2009-01-01

    The seven articles in this special issue focus on human computing. Most focus on two challenging issues in human computing, namely, machine analysis of human behavior in group interactions and context-sensitive modeling.

  19. Human ear recognition by computer

    CERN Document Server

    Bhanu, Bir; Chen, Hui

    2010-01-01

    Biometrics deals with recognition of individuals based on their physiological or behavioral characteristics. The human ear is a new feature in biometrics that has several merits over the more common face, fingerprint and iris biometrics. Unlike the fingerprint and iris, it can be easily captured from a distance without a fully cooperative subject, although sometimes it may be hidden with hair, scarf and jewellery. Also, unlike a face, the ear is a relatively stable structure that does not change much with the age and facial expressions. ""Human Ear Recognition by Computer"" is the first book o

  20. Human-centered Computing: Toward a Human Revolution

    OpenAIRE

    Jaimes, Alejandro; Gatica-Perez, Daniel; Sebe, Nicu; Huang, Thomas S.

    2007-01-01

    Human-centered computing studies the design, development, and deployment of mixed-initiative human-computer systems. HCC is emerging from the convergence of multiple disciplines that are concerned both with understanding human beings and with the design of computational artifacts.

  1. Improvement of debate competence: an outcome of an introductory course for medical humanities.

    Science.gov (United States)

    Chun, Kyung Hee; Lee, Young Hwan

    2016-03-01

    Academic debate is an effective method to enhance the competences of critical thinking, problem solving, communication skills and cooperation skills. The present study examined the improvement of debate competence which is an outcome of debate-based flipped learning. A questionnaire was administrated to second-year premedical school students at Yeungnam University. In total 45 students participated in the survey. The survey questionnaire was composed of 60 items of eight subfactors on debate competence. To investigate the homogeneous of low and high achievement groups, 18 items on empathy and 75 items on critical thinking scales were used. To compare the pretest with posttest scores, data was analyzed using paired sample t-test. There were no significant differences between low and high achievement groups by average grade at the beginning of the semester. There was a significant improvement in high achievers on the logical argumentation (phumanities.

  2. Manual Labour, Intellectual Labour and Digital (Academic Labour. The Practice/Theory Debate in the Digital Humanities

    Directory of Open Access Journals (Sweden)

    Christophe Magis

    2018-01-01

    Full Text Available Although it hasn’t much been considered as such, the Digital Humanities movements (or at least the most theoretically informed parts of it offers a critique “from within” of the recent mutation of the higher education and research systems. This paper offers an analysis, from a Critical Theory perspective, of a key element of this critique: the theory vs. practice debate, which, in the Digital Humanities, is translated into the famous “hack” versus “yack” motto, where DHers usually call for the pre-eminence of the former over the latter. I show how this debate aims to criticize the social situation of employment in academia in the digital age and can further be interpreted with the Cultural industry theoretical concept, as a continuance of the domination of the intellectual labour (ie. yack in this case over the manual labour (hack. Nevertheless, I argue that, pushing this debate to its very dialectical limit in the post-industrial academic labour situation, one realizes that the two terms aren’t in opposition anymore: the actual theory as well as the actual practice are below their very critical concepts in the academic labour. Therefore, I call for a reconfiguration of this debate, aiming at the rediscovering of an actual theory in the academic production, as well as a rediscovering of a praxis, the latter being outside of the scientific realm and rules: it is political.

  3. Human action in a Genomic Era: debates on human nature Ação humana na Era do Genoma: debates sobre a natureza humana

    Directory of Open Access Journals (Sweden)

    Tatiana Gomes Rotondaro

    2009-03-01

    Full Text Available The supposed properties of 'genes' have led natural scientists to claim authority to explain the reasons of human action, behavior, and even human nature, which has traditionally been the object of study of the humanities. The aim of this paper is to discuss the possibilities of sociological theory dealing with the biological reductionism that establishes the strict articulation between 'human nature' and 'human action', presented in several speeches and papers by scientists and journalists and supported by features of 'genes'. I intend to argue that sociological theories may broaden their scope of analysis by encompassing biological dimensions, which does not necessarily mean adopting a biological reductionist approach.As supostas propriedades dos 'genes' levam os cientistas naturais a reivindicar autoridade para explicar as razões de atos, comportamentos e até a natureza humana, tradicional objeto de estudo das ciências humanas. O objetivo deste artigo é discutir as possibilidades de a teoria sociológica lidar com o reducionismo biológico, que estabelece uma articulação exata entre 'natureza humana' e 'ação humana'. Tal reducionismo está presente em discursos e artigos de cientistas e jornalistas, e é embasado por características dos 'genes'. Argumento que as teorias sociológicas podem ampliar suas possibilidades de análise se incorporarem dimensões biológicas, o que não implica necessariamente adotar uma abordagem reducionista.

  4. Cooperation in human-computer communication

    OpenAIRE

    Kronenberg, Susanne

    2000-01-01

    The goal of this thesis is to simulate cooperation in human-computer communication to model the communicative interaction process of agents in natural dialogs in order to provide advanced human-computer interaction in that coherence is maintained between contributions of both agents, i.e. the human user and the computer. This thesis contributes to certain aspects of understanding and generation and their interaction in the German language. In spontaneous dialogs agents cooperate by the pro...

  5. The Slippery Slope Argument in the Ethical Debate on Genetic Engineering of Humans.

    Science.gov (United States)

    Walton, Douglas

    2017-12-01

    This article applies tools from argumentation theory to slippery slope arguments used in current ethical debates on genetic engineering. Among the tools used are argumentation schemes, value-based argumentation, critical questions, and burden of proof. It is argued that so-called drivers such as social acceptance and rapid technological development are also important factors that need to be taken into account alongside the argumentation scheme. It is shown that the slippery slope argument is basically a reasonable (but defeasible) form of argument, but is often flawed when used in ethical debates because of failures to meet the requirements of its scheme.

  6. Human Computing and Machine Understanding of Human Behavior: A Survey

    NARCIS (Netherlands)

    Pantic, Maja; Pentland, Alex; Nijholt, Antinus; Huang, Thomas; Quek, F.; Yang, Yie

    2006-01-01

    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing, which we will call human computing, should

  7. Debate: Forced Labour, Slavery and Human Trafficking: When do definitions matter?

    Directory of Open Access Journals (Sweden)

    Roger Plant

    2015-09-01

    Full Text Available We can spend a lot of time debating the connections or essential differences between the concepts of trafficking, forced labour, slavery and modern slavery, or slavery-like practices. Some insist that trafficking is a subset of forced labour, others the reverse. The arguments between academics, bureaucracies and even government agencies have often been vitriolic.

  8. Language evolution and human-computer interaction

    Science.gov (United States)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  9. Occupational stress in human computer interaction.

    Science.gov (United States)

    Smith, M J; Conway, F T; Karsh, B T

    1999-04-01

    There have been a variety of research approaches that have examined the stress issues related to human computer interaction including laboratory studies, cross-sectional surveys, longitudinal case studies and intervention studies. A critical review of these studies indicates that there are important physiological, biochemical, somatic and psychological indicators of stress that are related to work activities where human computer interaction occurs. Many of the stressors of human computer interaction at work are similar to those stressors that have historically been observed in other automated jobs. These include high workload, high work pressure, diminished job control, inadequate employee training to use new technology, monotonous tasks, por supervisory relations, and fear for job security. New stressors have emerged that can be tied primarily to human computer interaction. These include technology breakdowns, technology slowdowns, and electronic performance monitoring. The effects of the stress of human computer interaction in the workplace are increased physiological arousal; somatic complaints, especially of the musculoskeletal system; mood disturbances, particularly anxiety, fear and anger; and diminished quality of working life, such as reduced job satisfaction. Interventions to reduce the stress of computer technology have included improved technology implementation approaches and increased employee participation in implementation. Recommendations for ways to reduce the stress of human computer interaction at work are presented. These include proper ergonomic conditions, increased organizational support, improved job content, proper workload to decrease work pressure, and enhanced opportunities for social support. A model approach to the design of human computer interaction at work that focuses on the system "balance" is proposed.

  10. Human Adaptation to the Computer.

    Science.gov (United States)

    1986-09-01

    8217"’ TECHNOSTRESS " 5 5’..,:. VI I. CONCLUSIONS-------------------------59 -- LIST OF REFERENCES-------------------------61 BI BLI OGRAPHY...computer has not developed. Instead, what has developed is a "modern disease of adaptation" called " technostress ," a phrase coined by Brod. Craig...34 technostress ." Managers (according to Brod) have been implementing computers in ways that contribute directly to this stress: [Ref. 3:p. 38) 1. They

  11. Challenges for Virtual Humans in Human Computing

    NARCIS (Netherlands)

    Reidsma, Dennis; Ruttkay, Z.M.; Huang, T; Nijholt, Antinus; Pantic, Maja; Pentland, A.

    The vision of Ambient Intelligence (AmI) presumes a plethora of embedded services and devices that all endeavor to support humans in their daily activities as unobtrusively as possible. Hardware gets distributed throughout the environment, occupying even the fabric of our clothing. The environment

  12. Issues around radiological protection of the environment and its integration with protection of humans: promoting debate on the way forward

    International Nuclear Information System (INIS)

    Brownless, G P

    2007-01-01

    This paper explores issues to consider around integrating direct, explicit protection of the environment into the current system of radiological protection, which is focused on the protection of humans. Many issues around environmental radiological protection have been discussed, and ready-to-use toolboxes have been constructed for assessing harm to non-human biota, but it is not clear how (or even if) these should be fitted into the current system of protection. Starting from the position that the current approach to protecting the environment (namely that it follows from adequately protecting humans) is generally effective, this paper considers how explicit radiological protection of the environment can be integrated with the current system, through developing a 'worked example' of how this could be done and highlighting issues peculiar to protection of the environment. The aim of the paper is to promote debate on this topic, with the ultimate aim of ensuring that any changes to the system are consensual and robust

  13. Language and values in the human cloning debate: a web-based survey of scientists and Christian fundamentalist pastors.

    Science.gov (United States)

    Weasel, Lisa H; Jensen, Eric

    2005-04-01

    Over the last seven years, a major debate has arisen over whether human cloning should remain legal in the United States. Given that this may be the 'first real global and simultaneous news story on biotechnology' (Einsiedel et al., 2002, p.313), nations around the world have struggled with the implications of this newly viable scientific technology, which is often also referred to as somatic cell nuclear transfer. Since the successful cloning of Dolly the sheep in 1997, and with increasing media attention paid to the likelihood of a successful human reproductive clone coupled with research suggesting the medical potential of therapeutic cloning in humans, members of the scientific community and Christian fundamentalist leaders have become increasingly vocal in the debate over U.S. policy decisions regarding human cloning (Wilmut, 2000). Yet despite a surfeit of public opinion polls and widespread opining in the news media on the topic of human cloning, there have been no empirical studies comparing the views of scientists and Christian fundamentalists in this debate (see Evans, 2002a for a recent study of opinion polls assessing religion and attitudes toward cloning). In order to further investigate the values that underlie scientists' and Christian fundamentalist leader's understanding of human cloning, as well as their differential use of language in communicating about this issue, we conducted an open-ended, exploratory survey of practicing scientists in the field of molecular biology and Christian fundamentalist pastors. We then analyzed the responses from this survey using qualitative discourse analysis. While this was not necessarily a representative sample (in quantitative terms, see Gaskell & Bauer, 2000) of each of the groups and the response rate was limited, this approach was informative in identifying both commonalities between the two groups, such as a focus on ethical concerns about reproductive cloning and the use of scientific terminology, as well

  14. Object categorization: computer and human vision perspectives

    National Research Council Canada - National Science Library

    Dickinson, Sven J

    2009-01-01

    .... The result of a series of four highly successful workshops on the topic, the book gathers many of the most distinguished researchers from both computer and human vision to reflect on their experience...

  15. Human law and computer law comparative perspectives

    CERN Document Server

    Hildebrandt, Mireille

    2014-01-01

    This book probes the epistemological and hermeneutic implications of data science and artificial intelligence for democracy and the Rule of Law, and the challenges posed by computing technologies traditional legal thinking and the regulation of human affairs.

  16. The bereavement gap: grief, human dignity and legal personhood in the debate over Zoe's law.

    Science.gov (United States)

    Robert, Hannah

    2014-12-01

    A Bill before the New South Wales Parliament attempted to re-frame harm to late-term fetuses as grievous bodily harm to the fetus itself rather than (under the existing law) grievous bodily harm to the mother. To achieve this, the Bill extended legal personhood to the fetus for a limited number of offences. The Bill was brought on behalf of Brodie Donegan, who lost her daughter Zoe at 32 weeks' gestation when Donegan was hit by a drug-affected driver. This article asks what the perspective of a grieving mother can bring to the debate, in terms of helping the criminal law accurately come to grips with the complexity of pregnancy and the specific harm of fetal loss. It assesses the likely impacts of a change to fetal personhood and suggests an alternative legislative approach which is less likely to result in an erosion of bodily autonomy for pregnant women.

  17. Contemporary debates on social-environmental conflicts, extractivism and human rights in Latin America

    DEFF Research Database (Denmark)

    Raftopoulos, Malayna

    2017-01-01

    This opening contribution to ‘Social-Environmental Conflicts, Extractivism and Human Rights’ analyses how human rights have emerged as a weapon in the political battleground over the environment as natural resource extraction has become an increasingly contested and politicised form of development....... It examines the link between human rights abuses and extractivism, arguing that this new cycle of protests has opened up new political spaces for human rights based resistance. Furthermore, the explosion of socio-environmental conflicts that have accompanied the expansion and politicisation of natural...... resources has highlighted the different conceptualisations of nature, development and human rights that exist within Latin America. While new human rights perspectives are emerging in the region, mainstream human rights discourses are providing social movements and activists with the legal power...

  18. Fundamentals of human-computer interaction

    CERN Document Server

    Monk, Andrew F

    1985-01-01

    Fundamentals of Human-Computer Interaction aims to sensitize the systems designer to the problems faced by the user of an interactive system. The book grew out of a course entitled """"The User Interface: Human Factors for Computer-based Systems"""" which has been run annually at the University of York since 1981. This course has been attended primarily by systems managers from the computer industry. The book is organized into three parts. Part One focuses on the user as processor of information with studies on visual perception; extracting information from printed and electronically presented

  19. Modeling multimodal human-computer interaction

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2004-01-01

    Incorporating the well-known Unified Modeling Language into a generic modeling framework makes research on multimodal human-computer interaction accessible to a wide range off software engineers. Multimodal interaction is part of everyday human discourse: We speak, move, gesture, and shift our gaze

  20. From Human Nature to Moral Judgments : Reframing Debates about Disability and Enhancement

    NARCIS (Netherlands)

    Harnacke, C.E.

    2015-01-01

    My goal in my dissertation is to develop an account of how a theory of human nature should be integrated into bioethics and to show what bioethics can gain from using this account. I explore the relevance of human nature for moral argumentation, and especially for bioethics. Thereby, I focus on

  1. Reproductive cloning in humans and therapeutic cloning in primates: is the ethical debate catching up with the recent scientific advances?

    Science.gov (United States)

    Camporesi, S; Bortolotti, L

    2008-09-01

    After years of failure, in November 2007 primate embryonic stem cells were derived by somatic cellular nuclear transfer, also known as therapeutic cloning. The first embryo transfer for human reproductive cloning purposes was also attempted in 2006, albeit with negative results. These two events force us to think carefully about the possibility of human cloning which is now much closer to becoming a reality. In this paper we tackle this issue from two sides, first summarising what scientists have achieved so far, then discussing some of the ethical arguments in favour and against human cloning which are debated in the context of policy making and public consultation. Therapeutic cloning as a means to improve and save lives has uncontroversial moral value. As to human reproductive cloning, we consider and assess some common objections and failing to see them as conclusive. We do recognise, though, that there will be problems at the level of policy and regulation that might either impair the implementation of human reproductive cloning or make its accessibility restricted in a way that could become difficult to justify on moral grounds. We suggest using the time still available before human reproductive cloning is attempted successfully to create policies and institutions that can offer clear directives on its legitimate applications on the basis of solid arguments, coherent moral principles, and extensive public consultation.

  2. Human computing and machine understanding of human behavior: A survey

    NARCIS (Netherlands)

    Pentland, Alex; Huang, Thomas S.; Huang, Th.S.; Nijholt, Antinus; Pantic, Maja; Pentland, A.

    2007-01-01

    A widely accepted prediction is that computing will move to the background, weaving itself into the fabric of our everyday living spaces and projecting the human user into the foreground. If this prediction is to come true, then next generation computing should be about anticipatory user interfaces

  3. An analysis of the Human Papilloma Virus vaccine debate on MySpace blogs.

    Science.gov (United States)

    Keelan, Jennifer; Pavri, Vera; Balakrishnan, Ravin; Wilson, Kumanan

    2010-02-10

    The roll out of HPV immunization programs across the United States was hindered by controversy. We tracked the debate in the United States through MySpace, then the most popular social networking site, in order to better understand the public's reaction to the vaccine. We searched MySpace for all blog discourse related to HPV immunization. We analyzed each blog according to the overall portrayal of HPV immunization, identified the characteristics of the bloggers, and developed a content analysis to categorize the types of supporting arguments made. 303 blogs met our inclusion criteria. 157 (52%) of the blogs were classified as positive, 129 (43%) as negative, and 17 (6%) were ambivalent toward HPV immunization. Positive blogs generally argued that HPV infection was effective and there were no reasonable alternatives to immunizing. Negative blogs focused on the risks of immunizing and relied heavily on vaccine-critical publications to support their viewpoint. Of the blogs where gender could be identified, 75 (25%) were posted by men and 214 (71%) by women. 60% of blogs posted by men were explicitly critical about HPV immunization versus 36% of women's blogs. Male bloggers also had larger networks of friends. We describe a novel and promising approach to the surveillance of public opinions and attitudes toward immunization. In our analysis, men were far more likely to hold negative views about HPV immunization than women and disseminate negative messages through larger social networks. Blog analysis is a useful tool for Public health officials to profile vaccine criticism and to design appropriate educational information tailored to respond to alternative media/alternative information actively disseminated via social media tools. Public health officials should examine mechanisms by which to leverage this media to better communicate their message through existing networks and to engage in on-going dialogue with the public. Copyright (c) 2009 Elsevier Ltd. All rights

  4. Pilots of the future - Human or computer?

    Science.gov (United States)

    Chambers, A. B.; Nagel, D. C.

    1985-01-01

    In connection with the occurrence of aircraft accidents and the evolution of the air-travel system, questions arise regarding the computer's potential for making fundamental contributions to improving the safety and reliability of air travel. An important result of an analysis of the causes of aircraft accidents is the conclusion that humans - 'pilots and other personnel' - are implicated in well over half of the accidents which occur. Over 70 percent of the incident reports contain evidence of human error. In addition, almost 75 percent show evidence of an 'information-transfer' problem. Thus, the question arises whether improvements in air safety could be achieved by removing humans from control situations. In an attempt to answer this question, it is important to take into account also certain advantages which humans have in comparison to computers. Attention is given to human error and the effects of technology, the motivation to automate, aircraft automation at the crossroads, the evolution of cockpit automation, and pilot factors.

  5. Upper Pleistocene Human Dispersals out of Africa: A Review of the Current State of the Debate

    Science.gov (United States)

    Beyin, Amanuel

    2011-01-01

    Although there is a general consensus on African origin of early modern humans, there is disagreement about how and when they dispersed to Eurasia. This paper reviews genetic and Middle Stone Age/Middle Paleolithic archaeological literature from northeast Africa, Arabia, and the Levant to assess the timing and geographic backgrounds of Upper Pleistocene human colonization of Eurasia. At the center of the discussion lies the question of whether eastern Africa alone was the source of Upper Pleistocene human dispersals into Eurasia or were there other loci of human expansions outside of Africa? The reviewed literature hints at two modes of early modern human colonization of Eurasia in the Upper Pleistocene: (i) from multiple Homo sapiens source populations that had entered Arabia, South Asia, and the Levant prior to and soon after the onset of the Last Interglacial (MIS-5), (ii) from a rapid dispersal out of East Africa via the Southern Route (across the Red Sea basin), dating to ~74–60 kya. PMID:21716744

  6. Parallel structures in human and computer memory

    Science.gov (United States)

    Kanerva, Pentti

    1986-08-01

    If we think of our experiences as being recorded continuously on film, then human memory can be compared to a film library that is indexed by the contents of the film strips stored in it. Moreover, approximate retrieval cues suffice to retrieve information stored in this library: We recognize a familiar person in a fuzzy photograph or a familiar tune played on a strange instrument. This paper is about how to construct a computer memory that would allow a computer to recognize patterns and to recall sequences the way humans do. Such a memory is remarkably similar in structure to a conventional computer memory and also to the neural circuits in the cortex of the cerebellum of the human brain. The paper concludes that the frame problem of artificial intelligence could be solved by the use of such a memory if we were able to encode information about the world properly.

  7. The Contribution of the Human Development Index Literacy Theory to the Debate on Literacy and Development

    Science.gov (United States)

    Biao, Idowu; Mogotsi, Kebadire; Maruatona, Tonic; Raditloaneng, Wapula; Tladi, Flora; Chawawa, Morgan; Kheru, Obakeng

    2014-01-01

    The Human Development Index Literacy (HDIL) theory was developed in 2011 to eliminate or minimise the negative impact of issues underlying the failure of previous literacy programmes in promoting socio-economic development. This theory was tested for the first time between July 2013 and February 2014 in two rural communities of Botswana. A…

  8. Democratizing Human Genome Project Information: A Model Program for Education, Information and Debate in Public Libraries.

    Science.gov (United States)

    Pollack, Miriam

    The "Mapping the Human Genome" project demonstrated that librarians can help whomever they serve in accessing information resources in the areas of biological and health information, whether it is the scientists who are developing the information or a member of the public who is using the information. Public libraries can guide library…

  9. Applying Human Computation Methods to Information Science

    Science.gov (United States)

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  10. Feedback Loops in Communication and Human Computing

    NARCIS (Netherlands)

    op den Akker, Hendrikus J.A.; Heylen, Dirk K.J.; Pantic, Maja; Pentland, Alex; Nijholt, Antinus; Huang, Thomas S.

    Building systems that are able to analyse communicative behaviours or take part in conversations requires a sound methodology in which the complex organisation of conversations is understood and tested on real-life samples. The data-driven approaches to human computing not only have a value for the

  11. Human Memory Organization for Computer Programs.

    Science.gov (United States)

    Norcio, A. F.; Kerst, Stephen M.

    1983-01-01

    Results of study investigating human memory organization in processing of computer programming languages indicate that algorithmic logic segments form a cognitive organizational structure in memory for programs. Statement indentation and internal program documentation did not enhance organizational process of recall of statements in five Fortran…

  12. The Debate.

    Science.gov (United States)

    Current Issues in Language and Society, 1997

    1997-01-01

    The transcript of a debate within a group of specialists in translation is presented. The discussion addresses: translator "visibility" in translations and reader reception; the relationship of functionalism in translation, comparative linguistics, and intercultural communication; the client's power; literary translation; the…

  13. Computational Complexity and Human Decision-Making.

    Science.gov (United States)

    Bossaerts, Peter; Murawski, Carsten

    2017-12-01

    The rationality principle postulates that decision-makers always choose the best action available to them. It underlies most modern theories of decision-making. The principle does not take into account the difficulty of finding the best option. Here, we propose that computational complexity theory (CCT) provides a framework for defining and quantifying the difficulty of decisions. We review evidence showing that human decision-making is affected by computational complexity. Building on this evidence, we argue that most models of decision-making, and metacognition, are intractable from a computational perspective. To be plausible, future theories of decision-making will need to take into account both the resources required for implementing the computations implied by the theory, and the resource constraints imposed on the decision-maker by biology. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Introduction to human-computer interaction

    CERN Document Server

    Booth, Paul

    2014-01-01

    Originally published in 1989 this title provided a comprehensive and authoritative introduction to the burgeoning discipline of human-computer interaction for students, academics, and those from industry who wished to know more about the subject. Assuming very little knowledge, the book provides an overview of the diverse research areas that were at the time only gradually building into a coherent and well-structured field. It aims to explain the underlying causes of the cognitive, social and organizational problems typically encountered when computer systems are introduced. It is clear and co

  15. Steve Clarke, Julian Savulescu, C. A. J. Coady, Alberto Giubilini, and Sagar Sanyal (eds.), The Ethics of Human Enhancement: Understanding the Debate, Oxford University Press, 2016, 269pp., $74.00 (hbk), ISBN 9780198754855.

    NARCIS (Netherlands)

    Nyholm, S.R.

    2017-01-01

    The Ethics of Human Enhancement: Understanding the Debate has two chief aims. These aims are to help readers understand the existing debate and to move the debate forward. The book consists of an introductory chapter by Alberto Giubilini and Sagar Sanyal (which lays out some prominent

  16. Proxemics in Human-Computer Interaction

    OpenAIRE

    Greenberg, Saul; Honbaek, Kasper; Quigley, Aaron; Reiterer, Harald; Rädle, Roman

    2014-01-01

    In 1966, anthropologist Edward Hall coined the term "proxemics." Proxemics is an area of study that identifies the culturally dependent ways in which people use interpersonal distance to understand and mediate their interactions with others. Recent research has demonstrated the use of proxemics in human-computer interaction (HCI) for supporting users' explicit and implicit interactions in a range of uses, including remote office collaboration, home entertainment, and games. One promise of pro...

  17. Human-Computer Interaction in Smart Environments

    Science.gov (United States)

    Paravati, Gianluca; Gatteschi, Valentina

    2015-01-01

    Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  18. Debates atuais em humanização e saúde: quem somos nós? Current debates on humanization and health: who are we?

    Directory of Open Access Journals (Sweden)

    Rosângela Minardi Mitre Cotta

    2013-01-01

    Full Text Available INTRODUÇÃO: Considerando ser o SUS um processo social em construção, e sendo os profissionais de saúde importantes sujeitos desse processo, destaca-se o papel da educação permanente como um relevante instrumento para a garantia do cuidado humanizado. OBJETIVO: discutir a experiência do curso de capacitação dos profissionais de saúde de uma Unidade de Saúde pública-ambulatorial, com base na perspectiva da humanização, visando a implementação de um modelo sanitário comprometido com os valores essenciais impressos nos ideais do SUS. MÉTODOS: A metodologia de ensino-aprendizagem utilizada baseou-se na problematização, utilizando-se do processamento de uma situação problema elaborada a partir da experiência dos docentes. RESULTADO: Os profissionais identificaram que o padrão instituído no modo de pensar e fazer em saúde é insatisfatório para suprir os desafios enfrentados no setor. As estratégias utilizadas contribuíram para sistematizar o conteúdo através da reflexão sobre os referenciais teóricos apresentados, ao estimular o pensamento reflexivo e crítico, aspectos estes fundamentais para ampliar e aprofundar o processo de empoderamento dos profissionais. CONCLUSÃO: o curso estimulou a grupalidade, colocando em pauta na agenda, a discussão sobre a humanização das ações em saúde.Bearing in mind that the Brazilian Unified Health System (SUS is a social process in construction, and as health professionals are important individuals in this process, the role of permanent education as an important instrument to ensure humanized care is highlighted. The scope of this paper is to discuss the experience of the training course for health professionals of a public health outpatient unit, based on the prospect of humanized treatment, seeking the implementation of a sanitary model committed to the formal values contained in the SUS ideals. The teaching-learning methodology used is based on problem-solving, derived from

  19. Human-computer interaction : Guidelines for web animation

    OpenAIRE

    Galyani Moghaddam, Golnessa; Moballeghi, Mostafa

    2006-01-01

    Human-computer interaction in the large is an interdisciplinary area which attracts researchers, educators, and practioners from many differenf fields. Human-computer interaction studies a human and a machine in communication, it draws from supporting knowledge on both the machine and the human side. This paper is related to the human side of human-computer interaction and focuses on animations. The growing use of animation in Web pages testifies to the increasing ease with which such multim...

  20. Brain-Computer Interfaces Revolutionizing Human-Computer Interaction

    CERN Document Server

    Graimann, Bernhard; Allison, Brendan

    2010-01-01

    A brain-computer interface (BCI) establishes a direct output channel between the human brain and external devices. BCIs infer user intent via direct measures of brain activity and thus enable communication and control without movement. This book, authored by experts in the field, provides an accessible introduction to the neurophysiological and signal-processing background required for BCI, presents state-of-the-art non-invasive and invasive approaches, gives an overview of current hardware and software solutions, and reviews the most interesting as well as new, emerging BCI applications. The book is intended not only for students and young researchers, but also for newcomers and other readers from diverse backgrounds keen to learn about this vital scientific endeavour.

  1. Human-Computer Interaction in Smart Environments

    Directory of Open Access Journals (Sweden)

    Gianluca Paravati

    2015-08-01

    Full Text Available Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  2. Human-Computer Interaction The Agency Perspective

    CERN Document Server

    Oliveira, José

    2012-01-01

    Agent-centric theories, approaches and technologies are contributing to enrich interactions between users and computers. This book aims at highlighting the influence of the agency perspective in Human-Computer Interaction through a careful selection of research contributions. Split into five sections; Users as Agents, Agents and Accessibility, Agents and Interactions, Agent-centric Paradigms and Approaches, and Collective Agents, the book covers a wealth of novel, original and fully updated material, offering:   ü  To provide a coherent, in depth, and timely material on the agency perspective in HCI ü  To offer an authoritative treatment of the subject matter presented by carefully selected authors ü  To offer a balanced and broad coverage of the subject area, including, human, organizational, social, as well as technological concerns. ü  To offer a hands-on-experience by covering representative case studies and offering essential design guidelines   The book will appeal to a broad audience of resea...

  3. Measuring Multimodal Synchrony for Human-Computer Interaction

    NARCIS (Netherlands)

    Reidsma, Dennis; Nijholt, Antinus; Tschacher, Wolfgang; Ramseyer, Fabian; Sourin, A.

    2010-01-01

    Nonverbal synchrony is an important and natural element in human-human interaction. It can also play various roles in human-computer interaction. In particular this is the case in the interaction between humans and the virtual humans that inhabit our cyberworlds. Virtual humans need to adapt their

  4. Human computer interaction using hand gestures

    CERN Document Server

    Premaratne, Prashan

    2014-01-01

    Human computer interaction (HCI) plays a vital role in bridging the 'Digital Divide', bringing people closer to consumer electronics control in the 'lounge'. Keyboards and mouse or remotes do alienate old and new generations alike from control interfaces. Hand Gesture Recognition systems bring hope of connecting people with machines in a natural way. This will lead to consumers being able to use their hands naturally to communicate with any electronic equipment in their 'lounge.' This monograph will include the state of the art hand gesture recognition approaches and how they evolved from their inception. The author would also detail his research in this area for the past 8 years and how the future might turn out to be using HCI. This monograph will serve as a valuable guide for researchers (who would endeavour into) in the world of HCI.

  5. Human Computation An Integrated Approach to Learning from the Crowd

    CERN Document Server

    Law, Edith

    2011-01-01

    Human computation is a new and evolving research area that centers around harnessing human intelligence to solve computational problems that are beyond the scope of existing Artificial Intelligence (AI) algorithms. With the growth of the Web, human computation systems can now leverage the abilities of an unprecedented number of people via the Web to perform complex computation. There are various genres of human computation applications that exist today. Games with a purpose (e.g., the ESP Game) specifically target online gamers who generate useful data (e.g., image tags) while playing an enjoy

  6. Advanced Technologies, Embedded and Multimedia for Human-Centric Computing

    CERN Document Server

    Chao, Han-Chieh; Deng, Der-Jiunn; Park, James; HumanCom and EMC 2013

    2014-01-01

    The theme of HumanCom and EMC are focused on the various aspects of human-centric computing for advances in computer science and its applications, embedded and multimedia computing and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. And the theme of EMC (Advanced in Embedded and Multimedia Computing) is focused on the various aspects of embedded system, smart grid, cloud and multimedia computing, and it provides an opportunity for academic, industry professionals to discuss the latest issues and progress in the area of embedded and multimedia computing. Therefore this book will be include the various theories and practical applications in human-centric computing and embedded and multimedia computing.

  7. The epistemology and ontology of human-computer interaction

    NARCIS (Netherlands)

    Brey, Philip A.E.

    2005-01-01

    This paper analyzes epistemological and ontological dimensions of Human-Computer Interaction (HCI) through an analysis of the functions of computer systems in relation to their users. It is argued that the primary relation between humans and computer systems has historically been epistemic:

  8. Global Warming: A Review of the Debates on the Causes ...

    African Journals Online (AJOL)

    USER

    Herath (2011), the debates are human versus natural, small amount of warming versus ... computer model simulations and supported by Kyoto Protocol since it is without scientific ..... priorities, the Kyoto Protocol was a battleground between businesses and ... as OPEC (Organization of Petroleum Exporting Countries).

  9. Affective Learning and the Classroom Debate

    Science.gov (United States)

    Jagger, Suzy

    2013-01-01

    A commonly used teaching method to promote student engagement is the classroom debate. This study evaluates how affective characteristics, as defined in Bloom's taxonomy, were stimulated during debates that took place on a professional ethics module for first year computing undergraduates. The debates led to lively interactive group discussions…

  10. 2012 International Conference on Human-centric Computing

    CERN Document Server

    Jin, Qun; Yeo, Martin; Hu, Bin; Human Centric Technology and Service in Smart Space, HumanCom 2012

    2012-01-01

    The theme of HumanCom is focused on the various aspects of human-centric computing for advances in computer science and its applications and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. In addition, the conference will publish high quality papers which are closely related to the various theories and practical applications in human-centric computing. Furthermore, we expect that the conference and its publications will be a trigger for further related research and technology improvements in this important subject.

  11. Computational Analysis of Human Blood Flow

    Science.gov (United States)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  12. Human-Computer Interaction and Information Management Research Needs

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — In a visionary future, Human-Computer Interaction HCI and Information Management IM have the potential to enable humans to better manage their lives through the use...

  13. Changing Human-Animal Relationships in Sport: An Analysis of the UK and Australian Horse Racing Whips Debates

    Directory of Open Access Journals (Sweden)

    Raewyn Graham

    2016-05-01

    Full Text Available Changing social values and new technologies have contributed to increasing media attention and debate about the acceptable use of animals in sport. This paper focuses on the use of the whip in thoroughbred horse racing. Those who defend its use argue it is a necessary tool needed for safety, correction and encouragement, and that it does not cause the horse any pain. For those who oppose its use, it is an instrument of cruelty. Media framing is employed to unpack the discourses played out in print and social media in the UK (2011 and Australia (2009 during key periods of the whip debate following the introduction of new whip rules. Media coverage for the period August 2014–August 2015 for both countries is also considered. This paper seeks to identify the perceptions of advocates and opponents of the whip as portrayed in conventional and social media in Australia and the UK, to consider if these perceptions have changed over time, and whose voices are heard in these platforms. This paper contributes to discussions on the impacts that media sites have either in reinforcing existing perspectives or creating new perspectives; and importantly how this impacts on equine welfare.

  14. Bajtín en la encrucijada de las ciencias humanas europeas “en crisis”. Revisión de un debate / Bakhtin at the crossroads of the European Human Sciences “in crisis”. Review of a debate

    Directory of Open Access Journals (Sweden)

    Bénédicte Vauthier

    2009-10-01

    Full Text Available RESUMEM: En este artículo se valora la aportación del Círculo de Bajtin a las ciencias humanas. Después de esbozar el contexto de escritura, se valora el diálogo implícito que se instaura entre estos autores y teóricos alemanes. Los textos de Bajtin de los años veinte (Hacia una fi losofía del acto ético, “Autor y personaje en la actividad estética” “Problema del contenido, material y forma en la actividad estética” forman un conjunto coherente, verdadero cimento de la “Estilística de la creación verbal”. Bajtin toma cartas en el debate que enfrentó a Husserl con Dilthey. El marxismo y la filosofía del lenguaje y Freudismo: un bosquejo crítico de V. Volochinov; El método formal en los estudios literarios de P. Medvedev ponen al alcance de un mayor público las ideas fi losófi cas del joven Bajtín. De ahí la necesidad de no instaurar una ruptura hermenéutica entre textos que se aclaran recíprocamente. ABSTRACT: In this article, we evaluate the contribution of the Bakthin Circle to human sciences. After sketching out the writing context, we consider the implicitdialogue that is established between these authors and some Germantheoreticians. Bakhtin’s texts of the twenties (Toward a Philosophy ofthe Act, Author and Hero in Aesthetic Activity, The Problem of Content, Material, and Form in Verbal Art constitute a consistent set, true basis of the “aesthetics of verbal creation”. Bakhtin plays a part in the debate between Husserl and Dilthey. Voloshinov’s Marxism and the Philosophy of Language and Freudianism as well as P. Medvedev’s The Formal Method in Literary Scholarship made the philosophical ideas of the young Bakhtin understandable for a larger public. It is, therefore, necessary not to establish a hermeneutic break between texts which make one another clearer.

  15. Computer Modeling of Human Delta Opioid Receptor

    Directory of Open Access Journals (Sweden)

    Tatyana Dzimbova

    2013-04-01

    Full Text Available The development of selective agonists of δ-opioid receptor as well as the model of interaction of ligands with this receptor is the subjects of increased interest. In the absence of crystal structures of opioid receptors, 3D homology models with different templates have been reported in the literature. The problem is that these models are not available for widespread use. The aims of our study are: (1 to choose within recently published crystallographic structures templates for homology modeling of the human δ-opioid receptor (DOR; (2 to evaluate the models with different computational tools; and (3 to precise the most reliable model basing on correlation between docking data and in vitro bioassay results. The enkephalin analogues, as ligands used in this study, were previously synthesized by our group and their biological activity was evaluated. Several models of DOR were generated using different templates. All these models were evaluated by PROCHECK and MolProbity and relationship between docking data and in vitro results was determined. The best correlations received for the tested models of DOR were found between efficacy (erel of the compounds, calculated from in vitro experiments and Fitness scoring function from docking studies. New model of DOR was generated and evaluated by different approaches. This model has good GA341 value (0.99 from MODELLER, good values from PROCHECK (92.6% of most favored regions and MolProbity (99.5% of favored regions. Scoring function correlates (Pearson r = -0.7368, p-value = 0.0097 with erel of a series of enkephalin analogues, calculated from in vitro experiments. So, this investigation allows suggesting a reliable model of DOR. Newly generated model of DOR receptor could be used further for in silico experiments and it will give possibility for faster and more correct design of selective and effective ligands for δ-opioid receptor.

  16. Multimodal Information Presentation for High-Load Human Computer Interaction

    NARCIS (Netherlands)

    Cao, Y.

    2011-01-01

    This dissertation addresses multimodal information presentation in human computer interaction. Information presentation refers to the manner in which computer systems/interfaces present information to human users. More specifically, the focus of our work is not on which information to present, but

  17. Stereo Vision for Unrestricted Human-Computer Interaction

    OpenAIRE

    Eldridge, Ross; Rudolph, Heiko

    2008-01-01

    Human computer interfaces have come long way in recent years, but the goal of a computer interpreting unrestricted human movement remains elusive. The use of stereo vision in this field has enabled the development of systems that begin to approach this goal. As computer technology advances we come ever closer to a system that can react to the ambiguities of human movement in real-time. In the foreseeable future stereo computer vision is not likely to replace the keyboard or mouse. There is at...

  18. Intermediality between Games and Fiction: The “Ludology vs. Narratology” Debate in Computer Game Studies: A Response to Gonzalo Frasca

    Directory of Open Access Journals (Sweden)

    Kokonis Michalis

    2014-12-01

    Full Text Available In the last ten or fourteen years there has been a debate among the so called ludologists and narratologists in Computer Games Studies as to what is the best methodological approach for the academic study of electronic games. The aim of this paper is to propose a way out of the dilemma, suggesting that both ludology and narratology can be helpful methodologically. However, there is need for a wider theoretical perspective, that of semiotics, in which both approaches can be operative. The semiotic perspective proposed allows research in the field to focus on the similarities between games and traditional narrative forms (since they share narrativity to a greater or lesser extent as well as on their difference (they have different degrees of interaction; it will facilitate communication among theorists if we want to understand each other when talking about games and stories, and it will lead to a better understanding of the hybrid nature of the medium of game. In this sense the present paper aims to complement Gonzalo Frasca’s reconciliatory attempt made a few years back and expand on his proposal.

  19. Benefits of Subliminal Feedback Loops in Human-Computer Interaction

    OpenAIRE

    Walter Ritter

    2011-01-01

    A lot of efforts have been directed to enriching human-computer interaction to make the user experience more pleasing or efficient. In this paper, we briefly present work in the fields of subliminal perception and affective computing, before we outline a new approach to add analog communication channels to the human-computer interaction experience. In this approach, in addition to symbolic predefined mappings of input to output, a subliminal feedback loop is used that provides feedback in evo...

  20. Human computer confluence applied in healthcare and rehabilitation.

    Science.gov (United States)

    Viaud-Delmon, Isabelle; Gaggioli, Andrea; Ferscha, Alois; Dunne, Stephen

    2012-01-01

    Human computer confluence (HCC) is an ambitious research program studying how the emerging symbiotic relation between humans and computing devices can enable radically new forms of sensing, perception, interaction, and understanding. It is an interdisciplinary field, bringing together researches from horizons as various as pervasive computing, bio-signals processing, neuroscience, electronics, robotics, virtual & augmented reality, and provides an amazing potential for applications in medicine and rehabilitation.

  1. Classical Humanism and the Challenge of Modernity. Debates on classical education in Germany c. 1770-1860

    NARCIS (Netherlands)

    van Bommel, S.P.

    2013-01-01

    Classical humanism was a living tradition until far into the nineteenth century. In scholarship, classical (Renaissance) humanism is usually strictly distinguished from so-called ‘neo-humanism,’ which, especially in Germany, reigned supreme at the beginning of the nineteenth century. While most

  2. From Human-Computer Interaction to Human-Robot Social Interaction

    OpenAIRE

    Toumi, Tarek; Zidani, Abdelmadjid

    2014-01-01

    Human-Robot Social Interaction became one of active research fields in which researchers from different areas propose solutions and directives leading robots to improve their interactions with humans. In this paper we propose to introduce works in both human robot interaction and human computer interaction and to make a bridge between them, i.e. to integrate emotions and capabilities concepts of the robot in human computer model to become adequate for human robot interaction and discuss chall...

  3. Safety Metrics for Human-Computer Controlled Systems

    Science.gov (United States)

    Leveson, Nancy G; Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems.This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  4. Building a human rights framework for workers' compensation in the United States: opening the debate on first principles.

    Science.gov (United States)

    Hilgert, Jeffrey A

    2012-06-01

    This article introduces the idea of human rights to the topic of workers' compensation in the United States. It discusses what constitutes a human rights approach and explains how this approach conflicts with those policy ideas that have provided the foundation historically for workers' compensation in the United States. Using legal and historical research, key international labor and human rights standards on employment injury benefits and influential writings in the development of the U.S. workers' compensation system are cited. Workers' injury and illness compensation in the United States does not conform to basic international human rights norms. A comprehensive review of the U.S. workers' compensation system under international human rights standards is needed. Examples of policy changes are highlighted that would begin the process of moving workers' compensation into conformity with human rights standards. Copyright © 2012 Wiley Periodicals, Inc.

  5. Human performance models for computer-aided engineering

    Science.gov (United States)

    Elkind, Jerome I. (Editor); Card, Stuart K. (Editor); Hochberg, Julian (Editor); Huey, Beverly Messick (Editor)

    1989-01-01

    This report discusses a topic important to the field of computational human factors: models of human performance and their use in computer-based engineering facilities for the design of complex systems. It focuses on a particular human factors design problem -- the design of cockpit systems for advanced helicopters -- and on a particular aspect of human performance -- vision and related cognitive functions. By focusing in this way, the authors were able to address the selected topics in some depth and develop findings and recommendations that they believe have application to many other aspects of human performance and to other design domains.

  6. Image Visual Realism: From Human Perception to Machine Computation.

    Science.gov (United States)

    Fan, Shaojing; Ng, Tian-Tsong; Koenig, Bryan L; Herberg, Jonathan S; Jiang, Ming; Shen, Zhiqi; Zhao, Qi

    2017-08-30

    Visual realism is defined as the extent to which an image appears to people as a photo rather than computer generated. Assessing visual realism is important in applications like computer graphics rendering and photo retouching. However, current realism evaluation approaches use either labor-intensive human judgments or automated algorithms largely dependent on comparing renderings to reference images. We develop a reference-free computational framework for visual realism prediction to overcome these constraints. First, we construct a benchmark dataset of 2520 images with comprehensive human annotated attributes. From statistical modeling on this data, we identify image attributes most relevant for visual realism. We propose both empirically-based (guided by our statistical modeling of human data) and CNN-learned features to predict visual realism of images. Our framework has the following advantages: (1) it creates an interpretable and concise empirical model that characterizes human perception of visual realism; (2) it links computational features to latent factors of human image perception.

  7. Computational Intelligence in a Human Brain Model

    Directory of Open Access Journals (Sweden)

    Viorel Gaftea

    2016-06-01

    Full Text Available This paper focuses on the current trends in brain research domain and the current stage of development of research for software and hardware solutions, communication capabilities between: human beings and machines, new technologies, nano-science and Internet of Things (IoT devices. The proposed model for Human Brain assumes main similitude between human intelligence and the chess game thinking process. Tactical & strategic reasoning and the need to follow the rules of the chess game, all are very similar with the activities of the human brain. The main objective for a living being and the chess game player are the same: securing a position, surviving and eliminating the adversaries. The brain resolves these goals, and more, the being movement, actions and speech are sustained by the vital five senses and equilibrium. The chess game strategy helps us understand the human brain better and easier replicate in the proposed ‘Software and Hardware’ SAH Model.

  8. The Next Wave: Humans, Computers, and Redefining Reality

    Science.gov (United States)

    Little, William

    2018-01-01

    The Augmented/Virtual Reality (AVR) Lab at KSC is dedicated to " exploration into the growing computer fields of Extended Reality and the Natural User Interface (it is) a proving ground for new technologies that can be integrated into future NASA projects and programs." The topics of Human Computer Interface, Human Computer Interaction, Augmented Reality, Virtual Reality, and Mixed Reality are defined; examples of work being done in these fields in the AVR Lab are given. Current new and future work in Computer Vision, Speech Recognition, and Artificial Intelligence are also outlined.

  9. Accident sequence analysis of human-computer interface design

    International Nuclear Information System (INIS)

    Fan, C.-F.; Chen, W.-H.

    2000-01-01

    It is important to predict potential accident sequences of human-computer interaction in a safety-critical computing system so that vulnerable points can be disclosed and removed. We address this issue by proposing a Multi-Context human-computer interaction Model along with its analysis techniques, an Augmented Fault Tree Analysis, and a Concurrent Event Tree Analysis. The proposed augmented fault tree can identify the potential weak points in software design that may induce unintended software functions or erroneous human procedures. The concurrent event tree can enumerate possible accident sequences due to these weak points

  10. Let's Put "Debate" into "Presidential Debates."

    Science.gov (United States)

    Benoit, William L.

    Presidential debates come in all shapes and sizes. The presence and length of opening statements and closing remarks, the opportunity and length of rebuttal, the nature of the questioner, and other factors have created a bewildering variety of formats. However, most scholars agree that these confrontations are not "really" debates but merely…

  11. Homo faber or homo credente? What defines humans, and what could Homo naledi contribute to this debate?

    Directory of Open Access Journals (Sweden)

    Detlev L. Tönsing

    2017-10-01

    Full Text Available The transition from pre-human to human has, for a long time, been associated with tool use and construction. The implicit self-definition of humans in this is that of planned control over life world. This is reflected on in the work of Hanna Arendt on the homo faber and the novel by Max Frisch of that name. However, this definition has become problematic in a number of ways: Planned tool use has been seen to occur outside the human species, and the focus on control of the environment has become suspect because of the environmental crisis. The burial practices of Homo naledi indicate high-level self-awareness and social communication, with little tool use being evident. This article asks whether this might be an occasion to redefine our conception of what it means to be human away from the focus on mastery and control and towards including trust, also religious trust, as the true mark of humanity.

  12. It's no debate, debates are great.

    Science.gov (United States)

    Dy-Boarman, Eliza A; Nisly, Sarah A; Costello, Tracy J

    A debate can be a pedagogical method used to instill essential functions in pharmacy students. This non-traditional teaching method may help to further develop a number of skills that are highlighted in the current Accreditation Council for Pharmacy Education Standards 2016 and Center for the Advancement of Pharmacy Education Educational Outcomes 2013. Debates have also been used as an educational tool in other health disciplines. Current pharmacy literature does illustrate the use of debates in various areas within the pharmacy curriculum in both required and elective courses; however, the current body of literature would suggest that debates are an underutilized teaching tool in pharmacy experiential education. With all potential benefits of debates as a teaching tool, pharmacy experiential preceptors should further explore their use in the experiential setting. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Human-computer interaction and management information systems

    CERN Document Server

    Galletta, Dennis F

    2014-01-01

    ""Human-Computer Interaction and Management Information Systems: Applications"" offers state-of-the-art research by a distinguished set of authors who span the MIS and HCI fields. The original chapters provide authoritative commentaries and in-depth descriptions of research programs that will guide 21st century scholars, graduate students, and industry professionals. Human-Computer Interaction (or Human Factors) in MIS is concerned with the ways humans interact with information, technologies, and tasks, especially in business, managerial, organizational, and cultural contexts. It is distinctiv

  14. Mobile human-computer interaction perspective on mobile learning

    CSIR Research Space (South Africa)

    Botha, Adèle

    2010-10-01

    Full Text Available Applying a Mobile Human Computer Interaction (MHCI) view to the domain of education using Mobile Learning (Mlearning), the research outlines its understanding of the influences and effects of different interactions on the use of mobile technology...

  15. Cognition beyond the brain computation, interactivity and human artifice

    CERN Document Server

    Cowley, Stephen J

    2013-01-01

    Arguing that a collective dimension has given cognitive flexibility to human intelligence, this book shows that traditional cognitive psychology underplays the role of bodies, dialogue, diagrams, tools, talk, customs, habits, computers and cultural practices.

  16. Computers, the Human Mind, and My In-Laws' House.

    Science.gov (United States)

    Esque, Timm J.

    1996-01-01

    Discussion of human memory, computer memory, and the storage of information focuses on a metaphor that can account for memory without storage and can set the stage for systemic research around a more comprehensive, understandable theory. (Author/LRW)

  17. The Emotiv EPOC interface paradigm in Human-Computer Interaction

    OpenAIRE

    Ancău Dorina; Roman Nicolae-Marius; Ancău Mircea

    2017-01-01

    Numerous studies have suggested the use of decoded error potentials in the brain to improve human-computer communication. Together with state-of-the-art scientific equipment, experiments have also tested instruments with more limited performance for the time being, such as Emotiv EPOC. This study presents a review of these trials and a summary of the results obtained. However, the level of these results indicates a promising prospect for using this headset as a human-computer interface for er...

  18. A Review of the Organisation for Economic Cooperation and Development's International Education Surveys: Governance, Human Capital Discourses, and Policy Debates

    Science.gov (United States)

    Morgan, Clara; Volante, Louis

    2016-01-01

    Given the influential role that the Organisation for Economic Cooperation and Development (OECD) plays in educational governance, we believe it is timely to provide an in-depth review of its education surveys and their associated human capital discourses. By reviewing and summarizing the OECD's suite of education surveys, this paper identifies the…

  19. Where computers disappear, virtual humans appear

    NARCIS (Netherlands)

    Nijholt, Antinus; Sourin, A.

    2004-01-01

    In this paper, we survey the role of virtual humans (or embodied conversational agents) in smart and ambient intelligence environments. Research in this area can profit from research done earlier in virtual reality environments and research on verbal and nonverbal interaction. We discuss virtual

  20. ‘Bound Coolies’ and Other Indentured Workers in the Caribbean: Implications for debates about human trafficking and modern slavery

    Directory of Open Access Journals (Sweden)

    Kamala Kempadoo

    2017-09-01

    Full Text Available Under systems of indenture in the Caribbean, Europeans such as Irish, Scots and Portuguese, as well as Asians, primarily Indians, Chinese and Indonesians, were recruited, often under false pretences, and transported to the ‘New World’, where they were bound to an employer and the plantation in a state of ‘interlocking incarceration’. Indentureship not only preceded, co-existed with, and survived slavery in the Caribbean, but was distinct in law and in practice from slavery. This article argues that the conditions of Caribbean indenture can be seen to be much more analogous to those represented in contemporary discussions about human trafficking and ‘modern slavery’ than those of slavery. Caribbean histories of indenture, it is proposed, can provide more appropriate conceptual tools for thinking about unfree labour today—whether state or privately sponsored—than the concept of slavery, given the parallels between this past migrant labour system in the Caribbean and those we witness and identify today as ‘modern slavery’ or human trafficking. This article thus urges a move away from the conflation of slavery and human trafficking with all forced, bonded and migrant labour, as is commonly the case, and for greater attention for historical evidence.

  1. Audio Technology and Mobile Human Computer Interaction

    DEFF Research Database (Denmark)

    Chamberlain, Alan; Bødker, Mads; Hazzard, Adrian

    2017-01-01

    Audio-based mobile technology is opening up a range of new interactive possibilities. This paper brings some of those possibilities to light by offering a range of perspectives based in this area. It is not only the technical systems that are developing, but novel approaches to the design...... and understanding of audio-based mobile systems are evolving to offer new perspectives on interaction and design and support such systems to be applied in areas, such as the humanities....

  2. Object recognition in images by human vision and computer vision

    NARCIS (Netherlands)

    Chen, Q.; Dijkstra, J.; Vries, de B.

    2010-01-01

    Object recognition plays a major role in human behaviour research in the built environment. Computer based object recognition techniques using images as input are challenging, but not an adequate representation of human vision. This paper reports on the differences in object shape recognition

  3. Exemelification of parliamentary debates

    NARCIS (Netherlands)

    Gielissen, T.; Marx, M.

    2009-01-01

    Parliamentary debates are an interesting domain to apply state-of-the-art information retrieval technology. Parliamentary debates are highly structured transcripts of meetings of politicians in parliament. These debates are an important part of the cultural heritage of countries; they are often free

  4. A Perspective on Computational Human Performance Models as Design Tools

    Science.gov (United States)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  5. Evidence of the Possible Harm of Endocrine-Disrupting Chemicals in Humans: Ongoing Debates and Key Issues

    Directory of Open Access Journals (Sweden)

    Duk-Hee Lee

    2018-03-01

    Full Text Available Evidence has emerged that endocrine-disrupting chemicals (EDCs can produce adverse effects, even at low doses that are assumed safe. However, systemic reviews and meta-analyses focusing on human studies, especially of EDCs with short half-lives, have demonstrated inconsistent results. Epidemiological studies have insuperable methodological limitations, including the unpredictable net effects of mixtures, non-monotonic dose-response relationships, the non-existence of unexposed groups, and the low reliability of exposure assessment. Thus, despite increases in EDC-linked diseases, traditional epidemiological studies based on individual measurements of EDCs in bio-specimens may fail to provide consistent results. The exposome has been suggested as a promising approach to address the uncertainties surrounding human studies, but it is never free from these methodological issues. Although exposure to EDCs during critical developmental periods is a major concern, continuous exposure to EDCs during non-critical periods is also harmful. Indeed, the evolutionary aspects of epigenetic programming triggered by EDCs during development should be considered because it is a key mechanism for developmental plasticity. Presently, living without EDCs is impossible due to their omnipresence. Importantly, there are lifestyles which can increase the excretion of EDCs or mitigate their harmful effects through the activation of mitohormesis or xenohormesis. Effectiveness of lifestyle interventions should be evaluated as practical ways against EDCs in the real world.

  6. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2012-01-01

    This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135

  7. The UK Human Genome Mapping Project online computing service.

    Science.gov (United States)

    Rysavy, F R; Bishop, M J; Gibbs, G P; Williams, G W

    1992-04-01

    This paper presents an overview of computing and networking facilities developed by the Medical Research Council to provide online computing support to the Human Genome Mapping Project (HGMP) in the UK. The facility is connected to a number of other computing facilities in various centres of genetics and molecular biology research excellence, either directly via high-speed links or through national and international wide-area networks. The paper describes the design and implementation of the current system, a 'client/server' network of Sun, IBM, DEC and Apple servers, gateways and workstations. A short outline of online computing services currently delivered by this system to the UK human genetics research community is also provided. More information about the services and their availability could be obtained by a direct approach to the UK HGMP-RC.

  8. A Research Roadmap for Computation-Based Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States); Groth, Katrina [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  9. A Research Roadmap for Computation-Based Human Reliability Analysis

    International Nuclear Information System (INIS)

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-01-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  10. The Past, Present and Future of Human Computer Interaction

    KAUST Repository

    Churchill, Elizabeth

    2018-01-16

    Human Computer Interaction (HCI) focuses on how people interact with, and are transformed by computation. Our current technology landscape is changing rapidly. Interactive applications, devices and services are increasingly becoming embedded into our environments. From our homes to the urban and rural spaces, we traverse everyday. We are increasingly able toヨoften required toヨmanage and configure multiple, interconnected devices and program their interactions. Artificial intelligence (AI) techniques are being used to create dynamic services that learn about us and others, that make conclusions about our intents and affiliations, and that mould our digital interactions based in predictions about our actions and needs, nudging us toward certain behaviors. Computation is also increasingly embedded into our bodies. Understanding human interactions in the everyday digital and physical context. During this lecture, Elizabeth Churchill -Director of User Experience at Google- will talk about how an emerging landscape invites us to revisit old methods and tactics for understanding how people interact with computers and computation, and how it challenges us to think about new methods and frameworks for understanding the future of human-centered computation.

  11. The Emotiv EPOC interface paradigm in Human-Computer Interaction

    Directory of Open Access Journals (Sweden)

    Ancău Dorina

    2017-01-01

    Full Text Available Numerous studies have suggested the use of decoded error potentials in the brain to improve human-computer communication. Together with state-of-the-art scientific equipment, experiments have also tested instruments with more limited performance for the time being, such as Emotiv EPOC. This study presents a review of these trials and a summary of the results obtained. However, the level of these results indicates a promising prospect for using this headset as a human-computer interface for error decoding.

  12. From humans to computers cognition through visual perception

    CERN Document Server

    Alexandrov, Viktor Vasilievitch

    1991-01-01

    This book considers computer vision to be an integral part of the artificial intelligence system. The core of the book is an analysis of possible approaches to the creation of artificial vision systems, which simulate human visual perception. Much attention is paid to the latest achievements in visual psychology and physiology, the description of the functional and structural organization of the human perception mechanism, the peculiarities of artistic perception and the expression of reality. Computer vision models based on these data are investigated. They include the processes of external d

  13. An intelligent multi-media human-computer dialogue system

    Science.gov (United States)

    Neal, J. G.; Bettinger, K. E.; Byoun, J. S.; Dobes, Z.; Thielman, C. Y.

    1988-01-01

    Sophisticated computer systems are being developed to assist in the human decision-making process for very complex tasks performed under stressful conditions. The human-computer interface is a critical factor in these systems. The human-computer interface should be simple and natural to use, require a minimal learning period, assist the user in accomplishing his task(s) with a minimum of distraction, present output in a form that best conveys information to the user, and reduce cognitive load for the user. In pursuit of this ideal, the Intelligent Multi-Media Interfaces project is devoted to the development of interface technology that integrates speech, natural language text, graphics, and pointing gestures for human-computer dialogues. The objective of the project is to develop interface technology that uses the media/modalities intelligently in a flexible, context-sensitive, and highly integrated manner modelled after the manner in which humans converse in simultaneous coordinated multiple modalities. As part of the project, a knowledge-based interface system, called CUBRICON (CUBRC Intelligent CONversationalist) is being developed as a research prototype. The application domain being used to drive the research is that of military tactical air control.

  14. A Human-Centred Tangible approach to learning Computational Thinking

    Directory of Open Access Journals (Sweden)

    Tommaso Turchi

    2016-08-01

    Full Text Available Computational Thinking has recently become a focus of many teaching and research domains; it encapsulates those thinking skills integral to solving complex problems using a computer, thus being widely applicable in our society. It is influencing research across many disciplines and also coming into the limelight of education, mostly thanks to public initiatives such as the Hour of Code. In this paper we present our arguments for promoting Computational Thinking in education through the Human-centred paradigm of Tangible End-User Development, namely by exploiting objects whose interactions with the physical environment are mapped to digital actions performed on the system.

  15. Choice of Human-Computer Interaction Mode in Stroke Rehabilitation.

    Science.gov (United States)

    Mousavi Hondori, Hossein; Khademi, Maryam; Dodakian, Lucy; McKenzie, Alison; Lopes, Cristina V; Cramer, Steven C

    2016-03-01

    Advances in technology are providing new forms of human-computer interaction. The current study examined one form of human-computer interaction, augmented reality (AR), whereby subjects train in the real-world workspace with virtual objects projected by the computer. Motor performances were compared with those obtained while subjects used a traditional human-computer interaction, that is, a personal computer (PC) with a mouse. Patients used goal-directed arm movements to play AR and PC versions of the Fruit Ninja video game. The 2 versions required the same arm movements to control the game but had different cognitive demands. With AR, the game was projected onto the desktop, where subjects viewed the game plus their arm movements simultaneously, in the same visual coordinate space. In the PC version, subjects used the same arm movements but viewed the game by looking up at a computer monitor. Among 18 patients with chronic hemiparesis after stroke, the AR game was associated with 21% higher game scores (P = .0001), 19% faster reaching times (P = .0001), and 15% less movement variability (P = .0068), as compared to the PC game. Correlations between game score and arm motor status were stronger with the AR version. Motor performances during the AR game were superior to those during the PC game. This result is due in part to the greater cognitive demands imposed by the PC game, a feature problematic for some patients but clinically useful for others. Mode of human-computer interface influences rehabilitation therapy demands and can be individualized for patients. © The Author(s) 2015.

  16. Plants and Human Affairs: Educational Enhancement Via a Computer.

    Science.gov (United States)

    Crovello, Theodore J.; Smith, W. Nelson

    To enhance both teaching and learning in an advanced undergraduate elective course on the interrelationships of plants and human affairs, the computer was used for information retrieval, multiple choice course review, and the running of three simulation models--plant related systems (e.g., the rise in world coffee prices after the 1975 freeze in…

  17. Humor in Human-Computer Interaction : A Short Survey

    NARCIS (Netherlands)

    Nijholt, Anton; Niculescu, Andreea; Valitutti, Alessandro; Banchs, Rafael E.; Joshi, Anirudha; Balkrishan, Devanuj K.; Dalvi, Girish; Winckler, Marco

    2017-01-01

    This paper is a short survey on humor in human-computer interaction. It describes how humor is designed and interacted with in social media, virtual agents, social robots and smart environments. Benefits and future use of humor in interactions with artificial entities are discussed based on

  18. A Software Framework for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2009-01-01

    This paper describes a software framework we designed and implemented for the development and research in the area of multimodal human-computer interface. The proposed framework is based on publish / subscribe architecture, which allows developers and researchers to conveniently configure, test and

  19. Computational 3-D Model of the Human Respiratory System

    Science.gov (United States)

    We are developing a comprehensive, morphologically-realistic computational model of the human respiratory system that can be used to study the inhalation, deposition, and clearance of contaminants, while being adaptable for age, race, gender, and health/disease status. The model ...

  20. Why computer games can be essential for human flourishing

    NARCIS (Netherlands)

    Fröding, B.; Peterson, M.B.

    2013-01-01

    Traditionally, playing computer games and engaging in other online activities has been seen as a threat to well-being, health and long-term happiness. It is feared that spending many hours per day in front of the screen leads the individual to forsake other, more worthwhile activities, such as human

  1. Homo ludens in the loop playful human computation systems

    CERN Document Server

    Krause, Markus

    2014-01-01

    The human mind is incredible. It solves problems with ease that will elude machines even for the next decades. This book explores what happens when humans and machines work together to solve problems machines cannot yet solve alone. It explains how machines and computers can work together and how humans can have fun helping to face some of the most challenging problems of artificial intelligence. In this book, you will find designs for games that are entertaining and yet able to collect data to train machines for complex tasks such as natural language processing or image understanding. You wil

  2. The DSM5/RDoC debate on the future of mental health research: implication for studies on human stress and presentation of the signature bank.

    Science.gov (United States)

    Lupien, S J; Sasseville, M; François, N; Giguère, C E; Boissonneault, J; Plusquellec, P; Godbout, R; Xiong, L; Potvin, S; Kouassi, E; Lesage, A

    2017-01-01

    In 2008, the National Institute of Mental Health (NIMH) announced that in the next few decades, it will be essential to study the various biological, psychological and social "signatures" of mental disorders. Along with this new "signature" approach to mental health disorders, modifications of DSM were introduced. One major modification consisted of incorporating a dimensional approach to mental disorders, which involved analyzing, using a transnosological approach, various factors that are commonly observed across different types of mental disorders. Although this new methodology led to interesting discussions of the DSM5 working groups, it has not been incorporated in the last version of the DSM5. Consequently, the NIMH launched the "Research Domain Criteria" (RDoC) framework in order to provide new ways of classifying mental illnesses based on dimensions of observable behavioral and neurobiological measures. The NIMH emphasizes that it is important to consider the benefits of dimensional measures from the perspective of psychopathology and environmental influences, and it is also important to build these dimensions on neurobiological data. The goal of this paper is to present the perspectives of DSM5 and RDoC to the science of mental health disorders and the impact of this debate on the future of human stress research. The second goal is to present the "Signature Bank" developed by the Institut Universitaire en Santé Mentale de Montréal (IUSMM) that has been developed in line with a dimensional and transnosological approach to mental illness.

  3. Computational Fluid and Particle Dynamics in the Human Respiratory System

    CERN Document Server

    Tu, Jiyuan; Ahmadi, Goodarz

    2013-01-01

    Traditional research methodologies in the human respiratory system have always been challenging due to their invasive nature. Recent advances in medical imaging and computational fluid dynamics (CFD) have accelerated this research. This book compiles and details recent advances in the modelling of the respiratory system for researchers, engineers, scientists, and health practitioners. It breaks down the complexities of this field and provides both students and scientists with an introduction and starting point to the physiology of the respiratory system, fluid dynamics and advanced CFD modeling tools. In addition to a brief introduction to the physics of the respiratory system and an overview of computational methods, the book contains best-practice guidelines for establishing high-quality computational models and simulations. Inspiration for new simulations can be gained through innovative case studies as well as hands-on practice using pre-made computational code. Last but not least, students and researcher...

  4. A novel polar-based human face recognition computational model

    Directory of Open Access Journals (Sweden)

    Y. Zana

    2009-07-01

    Full Text Available Motivated by a recently proposed biologically inspired face recognition approach, we investigated the relation between human behavior and a computational model based on Fourier-Bessel (FB spatial patterns. We measured human recognition performance of FB filtered face images using an 8-alternative forced-choice method. Test stimuli were generated by converting the images from the spatial to the FB domain, filtering the resulting coefficients with a band-pass filter, and finally taking the inverse FB transformation of the filtered coefficients. The performance of the computational models was tested using a simulation of the psychophysical experiment. In the FB model, face images were first filtered by simulated V1- type neurons and later analyzed globally for their content of FB components. In general, there was a higher human contrast sensitivity to radially than to angularly filtered images, but both functions peaked at the 11.3-16 frequency interval. The FB-based model presented similar behavior with regard to peak position and relative sensitivity, but had a wider frequency band width and a narrower response range. The response pattern of two alternative models, based on local FB analysis and on raw luminance, strongly diverged from the human behavior patterns. These results suggest that human performance can be constrained by the type of information conveyed by polar patterns, and consequently that humans might use FB-like spatial patterns in face processing.

  5. Human-Computer Interaction, Tourism and Cultural Heritage

    Science.gov (United States)

    Cipolla Ficarra, Francisco V.

    We present a state of the art of the human-computer interaction aimed at tourism and cultural heritage in some cities of the European Mediterranean. In the work an analysis is made of the main problems deriving from training understood as business and which can derail the continuous growth of the HCI, the new technologies and tourism industry. Through a semiotic and epistemological study the current mistakes in the context of the interrelations of the formal and factual sciences will be detected and also the human factors that have an influence on the professionals devoted to the development of interactive systems in order to safeguard and boost cultural heritage.

  6. Computer aided systems human engineering: A hypermedia tool

    Science.gov (United States)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  7. The Danish Biofuel Debate

    DEFF Research Database (Denmark)

    Hansen, Janus

    2014-01-01

    of biofuels enrol scientific authority to support their positions? The sociological theory of functional differentiation combined with the concept of advocacy coalition can help in exploring this relationship between scientific claims-making and the policy stance of different actors in public debates about...... biofuels. In Denmark two distinct scientific perspectives about biofuels map onto the policy debates through articulation by two competing advocacy coalitions. One is a reductionist biorefinery perspective originating in biochemistry and neighbouring disciplines. This perspective works upwards from...

  8. Overview Electrotactile Feedback for Enhancing Human Computer Interface

    Science.gov (United States)

    Pamungkas, Daniel S.; Caesarendra, Wahyu

    2018-04-01

    To achieve effective interaction between a human and a computing device or machine, adequate feedback from the computing device or machine is required. Recently, haptic feedback is increasingly being utilised to improve the interactivity of the Human Computer Interface (HCI). Most existing haptic feedback enhancements aim at producing forces or vibrations to enrich the user’s interactive experience. However, these force and/or vibration actuated haptic feedback systems can be bulky and uncomfortable to wear and only capable of delivering a limited amount of information to the user which can limit both their effectiveness and the applications they can be applied to. To address this deficiency, electrotactile feedback is used. This involves delivering haptic sensations to the user by electrically stimulating nerves in the skin via electrodes placed on the surface of the skin. This paper presents a review and explores the capability of electrotactile feedback for HCI applications. In addition, a description of the sensory receptors within the skin for sensing tactile stimulus and electric currents alsoseveral factors which influenced electric signal to transmit to the brain via human skinare explained.

  9. Human-computer systems interaction backgrounds and applications 3

    CERN Document Server

    Kulikowski, Juliusz; Mroczek, Teresa; Wtorek, Jerzy

    2014-01-01

    This book contains an interesting and state-of the art collection of papers on the recent progress in Human-Computer System Interaction (H-CSI). It contributes the profound description of the actual status of the H-CSI field and also provides a solid base for further development and research in the discussed area. The contents of the book are divided into the following parts: I. General human-system interaction problems; II. Health monitoring and disabled people helping systems; and III. Various information processing systems. This book is intended for a wide audience of readers who are not necessarily experts in computer science, machine learning or knowledge engineering, but are interested in Human-Computer Systems Interaction. The level of particular papers and specific spreading-out into particular parts is a reason why this volume makes fascinating reading. This gives the reader a much deeper insight than he/she might glean from research papers or talks at conferences. It touches on all deep issues that ...

  10. Computer simulation of human motion in sports biomechanics.

    Science.gov (United States)

    Vaughan, C L

    1984-01-01

    This chapter has covered some important aspects of the computer simulation of human motion in sports biomechanics. First the definition and the advantages and limitations of computer simulation were discussed; second, research on various sporting activities were reviewed. These activities included basic movements, aquatic sports, track and field athletics, winter sports, gymnastics, and striking sports. This list was not exhaustive and certain material has, of necessity, been omitted. However, it was felt that a sufficiently broad and interesting range of activities was chosen to illustrate both the advantages and the pitfalls of simulation. It is almost a decade since Miller [53] wrote a review chapter similar to this one. One might be tempted to say that things have changed radically since then--that computer simulation is now a widely accepted and readily applied research tool in sports biomechanics. This is simply not true, however. Biomechanics researchers still tend to emphasize the descriptive type of study, often unfortunately, when a little theoretical explanation would have been more helpful [29]. What will the next decade bring? Of one thing we can be certain: The power of computers, particularly the readily accessible and portable microcomputer, will expand beyond all recognition. The memory and storage capacities will increase dramatically on the hardware side, and on the software side the trend will be toward "user-friendliness." It is likely that a number of software simulation packages designed specifically for studying human motion [31, 96] will be extensively tested and could gain wide acceptance in the biomechanics research community. Nevertheless, a familiarity with Newtonian and Lagrangian mechanics, optimization theory, and computers in general, as well as practical biomechanical insight, will still be a prerequisite for successful simulation models of human motion. Above all, the biomechanics researcher will still have to bear in mind that

  11. Electromagnetic Modeling of Human Body Using High Performance Computing

    Science.gov (United States)

    Ng, Cho-Kuen; Beall, Mark; Ge, Lixin; Kim, Sanghoek; Klaas, Ottmar; Poon, Ada

    Realistic simulation of electromagnetic wave propagation in the actual human body can expedite the investigation of the phenomenon of harvesting implanted devices using wireless powering coupled from external sources. The parallel electromagnetics code suite ACE3P developed at SLAC National Accelerator Laboratory is based on the finite element method for high fidelity accelerator simulation, which can be enhanced to model electromagnetic wave propagation in the human body. Starting with a CAD model of a human phantom that is characterized by a number of tissues, a finite element mesh representing the complex geometries of the individual tissues is built for simulation. Employing an optimal power source with a specific pattern of field distribution, the propagation and focusing of electromagnetic waves in the phantom has been demonstrated. Substantial speedup of the simulation is achieved by using multiple compute cores on supercomputers.

  12. Intermittent control: a computational theory of human control.

    Science.gov (United States)

    Gawthrop, Peter; Loram, Ian; Lakie, Martin; Gollee, Henrik

    2011-02-01

    The paradigm of continuous control using internal models has advanced understanding of human motor control. However, this paradigm ignores some aspects of human control, including intermittent feedback, serial ballistic control, triggered responses and refractory periods. It is shown that event-driven intermittent control provides a framework to explain the behaviour of the human operator under a wider range of conditions than continuous control. Continuous control is included as a special case, but sampling, system matched hold, an intermittent predictor and an event trigger allow serial open-loop trajectories using intermittent feedback. The implementation here may be described as "continuous observation, intermittent action". Beyond explaining unimodal regulation distributions in common with continuous control, these features naturally explain refractoriness and bimodal stabilisation distributions observed in double stimulus tracking experiments and quiet standing, respectively. Moreover, given that human control systems contain significant time delays, a biological-cybernetic rationale favours intermittent over continuous control: intermittent predictive control is computationally less demanding than continuous predictive control. A standard continuous-time predictive control model of the human operator is used as the underlying design method for an event-driven intermittent controller. It is shown that when event thresholds are small and sampling is regular, the intermittent controller can masquerade as the underlying continuous-time controller and thus, under these conditions, the continuous-time and intermittent controller cannot be distinguished. This explains why the intermittent control hypothesis is consistent with the continuous control hypothesis for certain experimental conditions.

  13. Computed tomography of human joints and radioactive waste drums

    International Nuclear Information System (INIS)

    Martz, Harry E.; Roberson, G. Patrick; Hollerbach, Karin; Logan, Clinton M.; Ashby, Elaine; Bernardi, Richard

    1999-01-01

    X- and gamma-ray imaging techniques in nondestructive evaluation (NDE) and assay (NDA) have seen increasing use in an array of industrial, environmental, military, and medical applications. Much of this growth in recent years is attributed to the rapid development of computed tomography (CT) and the use of NDE throughout the life-cycle of a product. Two diverse examples of CT are discussed, 1.) Our computational approach to normal joint kinematics and prosthetic joint analysis offers an opportunity to evaluate and improve prosthetic human joint replacements before they are manufactured or surgically implanted. Computed tomography data from scanned joints are segmented, resulting in the identification of bone and other tissues of interest, with emphasis on the articular surfaces. 2.) We are developing NDE and NDA techniques to analyze closed waste drums accurately and quantitatively. Active and passive computed tomography (A and PCT) is a comprehensive and accurate gamma-ray NDA method that can identify all detectable radioisotopes present in a container and measure their radioactivity

  14. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-03-24

    Roughly ~50% of the human genome, contains noncoding sequences serving as regulatory elements responsible for the diverse gene expression of the cells in the body. One very well studied category of regulatory elements is the category of enhancers. Enhancers increase the transcriptional output in cells through chromatin remodeling or recruitment of complexes of binding proteins. Identification of enhancer using computational techniques is an interesting area of research and up to now several approaches have been proposed. However, the current state-of-the-art methods face limitations since the function of enhancers is clarified, but their mechanism of function is not well understood. This PhD thesis presents a bioinformatics/computer science study that focuses on the problem of identifying enhancers in different human cells using computational techniques. The dissertation is decomposed into four main tasks that we present in different chapters. First, since many of the enhancer’s functions are not well understood, we study the basic biological models by which enhancers trigger transcriptional functions and we survey comprehensively over 30 bioinformatics approaches for identifying enhancers. Next, we elaborate more on the availability of enhancer data as produced by different enhancer identification methods and experimental procedures. In particular, we analyze advantages and disadvantages of existing solutions and we report obstacles that require further consideration. To mitigate these problems we developed the Database of Integrated Human Enhancers (DENdb), a centralized online repository that archives enhancer data from 16 ENCODE cell-lines. The integrated enhancer data are also combined with many other experimental data that can be used to interpret the enhancers content and generate a novel enhancer annotation that complements the existing integrative annotation proposed by the ENCODE consortium. Next, we propose the first deep-learning computational

  15. CHI '13 Extended Abstracts on Human Factors in Computing Systems

    DEFF Research Database (Denmark)

    also deeply appreciate the huge amount of time donated to this process by the 211-member program committee, who paid their own way to attend the face-to-face program committee meeting, an event larger than the average ACM conference. We are proud of the work of the CHI 2013 program committee and hope...... a tremendous amount of work from all areas of the human-computer interaction community. As co-chairs of the process, we are amazed at the ability of the community to organize itself to accomplish this task. We would like to thank the 2680 individual reviewers for their careful consideration of these papers. We...

  16. Code system to compute radiation dose in human phantoms

    International Nuclear Information System (INIS)

    Ryman, J.C.; Cristy, M.; Eckerman, K.F.; Davis, J.L.; Tang, J.S.; Kerr, G.D.

    1986-01-01

    Monte Carlo photon transport code and a code using Monte Carlo integration of a point kernel have been revised to incorporate human phantom models for an adult female, juveniles of various ages, and a pregnant female at the end of the first trimester of pregnancy, in addition to the adult male used earlier. An analysis code has been developed for deriving recommended values of specific absorbed fractions of photon energy. The computer code system and calculational method are described, emphasizing recent improvements in methods

  17. Shape perception in human and computer vision an interdisciplinary perspective

    CERN Document Server

    Dickinson, Sven J

    2013-01-01

    This comprehensive and authoritative text/reference presents a unique, multidisciplinary perspective on Shape Perception in Human and Computer Vision. Rather than focusing purely on the state of the art, the book provides viewpoints from world-class researchers reflecting broadly on the issues that have shaped the field. Drawing upon many years of experience, each contributor discusses the trends followed and the progress made, in addition to identifying the major challenges that still lie ahead. Topics and features: examines each topic from a range of viewpoints, rather than promoting a speci

  18. Virtual reality/ augmented reality technology : the next chapter of human-computer interaction

    OpenAIRE

    Huang, Xing

    2015-01-01

    No matter how many different size and shape the computer has, the basic components of computers are still the same. If we use the user perspective to look for the development of computer history, we can surprisingly find that it is the input output device that leads the development of the industry development, in one word, human-computer interaction changes the development of computer history. Human computer interaction has been gone through three stages, the first stage relies on the inpu...

  19. My4Sight: A Human Computation Platform for Improving Flu Predictions

    OpenAIRE

    Akupatni, Vivek Bharath

    2015-01-01

    While many human computation (human-in-the-loop) systems exist in the field of Artificial Intelligence (AI) to solve problems that can't be solved by computers alone, comparatively fewer platforms exist for collecting human knowledge, and evaluation of various techniques for harnessing human insights in improving forecasting models for infectious diseases, such as Influenza and Ebola. In this thesis, we present the design and implementation of My4Sight, a human computation system develope...

  20. The biofuels in debate

    International Nuclear Information System (INIS)

    Rigaud, Ch.

    2007-01-01

    As the development of the biofuels is increasing in the world, many voices are beginning to rise to denounce the environmental risks and the competition of the green fuels with the alimentary farming. The debate points out the problems to solve to develop a sustainable channel. (A.L.B.)

  1. Derailing the Growth Debate

    DEFF Research Database (Denmark)

    Nørgaard, Jørgen

    2009-01-01

    that we know today implies that the report was in any sense fundamentally wrong. A cohort of critics at the time, it can be said, was seriously in error when they managed to derail the debate by rejecting the report’s conclusions, and a lot of the critique was not related to the content of the report...

  2. Debating China's assertiveness

    DEFF Research Database (Denmark)

    He, Kai; Feng, Huiyun

    2012-01-01

    Engaging the recent debate on China's assertive foreign policy, we suggest that it is normal for China – a rising power – to change its policy to a confident or even assertive direction because of its transformed national interests. We argue also that it is better to understand future US–China re...

  3. WORKSHOP: Discussion, debate, deliberation

    NARCIS (Netherlands)

    Jeliazkova, Margarita I.

    2014-01-01

    Discussing, deliberating and debating are a core part of any democratic process. To organise these processes well, a great deal of knowledge and skill is required. It is not simple to find a good balance between a number of elements: appropriate language and terminology; paying attention to solid

  4. Debates in Teaching Bioethics

    Science.gov (United States)

    Kedraka, Katerina; Kourkoutas, Yiannis

    2018-01-01

    In this small scale study in higher education, a good educational practice on the teaching of Bioethics based on transformative learning and accomplished by debates is presented. The research was carried out in June 2016 at the Department of Molecular Biology and Genetics, Democritus University of Thrace, Greece and it includes the assessment of…

  5. Vitalism and the Darwin Debate

    Science.gov (United States)

    Henderson, James

    2012-01-01

    There are currently both scientific and public debates surrounding Darwinism. In the scientific debate, the details of evolution are in dispute, but not the central thesis of Darwin's theory; in the public debate, Darwinism itself is questioned. I concentrate on the public debate because of its direct impact on education in the United States. Some…

  6. Aspects of computer control from the human engineering standpoint

    International Nuclear Information System (INIS)

    Huang, T.V.

    1979-03-01

    A Computer Control System includes data acquisition, information display and output control signals. In order to design such a system effectively we must first determine the required operational mode: automatic control (closed loop), computer assisted (open loop), or hybrid control. The choice of operating mode will depend on the nature of the plant, the complexity of the operation, the funds available, and the technical expertise of the operating staff, among many other factors. Once the mode has been selected, consideration must be given to the method (man/machine interface) by which the operator interacts with this system. The human engineering factors are of prime importance to achieving high operating efficiency and very careful attention must be given to this aspect of the work, if full operator acceptance is to be achieved. This paper will discuss these topics and will draw on experience gained in setting up the computer control system in Main Control Center for Stanford University's Accelerator Center (a high energy physics research facility)

  7. Evidence Report: Risk of Inadequate Human-Computer Interaction

    Science.gov (United States)

    Holden, Kritina; Ezer, Neta; Vos, Gordon

    2013-01-01

    Human-computer interaction (HCI) encompasses all the methods by which humans and computer-based systems communicate, share information, and accomplish tasks. When HCI is poorly designed, crews have difficulty entering, navigating, accessing, and understanding information. HCI has rarely been studied in an operational spaceflight context, and detailed performance data that would support evaluation of HCI have not been collected; thus, we draw much of our evidence from post-spaceflight crew comments, and from other safety-critical domains like ground-based power plants, and aviation. Additionally, there is a concern that any potential or real issues to date may have been masked by the fact that crews have near constant access to ground controllers, who monitor for errors, correct mistakes, and provide additional information needed to complete tasks. We do not know what types of HCI issues might arise without this "safety net". Exploration missions will test this concern, as crews may be operating autonomously due to communication delays and blackouts. Crew survival will be heavily dependent on available electronic information for just-in-time training, procedure execution, and vehicle or system maintenance; hence, the criticality of the Risk of Inadequate HCI. Future work must focus on identifying the most important contributing risk factors, evaluating their contribution to the overall risk, and developing appropriate mitigations. The Risk of Inadequate HCI includes eight core contributing factors based on the Human Factors Analysis and Classification System (HFACS): (1) Requirements, policies, and design processes, (2) Information resources and support, (3) Allocation of attention, (4) Cognitive overload, (5) Environmentally induced perceptual changes, (6) Misperception and misinterpretation of displayed information, (7) Spatial disorientation, and (8) Displays and controls.

  8. Gaze-and-brain-controlled interfaces for human-computer and human-robot interaction

    Directory of Open Access Journals (Sweden)

    Shishkin S. L.

    2017-09-01

    Full Text Available Background. Human-machine interaction technology has greatly evolved during the last decades, but manual and speech modalities remain single output channels with their typical constraints imposed by the motor system’s information transfer limits. Will brain-computer interfaces (BCIs and gaze-based control be able to convey human commands or even intentions to machines in the near future? We provide an overview of basic approaches in this new area of applied cognitive research. Objective. We test the hypothesis that the use of communication paradigms and a combination of eye tracking with unobtrusive forms of registering brain activity can improve human-machine interaction. Methods and Results. Three groups of ongoing experiments at the Kurchatov Institute are reported. First, we discuss the communicative nature of human-robot interaction, and approaches to building a more e cient technology. Specifically, “communicative” patterns of interaction can be based on joint attention paradigms from developmental psychology, including a mutual “eye-to-eye” exchange of looks between human and robot. Further, we provide an example of “eye mouse” superiority over the computer mouse, here in emulating the task of selecting a moving robot from a swarm. Finally, we demonstrate a passive, noninvasive BCI that uses EEG correlates of expectation. This may become an important lter to separate intentional gaze dwells from non-intentional ones. Conclusion. The current noninvasive BCIs are not well suited for human-robot interaction, and their performance, when they are employed by healthy users, is critically dependent on the impact of the gaze on selection of spatial locations. The new approaches discussed show a high potential for creating alternative output pathways for the human brain. When support from passive BCIs becomes mature, the hybrid technology of the eye-brain-computer (EBCI interface will have a chance to enable natural, fluent, and the

  9. The great climate debate

    International Nuclear Information System (INIS)

    Sudhakara Reddy, B.; Assenza, Gaudenz B.

    2009-01-01

    For over two decades, scientific and political communities have debated whether and how to act on climate change. The present paper revisits these debates and synthesizes the longstanding arguments. Firstly, it provides an overview of the development of international climate policy and discusses clashing positions, represented by sceptics and supporters of action on climate change. Secondly, it discusses the market-based measures as a means to increase the win-win opportunities and to attract profit-minded investors to invest in climate change mitigation. Finally, the paper examines whether climate protection policies can yield benefits both for the environment and the economy. A new breed of analysts are identified who are convinced of the climate change problem, while remaining sceptical of the proposed solutions. The paper suggests the integration of climate policies with those of development priorities that are vitally important for developing countries and stresses the need for using sustainable development as a framework for climate change policies.

  10. 'Homeopathy': untangling the debate.

    Science.gov (United States)

    Relton, Clare; O'Cathain, Alicia; Thomas, Kate J

    2008-07-01

    There are active public campaigns both for and against homeopathy, and its continuing availability in the NHS is debated in the medical, scientific and popular press. However, there is a lack of clarity in key terms used in the debate, and in how the evidence base of homeopathy is described and interpreted. The term 'homeopathy' is used with several different meanings including: the therapeutic system, homeopathic medicine, treatment by a homeopath, and the principles of 'homeopathy'. Conclusions drawn from one of these aspects are often inappropriately applied to another aspect. In interpreting the homeopathy evidence it is important to understand that the existing clinical experimental (randomised controlled trial) evidence base provides evidence as to the efficacy of homeopathic medicines, but not the effectiveness of treatment by a homeopath. The observational evidence base provides evidence as to the effectiveness of treatment by a homeopath. We make four recommendations to promote clarity in the reporting, design and interpretation of homeopathy research.

  11. Darfur a debate

    OpenAIRE

    Cohen, Roberta

    2008-01-01

    Los enconados debates mantenidos por las comunidades humanitarias y de derechos humanos se centran en el número de víctimas de Darfur, el uso del término “genocidio”, la eficacia de las soluciones militares en comparación con las políticas y en analizar hasta qué punto la defensa de los derechos humanos puede debilitar los programas humanitarios sobre el terreno.

  12. Human-computer interface incorporating personal and application domains

    Science.gov (United States)

    Anderson, Thomas G [Albuquerque, NM

    2011-03-29

    The present invention provides a human-computer interface. The interface includes provision of an application domain, for example corresponding to a three-dimensional application. The user is allowed to navigate and interact with the application domain. The interface also includes a personal domain, offering the user controls and interaction distinct from the application domain. The separation into two domains allows the most suitable interface methods in each: for example, three-dimensional navigation in the application domain, and two- or three-dimensional controls in the personal domain. Transitions between the application domain and the personal domain are under control of the user, and the transition method is substantially independent of the navigation in the application domain. For example, the user can fly through a three-dimensional application domain, and always move to the personal domain by moving a cursor near one extreme of the display.

  13. A computational model of human auditory signal processing and perception

    DEFF Research Database (Denmark)

    Jepsen, Morten Løve; Ewert, Stephan D.; Dau, Torsten

    2008-01-01

    A model of computational auditory signal-processing and perception that accounts for various aspects of simultaneous and nonsimultaneous masking in human listeners is presented. The model is based on the modulation filterbank model described by Dau et al. [J. Acoust. Soc. Am. 102, 2892 (1997...... discrimination with pure tones and broadband noise, tone-in-noise detection, spectral masking with narrow-band signals and maskers, forward masking with tone signals and tone or noise maskers, and amplitude-modulation detection with narrow- and wideband noise carriers. The model can account for most of the key...... properties of the data and is more powerful than the original model. The model might be useful as a front end in technical applications....

  14. Human-computer interface glove using flexible piezoelectric sensors

    Science.gov (United States)

    Cha, Youngsu; Seo, Jeonggyu; Kim, Jun-Sik; Park, Jung-Min

    2017-05-01

    In this note, we propose a human-computer interface glove based on flexible piezoelectric sensors. We select polyvinylidene fluoride as the piezoelectric material for the sensors because of advantages such as a steady piezoelectric characteristic and good flexibility. The sensors are installed in a fabric glove by means of pockets and Velcro bands. We detect changes in the angles of the finger joints from the outputs of the sensors, and use them for controlling a virtual hand that is utilized in virtual object manipulation. To assess the sensing ability of the piezoelectric sensors, we compare the processed angles from the sensor outputs with the real angles from a camera recoding. With good agreement between the processed and real angles, we successfully demonstrate the user interaction system with the virtual hand and interface glove based on the flexible piezoelectric sensors, for four hand motions: fist clenching, pinching, touching, and grasping.

  15. Simple, accurate equations for human blood O2 dissociation computations.

    Science.gov (United States)

    Severinghaus, J W

    1979-03-01

    Hill's equation can be slightly modified to fit the standard human blood O2 dissociation curve to within plus or minus 0.0055 fractional saturation (S) from O less than S less than 1. Other modifications of Hill's equation may be used to compute Po2 (Torr) from S (Eq. 2), and the temperature coefficient of Po2 (Eq. 3). Variations of the Bohr coefficient with Po2 are given by Eq. 4. S = (((Po2(3) + 150 Po2)(-1) x 23,400) + 1)(-1) (1) In Po2 = 0.385 In (S-1 - 1)(-1) + 3.32 - (72 S)(-1) - 0.17(S6) (2) DELTA In Po2/delta T = 0.058 ((0.243 X Po2/100)(3.88) + 1)(-1) + 0.013 (3) delta In Po2/delta pH = (Po2/26.6)(0.184) - 2.2 (4) Procedures are described to determine Po2 and S of blood iteratively after extraction or addition of a defined amount of O2 and to compute P50 of blood from a single sample after measuring Po2, pH, and S.

  16. Assessing Human Judgment of Computationally Generated Swarming Behavior

    Directory of Open Access Journals (Sweden)

    John Harvey

    2018-02-01

    Full Text Available Computer-based swarm systems, aiming to replicate the flocking behavior of birds, were first introduced by Reynolds in 1987. In his initial work, Reynolds noted that while it was difficult to quantify the dynamics of the behavior from the model, observers of his model immediately recognized them as a representation of a natural flock. Considerable analysis has been conducted since then on quantifying the dynamics of flocking/swarming behavior. However, no systematic analysis has been conducted on human identification of swarming. In this paper, we assess subjects’ assessment of the behavior of a simplified version of Reynolds’ model. Factors that affect the identification of swarming are discussed and future applications of the resulting models are proposed. Differences in decision times for swarming-related questions asked during the study indicate that different brain mechanisms may be involved in different elements of the behavior assessment task. The relatively simple but finely tunable model used in this study provides a useful methodology for assessing individual human judgment of swarming behavior.

  17. Hybrid Human-Computing Distributed Sense-Making: Extending the SOA Paradigm for Dynamic Adjudication and Optimization of Human and Computer Roles

    Science.gov (United States)

    Rimland, Jeffrey C.

    2013-01-01

    In many evolving systems, inputs can be derived from both human observations and physical sensors. Additionally, many computation and analysis tasks can be performed by either human beings or artificial intelligence (AI) applications. For example, weather prediction, emergency event response, assistive technology for various human sensory and…

  18. Mutations that Cause Human Disease: A Computational/Experimental Approach

    Energy Technology Data Exchange (ETDEWEB)

    Beernink, P; Barsky, D; Pesavento, B

    2006-01-11

    International genome sequencing projects have produced billions of nucleotides (letters) of DNA sequence data, including the complete genome sequences of 74 organisms. These genome sequences have created many new scientific opportunities, including the ability to identify sequence variations among individuals within a species. These genetic differences, which are known as single nucleotide polymorphisms (SNPs), are particularly important in understanding the genetic basis for disease susceptibility. Since the report of the complete human genome sequence, over two million human SNPs have been identified, including a large-scale comparison of an entire chromosome from twenty individuals. Of the protein coding SNPs (cSNPs), approximately half leads to a single amino acid change in the encoded protein (non-synonymous coding SNPs). Most of these changes are functionally silent, while the remainder negatively impact the protein and sometimes cause human disease. To date, over 550 SNPs have been found to cause single locus (monogenic) diseases and many others have been associated with polygenic diseases. SNPs have been linked to specific human diseases, including late-onset Parkinson disease, autism, rheumatoid arthritis and cancer. The ability to predict accurately the effects of these SNPs on protein function would represent a major advance toward understanding these diseases. To date several attempts have been made toward predicting the effects of such mutations. The most successful of these is a computational approach called ''Sorting Intolerant From Tolerant'' (SIFT). This method uses sequence conservation among many similar proteins to predict which residues in a protein are functionally important. However, this method suffers from several limitations. First, a query sequence must have a sufficient number of relatives to infer sequence conservation. Second, this method does not make use of or provide any information on protein structure, which

  19. Great software debates

    CERN Document Server

    Davis, A

    2004-01-01

    The industry’s most outspoken and insightful critic explains how the software industry REALLY works. In Great Software Debates, Al Davis, shares what he has learned about the difference between the theory and the realities of business and encourages you to question and think about software engineering in ways that will help you succeed where others fail. In short, provocative essays, Davis fearlessly reveals the truth about process improvement, productivity, software quality, metrics, agile development, requirements documentation, modeling, software marketing and sales, empiricism, start-up financing, software research, requirements triage, software estimation, and entrepreneurship.

  20. The public debate on CIGEO

    International Nuclear Information System (INIS)

    2014-01-01

    This document first indicates the two laws which govern the public debate on the storage of high activity and long life wastes. It reports the progress of this public debate which started with a statement of 45 associations committed in the protection of environment saying they will not participate to this debate. A first debate in Bures had to be very quickly stopped as these opponents irrupted into the room. The vision of these opponents is very briefly presented. The reaction of public debate organizers is indicated. The results of the debate are briefly discussed. It appears that the ethical aspect is often raised by the opponents and this document outlines that their reactions were mostly irrational. The major issues of the debate have been: risks related to water, hydrogen and earthquake, costs and financing, transport safety, the loss of geological resources, job creation, and governance. The various aspects of this public debate are commented and discussed

  1. Human Pacman: A Mobile Augmented Reality Entertainment System Based on Physical, Social, and Ubiquitous Computing

    Science.gov (United States)

    Cheok, Adrian David

    This chapter details the Human Pacman system to illuminate entertainment computing which ventures to embed the natural physical world seamlessly with a fantasy virtual playground by capitalizing on infrastructure provided by mobile computing, wireless LAN, and ubiquitous computing. With Human Pacman, we have a physical role-playing computer fantasy together with real human-social and mobile-gaming that emphasizes on collaboration and competition between players in a wide outdoor physical area that allows natural wide-area human-physical movements. Pacmen and Ghosts are now real human players in the real world experiencing mixed computer graphics fantasy-reality provided by using the wearable computers on them. Virtual cookies and actual tangible physical objects are incorporated into the game play to provide novel experiences of seamless transitions between the real and virtual worlds. This is an example of a new form of gaming that anchors on physicality, mobility, social interaction, and ubiquitous computing.

  2. Open-Box Muscle-Computer Interface: Introduction to Human-Computer Interactions in Bioengineering, Physiology, and Neuroscience Courses

    Science.gov (United States)

    Landa-Jiménez, M. A.; González-Gaspar, P.; Pérez-Estudillo, C.; López-Meraz, M. L.; Morgado-Valle, C.; Beltran-Parrazal, L.

    2016-01-01

    A Muscle-Computer Interface (muCI) is a human-machine system that uses electromyographic (EMG) signals to communicate with a computer. Surface EMG (sEMG) signals are currently used to command robotic devices, such as robotic arms and hands, and mobile robots, such as wheelchairs. These signals reflect the motor intention of a user before the…

  3. Enrichment of Human-Computer Interaction in Brain-Computer Interfaces via Virtual Environments

    Directory of Open Access Journals (Sweden)

    Alonso-Valerdi Luz María

    2017-01-01

    Full Text Available Tridimensional representations stimulate cognitive processes that are the core and foundation of human-computer interaction (HCI. Those cognitive processes take place while a user navigates and explores a virtual environment (VE and are mainly related to spatial memory storage, attention, and perception. VEs have many distinctive features (e.g., involvement, immersion, and presence that can significantly improve HCI in highly demanding and interactive systems such as brain-computer interfaces (BCI. BCI is as a nonmuscular communication channel that attempts to reestablish the interaction between an individual and his/her environment. Although BCI research started in the sixties, this technology is not efficient or reliable yet for everyone at any time. Over the past few years, researchers have argued that main BCI flaws could be associated with HCI issues. The evidence presented thus far shows that VEs can (1 set out working environmental conditions, (2 maximize the efficiency of BCI control panels, (3 implement navigation systems based not only on user intentions but also on user emotions, and (4 regulate user mental state to increase the differentiation between control and noncontrol modalities.

  4. The nuclear power debate

    International Nuclear Information System (INIS)

    Woerndl, B.

    1992-01-01

    This material-intensive analysis of the public dispute about nuclear power plants uses the fundamental thoughts of the conflict theory approach by Georg Simmel, linking them to results of recent value change research. Through the medium of a qualitative content analysis of arguments in favour of and against nuclear energy it is shown how values are expressed and move, how they differentiate and get modified, in conflicting argumentation patterns. The first part reconstructs the history of the nuclear power conflict under the aspect of its subject priorities changing from time to time. The second part shows, based on three debate priorities, how social value patterns recognized for the moment changed in and by the conflict: the argumentation is that the nuclear power controversy has led to a relativization of its scientific claim for recognition; it has created a problem awareness with regard to purely quantitatively oriented growth objectives and developed criteria of an ecologically controlled satisfaction of needs; the debate has paved the way, in the area of political regulation models, for the advancement of basic democratic elements within a representative democracy. (orig./HP) [de

  5. The Crisis in Policy Debate.

    Science.gov (United States)

    Rowland, Robert C.; Deatherage, Scott

    1988-01-01

    Asserts that policy debate is declining, mainly because of incomprehensible argumentation and speaking. Claims that judges should intervene in the debate process to demand certain minimums of effective argument. Advocates the creation of a debate coach organization that would establish general norms for judging behavior. (MM)

  6. Public debate - radioactive wastes management

    International Nuclear Information System (INIS)

    2005-01-01

    Between September 2005 and January 2006 a national debate has been organized on the radioactive wastes management. This debate aimed to inform the public and to allow him to give his opinion. This document presents, the reasons of this debate, the operating, the synthesis of the results and technical documents to bring information in the domain of radioactive wastes management. (A.L.B.)

  7. Human-Centered Design of Human-Computer-Human Dialogs in Aerospace Systems

    Science.gov (United States)

    Mitchell, Christine M.

    1998-01-01

    A series of ongoing research programs at Georgia Tech established a need for a simulation support tool for aircraft computer-based aids. This led to the design and development of the Georgia Tech Electronic Flight Instrument Research Tool (GT-EFIRT). GT-EFIRT is a part-task flight simulator specifically designed to study aircraft display design and single pilot interaction. ne simulator, using commercially available graphics and Unix workstations, replicates to a high level of fidelity the Electronic Flight Instrument Systems (EFIS), Flight Management Computer (FMC) and Auto Flight Director System (AFDS) of the Boeing 757/767 aircraft. The simulator can be configured to present information using conventional looking B757n67 displays or next generation Primary Flight Displays (PFD) such as found on the Beech Starship and MD-11.

  8. A Human/Computer Learning Network to Improve Biodiversity Conservation and Research

    OpenAIRE

    Kelling, Steve; Gerbracht, Jeff; Fink, Daniel; Lagoze, Carl; Wong, Weng-Keen; Yu, Jun; Damoulas, Theodoros; Gomes, Carla

    2012-01-01

    In this paper we describe eBird, a citizen-science project that takes advantage of the human observational capacity to identify birds to species, which is then used to accurately represent patterns of bird occurrences across broad spatial and temporal extents. eBird employs artificial intelligence techniques such as machine learning to improve data quality by taking advantage of the synergies between human computation and mechanical computation. We call this a Human-Computer Learning Network,...

  9. A collaborative brain-computer interface for improving human performance.

    Directory of Open Access Journals (Sweden)

    Yijun Wang

    Full Text Available Electroencephalogram (EEG based brain-computer interfaces (BCI have been studied since the 1970s. Currently, the main focus of BCI research lies on the clinical use, which aims to provide a new communication channel to patients with motor disabilities to improve their quality of life. However, the BCI technology can also be used to improve human performance for normal healthy users. Although this application has been proposed for a long time, little progress has been made in real-world practices due to technical limits of EEG. To overcome the bottleneck of low single-user BCI performance, this study proposes a collaborative paradigm to improve overall BCI performance by integrating information from multiple users. To test the feasibility of a collaborative BCI, this study quantitatively compares the classification accuracies of collaborative and single-user BCI applied to the EEG data collected from 20 subjects in a movement-planning experiment. This study also explores three different methods for fusing and analyzing EEG data from multiple subjects: (1 Event-related potentials (ERP averaging, (2 Feature concatenating, and (3 Voting. In a demonstration system using the Voting method, the classification accuracy of predicting movement directions (reaching left vs. reaching right was enhanced substantially from 66% to 80%, 88%, 93%, and 95% as the numbers of subjects increased from 1 to 5, 10, 15, and 20, respectively. Furthermore, the decision of reaching direction could be made around 100-250 ms earlier than the subject's actual motor response by decoding the ERP activities arising mainly from the posterior parietal cortex (PPC, which are related to the processing of visuomotor transmission. Taken together, these results suggest that a collaborative BCI can effectively fuse brain activities of a group of people to improve the overall performance of natural human behavior.

  10. [Computational prediction of human immunodeficiency resistance to reverse transcriptase inhibitors].

    Science.gov (United States)

    Tarasova, O A; Filimonov, D A; Poroikov, V V

    2017-10-01

    Human immunodeficiency virus (HIV) causes acquired immunodeficiency syndrome (AIDS) and leads to over one million of deaths annually. Highly active antiretroviral treatment (HAART) is a gold standard in the HIV/AIDS therapy. Nucleoside and non-nucleoside inhibitors of HIV reverse transcriptase (RT) are important component of HAART, but their effect depends on the HIV susceptibility/resistance. HIV resistance mainly occurs due to mutations leading to conformational changes in the three-dimensional structure of HIV RT. The aim of our work was to develop and test a computational method for prediction of HIV resistance associated with the mutations in HIV RT. Earlier we have developed a method for prediction of HIV type 1 (HIV-1) resistance; it is based on the usage of position-specific descriptors. These descriptors are generated using the particular amino acid residue and its position; the position of certain residue is determined in a multiple alignment. The training set consisted of more than 1900 sequences of HIV RT from the Stanford HIV Drug Resistance database; for these HIV RT variants experimental data on their resistance to ten inhibitors are presented. Balanced accuracy of prediction varies from 80% to 99% depending on the method of classification (support vector machine, Naive Bayes, random forest, convolutional neural networks) and the drug, resistance to which is obtained. Maximal balanced accuracy was obtained for prediction of resistance to zidovudine, stavudine, didanosine and efavirenz by the random forest classifier. Average accuracy of prediction is 89%.

  11. Institutionalizing human-computer interaction for global health.

    Science.gov (United States)

    Gulliksen, Jan

    2017-06-01

    Digitalization is the societal change process in which new ICT-based solutions bring forward completely new ways of doing things, new businesses and new movements in the society. Digitalization also provides completely new ways of addressing issues related to global health. This paper provides an overview of the field of human-computer interaction (HCI) and in what way the field has contributed to international development in different regions of the world. Additionally, it outlines the United Nations' new sustainability goals from December 2015 and what these could contribute to the development of global health and its relationship to digitalization. Finally, it argues why and how HCI could be adopted and adapted to fit the contextual needs, the need for localization and for the development of new digital innovations. The research methodology is mostly qualitative following an action research paradigm in which the actual change process that the digitalization is evoking is equally important as the scientific conclusions that can be drawn. In conclusion, the paper argues that digitalization is fundamentally changing the society through the development and use of digital technologies and may have a profound effect on the digital development of every country in the world. But it needs to be developed based on local practices, it needs international support and to not be limited by any technological constraints. Particularly digitalization to support global health requires a profound understanding of the users and their context, arguing for user-centred systems design methodologies as particularly suitable.

  12. Remotely Telling Humans and Computers Apart: An Unsolved Problem

    Science.gov (United States)

    Hernandez-Castro, Carlos Javier; Ribagorda, Arturo

    The ability to tell humans and computers apart is imperative to protect many services from misuse and abuse. For this purpose, tests called CAPTCHAs or HIPs have been designed and put into production. Recent history shows that most (if not all) can be broken given enough time and commercial interest: CAPTCHA design seems to be a much more difficult problem than previously thought. The assumption that difficult-AI problems can be easily converted into valid CAPTCHAs is misleading. There are also some extrinsic problems that do not help, especially the big number of in-house designs that are put into production without any prior public critique. In this paper we present a state-of-the-art survey of current HIPs, including proposals that are now into production. We classify them regarding their basic design ideas. We discuss current attacks as well as future attack paths, and we also present common errors in design, and how many implementation flaws can transform a not necessarily bad idea into a weak CAPTCHA. We present examples of these flaws, using specific well-known CAPTCHAs. In a more theoretical way, we discuss the threat model: confronted risks and countermeasures. Finally, we introduce and discuss some desirable properties that new HIPs should have, concluding with some proposals for future work, including methodologies for design, implementation and security assessment.

  13. Inferring Human Activity in Mobile Devices by Computing Multiple Contexts.

    Science.gov (United States)

    Chen, Ruizhi; Chu, Tianxing; Liu, Keqiang; Liu, Jingbin; Chen, Yuwei

    2015-08-28

    This paper introduces a framework for inferring human activities in mobile devices by computing spatial contexts, temporal contexts, spatiotemporal contexts, and user contexts. A spatial context is a significant location that is defined as a geofence, which can be a node associated with a circle, or a polygon; a temporal context contains time-related information that can be e.g., a local time tag, a time difference between geographical locations, or a timespan; a spatiotemporal context is defined as a dwelling length at a particular spatial context; and a user context includes user-related information that can be the user's mobility contexts, environmental contexts, psychological contexts or social contexts. Using the measurements of the built-in sensors and radio signals in mobile devices, we can snapshot a contextual tuple for every second including aforementioned contexts. Giving a contextual tuple, the framework evaluates the posteriori probability of each candidate activity in real-time using a Naïve Bayes classifier. A large dataset containing 710,436 contextual tuples has been recorded for one week from an experiment carried out at Texas A&M University Corpus Christi with three participants. The test results demonstrate that the multi-context solution significantly outperforms the spatial-context-only solution. A classification accuracy of 61.7% is achieved for the spatial-context-only solution, while 88.8% is achieved for the multi-context solution.

  14. The nuclear energy debate

    International Nuclear Information System (INIS)

    Hardy, D.

    1984-01-01

    We have not been able to obtain closure in the nuclear energy debate because the public perception of nuclear energy is out of sync with reality. The industry has not been about to deal with the concerns of those opposed to nuclear energy because its reaction has been to generate and disseminate more facts rather than dealing with the serious moral and ethical questions that are being asked. Nuclear proponents and opponents appeal to different moral communities, and those outside each community cannot concede that the other might be right. The Interfaith Program for Public Awareness of Nuclear Issues (IPPANI) has been formed, sponsored by members of the Jewish, Baha'i, Roman Catholic, United, and Anglican faiths, to provide for a balanced discussion of the ethical aspects of energy. (L.L.)

  15. Psychosocial and Cultural Modeling in Human Computation Systems: A Gamification Approach

    Energy Technology Data Exchange (ETDEWEB)

    Sanfilippo, Antonio P.; Riensche, Roderick M.; Haack, Jereme N.; Butner, R. Scott

    2013-11-20

    “Gamification”, the application of gameplay to real-world problems, enables the development of human computation systems that support decision-making through the integration of social and machine intelligence. One of gamification’s major benefits includes the creation of a problem solving environment where the influence of cognitive and cultural biases on human judgment can be curtailed through collaborative and competitive reasoning. By reducing biases on human judgment, gamification allows human computation systems to exploit human creativity relatively unhindered by human error. Operationally, gamification uses simulation to harvest human behavioral data that provide valuable insights for the solution of real-world problems.

  16. A debate about the merits of debate in nurse education.

    Science.gov (United States)

    Hartin, Peter; Birks, Melanie; Bodak, Marie; Woods, Cindy; Hitchins, Marnie

    2017-09-01

    In this 'Issues for Debate' paper, the issue is debate. Today's nurses must be able to advocate, lead, and grow 'big ideas', as well as knowing their way around a patient's body and mind. This paper reports, partly, on a research study into the use of debate to develop clinical reasoning and thinking skills in nursing students. The study was conducted with first and third-year nursing students enrolled at an Australian regional university. Students were asked to comment on the effectiveness of debate as an educational strategy. We combine the results of this research study with literature and discussion into the educational uses of debate to put the argument that using debate in nursing education can be an effective way to foster the type of creative, intelligent, thoughtful and forward-thinking nurses needed in the modern healthcare system. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Frames in the Ethiopian Debate on Biofuels

    Directory of Open Access Journals (Sweden)

    Brigitte Portner

    2013-01-01

    Full Text Available Biofuel production, while highly contested, is supported by a number of policies worldwide. Ethiopia was among the first sub-Saharan countries to devise a biofuel policy strategy to guide the associated demand toward sustainable development. In this paper, I discuss Ethiopia’s biofuel policy from an interpretative research position using a frames approach and argue that useful insights can be obtained by paying more attention to national contexts and values represented in the debates on whether biofuel production can or will contribute to sustainable development. To this end, I was able to distinguish three major frames used in the Ethiopian debate on biofuels: an environmental rehabilitation frame, a green revolution frame and a legitimacy frame. The article concludes that actors advocating for frames related to social and human issues have difficulties entering the debate and forming alliances, and that those voices need to be included in order for Ethiopia to develop a sustainable biofuel sector.

  18. Energies: the real debate; Energies: Le Vrai Debat

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    Concurrently to the National Debate on the energies, a real debate has been proposed by seven associations of the environment protection and improvement. This debate, international, proposes: a panorama of the stakes, a presentation of the nuclear as an energy source not necessary dangerous, the relation between climate and employment and the conditions of existence and development of a local energy policy. (A.L.B.)

  19. Human-Centred Computing for Assisting Nuclear Safeguards

    International Nuclear Information System (INIS)

    Szoke, I.

    2015-01-01

    With the rapid evolution of enabling hardware and software, technologies including 3D simulation, virtual reality (VR), augmented reality (AR), advanced user interfaces (UI), and geographical information systems (GIS) are increasingly employed in many aspects of modern life. In line with this, the nuclear industry is rapidly adopting emerging technologies to improve efficiency and safety by supporting planning and optimization of maintenance and decommissioning work, as well as for knowledge management, surveillance, training and briefing field operatives, education, etc. For many years, the authors have been involved in research and development (R&D) into the application of 3D simulation, VR, and AR, for mobile, desktop, and immersive 3D systems, to provide a greater sense of presence and situation awareness, for training, briefing, and in situ work by field operators. This work has resulted in a unique software base and experience (documented in numerous reports) from evaluating the effects of the design of training programmes and briefing sessions on human performance and training efficiency when applying various emerging technologies. In addition, the authors are involved in R&D into the use of 3D simulation, advanced UIs, mobile computing, and GIS systems to support realistic visualization of the combined radiological and geographical environment, as well as acquisition, analyzes, visualization and sharing of radiological and other data, within nuclear installations and their surroundings. The toolkit developed by the authors, and the associated knowledge base, has been successfully applied to various aspects of the nuclear industry, and has great potential within the safeguards domain. It can be used to train safeguards inspectors, brief inspectors before inspections, assist inspectors in situ (data registration, analyzes, and communication), support the design and verification of safeguards systems, conserve data and experience, educate future safeguards

  20. [Bioethics and abortion. Debate].

    Science.gov (United States)

    Diniz, D; Gonzalez Velez, A C

    1998-06-01

    Although abortion has been the most debated of all issues analyzed in bioethics, no moral consensus has been achieved. The problem of abortion exemplifies the difficulty of establishing social dialogue in the face of distinct moral positions, and of creating an independent academic discussion based on writings that are passionately argumentative. The greatest difficulty posed by the abortion literature is to identify consistent philosophical and scientific arguments amid the rhetorical manipulation. A few illustrative texts were selected to characterize the contemporary debate. The terms used to describe abortion are full of moral meaning and must be analyzed for their underlying assumptions. Of the four main types of abortion, only 'eugenic abortion', as exemplified by the Nazis, does not consider the wishes of the woman or couple--a fundamental difference for most bioethicists. The terms 'selective abortion' and 'therapeutic abortion' are often confused, and selective abortion is often called eugenic abortion by opponents. The terms used to describe abortion practitioners, abortion opponents, and the 'product' are also of interest in determining the style of the article. The video entitled "The Silent Scream" was a classic example of violent and seductive rhetoric. Its type of discourse, freely mixing scientific arguments and moral beliefs, hinders analysis. Within writings about abortion three extreme positions may be identified: heteronomy (the belief that life is a gift that does not belong to one) versus reproductive autonomy; sanctity of life versus tangibility of life; and abortion as a crime versus abortion as morally neutral. Most individuals show an inconsistent array of beliefs, and few groups or individuals identify with the extreme positions. The principal argument of proponents of legalization is respect for the reproductive autonomy of the woman or couple based on the principle of individual liberty, while heteronomy is the main principle of

  1. L'ordinateur a visage humain (The Computer in Human Guise).

    Science.gov (United States)

    Otman, Gabriel

    1986-01-01

    Discusses the tendency of humans to describe parts and functions of a computer with terminology that refers to human characteristics; for example, parts of the body (electronic brain), intellectual activities (optical memory), and physical activities (command). Computers are also described through metaphors, connotations, allusions, and analogies…

  2. Computer science security research and human subjects: emerging considerations for research ethics boards.

    Science.gov (United States)

    Buchanan, Elizabeth; Aycock, John; Dexter, Scott; Dittrich, David; Hvizdak, Erin

    2011-06-01

    This paper explores the growing concerns with computer science research, and in particular, computer security research and its relationship with the committees that review human subjects research. It offers cases that review boards are likely to confront, and provides a context for appropriate consideration of such research, as issues of bots, clouds, and worms enter the discourse of human subjects review.

  3. Debate in EFL Classroom

    Directory of Open Access Journals (Sweden)

    Mirjana Želježič

    2017-06-01

    Full Text Available Relying primarily on the Common European Framework of Reference for Languages (CEFR and The National EFL Syllabus, this paper focuses on the highest ranking goals within formal foreign language (L2 education: the development of communicative competence (which the communicative paradigm regards as the most important goal of contemporary language teaching, and of critical thinking (CT ability, which is widely recognised as the main general education goal. It also points to some of the discrepancies generated by tensions between the fact that language is a social and cultural phenomenon that exists and evolves only through interaction with others, and individual-student-centred pedagogical practices of teaching (and assessment – which jeopardise the validity of these practices. Next, it links the official educational goals to the cultivation of oral interaction (rather than oral production in argumentative discursive practices in general and in structured debate formats in particular, which are proposed as an effective pedagogical method for developing CT skills and oral interactional competence in argumentative discursive events, especially on B2+ levels.

  4. Grounding Political Debate

    Directory of Open Access Journals (Sweden)

    Benjamin Marks

    2009-03-01

    Full Text Available This essay is intentionally one-sided. Almost all other essays by either defenders of capitalism (libertarians or defenders of government (statists are oppositely one-sided. They claim that capitalism’s voluntariness or government’s coerciveness mean that capitalism or government better fosters such things as art, happiness, education, jobs and world peace, and never much emphasise factors that may undermine their commentary. This essay emphasises the mitigating factors that others gloss over.Arguments about the advantages or disadvantages of capitalism or government dominate political debate. This essay contends that these arguments, when they are not just about their author’s feelings, are usually incorrect or misleading. They often use value-judgments on behalf of others, disguised by false measures of happiness invented from economic data or surveys, and then applied across demographics and time. Another common error is to talk only of the positive side of something and ignore the negative. Libertarians spot these errors in statists, yet often do not hold themselves to the same standard.

  5. Preguntas, interpretaciones y debates

    Directory of Open Access Journals (Sweden)

    Gastón Souroujon

    2014-01-01

    Full Text Available La dirección político-económica que asumió el gobierno de Carlos S. Menem, luego de su asunción en julio de 1989, supuso para la comunidad de cientistas políticos un aliciente para reflexionar en torno a distintas problemáticas centrales de la disciplina, permitiendo que en la década de los ’90 la ciencia política de Argentina se enriqueciera con nuevos tópicos de discusión. En este artículo abordaremos dos de los interrogantes cruciales que vertebraron los debates académicos durante estos años: 1 las razones del consenso, activo y pasivo, de gran parte de la población durante más de un lustro, a un gobierno que llevó a cabo medidas que tradicionalmente fueron resistidas, y ge - nerarían costos económicos a amplias capas de la sociedad, y 2 las consecuencias tanto negativas como positivas que generó el gobierno de Menem para la consolidación democrática. Analizaremos en cada caso las respuestas disímiles, y los supuestos epistemológicos que sustentaron estas lecturas.

  6. Implementations of the CC'01 Human-Computer Interaction Guidelines Using Bloom's Taxonomy

    Science.gov (United States)

    Manaris, Bill; Wainer, Michael; Kirkpatrick, Arthur E.; Stalvey, RoxAnn H.; Shannon, Christine; Leventhal, Laura; Barnes, Julie; Wright, John; Schafer, J. Ben; Sanders, Dean

    2007-01-01

    In today's technology-laden society human-computer interaction (HCI) is an important knowledge area for computer scientists and software engineers. This paper surveys existing approaches to incorporate HCI into computer science (CS) and such related issues as the perceived gap between the interests of the HCI community and the needs of CS…

  7. Domain Decomposition for Computing Extremely Low Frequency Induced Current in the Human Body

    OpenAIRE

    Perrussel , Ronan; Voyer , Damien; Nicolas , Laurent; Scorretti , Riccardo; Burais , Noël

    2011-01-01

    International audience; Computation of electromagnetic fields in high resolution computational phantoms requires solving large linear systems. We present an application of Schwarz preconditioners with Krylov subspace methods for computing extremely low frequency induced fields in a phantom issued from the Visible Human.

  8. Human-Computer Interfaces for Wearable Computers: A Systematic Approach to Development and Evaluation

    OpenAIRE

    Witt, Hendrik

    2007-01-01

    The research presented in this thesis examines user interfaces for wearable computers.Wearable computers are a special kind of mobile computers that can be worn on the body. Furthermore, they integrate themselves even more seamlessly into different activities than a mobile phone or a personal digital assistant can.The thesis investigates the development and evaluation of user interfaces for wearable computers. In particular, it presents fundamental research results as well as supporting softw...

  9. Moving beyond the GM debate.

    Directory of Open Access Journals (Sweden)

    Ottoline Leyser

    2014-06-01

    Full Text Available Once again, there are calls to reopen the debate on genetically modified (GM crops. I find these calls frustrating and unnecessarily decisive. In my opinion the GM debate, on both sides, continues to hamper the urgent need to address the diverse and pressing challenges of global food security and environmental sustainability. The destructive power of the debate comes from its conflation of unrelated issues, coupled with deeply rooted misconceptions of the nature of agriculture.

  10. Appearance-based human gesture recognition using multimodal features for human computer interaction

    Science.gov (United States)

    Luo, Dan; Gao, Hua; Ekenel, Hazim Kemal; Ohya, Jun

    2011-03-01

    The use of gesture as a natural interface plays an utmost important role for achieving intelligent Human Computer Interaction (HCI). Human gestures include different components of visual actions such as motion of hands, facial expression, and torso, to convey meaning. So far, in the field of gesture recognition, most previous works have focused on the manual component of gestures. In this paper, we present an appearance-based multimodal gesture recognition framework, which combines the different groups of features such as facial expression features and hand motion features which are extracted from image frames captured by a single web camera. We refer 12 classes of human gestures with facial expression including neutral, negative and positive meanings from American Sign Languages (ASL). We combine the features in two levels by employing two fusion strategies. At the feature level, an early feature combination can be performed by concatenating and weighting different feature groups, and LDA is used to choose the most discriminative elements by projecting the feature on a discriminative expression space. The second strategy is applied on decision level. Weighted decisions from single modalities are fused in a later stage. A condensation-based algorithm is adopted for classification. We collected a data set with three to seven recording sessions and conducted experiments with the combination techniques. Experimental results showed that facial analysis improve hand gesture recognition, decision level fusion performs better than feature level fusion.

  11. Integrating Human and Computer Intelligence. Technical Report No. 32.

    Science.gov (United States)

    Pea, Roy D.

    This paper explores the thesis that advances in computer applications and artificial intelligence have important implications for the study of development and learning in psychology. Current approaches to the use of computers as devices for problem solving, reasoning, and thinking--i.e., expert systems and intelligent tutoring systems--are…

  12. Developing Educational Computer Animation Based on Human Personality Types

    Science.gov (United States)

    Musa, Sajid; Ziatdinov, Rushan; Sozcu, Omer Faruk; Griffiths, Carol

    2015-01-01

    Computer animation in the past decade has become one of the most noticeable features of technology-based learning environments. By its definition, it refers to simulated motion pictures showing movement of drawn objects, and is often defined as the art in movement. Its educational application known as educational computer animation is considered…

  13. Computerized Cognitive Rehabilitation: Comparing Different Human-Computer Interactions.

    Science.gov (United States)

    Quaglini, Silvana; Alloni, Anna; Cattani, Barbara; Panzarasa, Silvia; Pistarini, Caterina

    2017-01-01

    In this work we describe an experiment involving aphasic patients, where the same speech rehabilitation exercise was administered in three different modalities, two of which are computer-based. In particular, one modality exploits the "Makey Makey", an electronic board which allows interacting with the computer using physical objects.

  14. Debates in Religious Education. The Debates in Subject Teaching Series

    Science.gov (United States)

    Barnes, L. Philip, Ed.

    2011-01-01

    What are the key debates in Religious Education teaching today? "Debates in Religious Education" explores the major issues all RE teachers encounter in their daily professional lives. It encourages critical reflection and aims to stimulate both novice and experienced teachers to think more deeply about their practice, and link research…

  15. A Model-based Framework for Risk Assessment in Human-Computer Controlled Systems

    Science.gov (United States)

    Hatanaka, Iwao

    2000-01-01

    The rapid growth of computer technology and innovation has played a significant role in the rise of computer automation of human tasks in modem production systems across all industries. Although the rationale for automation has been to eliminate "human error" or to relieve humans from manual repetitive tasks, various computer-related hazards and accidents have emerged as a direct result of increased system complexity attributed to computer automation. The risk assessment techniques utilized for electromechanical systems are not suitable for today's software-intensive systems or complex human-computer controlled systems. This thesis will propose a new systemic model-based framework for analyzing risk in safety-critical systems where both computers and humans are controlling safety-critical functions. A new systems accident model will be developed based upon modem systems theory and human cognitive processes to better characterize system accidents, the role of human operators, and the influence of software in its direct control of significant system functions. Better risk assessments will then be achievable through the application of this new framework to complex human-computer controlled systems.

  16. La Société des Nations suppose la Société des Esprits: The Debate on Modern Humanism

    NARCIS (Netherlands)

    van Heerikhuizen, A.

    2015-01-01

    This article focuses on the themes of the two conferences organized by the League of Nations—"Modern Man" and "The Foundations of Modern Humanism"—which were held in Nice and Budapest in 1935 and 1936, respectively. It was a time of deepening crisis, when the pervasive belief was that European

  17. Constructing a Computer Model of the Human Eye Based on Tissue Slice Images

    OpenAIRE

    Dai, Peishan; Wang, Boliang; Bao, Chunbo; Ju, Ying

    2010-01-01

    Computer simulation of the biomechanical and biological heat transfer in ophthalmology greatly relies on having a reliable computer model of the human eye. This paper proposes a novel method on the construction of a geometric model of the human eye based on tissue slice images. Slice images were obtained from an in vitro Chinese human eye through an embryo specimen processing methods. A level set algorithm was used to extract contour points of eye tissues while a principle component analysi...

  18. Proceedings of the topical meeting on advances in human factors research on man/computer interactions

    International Nuclear Information System (INIS)

    Anon.

    1990-01-01

    This book discusses the following topics: expert systems and knowledge engineering-I; verification and validation of software; methods for modeling UMAN/computer performance; MAN/computer interaction problems in producing procedures -1-2; progress and problems with automation-1-2; experience with electronic presentation of procedures-2; intelligent displays and monitors; modeling user/computer interface; and computer-based human decision-making aids

  19. Biomedical ontologies: toward scientific debate.

    Science.gov (United States)

    Maojo, V; Crespo, J; García-Remesal, M; de la Iglesia, D; Perez-Rey, D; Kulikowski, C

    2011-01-01

    Biomedical ontologies have been very successful in structuring knowledge for many different applications, receiving widespread praise for their utility and potential. Yet, the role of computational ontologies in scientific research, as opposed to knowledge management applications, has not been extensively discussed. We aim to stimulate further discussion on the advantages and challenges presented by biomedical ontologies from a scientific perspective. We review various aspects of biomedical ontologies going beyond their practical successes, and focus on some key scientific questions in two ways. First, we analyze and discuss current approaches to improve biomedical ontologies that are based largely on classical, Aristotelian ontological models of reality. Second, we raise various open questions about biomedical ontologies that require further research, analyzing in more detail those related to visual reasoning and spatial ontologies. We outline significant scientific issues that biomedical ontologies should consider, beyond current efforts of building practical consensus between them. For spatial ontologies, we suggest an approach for building "morphospatial" taxonomies, as an example that could stimulate research on fundamental open issues for biomedical ontologies. Analysis of a large number of problems with biomedical ontologies suggests that the field is very much open to alternative interpretations of current work, and in need of scientific debate and discussion that can lead to new ideas and research directions.

  20. The Great Mini-Debate

    Science.gov (United States)

    Benucci, Heather

    2017-01-01

    Debates remain popular in English language courses, and this activity gives students a low-stress opportunity to develop their speaking debating skills. This lesson plan is appropriate for upper intermediate or advanced students. Goals of the activity are to present an oral argument using evidence and use functional language related to agreeing,…

  1. Green grabbing debate and Madagascar

    DEFF Research Database (Denmark)

    Casse, Thorkil; Razafy, Fara Lala; Wurtzebach, Zachary

    2017-01-01

    and capitalise natural assets. First, to provide some context on the green grabbing debate, we discuss the trade-offs between conservation and development objectives. In addition, we refer briefly to the broader land grabbing debate of which green grabbing is a sub-component. Second, we question the theoretical...

  2. Student Pressure Subject of Debate

    Science.gov (United States)

    Gewertz, Catherine

    2006-01-01

    This article discusses student pressure as a subject of debate. The latest debate about schoolwork is being fueled by three recent books: "The Homework Myth" by Alfie Kohn, "The Case Against Homework" by Sara Bennett and Nancy Kalish, and "The Overachievers", by Alexandra Robbins, which depicts overextended high…

  3. Identification of Enhancers In Human: Advances In Computational Studies

    KAUST Repository

    Kleftogiannis, Dimitrios A.

    2016-01-01

    Finally, we take a step further by developing a novel feature selection method suitable for defining a computational framework capable of analyzing the genomic content of enhancers and reporting cell-line specific predictive signatures.

  4. Human face recognition using eigenface in cloud computing environment

    Science.gov (United States)

    Siregar, S. T. M.; Syahputra, M. F.; Rahmat, R. F.

    2018-02-01

    Doing a face recognition for one single face does not take a long time to process, but if we implement attendance system or security system on companies that have many faces to be recognized, it will take a long time. Cloud computing is a computing service that is done not on a local device, but on an internet connected to a data center infrastructure. The system of cloud computing also provides a scalability solution where cloud computing can increase the resources needed when doing larger data processing. This research is done by applying eigenface while collecting data as training data is also done by using REST concept to provide resource, then server can process the data according to existing stages. After doing research and development of this application, it can be concluded by implementing Eigenface, recognizing face by applying REST concept as endpoint in giving or receiving related information to be used as a resource in doing model formation to do face recognition.

  5. Computational analysis of human miRNAs phylogenetics

    African Journals Online (AJOL)

    User

    2011-05-02

    May 2, 2011 ... Human DNA. 71. 100.00. 1.94E-28. AL138714. Human DNA sequence from clone RP11-. 121J7 on chromosome 13q32.1-32.3. Contains the 3' end of a novel gene, the 5' end of the GPC5 gene for glypican 5, 5 ..... including human, chimpanzee, orangutan, and macaque, and find that miRNAs were ...

  6. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...

  7. Advancements in Violin-Related Human-Computer Interaction

    DEFF Research Database (Denmark)

    Overholt, Daniel

    2014-01-01

    of human intelligence and emotion is at the core of the Musical Interface Technology Design Space, MITDS. This is a framework that endeavors to retain and enhance such traits of traditional instruments in the design of interactive live performance interfaces. Utilizing the MITDS, advanced Human...

  8. Applying systemic-structural activity theory to design of human-computer interaction systems

    CERN Document Server

    Bedny, Gregory Z; Bedny, Inna

    2015-01-01

    Human-Computer Interaction (HCI) is an interdisciplinary field that has gained recognition as an important field in ergonomics. HCI draws on ideas and theoretical concepts from computer science, psychology, industrial design, and other fields. Human-Computer Interaction is no longer limited to trained software users. Today people interact with various devices such as mobile phones, tablets, and laptops. How can you make such interaction user friendly, even when user proficiency levels vary? This book explores methods for assessing the psychological complexity of computer-based tasks. It also p

  9. Winning the sustainable development debate

    International Nuclear Information System (INIS)

    Ritch, John; Cornish, Emma

    2002-01-01

    Full text: This year - in Johannesburg from 26 August to 4 September - the world will witness what is expected to be the largest environmental gathering yet: the World Summit on Sustainable Development. Some 60,000 participants, including Heads of State, government officials, intergovernmental organizations, and environmental, business and scientific lobbies, will debate the world's progress in implementing 'Agenda 2 V - the sustainable development principles agreed in Rio de Janeiro in 1992. Some kind of deal, perhaps in the form of a declaration, will emerge from Johannesburg, reasserting international commitment to sustainable development. At this stage the content cannot be predicted. Experience warns us to expect a strong and virulent anti-nuclear lobby, not only as part of the 'environmental community', but within some of the governments themselves. Their role will be to achieve a text declaring nuclear an unsustainable energy source. The nuclear industry has six months to make its case, in the preparatory fora and elsewhere, that nuclear energy must be recognized - and at a minimum, not excluded - as a sustainable development technology. Twin goals of sustainable development: meeting human need and achieving environmental security. The principle of sustainable development aims at the long-term environmental protection of the planet - sparing our children and their children from living on a planet irredeemably spoilt through human action. An equally pressing issue is that of bridging the wealth gap between the North and South. In this vein, UN Secretary General Kofi Annan recently published his priorities for attention at the World Summit. These include: - Poverty eradication and achieving sustainable livelihoods; - Promoting health through sustainable development; - Access to energy and energy efficiency; - Managing the world's freshwater resources; - Sustainable development initiatives for Africa. The central element of sustainable development: clean energy

  10. HuRECA: Human Reliability Evaluator for Computer-based Control Room Actions

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Lee, Seung Jun; Jang, Seung Cheol

    2011-01-01

    As computer-based design features such as computer-based procedures (CBP), soft controls (SCs), and integrated information systems are being adopted in main control rooms (MCR) of nuclear power plants, a human reliability analysis (HRA) method capable of dealing with the effects of these design features on human reliability is needed. From the observations of human factors engineering verification and validation experiments, we have drawn some major important characteristics on operator behaviors and design-related influencing factors (DIFs) from the perspective of human reliability. Firstly, there are new DIFs that should be considered in developing an HRA method for computer-based control rooms including especially CBP and SCs. In the case of the computer-based procedure rather than the paper-based procedure, the structural and managerial elements should be considered as important PSFs in addition to the procedural contents. In the case of the soft controllers, the so-called interface management tasks (or secondary tasks) should be reflected in the assessment of human error probability. Secondly, computer-based control rooms can provide more effective error recovery features than conventional control rooms. Major error recovery features for computer-based control rooms include the automatic logic checking function of the computer-based procedure and the information sharing feature of the general computer-based designs

  11. Computational Modeling of Human Multiple-Task Performance

    National Research Council Canada - National Science Library

    Kieras, David E; Meyer, David

    2005-01-01

    This is the final report for a project that was a continuation of an earlier, long-term project on the development and validation of the EPIC cognitive architecture for modeling human cognition and performance...

  12. Human-Computer Interaction Software: Lessons Learned, Challenges Ahead

    Science.gov (United States)

    1989-01-01

    domain communi- Iatelligent s t s s Me cation. Users familiar with problem Inteligent support systes. High-func- anddomains but inxperienced with comput...8217i. April 1987, pp. 7.3-78. His research interests include artificial intel- Creating better HCI softw-are will have a 8. S.K Catrd. I.P. Moran. arid

  13. Supporting Human Activities - Exploring Activity-Centered Computing

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob

    2002-01-01

    In this paper we explore an activity-centered computing paradigm that is aimed at supporting work processes that are radically different from the ones known from office work. Our main inspiration is healthcare work that is characterized by an extreme degree of mobility, many interruptions, ad-hoc...

  14. Can human experts predict solubility better than computers?

    Science.gov (United States)

    Boobier, Samuel; Osbourn, Anne; Mitchell, John B O

    2017-12-13

    In this study, we design and carry out a survey, asking human experts to predict the aqueous solubility of druglike organic compounds. We investigate whether these experts, drawn largely from the pharmaceutical industry and academia, can match or exceed the predictive power of algorithms. Alongside this, we implement 10 typical machine learning algorithms on the same dataset. The best algorithm, a variety of neural network known as a multi-layer perceptron, gave an RMSE of 0.985 log S units and an R 2 of 0.706. We would not have predicted the relative success of this particular algorithm in advance. We found that the best individual human predictor generated an almost identical prediction quality with an RMSE of 0.942 log S units and an R 2 of 0.723. The collection of algorithms contained a higher proportion of reasonably good predictors, nine out of ten compared with around half of the humans. We found that, for either humans or algorithms, combining individual predictions into a consensus predictor by taking their median generated excellent predictivity. While our consensus human predictor achieved very slightly better headline figures on various statistical measures, the difference between it and the consensus machine learning predictor was both small and statistically insignificant. We conclude that human experts can predict the aqueous solubility of druglike molecules essentially equally well as machine learning algorithms. We find that, for either humans or algorithms, combining individual predictions into a consensus predictor by taking their median is a powerful way of benefitting from the wisdom of crowds.

  15. Experimental evaluation of multimodal human computer interface for tactical audio applications

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.; Jovanov, E.; Oy, S.

    2002-01-01

    Mission critical and information overwhelming applications require careful design of the human computer interface. Typical applications include night vision or low visibility mission navigation, guidance through a hostile territory, and flight navigation and orientation. Additional channels of

  16. Exploring the compassion deficit debate.

    Science.gov (United States)

    Stenhouse, Rosie; Ion, Robin; Roxburgh, Michelle; Devitt, Patric Ffrench; Smith, Stephen D M

    2016-04-01

    Several recent high profile failures in the UK health care system have promoted strong debate on compassion and care in nursing. A number of papers articulating a range of positions within this debate have been published in this journal over the past two and a half years. These articulate a diverse range of theoretical perspectives and have been drawn together here in an attempt to bring some coherence to the debate and provide an overview of the key arguments and positions taken by those involved. In doing this we invite the reader to consider their own position in relation to the issues raised and to consider the impact of this for their own practice. Finally the paper offers some sense of how individual practitioners might use their understanding of the debates to ensure delivery of good nursing care. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Design Science in Human-Computer Interaction: A Model and Three Examples

    Science.gov (United States)

    Prestopnik, Nathan R.

    2013-01-01

    Humanity has entered an era where computing technology is virtually ubiquitous. From websites and mobile devices to computers embedded in appliances on our kitchen counters and automobiles parked in our driveways, information and communication technologies (ICTs) and IT artifacts are fundamentally changing the ways we interact with our world.…

  18. Eyewear Computing – Augmenting the Human with Head-mounted Wearable Assistants (Dagstuhl Seminar 16042)

    OpenAIRE

    Bulling, Andreas; Cakmakci, Ozan; Kunze, Kai; Rehg, James M.

    2016-01-01

    The seminar was composed of workshops and tutorials on head-mounted eye tracking, egocentric vision, optics, and head-mounted displays. The seminar welcomed 30 academic and industry researchers from Europe, the US, and Asia with a diverse background, including wearable and ubiquitous computing, computer vision, developmental psychology, optics, and human-computer interaction. In contrast to several previous Dagstuhl seminars, we used an ignite talk format to reduce the time of talks to...

  19. The Human Genome Project: Biology, Computers, and Privacy.

    Science.gov (United States)

    Cutter, Mary Ann G.; Drexler, Edward; Gottesman, Kay S.; Goulding, Philip G.; McCullough, Laurence B.; McInerney, Joseph D.; Micikas, Lynda B.; Mural, Richard J.; Murray, Jeffrey C.; Zola, John

    This module, for high school teachers, is the second of two modules about the Human Genome Project (HGP) produced by the Biological Sciences Curriculum Study (BSCS). The first section of this module provides background information for teachers about the structure and objectives of the HGP, aspects of the science and technology that underlie the…

  20. Using Noninvasive Brain Measurement to Explore the Psychological Effects of Computer Malfunctions on Users during Human-Computer Interactions

    Directory of Open Access Journals (Sweden)

    Leanne M. Hirshfield

    2014-01-01

    Full Text Available In today’s technologically driven world, there is a need to better understand the ways that common computer malfunctions affect computer users. These malfunctions may have measurable influences on computer user’s cognitive, emotional, and behavioral responses. An experiment was conducted where participants conducted a series of web search tasks while wearing functional near-infrared spectroscopy (fNIRS and galvanic skin response sensors. Two computer malfunctions were introduced during the sessions which had the potential to influence correlates of user trust and suspicion. Surveys were given after each session to measure user’s perceived emotional state, cognitive load, and perceived trust. Results suggest that fNIRS can be used to measure the different cognitive and emotional responses associated with computer malfunctions. These cognitive and emotional changes were correlated with users’ self-report levels of suspicion and trust, and they in turn suggest future work that further explores the capability of fNIRS for the measurement of user experience during human-computer interactions.

  1. Moving research beyond the spanking debate.

    Science.gov (United States)

    MacMillan, Harriet L; Mikton, Christopher R

    2017-09-01

    Despite numerous studies identifying a broad range of harms associated with the use of spanking and other types of physical punishment, debate continues about its use as a form of discipline. In this commentary, we recommend four strategies to move the field forward and beyond the spanking debate including: 1) use of methodological approaches that allow for stronger causal inference; 2) consideration of human rights issues; 3) a focus on understanding the causes of spanking and reasons for its decline in certain countries; and 4) more emphasis on evidence-based approaches to changing social norms to reject spanking as a form of discipline. Physical punishment needs to be recognized as an important public health problem. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Recent Advances in Computational Mechanics of the Human Knee Joint

    Science.gov (United States)

    Kazemi, M.; Dabiri, Y.; Li, L. P.

    2013-01-01

    Computational mechanics has been advanced in every area of orthopedic biomechanics. The objective of this paper is to provide a general review of the computational models used in the analysis of the mechanical function of the knee joint in different loading and pathological conditions. Major review articles published in related areas are summarized first. The constitutive models for soft tissues of the knee are briefly discussed to facilitate understanding the joint modeling. A detailed review of the tibiofemoral joint models is presented thereafter. The geometry reconstruction procedures as well as some critical issues in finite element modeling are also discussed. Computational modeling can be a reliable and effective method for the study of mechanical behavior of the knee joint, if the model is constructed correctly. Single-phase material models have been used to predict the instantaneous load response for the healthy knees and repaired joints, such as total and partial meniscectomies, ACL and PCL reconstructions, and joint replacements. Recently, poromechanical models accounting for fluid pressurization in soft tissues have been proposed to study the viscoelastic response of the healthy and impaired knee joints. While the constitutive modeling has been considerably advanced at the tissue level, many challenges still exist in applying a good material model to three-dimensional joint simulations. A complete model validation at the joint level seems impossible presently, because only simple data can be obtained experimentally. Therefore, model validation may be concentrated on the constitutive laws using multiple mechanical tests of the tissues. Extensive model verifications at the joint level are still crucial for the accuracy of the modeling. PMID:23509602

  3. Computational simulation of chromosome breaks in human liver

    International Nuclear Information System (INIS)

    Yang Jianshe; Li Wenjian; Jin Xiaodong

    2006-01-01

    An easy method was established for computing chromosome breaks in cells exposed to heavily charged particles. The cell chromosome break value by 12 C +6 ions was theoretically calculated, and was tested with experimental data of chromosome breaks by using a premature chromosome condensation technique. The theoretical chromosome break value agreed well with the experimental data. The higher relative biological effectiveness of the heavy ions was closely correlated to its physical characteristics. In addition, the chromosome break value can be predicted off line. (authors)

  4. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  5. Distinguishing humans from computers in the game of go: A complex network approach

    Science.gov (United States)

    Coquidé, C.; Georgeot, B.; Giraud, O.

    2017-08-01

    We compare complex networks built from the game of go and obtained from databases of human-played games with those obtained from computer-played games. Our investigations show that statistical features of the human-based networks and the computer-based networks differ, and that these differences can be statistically significant on a relatively small number of games using specific estimators. We show that the deterministic or stochastic nature of the computer algorithm playing the game can also be distinguished from these quantities. This can be seen as a tool to implement a Turing-like test for go simulators.

  6. MoCog1: A computer simulation of recognition-primed human decision making

    Science.gov (United States)

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  7. Computer modelling of HT gas metabolism in humans

    International Nuclear Information System (INIS)

    Peterman, B.F.

    1982-01-01

    A mathematical model was developed to simulate the metabolism of HT gas in humans. The rate constants of the model were estimated by fitting the calculated curves to the experimental data by Pinson and Langham in 1957. The calculations suggest that the oxidation of HT gas (which probably occurs as a result of the enzymatic action of hydrogenase present in bacteria of human gut) occurs at a relatively low rate with a half-time of 10-12 hours. The inclusion of the dose due to the production of the HT oxidation product (HTO) in the soft tissues lowers the value of derived air concentration by about 50%. Furthermore the relationship between the concentration of HTO in urine and the dose to the lung from HT in the air in lungs is linear after short HT exposures, and hence HTO concentrations in urine can be used to estimate the upper limits on the lung dose from HT exposures. (author)

  8. Measuring Human Performance within Computer Security Incident Response Teams

    Energy Technology Data Exchange (ETDEWEB)

    McClain, Jonathan T. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Silva, Austin Ray [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Avina, Glory Emmanuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Forsythe, James C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    Human performance has become a pertinen t issue within cyber security. However, this research has been stymied by the limited availability of expert cyber security professionals. This is partly attributable to the ongoing workload faced by cyber security professionals, which is compound ed by the limited number of qualified personnel and turnover of p ersonnel across organizations. Additionally, it is difficult to conduct research, and particularly, openly published research, due to the sensitivity inherent to cyber ope rations at most orga nizations. As an alternative, the current research has focused on data collection during cyb er security training exercises. These events draw individuals with a range of knowledge and experience extending from seasoned professionals to recent college gradu ates to college students. The current paper describes research involving data collection at two separate cyber security exercises. This data collection involved multiple measures which included behavioral performance based on human - machine transactions and questionnaire - based assessments of cyber security experience.

  9. Computer simulation of mucosal waves on vibrating human vocal folds

    Czech Academy of Sciences Publication Activity Database

    Vampola, T.; Horáček, Jaromír; Klepáček, I.

    2016-01-01

    Roč. 36, č. 3 (2016), s. 451-465 ISSN 0208-5216 R&D Projects: GA ČR GA16-01246S; GA ČR(CZ) GAP101/12/1306 Institutional support: RVO:61388998 Keywords : biomechanics of human voice * 3D FE model of human larynx * finite element method * proper orthogonal decomposition analysis Subject RIV: BI - Acoustics Impact factor: 1.031, year: 2016 http://ac.els-cdn.com/S0208521616300298/1-s2.0-S0208521616300298-main.pdf?_tid=e0b15360-28a9-11e6-9119-00000aab0f27&acdnat=1464862256_9ef3bcd835b40b3ce495106c65295508

  10. Transnational HCI: Humans, Computers and Interactions in Global Contexts

    DEFF Research Database (Denmark)

    Vertesi, Janet; Lindtner, Silvia; Shklovski, Irina

    2011-01-01

    , but as evolving in relation to global processes, boundary crossings, frictions and hybrid practices. In doing so, we expand upon existing research in HCI to consider the effects, implications for individuals and communities, and design opportunities in times of increased transnational interactions. We hope...... to broaden the conversation around the impact of technology in global processes by bringing together scholars from HCI and from related humanities, media arts and social sciences disciplines....

  11. Computational Human Performance Modeling For Alarm System Design

    Energy Technology Data Exchange (ETDEWEB)

    Jacques Hugo

    2012-07-01

    The introduction of new technologies like adaptive automation systems and advanced alarms processing and presentation techniques in nuclear power plants is already having an impact on the safety and effectiveness of plant operations and also the role of the control room operator. This impact is expected to escalate dramatically as more and more nuclear power utilities embark on upgrade projects in order to extend the lifetime of their plants. One of the most visible impacts in control rooms will be the need to replace aging alarm systems. Because most of these alarm systems use obsolete technologies, the methods, techniques and tools that were used to design the previous generation of alarm system designs are no longer effective and need to be updated. The same applies to the need to analyze and redefine operators’ alarm handling tasks. In the past, methods for analyzing human tasks and workload have relied on crude, paper-based methods that often lacked traceability. New approaches are needed to allow analysts to model and represent the new concepts of alarm operation and human-system interaction. State-of-the-art task simulation tools are now available that offer a cost-effective and efficient method for examining the effect of operator performance in different conditions and operational scenarios. A discrete event simulation system was used by human factors researchers at the Idaho National Laboratory to develop a generic alarm handling model to examine the effect of operator performance with simulated modern alarm system. It allowed analysts to evaluate alarm generation patterns as well as critical task times and human workload predicted by the system.

  12. Advanced approaches to characterize the human intestinal microbiota by computational meta-analysis

    NARCIS (Netherlands)

    Nikkilä, J.; Vos, de W.M.

    2010-01-01

    GOALS: We describe advanced approaches for the computational meta-analysis of a collection of independent studies, including over 1000 phylogenetic array datasets, as a means to characterize the variability of human intestinal microbiota. BACKGROUND: The human intestinal microbiota is a complex

  13. The Socioemotional Effects of a Computer-Simulated Animal on Children's Empathy and Humane Attitudes

    Science.gov (United States)

    Tsai, Yueh-Feng Lily; Kaufman, David M.

    2009-01-01

    This study investigated the potential of using a computer-simulated animal in a handheld virtual pet videogame to improve children's empathy and humane attitudes. Also investigated was whether sex differences existed in children's development of empathy and humane attitudes resulting from play, as well as their feelings for a virtual pet. The…

  14. Operational characteristics optimization of human-computer system

    Directory of Open Access Journals (Sweden)

    Zulquernain Mallick

    2010-09-01

    Full Text Available Computer operational parameters are having vital influence on the operators efficiency from readability viewpoint. Four parameters namely font, text/background color, viewing angle and viewing distance are analyzed. The text reading task, in the form of English text, was presented on the computer screen to the participating subjects and their performance, measured in terms of number of words read per minute (NWRPM, was recorded. For the purpose of optimization, the Taguchi method is used to find the optimal parameters to maximize operators’ efficiency for performing readability task. Two levels of each parameter have been considered in this study. An orthogonal array, the signal-to-noise (S/N ratio and the analysis of variance (ANOVA were employed to investigate the operators’ performance/efficiency. Results showed that Times Roman font, black text on white background, 40 degree viewing angle and 60 cm viewing distance, the subjects were quite comfortable, efficient and read maximum number of words per minute. Text/background color was dominant parameter with a percentage contribution of 76.18% towards the laid down objective followed by font type at 18.17%, viewing distance 7.04% and viewing angle 0.58%. Experimental results are provided to confirm the effectiveness of this approach.

  15. Computational modeling of human oral bioavailability: what will be next?

    Science.gov (United States)

    Cabrera-Pérez, Miguel Ángel; Pham-The, Hai

    2018-06-01

    The oral route is the most convenient way of administrating drugs. Therefore, accurate determination of oral bioavailability is paramount during drug discovery and development. Quantitative structure-property relationship (QSPR), rule-of-thumb (RoT) and physiologically based-pharmacokinetic (PBPK) approaches are promising alternatives to the early oral bioavailability prediction. Areas covered: The authors give insight into the factors affecting bioavailability, the fundamental theoretical framework and the practical aspects of computational methods for predicting this property. They also give their perspectives on future computational models for estimating oral bioavailability. Expert opinion: Oral bioavailability is a multi-factorial pharmacokinetic property with its accurate prediction challenging. For RoT and QSPR modeling, the reliability of datasets, the significance of molecular descriptor families and the diversity of chemometric tools used are important factors that define model predictability and interpretability. Likewise, for PBPK modeling the integrity of the pharmacokinetic data, the number of input parameters, the complexity of statistical analysis and the software packages used are relevant factors in bioavailability prediction. Although these approaches have been utilized independently, the tendency to use hybrid QSPR-PBPK approaches together with the exploration of ensemble and deep-learning systems for QSPR modeling of oral bioavailability has opened new avenues for development promising tools for oral bioavailability prediction.

  16. Automated planning target volume generation: an evaluation pitting a computer-based tool against human experts

    International Nuclear Information System (INIS)

    Ketting, Case H.; Austin-Seymour, Mary; Kalet, Ira; Jacky, Jon; Kromhout-Schiro, Sharon; Hummel, Sharon; Unger, Jonathan; Fagan, Lawrence M.; Griffin, Tom

    1997-01-01

    Purpose: Software tools are seeing increased use in three-dimensional treatment planning. However, the development of these tools frequently omits careful evaluation before placing them in clinical use. This study demonstrates the application of a rigorous evaluation methodology using blinded peer review to an automated software tool that produces ICRU-50 planning target volumes (PTVs). Methods and Materials: Seven physicians from three different institutions involved in three-dimensional treatment planning participated in the evaluation. Four physicians drew partial PTVs on nine test cases, consisting of four nasopharynx and five lung primaries. Using the same information provided to the human experts, the computer tool generated PTVs for comparison. The remaining three physicians, designated evaluators, individually reviewed the PTVs for acceptability. To exclude bias, the evaluators were blinded to the source (human or computer) of the PTVs they reviewed. Their scorings of the PTVs were statistically examined to determine if the computer tool performed as well as the human experts. Results: The computer tool was as successful as the human experts in generating PTVs. Failures were primarily attributable to insufficient margins around the clinical target volume and to encroachment upon critical structures. In a qualitative analysis, the human and computer experts displayed similar types and distributions of errors. Conclusions: Rigorous evaluation of computer-based radiotherapy tools requires comparison to current practice and can reveal areas for improvement before the tool enters clinical practice

  17. Human Inspired Self-developmental Model of Neural Network (HIM): Introducing Content/Form Computing

    Science.gov (United States)

    Krajíček, Jiří

    This paper presents cross-disciplinary research between medical/psychological evidence on human abilities and informatics needs to update current models in computer science to support alternative methods for computation and communication. In [10] we have already proposed hypothesis introducing concept of human information model (HIM) as cooperative system. Here we continue on HIM design in detail. In our design, first we introduce Content/Form computing system which is new principle of present methods in evolutionary computing (genetic algorithms, genetic programming). Then we apply this system on HIM (type of artificial neural network) model as basic network self-developmental paradigm. Main inspiration of our natural/human design comes from well known concept of artificial neural networks, medical/psychological evidence and Sheldrake theory of "Nature as Alive" [22].

  18. Flow velocity-driven differentiation of human mesenchymal stromal cells in silk fibroin scaffolds: A combined experimental and computational approach.

    Directory of Open Access Journals (Sweden)

    Jolanda Rita Vetsch

    Full Text Available Mechanical loading plays a major role in bone remodeling and fracture healing. Mimicking the concept of mechanical loading of bone has been widely studied in bone tissue engineering by perfusion cultures. Nevertheless, there is still debate regarding the in-vitro mechanical stimulation regime. This study aims at investigating the effect of two different flow rates (vlow = 0.001m/s and vhigh = 0.061m/s on the growth of mineralized tissue produced by human mesenchymal stromal cells cultured on 3-D silk fibroin scaffolds. The flow rates applied were chosen to mimic the mechanical environment during early fracture healing or during bone remodeling, respectively. Scaffolds cultured under static conditions served as a control. Time-lapsed micro-computed tomography showed that mineralized extracellular matrix formation was completely inhibited at vlow compared to vhigh and the static group. Biochemical assays and histology confirmed these results and showed enhanced osteogenic differentiation at vhigh whereas the amount of DNA was increased at vlow. The biological response at vlow might correspond to the early stage of fracture healing, where cell proliferation and matrix production is prominent. Visual mapping of shear stresses, simulated by computational fluid dynamics, to 3-D micro-computed tomography data revealed that shear stresses up to 0.39mPa induced a higher DNA amount and shear stresses between 0.55mPa and 24mPa induced osteogenic differentiation. This study demonstrates the feasibility to drive cell behavior of human mesenchymal stromal cells by the flow velocity applied in agreement with mechanical loading mimicking early fracture healing (vlow or bone remodeling (vhigh. These results can be used in the future to tightly control the behavior of human mesenchymal stromal cells towards proliferation or differentiation. Additionally, the combination of experiment and simulation presented is a strong tool to link biological responses to

  19. The Study on Human-Computer Interaction Design Based on the Users’ Subconscious Behavior

    Science.gov (United States)

    Li, Lingyuan

    2017-09-01

    Human-computer interaction is human-centered. An excellent interaction design should focus on the study of user experience, which greatly comes from the consistence between design and human behavioral habit. However, users’ behavioral habits often result from subconsciousness. Therefore, it is smart to utilize users’ subconscious behavior to achieve design's intention and maximize the value of products’ functions, which gradually becomes a new trend in this field.

  20. USING RESEARCH METHODS IN HUMAN COMPUTER INTERACTION TO DESIGN TECHNOLOGY FOR RESILIENCE

    OpenAIRE

    Lopes, Arminda Guerra

    2016-01-01

    ABSTRACT Research in human computer interaction (HCI) covers both technological and human behavioural concerns. As a consequence, the contributions made in HCI research tend to be aware to either engineering or the social sciences. In HCI the purpose of practical research contributions is to reveal unknown insights about human behaviour and its relationship to technology. Practical research methods normally used in HCI include formal experiments, field experiments, field studies, interviews, ...

  1. Optimal design methods for a digital human-computer interface based on human reliability in a nuclear power plant

    International Nuclear Information System (INIS)

    Jiang, Jianjun; Zhang, Li; Xie, Tian; Wu, Daqing; Li, Min; Wang, Yiqun; Peng, Yuyuan; Peng, Jie; Zhang, Mengjia; Li, Peiyao; Ma, Congmin; Wu, Xing

    2017-01-01

    Highlights: • A complete optimization process is established for digital human-computer interfaces of Npps. • A quick convergence search method is proposed. • The authors propose an affinity error probability mapping function to test human reliability. - Abstract: This is the second in a series of papers describing the optimal design method for a digital human-computer interface of nuclear power plant (Npp) from three different points based on human reliability. The purpose of this series is to explore different optimization methods from varying perspectives. This present paper mainly discusses the optimal design method for quantity of components of the same factor. In monitoring process, quantity of components has brought heavy burden to operators, thus, human errors are easily triggered. To solve the problem, the authors propose an optimization process, a quick convergence search method and an affinity error probability mapping function. Two balanceable parameter values of the affinity error probability function are obtained by experiments. The experimental results show that the affinity error probability mapping function about human-computer interface has very good sensitivity and stability, and that quick convergence search method for fuzzy segments divided by component quantity has better performance than general algorithm.

  2. Computational Modelling of the Human Islet Amyloid Polypeptide

    DEFF Research Database (Denmark)

    Skeby, Katrine Kirkeby

    2014-01-01

    to interpret results correctly. Computational studies and molecular dynamics (MD) simulations in particular have become important tools in the effort to understand biological mechanisms. The strength of these methods is the high resolution in time and space, and the ability to specifically design the system....... Using MD simulations we have investigated the binding of 13 different imaging agents to a fibril segment. Using clustering analysis and binding energy calculations we have identified a common binding mode for the 13 agents in the surface grooves of the fibril, which are present on all amyloid fibrils....... This information combined with specific knowledge about the AD amyloid fibril is the building block for the design of highly specific amyloid imaging agents. We have also used MD simulations to study the interaction between hIAPP and a phospholipid membrane. At neutral pH, we find that the attraction is mainly...

  3. Computing Stability Effects of Mutations in Human Superoxide Dismutase 1

    DEFF Research Database (Denmark)

    Kepp, Kasper Planeta

    2014-01-01

    Protein stability is affected in several diseases and is of substantial interest in efforts to correlate genotypes to phenotypes. Superoxide dismutase 1 (SOD1) is a suitable test case for such correlations due to its abundance, stability, available crystal structures and thermochemical data......, and physiological importance. In this work, stability changes of SOD1 mutations were computed with five methods, CUPSAT, I-Mutant2.0, I-Mutant3.0, PoPMuSiC, and SDM, with emphasis on structural sensitivity as a potential issue in structure-based protein calculation. The large correlation between experimental...... literature data of SOD1 dimers and monomers (r = 0.82) suggests that mutations in separate protein monomers are mostly additive. PoPMuSiC was most accurate (typical MAE ∼ 1 kcal/mol, r ∼ 0.5). The relative performance of the methods was not very structure-dependent, and the more accurate methods also...

  4. A computational model of blast loading on the human eye.

    Science.gov (United States)

    Bhardwaj, Rajneesh; Ziegler, Kimberly; Seo, Jung Hee; Ramesh, K T; Nguyen, Thao D

    2014-01-01

    Ocular injuries from blast have increased in recent wars, but the injury mechanism associated with the primary blast wave is unknown. We employ a three-dimensional fluid-structure interaction computational model to understand the stresses and deformations incurred by the globe due to blast overpressure. Our numerical results demonstrate that the blast wave reflections off the facial features around the eye increase the pressure loading on and around the eye. The blast wave produces asymmetric loading on the eye, which causes globe distortion. The deformation response of the globe under blast loading was evaluated, and regions of high stresses and strains inside the globe were identified. Our numerical results show that the blast loading results in globe distortion and large deviatoric stresses in the sclera. These large deviatoric stresses may be indicator for the risk of interfacial failure between the tissues of the sclera and the orbit.

  5. Human anatomy nomenclature rules for the computer age.

    Science.gov (United States)

    Neumann, Paul E; Baud, Robert; Sprumont, Pierre

    2017-04-01

    Information systems are increasing in importance in biomedical sciences and medical practice. The nomenclature rules of human anatomy were reviewed for adequacy with respect to modern needs. New rules are proposed here to ensure that each Latin term is uniquely associated with an anatomical entity, as short and simple as possible, and machine-interpretable. Observance of these recommendations will also benefit students and translators of the Latin terms into other languages. Clin. Anat. 30:300-302, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  6. Brain-Computer Interfaces Applying Our Minds to Human-computer Interaction

    CERN Document Server

    Tan, Desney S

    2010-01-01

    For generations, humans have fantasized about the ability to create devices that can see into a person's mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science fiction stories. Recent advances in cognitive neuroscience and brain imaging technologies have started to turn these myths into a reality, and are providing us with the ability to interface directly with the human brain. This ability is made possible through the use of sensors that monitor physical p

  7. National debate (Slovenia)

    International Nuclear Information System (INIS)

    Pecnik, Maks; Veselic, Miran

    2003-01-01

    be adopted for a maximum of four years. Based on legislation, number of measures were implemented to protect environment and human society against harmful impact of radioactive waste and spent fuel. The most important measures were the definition of actors in the area of radioactive waste management. A clear requirement for the management of radioactive waste and spent fuel is set in new Act on Ionising Radiation Protection and Nuclear Safety which provides that the holder of radioactive waste and spent fuel must ensure that the radioactive waste and spent fuel are handled in the way prescribed and that the transfer of the burden of disposing of radioactive waste and spent fuel to future generations is avoided as far as possible. The person responsible for the occurrence of radioactive waste and spent fuel must ensure that the wasted radioactive substances occur within the smallest possible quantities. The costs of radioactive waste and spent fuel management shall be paid by the person responsible for the occurrence of the radioactive waste; or by the holder of the waste if the possession of it transferred from the person responsible for the occurrence of it, or if he acquires it in any other way. If the person responsible for the occurrence of radioactive waste or spent fuel is not known, the state shall take the responsibility for its management. Liabilities are determined by legally binding instruments such as Law and subsidiary legislation. They are verified by regulators and approved annually through the report on Nuclear and Radiation Safety by the Slovenian Parliament. The Krsko NPP is also obliged to assure the funds for the decommissioning and the final disposal of radioactive waste and spent nuclear fuel. The decommissioning of the NPP Krsko is regulated through the Act on the Fund for Financing the Decommissioning of the Krsko NPP and on Radioactive Waste Disposal from the Krsko NPP. Based on this Act, the Fund for Decommissioning of the NPP Krsko was

  8. Teaching Speaking Through Debate Technique

    Directory of Open Access Journals (Sweden)

    . Suranto

    2016-07-01

    Full Text Available Abstract : Teaching Speaking Through Debate Technique. Speaking is one of the basic competence from the other fourth basic competence (listening, speaking, reading and writing. Speaking ability should be mastered by every students, in order to achieve that competence students should be given the right technique to study sepaking. The successfull of the students speaking can be seen from their ability to express idea, thought and feeling through speaking. The objective of this Action Research is to improve students’s oral communication skill through the debate technique. This study was conducted at MA Ma’arif Nu 5 Sekampung Lampung Timur from March to April 2014. The research data were taken from students in the eleventh class, with 28 students and analyzed qualitatively and quantitatively. The research findings indicate that there are improvements in students’ english speaking skill through the debate technique. By analyzing data qualitatively and quantitatively from the end of the first cycle to the second cycle and it was found that the students’ English speaking skill increased 20,9% over the standard that has been determined by the researcher that is 65%. The researcher concludes that the students’ english speaking skill can be improve through the debate technique in learning process.   Key words : action research, debate technique, english speaking skill

  9. Brain-Computer Interfaces. Applying our Minds to Human-Computer Interaction

    NARCIS (Netherlands)

    Tan, Desney S.; Nijholt, Antinus

    2010-01-01

    For generations, humans have fantasized about the ability to create devices that can see into a person’s mind and thoughts, or to communicate and interact with machines through thought alone. Such ideas have long captured the imagination of humankind in the form of ancient myths and modern science

  10. Situated dialog in speech-based human-computer interaction

    CERN Document Server

    Raux, Antoine; Lane, Ian; Misu, Teruhisa

    2016-01-01

    This book provides a survey of the state-of-the-art in the practical implementation of Spoken Dialog Systems for applications in everyday settings. It includes contributions on key topics in situated dialog interaction from a number of leading researchers and offers a broad spectrum of perspectives on research and development in the area. In particular, it presents applications in robotics, knowledge access and communication and covers the following topics: dialog for interacting with robots; language understanding and generation; dialog architectures and modeling; core technologies; and the analysis of human discourse and interaction. The contributions are adapted and expanded contributions from the 2014 International Workshop on Spoken Dialog Systems (IWSDS 2014), where researchers and developers from industry and academia alike met to discuss and compare their implementation experiences, analyses and empirical findings.

  11. Computational model of soft tissues in the human upper airway.

    Science.gov (United States)

    Pelteret, J-P V; Reddy, B D

    2012-01-01

    This paper presents a three-dimensional finite element model of the tongue and surrounding soft tissues with potential application to the study of sleep apnoea and of linguistics and speech therapy. The anatomical data was obtained from the Visible Human Project, and the underlying histological data was also extracted and incorporated into the model. Hyperelastic constitutive models were used to describe the material behaviour, and material incompressibility was accounted for. An active Hill three-element muscle model was used to represent the muscular tissue of the tongue. The neural stimulus for each muscle group was determined through the use of a genetic algorithm-based neural control model. The fundamental behaviour of the tongue under gravitational and breathing-induced loading is investigated. It is demonstrated that, when a time-dependent loading is applied to the tongue, the neural model is able to control the position of the tongue and produce a physiologically realistic response for the genioglossus.

  12. Consciousness operationalized, a debate realigned.

    Science.gov (United States)

    Carruthers, Peter; Veillet, Bénédicte

    2017-10-01

    This paper revisits the debate about cognitive phenomenology. It elaborates, defends, and improves on our earlier proposal for resolving that debate, according to which the test for irreducible phenomenology is the presence of explanatory gaps. After showing how proposals like ours have been misunderstood or misused by others, we deploy our operationalization to argue that the correct way to align the debate over cognitive phenomenology is not between sensory and (alleged) cognitive phenomenology, but rather between non-conceptual and (alleged) conceptual or propositional phenomenology. In doing so we defend three varieties of non-sensory (amodal) 1 non-conceptual phenomenology: valence, a sense of approximate number, and a sense of elapsed time. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. Unpacking the great transmission debate

    Science.gov (United States)

    Denning, Kathryn

    2010-12-01

    The debate about the wisdom of sending interstellar transmissions is well-known to those involved in SETI, and frustrating for many. Its tendency towards intractability is a result of multiple factors, including: different models of the scientist's role as citizen and/or leader; disparate ideas about society's readiness to cope with frontier science; variable political substrates, particularly ideas concerning individual freedom and state control; competing ideologies of globalization; and the perceived relative risks and benefits of contact. (Variations in the latter, i.e. assessments of the risks and benefits of contact, derive partly from different thinking styles, including tolerance for risk, and partly from inferences based upon episodes of biological and cultural contact on Earth.) Unpacking the debate into its components may be of use to those debating policy about SETI transmissions, or at the very least, help keep in focus what, precisely, the perennial arguments are really about.

  14. Transversal Lines of the Debates

    Directory of Open Access Journals (Sweden)

    Yolanda Onghena

    1998-12-01

    Full Text Available The Transversal Lines of the Debates gathers for publication the presentations of the scholars invited to the seminar. In the papers, Yolanda Onghena observes that the evolution from the cultural to the inter-cultural travels along four axes: the relations between cultureand society; the processes of change within identity-based dynamics; the representations of the Other; and, interculturality. Throughout the presentations and subsequent debates, whenever the different participants referred to aspects of the cultural identity problematic--”angst”, “obsession”, “deficit”, manipulation”, and others, these same participants in the Transversal Lines of the Debates also showed that, in certain areas, an optimistic viewpoint is not out of the question.

  15. Debates in History Teaching. The Debates in Subject Teaching Series

    Science.gov (United States)

    Davies, Ian, Ed.

    2010-01-01

    "Debates in History Teaching" explores the major issues all history teachers encounter in their daily professional lives. It encourages critical reflection and aims to stimulate both novice and experienced teachers to think more deeply about their practice, and link research and evidence to what they have observed in schools. Written by a range of…

  16. The Debate over Inclusive Fitness as a Debate over Methodologies

    NARCIS (Netherlands)

    Rubin, Hannah

    This article analyzes the recent debate surrounding inclusive fitness and argues that certain limitations ascribed to it by critics—such as requiring weak selection or providing dynamically insufficient models—are better thought of as limitations of the methodological framework most often used with

  17. Glove-Enabled Computer Operations (GECO): Design and Testing of an Extravehicular Activity Glove Adapted for Human-Computer Interface

    Science.gov (United States)

    Adams, Richard J.; Olowin, Aaron; Krepkovich, Eileen; Hannaford, Blake; Lindsay, Jack I. C.; Homer, Peter; Patrie, James T.; Sands, O. Scott

    2013-01-01

    The Glove-Enabled Computer Operations (GECO) system enables an extravehicular activity (EVA) glove to be dual-purposed as a human-computer interface device. This paper describes the design and human participant testing of a right-handed GECO glove in a pressurized glove box. As part of an investigation into the usability of the GECO system for EVA data entry, twenty participants were asked to complete activities including (1) a Simon Says Games in which they attempted to duplicate random sequences of targeted finger strikes and (2) a Text Entry activity in which they used the GECO glove to enter target phrases in two different virtual keyboard modes. In a within-subjects design, both activities were performed both with and without vibrotactile feedback. Participants mean accuracies in correctly generating finger strikes with the pressurized glove were surprisingly high, both with and without the benefit of tactile feedback. Five of the subjects achieved mean accuracies exceeding 99 in both conditions. In Text Entry, tactile feedback provided a statistically significant performance benefit, quantified by characters entered per minute, as well as reduction in error rate. Secondary analyses of responses to a NASA Task Loader Index (TLX) subjective workload assessments reveal a benefit for tactile feedback in GECO glove use for data entry. This first-ever investigation of employment of a pressurized EVA glove for human-computer interface opens up a wide range of future applications, including text chat communications, manipulation of procedureschecklists, cataloguingannotating images, scientific note taking, human-robot interaction, and control of suit andor other EVA systems.

  18. Can Computers Foster Human Users’ Creativity? Theory and Praxis of Mixed-Initiative Co-Creativity

    Directory of Open Access Journals (Sweden)

    Antonios Liapis

    2016-07-01

    Full Text Available This article discusses the impact of artificially intelligent computers to the process of design, play and educational activities. A computational process which has the necessary intelligence and creativity to take a proactive role in such activities can not only support human creativity but also foster it and prompt lateral thinking. The argument is made both from the perspective of human creativity, where the computational input is treated as an external stimulus which triggers re-framing of humans’ routines and mental associations, but also from the perspective of computational creativity where human input and initiative constrains the search space of the algorithm, enabling it to focus on specific possible solutions to a problem rather than globally search for the optimal. The article reviews four mixed-initiative tools (for design and educational play based on how they contribute to human-machine co-creativity. These paradigms serve different purposes, afford different human interaction methods and incorporate different computationally creative processes. Assessing how co-creativity is facilitated on a per-paradigm basis strengthens the theoretical argument and provides an initial seed for future work in the burgeoning domain of mixed-initiative interaction.

  19. Debat

    DEFF Research Database (Denmark)

    Adler-Nissen, Rebecca

    2005-01-01

    At Norge og Island klarer sig godt pga. fraværet af et EU-medlemskab, er en sandhed med modifikationer. De to lande er tværtimod meget afhængige af EU, og at de skulle nyde godt af en udbredt selvstændighed, er en illusion. Norge og Island forpligtes med EØS-aftalen til at gennemføre EU's lovgivn...

  20. [Debat

    DEFF Research Database (Denmark)

    Myong, Lene; Müller, Anders Riel

    2015-01-01

    Kritik af racisme bliver systematisk afvist som enten abstrakt intellektuelt spind eller individuelle følelsesudbrud. Senest i teksten 'Tanker om en hottentot-karussel', hvor racialiserede minoriteter bliver bedt om at skrue ned for kritikken og i stedet appellere til det hvide hjerte...

  1. Proceedings of the Third International Conference on Intelligent Human Computer Interaction

    CERN Document Server

    Pokorný, Jaroslav; Snášel, Václav; Abraham, Ajith

    2013-01-01

    The Third International Conference on Intelligent Human Computer Interaction 2011 (IHCI 2011) was held at Charles University, Prague, Czech Republic from August 29 - August 31, 2011. This conference was third in the series, following IHCI 2009 and IHCI 2010 held in January at IIIT Allahabad, India. Human computer interaction is a fast growing research area and an attractive subject of interest for both academia and industry. There are many interesting and challenging topics that need to be researched and discussed. This book aims to provide excellent opportunities for the dissemination of interesting new research and discussion about presented topics. It can be useful for researchers working on various aspects of human computer interaction. Topics covered in this book include user interface and interaction, theoretical background and applications of HCI and also data mining and knowledge discovery as a support of HCI applications.

  2. Treatment of human-computer interface in a decision support system

    International Nuclear Information System (INIS)

    Heger, A.S.; Duran, F.A.; Cox, R.G.

    1992-01-01

    One of the most challenging applications facing the computer community is development of effective adaptive human-computer interface. This challenge stems from the complex nature of the human part of this symbiosis. The application of this discipline to the environmental restoration and waste management is further complicated due to the nature of environmental data. The information that is required to manage environmental impacts of human activity is fundamentally complex. This paper will discuss the efforts at Sandia National Laboratories in developing the adaptive conceptual model manager within the constraint of the environmental decision-making. A computer workstation, that hosts the Conceptual Model Manager and the Sandia Environmental Decision Support System will also be discussed

  3. Investigation and evaluation into the usability of human-computer interfaces using a typical CAD system

    Energy Technology Data Exchange (ETDEWEB)

    Rickett, J D

    1987-01-01

    This research program covers three topics relating to the human-computer interface namely, voice recognition, tools and techniques for evaluation, and user and interface modeling. An investigation into the implementation of voice-recognition technologies examines how voice recognizers may be evaluated in commercial software. A prototype system was developed with the collaboration of FEMVIEW Ltd. (marketing a CAD package). A theoretical approach to evaluation leads to the hypothesis that human-computer interaction is affected by personality, influencing types of dialogue, preferred methods for providing helps, etc. A user model based on personality traits, or habitual-behavior patterns (HBP) is presented. Finally, a practical framework is provided for the evaluation of human-computer interfaces. It suggests that evaluation is an integral part of design and that the iterative use of evaluation techniques throughout the conceptualization, design, implementation and post-implementation stages will ensure systems that satisfy the needs of the users and fulfill the goal of usability.

  4. The nuclear debate in Austria

    International Nuclear Information System (INIS)

    Weish, P.

    1977-01-01

    This report was published during the debate about the construction of nuclear-power-plants in Austria and before the national referendum, which prevented the implementing of “Zwentendorf”, Austria´s first nuclear-power-plant. The report gives a view over the events in the discussion about Austria´s nuclear-future. (kancsar)

  5. A debate on open inflation

    Science.gov (United States)

    Hawking, S. W.

    1999-07-01

    This is a reproduction of Professor Stephen Hawking's part in a debate, which took place at the COSMO 98 Coference, in Monterey, California. Two other physicists, Andrei Linde and Alexander Villenkin, also took part. Professor Hawking is the Lucasian Professor of Mathematics at the University of Cambridge, in England.

  6. The debate on minimal deterrence

    International Nuclear Information System (INIS)

    Arbatov, A.; Karp, R.C.; Toth, T.

    1993-01-01

    Revitalization of debates on minimal nuclear deterrence at the present time is induced by the end of the Cold War and a number of unilateral and bilateral actions by the great powers to curtail nuclear arms race and reduce nuclear weapons arsenals

  7. Distribution of absorbed dose in human eye simulated by SRNA-2KG computer code

    International Nuclear Information System (INIS)

    Ilic, R.; Pesic, M.; Pavlovic, R.; Mostacci, D.

    2003-01-01

    Rapidly increasing performances of personal computers and development of codes for proton transport based on Monte Carlo methods will allow, very soon, the introduction of the computer planning proton therapy as a normal activity in regular hospital procedures. A description of SRNA code used for such applications and results of calculated distributions of proton-absorbed dose in human eye are given in this paper. (author)

  8. 11 CFR 110.13 - Candidate debates.

    Science.gov (United States)

    2010-01-01

    ... debates include at least two candidates; and (2) The staging organization(s) does not structure the... PROHIBITIONS § 110.13 Candidate debates. (a) Staging organizations. (1) Nonprofit organizations described in 26..., subparts D and E. (b) Debate structure. The structure of debates staged in accordance with this section and...

  9. Human-computer interaction handbook fundamentals, evolving technologies and emerging applications

    CERN Document Server

    Sears, Andrew

    2007-01-01

    This second edition of The Human-Computer Interaction Handbook provides an updated, comprehensive overview of the most important research in the field, including insights that are directly applicable throughout the process of developing effective interactive information technologies. It features cutting-edge advances to the scientific knowledge base, as well as visionary perspectives and developments that fundamentally transform the way in which researchers and practitioners view the discipline. As the seminal volume of HCI research and practice, The Human-Computer Interaction Handbook feature

  10. Digital image processing and analysis human and computer vision applications with CVIPtools

    CERN Document Server

    Umbaugh, Scott E

    2010-01-01

    Section I Introduction to Digital Image Processing and AnalysisDigital Image Processing and AnalysisOverviewImage Analysis and Computer VisionImage Processing and Human VisionKey PointsExercisesReferencesFurther ReadingComputer Imaging SystemsImaging Systems OverviewImage Formation and SensingCVIPtools SoftwareImage RepresentationKey PointsExercisesSupplementary ExercisesReferencesFurther ReadingSection II Digital Image Analysis and Computer VisionIntroduction to Digital Image AnalysisIntroductionPreprocessingBinary Image AnalysisKey PointsExercisesSupplementary ExercisesReferencesFurther Read

  11. Computational Controversy

    NARCIS (Netherlands)

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  12. Human Computer Collaboration at the Edge: Enhancing Collective Situation Understanding with Controlled Natural Language

    Science.gov (United States)

    2016-09-06

    conversational agent with information exchange disabled until the end of the experiment run. The meaning of the indicator in the top- right of the agent... Human Computer Collaboration at the Edge: Enhancing Collective Situation Understanding with Controlled Natural Language Alun Preece∗, William...email: PreeceAD@cardiff.ac.uk †Emerging Technology Services, IBM United Kingdom Ltd, Hursley Park, Winchester, UK ‡US Army Research Laboratory, Human

  13. Modelling flow and heat transfer around a seated human body by computational fluid dynamics

    DEFF Research Database (Denmark)

    Sørensen, Dan Nørtoft; Voigt, Lars Peter Kølgaard

    2003-01-01

    A database (http://www.ie.dtu.dk/manikin) containing a detailed representation of the surface geometry of a seated female human body was created from a surface scan of a thermal manikin (minus clothing and hair). The radiative heat transfer coefficient and the natural convection flow around...... of the computational manikin has all surface features of a human being; (2) the geometry is an exact copy of an experimental thermal manikin, enabling detailed comparisons between calculations and experiments....

  14. Developing Human-Computer Interface Models and Representation Techniques(Dialogue Management as an Integral Part of Software Engineering)

    OpenAIRE

    Hartson, H. Rex; Hix, Deborah; Kraly, Thomas M.

    1987-01-01

    The Dialogue Management Project at Virginia Tech is studying the poorly understood problem of human-computer dialogue development. This problem often leads to low usability in human-computer dialogues. The Dialogue Management Project approaches solutions to low usability in interfaces by addressing human-computer dialogue development as an integral and equal part of the total system development process. This project consists of two rather distinct, but dependent, parts. One is development of ...

  15. Ergonomic guidelines for using notebook personal computers. Technical Committee on Human-Computer Interaction, International Ergonomics Association.

    Science.gov (United States)

    Saito, S; Piccoli, B; Smith, M J; Sotoyama, M; Sweitzer, G; Villanueva, M B; Yoshitake, R

    2000-10-01

    In the 1980's, the visual display terminal (VDT) was introduced in workplaces of many countries. Soon thereafter, an upsurge in reported cases of related health problems, such as musculoskeletal disorders and eyestrain, was seen. Recently, the flat panel display or notebook personal computer (PC) became the most remarkable feature in modern workplaces with VDTs and even in homes. A proactive approach must be taken to avert foreseeable ergonomic and occupational health problems from the use of this new technology. Because of its distinct physical and optical characteristics, the ergonomic requirements for notebook PCs in terms of machine layout, workstation design, lighting conditions, among others, should be different from the CRT-based computers. The Japan Ergonomics Society (JES) technical committee came up with a set of guidelines for notebook PC use following exploratory discussions that dwelt on its ergonomic aspects. To keep in stride with this development, the Technical Committee on Human-Computer Interaction under the auspices of the International Ergonomics Association worked towards the international issuance of the guidelines. This paper unveils the result of this collaborative effort.

  16. Quantum mechanics interpretation: scalled debate

    International Nuclear Information System (INIS)

    Sanchez Gomez, J. L.

    2000-01-01

    This paper discusses the two main issues of the so called quantum debate, that started in 1927 with the famous Bohr-Einstein controversy; namely non-separability and the projection postulate. Relevant interpretations and formulations of quantum mechanics are critically analyzed in the light of the said issues. The treatment is focused chiefly on fundamental points, so that technical ones are practically not dealt with here. (Author) 20 refs

  17. General Assembly debate on IAEA

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1960-01-15

    On 3 November 1959, the General Assembly of the United Nations considered the annual report of the International Atomic Energy Agency, the first report to cover a full operational year of the Agency - 1 July 1959 to 30 June 1959, more recent developments having been summarized in a preface. At the end of the debate the Assembly adopted a resolution, submitted jointly by Czechoslovakia, the Union of South Africa and the United Arab Republic, taking note of the report

  18. Religious organizations debate nuclear energy

    International Nuclear Information System (INIS)

    Dowell, T.

    1984-08-01

    This paper reviews the history of the religious debate on nuclear energy over the last thirty years. In the 1950s, religious statements recognized the peaceful uses of atomic energy as a blessing from God and called upon world leaders to promote its use. Nuclear energy programmes were launched in this decade. In the 1960s, there was still religious approval of nuclear energy, but questions about ethics arose. It was not until the 1970s, after the oil crisis, that serious questioning and criticism of nuclear energy emerged. This was particularly true in the United States, where the majority of statements originated - especially in 1979, the year of the Three Mile Island accident. Around this time, the World Council of Churches developed the concept of the just, participatory and sustainable society. The meaning and use of these terms in the nuclear energy debate is examined. This paper also compares the balanced debate of the World Council with the case against the plutonium economy prepared by the National Council of the Churches of Christ in the USA. Three religious statements from the 1980s are examined. A United Church of Canada resolution, critical of nuclear energy, is compared with a favourable report from the Methodist Church in England. Both use similar values: in one case, justice, participation and sustainability; in the other case, concern for others, participation and stewardship. There are not many Catholic statements on nuclear energy. One which is cautious and favourable is examined in detail. It is concluded that the use of concepts of justice, participation and sustainability (or their equivalents) has not clarified the nuclear debate

  19. The nuclear debate in Canada

    International Nuclear Information System (INIS)

    Macaulay, H.L.

    1981-06-01

    The author argues that the nuclear debate in Canada is concerned less with the safety of nuclear power plants and more with arguments of economics and social decision-making. The nuclear industry cannot afford to neglect the continuing need to inform the public about nuclear risks. But there is also a need to develop specific arguments to increase public acceptance of nuclear energy as an economic, democratic and equitable energy option

  20. Human vs. Computer Diagnosis of Students' Natural Selection Knowledge: Testing the Efficacy of Text Analytic Software

    Science.gov (United States)

    Nehm, Ross H.; Haertig, Hendrik

    2012-01-01

    Our study examines the efficacy of Computer Assisted Scoring (CAS) of open-response text relative to expert human scoring within the complex domain of evolutionary biology. Specifically, we explored whether CAS can diagnose the explanatory elements (or Key Concepts) that comprise undergraduate students' explanatory models of natural selection with…

  1. A hybrid approach to the computational aeroacoustics of human voice production

    Czech Academy of Sciences Publication Activity Database

    Šidlof, Petr; Zörner, S.; Huppe, A.

    2015-01-01

    Roč. 14, č. 3 (2015), s. 473-488 ISSN 1617-7959 R&D Projects: GA ČR(CZ) GAP101/11/0207 Institutional support: RVO:61388998 Keywords : computational aeroacoustics * parallel CFD * human voice * vocal folds * ventricular folds Subject RIV: BI - Acoustics Impact factor: 3.032, year: 2015

  2. HCI^2 Workbench: A Development Tool for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Wenzhe, Shi; Pantic, Maja

    In this paper, we present a novel software tool designed and implemented to simplify the development process of Multimodal Human-Computer Interaction (MHCI) systems. This tool, which is called the HCI^2 Workbench, exploits a Publish / Subscribe (P/S) architecture [13] [14] to facilitate efficient

  3. HCI^2 Framework: A software framework for multimodal human-computer interaction systems

    NARCIS (Netherlands)

    Shen, Jie; Pantic, Maja

    2013-01-01

    This paper presents a novel software framework for the development and research in the area of multimodal human-computer interface (MHCI) systems. The proposed software framework, which is called the HCI∧2 Framework, is built upon publish/subscribe (P/S) architecture. It implements a

  4. Research Summary 3-D Computational Fluid Dynamics (CFD) Model Of The Human Respiratory System

    Science.gov (United States)

    The U.S. EPA’s Office of Research and Development (ORD) has developed a 3-D computational fluid dynamics (CFD) model of the human respiratory system that allows for the simulation of particulate based contaminant deposition and clearance, while being adaptable for age, ethnicity,...

  5. The Human-Computer Interaction of Cross-Cultural Gaming Strategy

    Science.gov (United States)

    Chakraborty, Joyram; Norcio, Anthony F.; Van Der Veer, Jacob J.; Andre, Charles F.; Miller, Zachary; Regelsberger, Alexander

    2015-01-01

    This article explores the cultural dimensions of the human-computer interaction that underlies gaming strategies. The article is a desktop study of existing literature and is organized into five sections. The first examines the cultural aspects of knowledge processing. The social constructs technology interaction is discussed. Following this, the…

  6. Enhancing Human-Computer Interaction Design Education: Teaching Affordance Design for Emerging Mobile Devices

    Science.gov (United States)

    Faiola, Anthony; Matei, Sorin Adam

    2010-01-01

    The evolution of human-computer interaction design (HCID) over the last 20 years suggests that there is a growing need for educational scholars to consider new and more applicable theoretical models of interactive product design. The authors suggest that such paradigms would call for an approach that would equip HCID students with a better…

  7. Computational Model-Based Prediction of Human Episodic Memory Performance Based on Eye Movements

    Science.gov (United States)

    Sato, Naoyuki; Yamaguchi, Yoko

    Subjects' episodic memory performance is not simply reflected by eye movements. We use a ‘theta phase coding’ model of the hippocampus to predict subjects' memory performance from their eye movements. Results demonstrate the ability of the model to predict subjects' memory performance. These studies provide a novel approach to computational modeling in the human-machine interface.

  8. Rational behavior in decision making. A comparison between humans, computers and fast and frugal strategies

    NARCIS (Netherlands)

    Snijders, C.C.P.

    2007-01-01

    Rational behavior in decision making. A comparison between humans, computers, and fast and frugal strategies Chris Snijders and Frits Tazelaar (Eindhoven University of Technology, The Netherlands) Real life decisions often have to be made in "noisy" circumstances: not all crucial information is

  9. Human brain as the model of a new computer system. II

    Energy Technology Data Exchange (ETDEWEB)

    Holtz, K; Langheld, E

    1981-12-09

    For Pt. I see IBID., Vol. 29, No. 22, P. 13 (1981). The authors describe the self-generating system of connections of a self-teaching no-program associative computer. The self-generating systems of connections are regarded as simulation models of the human brain and compared with the brain structure. The system hardware comprises microprocessor, PROM, memory, VDU, keyboard unit.

  10. Manifiesto de Historia a Debate.

    Directory of Open Access Journals (Sweden)

    Historia a Debate

    2011-08-01

    Full Text Available Después de ocho años de contactos, reflexiones y debates, a través de congresos, encuestas y últimamente Internet, hemos sentido la urgencia de explicitar y actualizar nuestra posición en diálogo crítico con otras corrientes historiográficas, asimismo desarrolladas en la última década del siglo XX: (1 el continuismo de los años 60-70, (2 el posmodernismo, y (3 el retorno a la vieja historia, la última “novedad” historiográfica.Estamos viviendo una transición histórica e historiográfica de resultados todavía inciertos. Historia a Debate como tendencia historiográfica quiere contribuir a la configuración de un paradigma común y plural de los historiadores del siglo XXI que asegure para la historia y su escritura una nueva primavera. A tal fin hemos elaborado 18 propuestas metodológicas, historiográficas y epistemológicas, que presentamos a los historiadores y a las historiadoras del mundo para su debate y, en su caso, adhesión crítica y posterior desarrollo.

  11. Seismic-load-induced human errors and countermeasures using computer graphics in plant-operator communication

    International Nuclear Information System (INIS)

    Hara, Fumio

    1988-01-01

    This paper remarks the importance of seismic load-induced human errors in plant operation by delineating the characteristics of the task performance of human beings under seismic loads. It focuses on man-machine communication via multidimensional data like that conventionally displayed on large panels in a plant control room. It demonstrates a countermeasure to human errors using a computer graphics technique that conveys the global state of the plant operation to operators through cartoon-like, colored graphs in the form of faces that, with different facial expressions, show the plant safety status. (orig.)

  12. MoCog1: A computer simulation of recognition-primed human decision making, considering emotions

    Science.gov (United States)

    Gevarter, William B.

    1992-01-01

    The successful results of the first stage of a research effort to develop a versatile computer model of motivated human cognitive behavior are reported. Most human decision making appears to be an experience-based, relatively straightforward, largely automatic response to situations, utilizing cues and opportunities perceived from the current environment. The development, considering emotions, of the architecture and computer program associated with such 'recognition-primed' decision-making is described. The resultant computer program (MoCog1) was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  13. Heuristic and optimal policy computations in the human brain during sequential decision-making.

    Science.gov (United States)

    Korn, Christoph W; Bach, Dominik R

    2018-01-23

    Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.

  14. Computer-based personality judgments are more accurate than those made by humans

    Science.gov (United States)

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-01

    Judging others’ personalities is an essential skill in successful social living, as personality is a key driver behind people’s interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants’ Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy. PMID:25583507

  15. Computer-based personality judgments are more accurate than those made by humans.

    Science.gov (United States)

    Youyou, Wu; Kosinski, Michal; Stillwell, David

    2015-01-27

    Judging others' personalities is an essential skill in successful social living, as personality is a key driver behind people's interactions, behaviors, and emotions. Although accurate personality judgments stem from social-cognitive skills, developments in machine learning show that computer models can also make valid judgments. This study compares the accuracy of human and computer-based personality judgments, using a sample of 86,220 volunteers who completed a 100-item personality questionnaire. We show that (i) computer predictions based on a generic digital footprint (Facebook Likes) are more accurate (r = 0.56) than those made by the participants' Facebook friends using a personality questionnaire (r = 0.49); (ii) computer models show higher interjudge agreement; and (iii) computer personality judgments have higher external validity when predicting life outcomes such as substance use, political attitudes, and physical health; for some outcomes, they even outperform the self-rated personality scores. Computers outpacing humans in personality judgment presents significant opportunities and challenges in the areas of psychological assessment, marketing, and privacy.

  16. Development and evaluation of a computer-aided system for analyzing human error in railway operations

    International Nuclear Information System (INIS)

    Kim, Dong San; Baek, Dong Hyun; Yoon, Wan Chul

    2010-01-01

    As human error has been recognized as one of the major contributors to accidents in safety-critical systems, there has been a strong need for techniques that can analyze human error effectively. Although many techniques have been developed so far, much room for improvement remains. As human error analysis is a cognitively demanding and time-consuming task, it is particularly necessary to develop a computerized system supporting this task. This paper presents a computer-aided system for analyzing human error in railway operations, called Computer-Aided System for Human Error Analysis and Reduction (CAS-HEAR). It supports analysts to find multiple levels of error causes and their causal relations by using predefined links between contextual factors and causal factors as well as links between causal factors. In addition, it is based on a complete accident model; hence, it helps analysts to conduct a thorough analysis without missing any important part of human error analysis. A prototype of CAS-HEAR was evaluated by nine field investigators from six railway organizations in Korea. Its overall usefulness in human error analysis was confirmed, although development of its simplified version and some modification of the contextual factors and causal factors are required in order to ensure its practical use.

  17. Teaching the Mantle Plumes Debate

    Science.gov (United States)

    Foulger, G. R.

    2010-12-01

    There is an ongoing debate regarding whether or not mantle plumes exist. This debate has highlighted a number of issues regarding how Earth science is currently practised, and how this feeds into approaches toward teaching students. The plume model is an hypothesis, not a proven fact. And yet many researchers assume a priori that plumes exist. This assumption feeds into teaching. That the plume model is unproven, and that many practising researchers are skeptical, may be at best only mentioned in passing to students, with most teachers assuming that plumes are proven to exist. There is typically little emphasis, in particular in undergraduate teaching, that the origin of melting anomalies is currently uncertain and that scientists do not know all the answers. Little encouragement is given to students to become involved in the debate and to consider the pros and cons for themselves. Typically teachers take the approach that “an answer” (or even “the answer”) must be taught to students. Such a pedagogic approach misses an excellent opportunity to allow students to participate in an important ongoing debate in Earth sciences. It also misses the opportunity to illustrate to students several critical aspects regarding correct application of the scientific method. The scientific method involves attempting to disprove hypotheses, not to prove them. A priori assumptions should be kept uppermost in mind and reconsidered at all stages. Multiple working hypotheses should be entertained. The predictions of a hypothesis should be tested, and unpredicted observations taken as weakening the original hypothesis. Hypotheses should not be endlessly adapted to fit unexpected observations. The difficulty with pedagogic treatment of the mantle plumes debate highlights a general uncertainty about how to teach issues in Earth science that are not yet resolved with certainty. It also represents a missed opportunity to let students experience how scientific theories evolve, warts

  18. Human Environmental Disease Network: A computational model to assess toxicology of contaminants.

    Science.gov (United States)

    Taboureau, Olivier; Audouze, Karine

    2017-01-01

    During the past decades, many epidemiological, toxicological and biological studies have been performed to assess the role of environmental chemicals as potential toxicants associated with diverse human disorders. However, the relationships between diseases based on chemical exposure rarely have been studied by computational biology. We developed a human environmental disease network (EDN) to explore and suggest novel disease-disease and chemical-disease relationships. The presented scored EDN model is built upon the integration of systems biology and chemical toxicology using information on chemical contaminants and their disease relationships reported in the TDDB database. The resulting human EDN takes into consideration the level of evidence of the toxicant-disease relationships, allowing inclusion of some degrees of significance in the disease-disease associations. Such a network can be used to identify uncharacterized connections between diseases. Examples are discussed for type 2 diabetes (T2D). Additionally, this computational model allows confirmation of already known links between chemicals and diseases (e.g., between bisphenol A and behavioral disorders) and also reveals unexpected associations between chemicals and diseases (e.g., between chlordane and olfactory alteration), thus predicting which chemicals may be risk factors to human health. The proposed human EDN model allows exploration of common biological mechanisms of diseases associated with chemical exposure, helping us to gain insight into disease etiology and comorbidity. This computational approach is an alternative to animal testing supporting the 3R concept.

  19. Die menswaardigheid van die menslike embrio: die debat tot dusver

    Directory of Open Access Journals (Sweden)

    J.M. Vorster

    2011-06-01

    Full Text Available The human dignity of the human embryo: the debate thus far This article examines some recent arguments regarding the ethics of stem cell research as they are discussed in the various essays in the publication of Gruen et al. (2007, “Stem cell research: the ethical issues”. Regarding the use of human embryos in stem cell research, these essays discuss among other things the potential of the human embryo, the moral status (human dignity of the human embryo, the creation of chimeras, the sale of ocytes and other ethical issues in modern bioethics. Eventually the article draws attention to the main ethical problems at stake to be dealt with by Christian ethics using a deontological ethical theory. Christian ethics should focus on these problems in the on-going ethical debate regarding stem cell research.

  20. Die menswaardigheid van die menslike embrio: die debat tot dusver

    Directory of Open Access Journals (Sweden)

    J.M. Vorster

    2011-06-01

    Full Text Available The human dignity of the human embryo: the debate thus farThis article examines some recent arguments regarding the ethics of stem cell research as they are discussed in the various essays in the publication of Gruen et al. (2007, “Stem cell research: the ethical issues”. Regarding the use of human embryos in stem cell research, these essays discuss among other things the potential of the human embryo, the moral status (human dignity of the human embryo, the creation of chimeras, the sale of ocytes and other ethical issues in modern bioethics. Eventually the article draws attention to the main ethical problems at stake to be dealt with by Christian ethics using a deontological ethical theory. Christian ethics should focus on these problems in the on-going ethical debate regarding stem cell research.

  1. Cognitive engineering in the design of human-computer interaction and expert systems

    International Nuclear Information System (INIS)

    Salvendy, G.

    1987-01-01

    The 68 papers contributing to this book cover the following areas: Theories of Interface Design; Methodologies of Interface Design; Applications of Interface Design; Software Design; Human Factors in Speech Technology and Telecommunications; Design of Graphic Dialogues; Knowledge Acquisition for Knowledge-Based Systems; Design, Evaluation and Use of Expert Systems. This demonstrates the dual role of cognitive engineering. On the one hand cognitive engineering is utilized to design computing systems which are compatible with human cognition and can be effectively and be easily utilized by all individuals. On the other hand, cognitive engineering is utilized to transfer human cognition into the computer for the purpose of building expert systems. Two papers are of interest to INIS

  2. Human factors with nonhumans - Factors that affect computer-task performance

    Science.gov (United States)

    Washburn, David A.

    1992-01-01

    There are two general strategies that may be employed for 'doing human factors research with nonhuman animals'. First, one may use the methods of traditional human factors investigations to examine the nonhuman animal-to-machine interface. Alternatively, one might use performance by nonhuman animals as a surrogate for or model of performance by a human operator. Each of these approaches is illustrated with data in the present review. Chronic ambient noise was found to have a significant but inconsequential effect on computer-task performance by rhesus monkeys (Macaca mulatta). Additional data supported the generality of findings such as these to humans, showing that rhesus monkeys are appropriate models of human psychomotor performance. It is argued that ultimately the interface between comparative psychology and technology will depend on the coordinated use of both strategies of investigation.

  3. National debate on the energies; Debat national sur les energies

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-07-01

    This document gathered the allocutions presented at the national debate on the energies of the 18 march 2003. The full text of the presentations of the Ministry of the industry N. Fontaine and the first Ministry J.P. Raffarin are provided. A synthesis of the answers to the following questions is also presented: understand the energy, the increase of the energy demand, the international consumption, the necessary changes of the consumption and production modes, the environmental impact, the resources, the decision making and the deciders. (A.L.B.)

  4. The role of beliefs in lexical alignment: evidence from dialogs with humans and computers.

    Science.gov (United States)

    Branigan, Holly P; Pickering, Martin J; Pearson, Jamie; McLean, Janet F; Brown, Ash

    2011-10-01

    Five experiments examined the extent to which speakers' alignment (i.e., convergence) on words in dialog is mediated by beliefs about their interlocutor. To do this, we told participants that they were interacting with another person or a computer in a task in which they alternated between selecting pictures that matched their 'partner's' descriptions and naming pictures themselves (though in reality all responses were scripted). In both text- and speech-based dialog, participants tended to repeat their partner's choice of referring expression. However, they showed a stronger tendency to align with 'computer' than with 'human' partners, and with computers that were presented as less capable than with computers that were presented as more capable. The tendency to align therefore appears to be mediated by beliefs, with the relevant beliefs relating to an interlocutor's perceived communicative capacity. Copyright © 2011 Elsevier B.V. All rights reserved.

  5. A conceptual and computational model of moral decision making in human and artificial agents.

    Science.gov (United States)

    Wallach, Wendell; Franklin, Stan; Allen, Colin

    2010-07-01

    Recently, there has been a resurgence of interest in general, comprehensive models of human cognition. Such models aim to explain higher-order cognitive faculties, such as deliberation and planning. Given a computational representation, the validity of these models can be tested in computer simulations such as software agents or embodied robots. The push to implement computational models of this kind has created the field of artificial general intelligence (AGI). Moral decision making is arguably one of the most challenging tasks for computational approaches to higher-order cognition. The need for increasingly autonomous artificial agents to factor moral considerations into their choices and actions has given rise to another new field of inquiry variously known as Machine Morality, Machine Ethics, Roboethics, or Friendly AI. In this study, we discuss how LIDA, an AGI model of human cognition, can be adapted to model both affective and rational features of moral decision making. Using the LIDA model, we will demonstrate how moral decisions can be made in many domains using the same mechanisms that enable general decision making. Comprehensive models of human cognition typically aim for compatibility with recent research in the cognitive and neural sciences. Global workspace theory, proposed by the neuropsychologist Bernard Baars (1988), is a highly regarded model of human cognition that is currently being computationally instantiated in several software implementations. LIDA (Franklin, Baars, Ramamurthy, & Ventura, 2005) is one such computational implementation. LIDA is both a set of computational tools and an underlying model of human cognition, which provides mechanisms that are capable of explaining how an agent's selection of its next action arises from bottom-up collection of sensory data and top-down processes for making sense of its current situation. We will describe how the LIDA model helps integrate emotions into the human decision-making process, and we

  6. Human Computer Confluence in Rehabilitation: Digital Media Plasticity and Human Performance Plasticity

    DEFF Research Database (Denmark)

    Brooks, Anthony Lewis

    2013-01-01

    Digital media plasticity evocative to embodied interaction is presented as a utilitarian tool when mixed and matched to target human performance potentials specific to nuance of development for those with impairment. A distinct intervention strategy trains via alternative channeling of external s...

  7. Automatic procedure for realistic 3D finite element modelling of human brain for bioelectromagnetic computations

    International Nuclear Information System (INIS)

    Aristovich, K Y; Khan, S H

    2010-01-01

    Realistic computer modelling of biological objects requires building of very accurate and realistic computer models based on geometric and material data, type, and accuracy of numerical analyses. This paper presents some of the automatic tools and algorithms that were used to build accurate and realistic 3D finite element (FE) model of whole-brain. These models were used to solve the forward problem in magnetic field tomography (MFT) based on Magnetoencephalography (MEG). The forward problem involves modelling and computation of magnetic fields produced by human brain during cognitive processing. The geometric parameters of the model were obtained from accurate Magnetic Resonance Imaging (MRI) data and the material properties - from those obtained from Diffusion Tensor MRI (DTMRI). The 3D FE models of the brain built using this approach has been shown to be very accurate in terms of both geometric and material properties. The model is stored on the computer in Computer-Aided Parametrical Design (CAD) format. This allows the model to be used in a wide a range of methods of analysis, such as finite element method (FEM), Boundary Element Method (BEM), Monte-Carlo Simulations, etc. The generic model building approach presented here could be used for accurate and realistic modelling of human brain and many other biological objects.

  8. Debatable Premises in Telecom Policy

    DEFF Research Database (Denmark)

    Hurwitz, Justin (Gus); Layton, Roslyn

    2015-01-01

    ‟t stand up well to critical analysis. This paper collects and responds to a number of these premises that, collectively, underlie much popular, political, and academic support for increased telecommunications regulation in the United States and Europe – as well as much of the rest of the world....... in the world. The Internet is opening up new platforms for business, education, government, and civic engagement. It has literally been a driving force in toppling governments. Telecommunications policy is important to every government in the world, and debates over what policies should be implemented...

  9. Debating the viability of ethnicity

    Directory of Open Access Journals (Sweden)

    Vilna Bashi

    2004-01-01

    Full Text Available [First paragraph] Immigration and the Political Economy of Home: West Indian Brooklyn and American Indian Minneapolis, 1945-1992. RACHEL BUFF. Berkeley: University of Califomia Press, 2001. xv + 240 pp. (Paper US$ 18.95 Black Cuban, Black American: A Memoir. EVELIO GRILLO. Houston TX: Arte Püblico Press, 2000. xvi + 134 pp. (Paper US$ 13.95 West Indian in the West: Self Representations in an Immigrant Community. PERCY C. HINTZEN. New York: New York University Press, 2001. x + 200pp. (Paper US$ 18.50 Caribbean Families in Britain and the Transatlantic World. HARRY GOULBOURNE & MARY CHAMBERLAIN (eds.. Oxford UK: Macmillan, 2001. xvi + 270 pp. (Paper £15.50 Legacies: The Story of the Immigrant Second Generation. ALEJANDRO PORTES & RUBÉN G. RUMBAUT. Berkeley: University of Califomia Press/ New York: Russell Sage Foundation, 2001. xxiv + 406 pp. (Paper US$ 19.95 "Ethnicity" and its meaning, both as an identity and as a resilient cultural influence, has dominated late twentieth-century social scientific analyses of the process of immigrant incorporation. Perhaps we may mark the crowning of the term with the publication of Glazer and Moynihan's The Melting Pot, one famous tome that "explained" varying "assimilation" outcomes among the "new" (post-1965 newcomers by examining their ethnic culture for flaws or strengths that justified socioeconomic failure or success. Muddying the ensuing policy debate was the use of buzzwords, like mainstream, deviant, assimilated, minority, black matriarch, absent father, and underclass, that were themselves categorizing and hierarchical. The tautology of hierarchically labeling groups and then asking why groups with different labels have different outcomes seems to be perpetually invisible to the parties in the assimilation debate, but the debate itself rages on. Newer scholarship has added a different voice to that debate, arguing that variance in "assimilation" is instead explained by incorporation into

  10. Debatable Premises in Telecom Policy

    DEFF Research Database (Denmark)

    HURWITZ, Justin; Layton, Roslyn

    2014-01-01

    in the world. The Internet is opening up new platforms for business, education, government, and civic engagement. It has literally been a driving force in toppling governments. Telecommunications policy is important to every government in the world, and debates over what policies should be implemented......Around the world, telecommunications policy is one of the most important areas of public policy. The modern economy is driven by telecom technologies, and many telecom-related firms – Google, Apple, Facebook, and myriad fixed and mobile Internet service providers – are among the largest companies...

  11. The globalization debate: The skeptics

    Directory of Open Access Journals (Sweden)

    Tadić Tadija

    2006-01-01

    Full Text Available A devastating criticism of a "hard core" argumentation, stemming from skeptical authors, has strongly challenged an enthusiasm noticeable in most theoretical analyses of globalization, bringing to light many "darker sides" of the globalization phenomena. A detailed critical re-examination of their often unrealistic assumptions has presented a very serious challenge to globalists and has made room for the arising of the so called "great globalization debate", which has started over time to shape the mainstream of the contemporary social philosophy. In this paper we are closely looking into the way in which skeptics realize their devastating criticism of globalists' argumentation.

  12. National debate on the energies

    International Nuclear Information System (INIS)

    2003-01-01

    This document gathered the allocutions presented at the national debate on the energies of the 18 march 2003. The full text of the presentations of the Ministry of the industry N. Fontaine and the first Ministry J.P. Raffarin are provided. A synthesis of the answers to the following questions is also presented: understand the energy, the increase of the energy demand, the international consumption, the necessary changes of the consumption and production modes, the environmental impact, the resources, the decision making and the deciders. (A.L.B.)

  13. Towards passive brain-computer interfaces: applying brain-computer interface technology to human-machine systems in general.

    Science.gov (United States)

    Zander, Thorsten O; Kothe, Christian

    2011-04-01

    Cognitive monitoring is an approach utilizing realtime brain signal decoding (RBSD) for gaining information on the ongoing cognitive user state. In recent decades this approach has brought valuable insight into the cognition of an interacting human. Automated RBSD can be used to set up a brain-computer interface (BCI) providing a novel input modality for technical systems solely based on brain activity. In BCIs the user usually sends voluntary and directed commands to control the connected computer system or to communicate through it. In this paper we propose an extension of this approach by fusing BCI technology with cognitive monitoring, providing valuable information about the users' intentions, situational interpretations and emotional states to the technical system. We call this approach passive BCI. In the following we give an overview of studies which utilize passive BCI, as well as other novel types of applications resulting from BCI technology. We especially focus on applications for healthy users, and the specific requirements and demands of this user group. Since the presented approach of combining cognitive monitoring with BCI technology is very similar to the concept of BCIs itself we propose a unifying categorization of BCI-based applications, including the novel approach of passive BCI.

  14. PERANCANGAN COMPUTER AIDED SYSTEM DALAM MENGANALISA HUMAN ERROR DI PERKERETAAPIAN INDONESIA

    Directory of Open Access Journals (Sweden)

    Wiwik Budiawan

    2013-06-01

    the occurrence of a train crash in Indonesia. However, it is not clear how this analysis technique is done. Studies of human error made ​​National Transportation Safety Committee (NTSC is still relatively limited, is not equipped with a systematic method. There are several methods that have been developed at this time, but for railway transportation is not widely developed. Human Factors Analysis and Classification System (HFACS is a human error analysis method were developed and adapted to the Indonesian railway system. To improve the reliability of the analysis of human error, HFACS then developed in the form of web-based applications that can be accessed on a computer or smartphone. The results could be used by the NTSC as railway accident analysis methods particularly associated with human error. Keywords: human error, HFACS, CAS, railways

  15. The data base management system alternative for computing in the human services.

    Science.gov (United States)

    Sircar, S; Schkade, L L; Schoech, D

    1983-01-01

    The traditional incremental approach to computerization presents substantial problems as systems develop and grow. The Data Base Management System approach to computerization was developed to overcome the problems resulting from implementing computer applications one at a time. The authors describe the applications approach and the alternative Data Base Management System (DBMS) approach through their developmental history, discuss the technology of DBMS components, and consider the implications of choosing the DBMS alternative. Human service managers need an understanding of the DBMS alternative and its applicability to their agency data processing needs. The basis for a conscious selection of computing alternatives is outlined.

  16. Cross-cultural human-computer interaction and user experience design a semiotic perspective

    CERN Document Server

    Brejcha, Jan

    2015-01-01

    This book describes patterns of language and culture in human-computer interaction (HCI). Through numerous examples, it shows why these patterns matter and how to exploit them to design a better user experience (UX) with computer systems. It provides scientific information on the theoretical and practical areas of the interaction and communication design for research experts and industry practitioners and covers the latest research in semiotics and cultural studies, bringing a set of tools and methods to benefit the process of designing with the cultural background in mind.

  17. Human-computer interfaces applied to numerical solution of the Plateau problem

    Science.gov (United States)

    Elias Fabris, Antonio; Soares Bandeira, Ivana; Ramos Batista, Valério

    2015-09-01

    In this work we present a code in Matlab to solve the Problem of Plateau numerically, and the code will include human-computer interface. The Problem of Plateau has applications in areas of knowledge like, for instance, Computer Graphics. The solution method will be the same one of the Surface Evolver, but the difference will be a complete graphical interface with the user. This will enable us to implement other kinds of interface like ocular mouse, voice, touch, etc. To date, Evolver does not include any graphical interface, which restricts its use by the scientific community. Specially, its use is practically impossible for most of the Physically Challenged People.

  18. Cholesterol: the debate should be terminated.

    Science.gov (United States)

    Nathan, David G

    2017-07-01

    Here, I offer personal perspectives on cholesterol homeostasis that reflect my belief that certain aspects of the debate have been overstated.-Nathan, D. G. Cholesterol: the debate should be terminated. © FASEB.

  19. Is the corticomedullary index valid to distinguish human from nonhuman bones: a multislice computed tomography study.

    Science.gov (United States)

    Rérolle, Camille; Saint-Martin, Pauline; Dedouit, Fabrice; Rousseau, Hervé; Telmon, Norbert

    2013-09-10

    The first step in the identification process of bone remains is to determine whether they are of human or nonhuman origin. This issue may arise when only a fragment of bone is available, as the species of origin is usually easily determined on a complete bone. The present study aims to assess the validity of a morphometric method used by French forensic anthropologists to determine the species of origin: the corticomedullary index (CMI), defined by the ratio of the diameter of the medullary cavity to the total diameter of the bone. We studied the constancy of the CMI from measurements made on computed tomography images (CT scans) of different human bones, and compared our measurements with reference values selected in the literature. The measurements obtained on CT scans at three different sites of 30 human femurs, 24 tibias, and 24 fibulas were compared between themselves and with the CMI reference values for humans, pigs, dogs and sheep. Our results differed significantly from these reference values, with three exceptions: the proximal quarter of the femur and mid-fibular measurements for the human CMI, and the proximal quarter of the tibia for the sheep CMI. Mid-tibial, mid-femoral, and mid-fibular measurements also differed significantly between themselves. Only 22.6% of CT scans of human bones were correctly identified as human. We concluded that the CMI is not an effective method for determining the human origin of bone remains. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  20. Human Factors Principles in Design of Computer-Mediated Visualization for Robot Missions

    Energy Technology Data Exchange (ETDEWEB)

    David I Gertman; David J Bruemmer

    2008-12-01

    With increased use of robots as a resource in missions supporting countermine, improvised explosive devices (IEDs), and chemical, biological, radiological nuclear and conventional explosives (CBRNE), fully understanding the best means by which to complement the human operator’s underlying perceptual and cognitive processes could not be more important. Consistent with control and display integration practices in many other high technology computer-supported applications, current robotic design practices rely highly upon static guidelines and design heuristics that reflect the expertise and experience of the individual designer. In order to use what we know about human factors (HF) to drive human robot interaction (HRI) design, this paper reviews underlying human perception and cognition principles and shows how they were applied to a threat detection domain.

  1. Soviet debate on missile defense

    Energy Technology Data Exchange (ETDEWEB)

    Parrott, B.

    1987-04-01

    Although the Strategic Defense Initiative (SDI) is meant to cope with the danger of a Soviet nuclear attack, the recent US debate over SDI has paid surprisingly little attention to Soviet views of ballistic missile defense. Despite the existence of a substantial body of pertinent scholarship, the debate has failed to take adequate account of major changes in Soviet ballistic missile defense policy since the mid-1960s. It has also neglected the links between current Soviet military policy and broader Soviet political and economic choices. The Soviets regard SDI not as a novel undertaking to reduce the risks of nuclear war but as an extension of the geopolitical competition between the superpowers. This competition has been dominated in the 1980s, in the Soviet view, by sharply increased US assertiveness and the decline of detente. Viewing SDI as a manifestation of these general trends, Soviet decision makers find the prospect of an unregulated race in ballistic missile defenses and military space technologies deeply unsettling. The deterioration of superpower relations has raised serious doubts in Moscow about the wisdom of Soviet external policy during the 1970s and has provoked sharp internal differences over policy toward the US. Already highly suspicious of the Reagan administration, the elite is united by a general conviction that SDI is an American gambit that may ultimately undercut past Soviet strategic gains and pose a grave new threat to Soviet security. 14 references.

  2. Speech and Debate as Civic Education

    Science.gov (United States)

    Hogan, J. Michael; Kurr, Jeffrey A.; Johnson, Jeremy D.; Bergmaier, Michael J.

    2016-01-01

    In light of the U.S. Senate's designation of March 15, 2016 as "National Speech and Debate Education Day" (S. Res. 398, 2016), it only seems fitting that "Communication Education" devote a special section to the role of speech and debate in civic education. Speech and debate have been at the heart of the communication…

  3. The Power of In-Class Debates

    Science.gov (United States)

    Kennedy, Ruth R.

    2009-01-01

    The students in three sections of a class rated their knowledge and identified their view before and after each of five in-class debates. The degree of self-reported knowledge was significantly different after four of the five debates. Between 31% and 58% of participants changed their views after participating in or observing each debate. Some…

  4. 11 CFR 100.154 - Candidate debates.

    Science.gov (United States)

    2010-01-01

    ... 11 Federal Elections 1 2010-01-01 2010-01-01 false Candidate debates. 100.154 Section 100.154 Federal Elections FEDERAL ELECTION COMMISSION GENERAL SCOPE AND DEFINITIONS (2 U.S.C. 431) Exceptions to Expenditures § 100.154 Candidate debates. Funds used to defray costs incurred in staging candidate debates in...

  5. 11 CFR 100.92 - Candidate debates.

    Science.gov (United States)

    2010-01-01

    ... 11 Federal Elections 1 2010-01-01 2010-01-01 false Candidate debates. 100.92 Section 100.92 Federal Elections FEDERAL ELECTION COMMISSION GENERAL SCOPE AND DEFINITIONS (2 U.S.C. 431) Exceptions to Contributions § 100.92 Candidate debates. Funds provided to defray costs incurred in staging candidate debates...

  6. Literacy as Social Action in City Debate

    Science.gov (United States)

    Cridland-Hughes, Susan

    2012-01-01

    This study examines critical literacy and the intersections of oral, aural, written, and performative literate practices in City Debate, an afterschool program dedicated to providing debate instruction to students in a major Southeastern city. Previous research into definitions and beliefs about literacy in an urban debate program over its twenty…

  7. Computational Characterization of Exogenous MicroRNAs that Can Be Transferred into Human Circulation.

    Directory of Open Access Journals (Sweden)

    Jiang Shu

    Full Text Available MicroRNAs have been long considered synthesized endogenously until very recent discoveries showing that human can absorb dietary microRNAs from animal and plant origins while the mechanism remains unknown. Compelling evidences of microRNAs from rice, milk, and honeysuckle transported to human blood and tissues have created a high volume of interests in the fundamental questions that which and how exogenous microRNAs can be transferred into human circulation and possibly exert functions in humans. Here we present an integrated genomics and computational analysis to study the potential deciding features of transportable microRNAs. Specifically, we analyzed all publicly available microRNAs, a total of 34,612 from 194 species, with 1,102 features derived from the microRNA sequence and structure. Through in-depth bioinformatics analysis, 8 groups of discriminative features have been used to characterize human circulating microRNAs and infer the likelihood that a microRNA will get transferred into human circulation. For example, 345 dietary microRNAs have been predicted as highly transportable candidates where 117 of them have identical sequences with their homologs in human and 73 are known to be associated with exosomes. Through a milk feeding experiment, we have validated 9 cow-milk microRNAs in human plasma using microRNA-sequencing analysis, including the top ranked microRNAs such as bta-miR-487b, miR-181b, and miR-421. The implications in health-related processes have been illustrated in the functional analysis. This work demonstrates the data-driven computational analysis is highly promising to study novel molecular characteristics of transportable microRNAs while bypassing the complex mechanistic details.

  8. Computational Characterization of Exogenous MicroRNAs that Can Be Transferred into Human Circulation

    Science.gov (United States)

    Shu, Jiang; Chiang, Kevin; Zempleni, Janos; Cui, Juan

    2015-01-01

    MicroRNAs have been long considered synthesized endogenously until very recent discoveries showing that human can absorb dietary microRNAs from animal and plant origins while the mechanism remains unknown. Compelling evidences of microRNAs from rice, milk, and honeysuckle transported to human blood and tissues have created a high volume of interests in the fundamental questions that which and how exogenous microRNAs can be transferred into human circulation and possibly exert functions in humans. Here we present an integrated genomics and computational analysis to study the potential deciding features of transportable microRNAs. Specifically, we analyzed all publicly available microRNAs, a total of 34,612 from 194 species, with 1,102 features derived from the microRNA sequence and structure. Through in-depth bioinformatics analysis, 8 groups of discriminative features have been used to characterize human circulating microRNAs and infer the likelihood that a microRNA will get transferred into human circulation. For example, 345 dietary microRNAs have been predicted as highly transportable candidates where 117 of them have identical sequences with their homologs in human and 73 are known to be associated with exosomes. Through a milk feeding experiment, we have validated 9 cow-milk microRNAs in human plasma using microRNA-sequencing analysis, including the top ranked microRNAs such as bta-miR-487b, miR-181b, and miR-421. The implications in health-related processes have been illustrated in the functional analysis. This work demonstrates the data-driven computational analysis is highly promising to study novel molecular characteristics of transportable microRNAs while bypassing the complex mechanistic details. PMID:26528912

  9. SnapAnatomy, a computer-based interactive tool for independent learning of human anatomy.

    Science.gov (United States)

    Yip, George W; Rajendran, Kanagasuntheram

    2008-06-01

    Computer-aided instruction materials are becoming increasing popular in medical education and particularly in the teaching of human anatomy. This paper describes SnapAnatomy, a new interactive program that the authors designed for independent learning of anatomy. SnapAnatomy is primarily tailored for the beginner student to encourage the learning of anatomy by developing a three-dimensional visualization of human structure that is essential to applications in clinical practice and the understanding of function. The program allows the student to take apart and to accurately put together body components in an interactive, self-paced and variable manner to achieve the learning outcome.

  10. Issues in Sociobiology: The Nature vs. Nurture Debate.

    Science.gov (United States)

    Lorenzen, Eric

    2001-01-01

    Explains the two theories on the origins of human and animal behavior. Introduces the new discipline of sociobiology, a merging of biology and sociology. Describes the central dogma of sociobiology and its societal implications, and discusses criticism of sociobiology. Presents the nature vs. nurture debate. (YDS)

  11. Engageability: a new sub-principle of the learnability principle in human-computer interaction

    Directory of Open Access Journals (Sweden)

    B Chimbo

    2011-12-01

    Full Text Available The learnability principle relates to improving the usability of software, as well as users’ performance and productivity. A gap has been identified as the current definition of the principle does not distinguish between users of different ages. To determine the extent of the gap, this article compares the ways in which two user groups, adults and children, learn how to use an unfamiliar software application. In doing this, we bring together the research areas of human-computer interaction (HCI, adult and child learning, learning theories and strategies, usability evaluation and interaction design. A literature survey conducted on learnability and learning processes considered the meaning of learnability of software applications across generations. In an empirical investigation, users aged from 9 to 12 and from 35 to 50 were observed in a usability laboratory while learning to use educational software applications. Insights that emerged from data analysis showed different tactics and approaches that children and adults use when learning unfamiliar software. Eye tracking data was also recorded. Findings indicated that subtle re- interpretation of the learnability principle and its associated sub-principles was required. An additional sub-principle, namely engageability was proposed to incorporate aspects of learnability that are not covered by the existing sub-principles. Our re-interpretation of the learnability principle and the resulting design recommendations should help designers to fulfill the varying needs of different-aged users, and improve the learnability of their designs. Keywords: Child computer interaction, Design principles, Eye tracking, Generational differences, human-computer interaction, Learning theories, Learnability, Engageability, Software applications, Uasability Disciplines: Human-Computer Interaction (HCI Studies, Computer science, Observational Studies

  12. Conformational effects on the circular dichroism of Human Carbonic Anhydrase II: a multilevel computational study.

    Directory of Open Access Journals (Sweden)

    Tatyana G Karabencheva-Christova

    Full Text Available Circular Dichroism (CD spectroscopy is a powerful method for investigating conformational changes in proteins and therefore has numerous applications in structural and molecular biology. Here a computational investigation of the CD spectrum of the Human Carbonic Anhydrase II (HCAII, with main focus on the near-UV CD spectra of the wild-type enzyme and it seven tryptophan mutant forms, is presented and compared to experimental studies. Multilevel computational methods (Molecular Dynamics, Semiempirical Quantum Mechanics, Time-Dependent Density Functional Theory were applied in order to gain insight into the mechanisms of interaction between the aromatic chromophores within the protein environment and understand how the conformational flexibility of the protein influences these mechanisms. The analysis suggests that combining CD semi empirical calculations, crystal structures and molecular dynamics (MD could help in achieving a better agreement between the computed and experimental protein spectra and provide some unique insight into the dynamic nature of the mechanisms of chromophore interactions.

  13. Sustaining Economic Exploitation of Complex Ecosystems in Computational Models of Coupled Human-Natural Networks

    OpenAIRE

    Martinez, Neo D.; Tonin, Perrine; Bauer, Barbara; Rael, Rosalyn C.; Singh, Rahul; Yoon, Sangyuk; Yoon, Ilmi; Dunne, Jennifer A.

    2012-01-01

    Understanding ecological complexity has stymied scientists for decades. Recent elucidation of the famously coined "devious strategies for stability in enduring natural systems" has opened up a new field of computational analyses of complex ecological networks where the nonlinear dynamics of many interacting species can be more realistically mod-eled and understood. Here, we describe the first extension of this field to include coupled human-natural systems. This extension elucidates new strat...

  14. Computer-assisted image analysis assay of human neutrophil chemotaxis in vitro

    DEFF Research Database (Denmark)

    Jensen, P; Kharazmi, A

    1991-01-01

    We have developed a computer-based image analysis system to measure in-filter migration of human neutrophils in the Boyden chamber. This method is compared with the conventional manual counting techniques. Neutrophils from healthy individuals and from patients with reduced chemotactic activity were....... Another advantage of the assay is that it can be used to show the migration pattern of different populations of neutrophils from both healthy individuals and patients....

  15. An experimental and computational framework to build a dynamic protein atlas of human cell division

    OpenAIRE

    Kavur, Marina; Kavur, Marina; Kavur, Marina; Ellenberg, Jan; Peters, Jan-Michael; Ladurner, Rene; Martinic, Marina; Kueblbeck, Moritz; Nijmeijer, Bianca; Wachsmuth, Malte; Koch, Birgit; Walther, Nike; Politi, Antonio; Heriche, Jean-Karim; Hossain, M.

    2017-01-01

    Essential biological functions of human cells, such as division, require the tight coordination of the activity of hundreds of proteins in space and time. While live cell imaging is a powerful tool to study the distribution and dynamics of individual proteins after fluorescence tagging, it has not yet been used to map protein networks due to the lack of systematic and quantitative experimental and computational approaches. Using the cell and nuclear boundaries as landmarks, we generated a 4D ...

  16. US Army Weapon Systems Human-Computer Interface (WSHCI) style guide, Version 1

    Energy Technology Data Exchange (ETDEWEB)

    Avery, L.W.; O`Mara, P.A.; Shepard, A.P.

    1996-09-30

    A stated goal of the U.S. Army has been the standardization of the human computer interfaces (HCIS) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of style guides. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide. This document, the U.S. Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide, represents the first version of that style guide. The purpose of this document is to provide HCI design guidance for RT/NRT Army systems across the weapon systems domains of ground, aviation, missile, and soldier systems. Each domain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their domains.

  17. HCIDL: Human-computer interface description language for multi-target, multimodal, plastic user interfaces

    Directory of Open Access Journals (Sweden)

    Lamia Gaouar

    2018-06-01

    Full Text Available From the human-computer interface perspectives, the challenges to be faced are related to the consideration of new, multiple interactions, and the diversity of devices. The large panel of interactions (touching, shaking, voice dictation, positioning … and the diversification of interaction devices can be seen as a factor of flexibility albeit introducing incidental complexity. Our work is part of the field of user interface description languages. After an analysis of the scientific context of our work, this paper introduces HCIDL, a modelling language staged in a model-driven engineering approach. Among the properties related to human-computer interface, our proposition is intended for modelling multi-target, multimodal, plastic interaction interfaces using user interface description languages. By combining plasticity and multimodality, HCIDL improves usability of user interfaces through adaptive behaviour by providing end-users with an interaction-set adapted to input/output of terminals and, an optimum layout. Keywords: Model driven engineering, Human-computer interface, User interface description languages, Multimodal applications, Plastic user interfaces

  18. An Efficient and Secure m-IPS Scheme of Mobile Devices for Human-Centric Computing

    Directory of Open Access Journals (Sweden)

    Young-Sik Jeong

    2014-01-01

    Full Text Available Recent rapid developments in wireless and mobile IT technologies have led to their application in many real-life areas, such as disasters, home networks, mobile social networks, medical services, industry, schools, and the military. Business/work environments have become wire/wireless, integrated with wireless networks. Although the increase in the use of mobile devices that can use wireless networks increases work efficiency and provides greater convenience, wireless access to networks represents a security threat. Currently, wireless intrusion prevention systems (IPSs are used to prevent wireless security threats. However, these are not an ideal security measure for businesses that utilize mobile devices because they do not take account of temporal-spatial and role information factors. Therefore, in this paper, an efficient and secure mobile-IPS (m-IPS is proposed for businesses utilizing mobile devices in mobile environments for human-centric computing. The m-IPS system incorporates temporal-spatial awareness in human-centric computing with various mobile devices and checks users’ temporal spatial information, profiles, and role information to provide precise access control. And it also can extend application of m-IPS to the Internet of things (IoT, which is one of the important advanced technologies for supporting human-centric computing environment completely, for real ubiquitous field with mobile devices.

  19. Canadian natural gas price debate

    International Nuclear Information System (INIS)

    Wight, G.

    1998-01-01

    Sunoco Inc. is a subsidiary of Suncor Energy, one of Canada's largest integrated energy companies having total assets of $2.8 billion. As one of the major energy suppliers in the country, Sunoco Inc has a substantial stake in the emerging trends in the natural gas industry, including the Canadian natural gas price debate. Traditionally, natural gas prices have been determined by the number of pipeline expansions, weather, energy supply and demand, and storage levels. In addition to all these traditional factors which still apply today, the present day natural gas industry also has to deal with deregulation, open competition and the global energy situation, all of which also have an impact on prices. How to face up to these challenges is the subject of this discourse. tabs., figs

  20. The debate on nuclear power

    International Nuclear Information System (INIS)

    Bethe, H.A.

    1977-01-01

    The need for nuclear power is pointed out. The Study Group on Nuclear Fuel Cycles of the American Physical Society has studied the problem of waste disposal in detail and has found that geological emplacement leads to safe waste disposal. The relation between nuclear power and weapons proliferation is discussed. The problem of preventing proliferation is primarily a political problem, and the availability of nuclear power will contribute little to the potential for proliferation. However, to further reduce this contribution, it may be desirable to keep fast-breeder reactors under international control and to use only converters for national reactors. The desirable converter is one which has a high conversion ratio, probably one using the thorium cycle, 233 U, and heavy water as the moderator. The nuclear debate in the United States of America is discussed. Work on physical and technical safeguards in the USA against diversion of fissile materials is mentioned. (author)

  1. The nuclear debate in Sweden

    International Nuclear Information System (INIS)

    Sandstrom, S.

    1976-01-01

    The current preoccupation with conservation among widespread factions in the Swedish populace dates back to the 1960's. Co-ordinated by a central organisation, Miljocentrum, a variety of environmental protection groups concentrated at first on such things as fluorine in drinking water, colouring matter in foodstuffs, poisonous industrial effluents such as phosphates in detergents and mercury. In the early 1970's attention became more and more directed against nuclear energy, the arguments generally following the same lines as the U.S. debate but with some time lag. Nuclear energy has since become the focal point of environmental protest both among the public and within parliament. Public opposition to a reprocessing plant site at Sannas may lead to a decision to opt for a 'fuel cycle centre' on a site suitable for final disposal of high level radioactive waste. (author)

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  3. Cognitive engineering models: A prerequisite to the design of human-computer interaction in complex dynamic systems

    Science.gov (United States)

    Mitchell, Christine M.

    1993-01-01

    This chapter examines a class of human-computer interaction applications, specifically the design of human-computer interaction for the operators of complex systems. Such systems include space systems (e.g., manned systems such as the Shuttle or space station, and unmanned systems such as NASA scientific satellites), aviation systems (e.g., the flight deck of 'glass cockpit' airplanes or air traffic control) and industrial systems (e.g., power plants, telephone networks, and sophisticated, e.g., 'lights out,' manufacturing facilities). The main body of human-computer interaction (HCI) research complements but does not directly address the primary issues involved in human-computer interaction design for operators of complex systems. Interfaces to complex systems are somewhat special. The 'user' in such systems - i.e., the human operator responsible for safe and effective system operation - is highly skilled, someone who in human-machine systems engineering is sometimes characterized as 'well trained, well motivated'. The 'job' or task context is paramount and, thus, human-computer interaction is subordinate to human job interaction. The design of human interaction with complex systems, i.e., the design of human job interaction, is sometimes called cognitive engineering.

  4. [Geomagnetic storm decreases coherence of electric oscillations of human brain while working at the computer].

    Science.gov (United States)

    Novik, O B; Smirnov, F A

    2013-01-01

    The effect of geomagnetic storms at the latitude of Moscow on the electric oscillations of the human brain cerebral cortex was studied. In course of electroencephalogram measurements it was shown that when the voluntary persons at the age of 18-23 years old were performing tasks using a computer during moderate magnetic storm or no later than 24 hrs after it, the value of the coherence function of electric oscillations of the human brain in the frontal and occipital areas in a range of 4.0-7.9 Hz (so-called the theta rhythm oscillations of the human brain) decreased by a factor of two or more, sometimes reaching zero, although arterial blood pressure, respiratory rate and the electrocardiogram registered during electroencephalogram measurements remained within the standard values.

  5. Simulation-based computation of dose to humans in radiological environments

    Energy Technology Data Exchange (ETDEWEB)

    Breazeal, N.L. [Sandia National Labs., Livermore, CA (United States); Davis, K.R.; Watson, R.A. [Sandia National Labs., Albuquerque, NM (United States); Vickers, D.S. [Brigham Young Univ., Provo, UT (United States). Dept. of Electrical and Computer Engineering; Ford, M.S. [Battelle Pantex, Amarillo, TX (United States). Dept. of Radiation Safety

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface.

  6. Computational Thermodynamics Analysis of Vaporizing Fuel Droplets in the Human Upper Airways

    Science.gov (United States)

    Zhang, Zhe; Kleinstreuer, Clement

    The detailed knowledge of air flow structures as well as particle transport and deposition in the human lung for typical inhalation flow rates is an important precursor for dosimetry-and-health-effect studies of toxic particles as well as for targeted drug delivery of therapeutic aerosols. Focusing on highly toxic JP-8 fuel aerosols, 3-D airflow and fluid-particle thermodynamics in a human upper airway model starting from mouth to Generation G3 (G0 is the trachea) are simulated using a user-enhanced and experimentally validated finite-volume code. The temperature distributions and their effects on airflow structures, fuel vapor deposition and droplet motion/evaporation are discussed. The computational results show that the thermal effect on vapor deposition is minor, but it may greatly affect droplet deposition in human airways.

  7. Simulation-based computation of dose to humans in radiological environments

    International Nuclear Information System (INIS)

    Breazeal, N.L.; Davis, K.R.; Watson, R.A.; Vickers, D.S.; Ford, M.S.

    1996-03-01

    The Radiological Environment Modeling System (REMS) quantifies dose to humans working in radiological environments using the IGRIP (Interactive Graphical Robot Instruction Program) and Deneb/ERGO simulation software. These commercially available products are augmented with custom C code to provide radiation exposure information to, and collect radiation dose information from, workcell simulations. Through the use of any radiation transport code or measured data, a radiation exposure input database may be formulated. User-specified IGRIP simulations utilize these databases to compute and accumulate dose to programmable human models operating around radiation sources. Timing, distances, shielding, and human activity may be modeled accurately in the simulations. The accumulated dose is recorded in output files, and the user is able to process and view this output. The entire REMS capability can be operated from a single graphical user interface

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  9. COMPUTING

    CERN Multimedia

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  10. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  11. The use of computers to teach human anatomy and physiology to allied health and nursing students

    Science.gov (United States)

    Bergeron, Valerie J.

    Educational institutions are under tremendous pressure to adopt the newest technologies in order to prepare their students to meet the challenges of the twenty-first century. For the last twenty years huge amounts of money have been spent on computers, printers, software, multimedia projection equipment, and so forth. A reasonable question is, "Has it worked?" Has this infusion of resources, financial as well as human, resulted in improved learning? Are the students meeting the intended learning goals? Any attempt to develop answers to these questions should include examining the intended goals and exploring the effects of the changes on students and faculty. This project investigated the impact of a specific application of a computer program in a community college setting on students' attitudes and understanding of human anatomy and physiology. In this investigation two sites of the same community college with seemingly similar students populations, seven miles apart, used different laboratory activities to teach human anatomy and physiology. At one site nursing students were taught using traditional dissections and laboratory activities; at the other site two of the dissections, specifically cat and sheep pluck, were replaced with the A.D.A.M.RTM (Animated Dissection of Anatomy for Medicine) computer program. Analysis of the attitude data indicated that students at both sites were extremely positive about their laboratory experiences. Analysis of the content data indicated a statistically significant difference in performance between the two sites in two of the eight content areas that were studied. For both topics the students using the computer program scored higher. A detailed analysis of the surveys, interviews with faculty and students, examination of laboratory materials, and observations of laboratory facilities in both sites, and cost-benefit analysis led to the development of seven recommendations. The recommendations call for action at the level of the

  12. Computational Controversy

    OpenAIRE

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have appeared, using new data sources such as Wikipedia, which help us now better understand these phenomena. However, compared to what social sciences have discovered about such debates, the existing computati...

  13. Computational study of depth completion consistent with human bi-stable perception for ambiguous figures.

    Science.gov (United States)

    Mitsukura, Eiichi; Satoh, Shunji

    2018-03-01

    We propose a computational model that is consistent with human perception of depth in "ambiguous regions," in which no binocular disparity exists. Results obtained from our model reveal a new characteristic of depth perception. Random dot stereograms (RDS) are often used as examples because RDS provides sufficient disparity for depth calculation. A simple question confronts us: "How can we estimate the depth of a no-texture image region, such as one on white paper?" In such ambiguous regions, mathematical solutions related to binocular disparities are not unique or indefinite. We examine a mathematical description of depth completion that is consistent with human perception of depth for ambiguous regions. Using computer simulation, we demonstrate that resultant depth-maps qualitatively reproduce human depth perception of two kinds. The resultant depth maps produced using our model depend on the initial depth in the ambiguous region. Considering this dependence from psychological viewpoints, we conjecture that humans perceive completed surfaces that are affected by prior-stimuli corresponding to the initial condition of depth. We conducted psychological experiments to verify the model prediction. An ambiguous stimulus was presented after a prior stimulus removed ambiguity. The inter-stimulus interval (ISI) was inserted between the prior stimulus and post-stimulus. Results show that correlation of perception between the prior stimulus and post-stimulus depends on the ISI duration. Correlation is positive, negative, and nearly zero in the respective cases of short (0-200 ms), medium (200-400 ms), and long ISI (>400 ms). Furthermore, based on our model, we propose a computational model that can explain the dependence. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Computing the influences of different Intraocular Pressures on the human eye components using computational fluid-structure interaction model.

    Science.gov (United States)

    Karimi, Alireza; Razaghi, Reza; Navidbakhsh, Mahdi; Sera, Toshihiro; Kudo, Susumu

    2017-01-01

    Intraocular Pressure (IOP) is defined as the pressure of aqueous in the eye. It has been reported that the normal range of IOP should be within the 10-20 mmHg with an average of 15.50 mmHg among the ophthalmologists. Keratoconus is an anti-inflammatory eye disorder that debilitated cornea unable to reserve the normal structure contrary to the IOP in the eye. Consequently, the cornea would bulge outward and invoke a conical shape following by distorted vision. In addition, it is known that any alterations in the structure and composition of the lens and cornea would exceed a change of the eye ball as well as the mechanical and optical properties of the eye. Understanding the precise alteration of the eye components' stresses and deformations due to different IOPs could help elucidate etiology and pathogenesis to develop treatments not only for keratoconus but also for other diseases of the eye. In this study, at three different IOPs, including 10, 20, and 30 mmHg the stresses and deformations of the human eye components were quantified using a Three-Dimensional (3D) computational Fluid-Structure Interaction (FSI) model of the human eye. The results revealed the highest amount of von Mises stress in the bulged region of the cornea with 245 kPa at the IOP of 30 mmHg. The lens was also showed the von Mises stress of 19.38 kPa at the IOPs of 30 mmHg. In addition, by increasing the IOP from 10 to 30 mmHg, the radius of curvature in the cornea and lens was increased accordingly. In contrast, the sclera indicated its highest stress at the IOP of 10 mmHg due to over pressure phenomenon. The variation of IOP illustrated a little influence in the amount of stress as well as the resultant displacement of the optic nerve. These results can be used for understanding the amount of stresses and deformations in the human eye components due to different IOPs as well as for clarifying significant role of IOP on the radius of curvature of the cornea and the lens.

  15. Histomorphometric quantification of human pathological bones from synchrotron radiation 3D computed microtomography

    International Nuclear Information System (INIS)

    Nogueira, Liebert P.; Braz, Delson

    2011-01-01

    Conventional bone histomorphometry is an important method for quantitative evaluation of bone microstructure. X-ray computed microtomography is a noninvasive technique, which can be used to evaluate histomorphometric indices in trabecular bones (BV/TV, BS/BV, Tb.N, Tb.Th, Tb.Sp). In this technique, the output 3D images are used to quantify the whole sample, differently from the conventional one, in which the quantification is performed in 2D slices and extrapolated for 3D case. In this work, histomorphometric quantification using synchrotron 3D X-ray computed microtomography was performed to quantify pathological samples of human bone. Samples of human bones were cut into small blocks (8 mm x 8 mm x 10 mm) with a precision saw and then imaged. The computed microtomographies were obtained at SYRMEP (Synchrotron Radiation for MEdical Physics) beamline, at ELETTRA synchrotron radiation facility (Italy). The obtained 3D images yielded excellent resolution and details of intra-trabecular bone structures, including marrow present inside trabeculae. Histomorphometric quantification was compared to literature as well. (author)

  16. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  17. Computational fluid dynamics modeling of Bacillus anthracis spore deposition in rabbit and human respiratory airways

    Energy Technology Data Exchange (ETDEWEB)

    Kabilan, S.; Suffield, S. R.; Recknagle, K. P.; Jacob, R. E.; Einstein, D. R.; Kuprat, A. P.; Carson, J. P.; Colby, S. M.; Saunders, J. H.; Hines, S. A.; Teeguarden, J. G.; Straub, T. M.; Moe, M.; Taft, S. C.; Corley, R. A.

    2016-09-01

    Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived respectively from computed tomography (CT) and µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation–exhalation breathing conditions using average species-specific minute volumes. Two different exposure scenarios were modeled in the rabbit based upon experimental inhalation studies. For comparison, human simulations were conducted at the highest exposure concentration used during the rabbit experimental exposures. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Due to the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the nasal sinus compared to the human at the same air concentration of anthrax spores. In contrast, higher spore deposition was predicted in the lower conducting airways of the human compared to the rabbit lung due to differences in airway branching pattern. This information can be used to refine published and ongoing biokinetic models of inhalation anthrax spore exposures, which currently estimate deposited spore concentrations based solely upon exposure concentrations and inhaled doses that do not factor in species-specific anatomy and physiology for deposition.

  18. Computational Fluid Dynamics Modeling of Bacillus anthracis Spore Deposition in Rabbit and Human Respiratory Airways

    Energy Technology Data Exchange (ETDEWEB)

    Kabilan, Senthil; Suffield, Sarah R.; Recknagle, Kurtis P.; Jacob, Rick E.; Einstein, Daniel R.; Kuprat, Andrew P.; Carson, James P.; Colby, Sean M.; Saunders, James H.; Hines, Stephanie; Teeguarden, Justin G.; Straub, Tim M.; Moe, M.; Taft, Sarah; Corley, Richard A.

    2016-09-30

    Three-dimensional computational fluid dynamics and Lagrangian particle deposition models were developed to compare the deposition of aerosolized Bacillus anthracis spores in the respiratory airways of a human with that of the rabbit, a species commonly used in the study of anthrax disease. The respiratory airway geometries for each species were derived from computed tomography (CT) or µCT images. Both models encompassed airways that extended from the external nose to the lung with a total of 272 outlets in the human model and 2878 outlets in the rabbit model. All simulations of spore deposition were conducted under transient, inhalation-exhalation breathing conditions using average species-specific minute volumes. The highest exposure concentration was modeled in the rabbit based upon prior acute inhalation studies. For comparison, human simulation was also conducted at the same concentration. Results demonstrated that regional spore deposition patterns were sensitive to airway geometry and ventilation profiles. Due to the complex airway geometries in the rabbit nose, higher spore deposition efficiency was predicted in the upper conducting airways compared to the human at the same air concentration of anthrax spores. As a result, higher particle deposition was predicted in the conducting airways and deep lung of the human compared to the rabbit lung due to differences in airway branching pattern. This information can be used to refine published and ongoing biokinetic models of inhalation anthrax spore exposures, which currently estimate deposited spore concentrations based solely upon exposure concentrations and inhaled doses that do not factor in species-specific anatomy and physiology.

  19. A Chinese Visible Human-based computational female pelvic phantom for radiation dosimetry simulation

    International Nuclear Information System (INIS)

    Nan, H.; Jinlu, S.; Shaoxiang, Z.; Qing, H.; Li-wen, T.; Chengjun, G.; Tang, X.; Jiang, S. B.; Xiano-lin, Z.

    2010-01-01

    Accurate voxel phantom is needed for dosimetric simulation in radiation therapy for malignant tumors in female pelvic region. However, most of the existing voxel phantoms are constructed on the basis of Caucasian or non-Chinese population. Materials and Methods: A computational framework for constructing female pelvic voxel phantom for radiation dosimetry was performed based on Chinese Visible Human datasets. First, several organs within pelvic region were segmented from Chinese Visible Human datasets. Then, polygonization and voxelization were performed based on the segmented organs and a 3D computational phantom is built in the form of a set of voxel arrays. Results: The generated phantom can be converted and loaded into treatment planning system for radiation dosimetry calculation. From the observed dosimetric results of those organs and structures, we can evaluate their absorbed dose and implement some simulation studies. Conclusion: A voxel female pelvic phantom was developed from Chinese Visible Human datasets. It can be utilized for dosimetry evaluation and planning simulation, which would be very helpful to improve the clinical performance and reduce the radiation toxicity on organ at risk.

  20. Computer graphics of SEM images facilitate recognition of chromosome position in isolated human metaphase plates.

    Science.gov (United States)

    Hodge, L D; Barrett, J M; Welter, D A

    1995-04-01

    There is general agreement that at the time of mitosis chromosomes occupy precise positions and that these positions likely affect subsequent nuclear function in interphase. However, before such ideas can be investigated in human cells, it is necessary to determine first the precise position of each chromosome with regard to its neighbors. It has occurred to us that stereo images, produced by scanning electron microscopy, of isolated metaphase plates could form the basis whereby these positions could be ascertained. In this paper we describe a computer graphic technique that permits us to keep track of individual chromosomes in a metaphase plate and to compare chromosome positions in different metaphase plates. Moreover, the computer graphics provide permanent, easily manipulated, rapid recall of stored chromosome profiles. These advantages are demonstrated by a comparison of the relative position of group A-specific and groups D- and G-specific chromosomes to the full complement of chromosomes in metaphase plates isolated from a nearly triploid human-derived cell (HeLa S3) to a hypo-diploid human fetal lung cell.

  1. Foundations for Reasoning in Cognition-Based Computational Representations of Human Decision Making; TOPICAL

    International Nuclear Information System (INIS)

    SENGLAUB, MICHAEL E.; HARRIS, DAVID L.; RAYBOURN, ELAINE M.

    2001-01-01

    In exploring the question of how humans reason in ambiguous situations or in the absence of complete information, we stumbled onto a body of knowledge that addresses issues beyond the original scope of our effort. We have begun to understand the importance that philosophy, in particular the work of C. S. Peirce, plays in developing models of human cognition and of information theory in general. We have a foundation that can serve as a basis for further studies in cognition and decision making. Peircean philosophy provides a foundation for understanding human reasoning and capturing behavioral characteristics of decision makers due to cultural, physiological, and psychological effects. The present paper describes this philosophical approach to understanding the underpinnings of human reasoning. We present the work of C. S. Peirce, and define sets of fundamental reasoning behavior that would be captured in the mathematical constructs of these newer technologies and would be able to interact in an agent type framework. Further, we propose the adoption of a hybrid reasoning model based on his work for future computational representations or emulations of human cognition

  2. Direct estimation of human trabecular bone stiffness using cone beam computed tomography.

    Science.gov (United States)

    Klintström, Eva; Klintström, Benjamin; Pahr, Dieter; Brismar, Torkel B; Smedby, Örjan; Moreno, Rodrigo

    2018-04-10

    The aim of this study was to evaluate the possibility of estimating the biomechanical properties of trabecular bone through finite element simulations by using dental cone beam computed tomography data. Fourteen human radius specimens were scanned in 3 cone beam computed tomography devices: 3-D Accuitomo 80 (J. Morita MFG., Kyoto, Japan), NewTom 5 G (QR Verona, Verona, Italy), and Verity (Planmed, Helsinki, Finland). The imaging data were segmented by using 2 different methods. Stiffness (Young modulus), shear moduli, and the size and shape of the stiffness tensor were studied. Corresponding evaluations by using micro-CT were regarded as the reference standard. The 3-D Accuitomo 80 (J. Morita MFG., Kyoto, Japan) showed good performance in estimating stiffness and shear moduli but was sensitive to the choice of segmentation method. NewTom 5 G (QR Verona, Verona, Italy) and Verity (Planmed, Helsinki, Finland) yielded good correlations, but they were not as strong as Accuitomo 80 (J. Morita MFG., Kyoto, Japan). The cone beam computed tomography devices overestimated both stiffness and shear compared with the micro-CT estimations. Finite element-based calculations of biomechanics from cone beam computed tomography data are feasible, with strong correlations for the Accuitomo 80 scanner (J. Morita MFG., Kyoto, Japan) combined with an appropriate segmentation method. Such measurements might be useful for predicting implant survival by in vivo estimations of bone properties. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. The heterogeneity of mental representation: Ending the imagery debate.

    Science.gov (United States)

    Pearson, Joel; Kosslyn, Stephen M

    2015-08-18

    The possible ways that information can be represented mentally have been discussed often over the past thousand years. However, this issue could not be addressed rigorously until late in the 20th century. Initial empirical findings spurred a debate about the heterogeneity of mental representation: Is all information stored in propositional, language-like, symbolic internal representations, or can humans use at least two different types of representations (and possibly many more)? Here, in historical context, we describe recent evidence that humans do not always rely on propositional internal representations but, instead, can also rely on at least one other format: depictive representation. We propose that the debate should now move on to characterizing all of the different forms of human mental representation.

  4. 3D virtual human atria: A computational platform for studying clinical atrial fibrillation.

    Science.gov (United States)

    Aslanidi, Oleg V; Colman, Michael A; Stott, Jonathan; Dobrzynski, Halina; Boyett, Mark R; Holden, Arun V; Zhang, Henggui

    2011-10-01

    Despite a vast amount of experimental and clinical data on the underlying ionic, cellular and tissue substrates, the mechanisms of common atrial arrhythmias (such as atrial fibrillation, AF) arising from the functional interactions at the whole atria level remain unclear. Computational modelling provides a quantitative framework for integrating such multi-scale data and understanding the arrhythmogenic behaviour that emerges from the collective spatio-temporal dynamics in all parts of the heart. In this study, we have developed a multi-scale hierarchy of biophysically detailed computational models for the human atria--the 3D virtual human atria. Primarily, diffusion tensor MRI reconstruction of the tissue geometry and fibre orientation in the human sinoatrial node (SAN) and surrounding atrial muscle was integrated into the 3D model of the whole atria dissected from the Visible Human dataset. The anatomical models were combined with the heterogeneous atrial action potential (AP) models, and used to simulate the AP conduction in the human atria under various conditions: SAN pacemaking and atrial activation in the normal rhythm, break-down of regular AP wave-fronts during rapid atrial pacing, and the genesis of multiple re-entrant wavelets characteristic of AF. Contributions of different properties of the tissue to mechanisms of the normal rhythm and arrhythmogenesis were investigated. Primarily, the simulations showed that tissue heterogeneity caused the break-down of the normal AP wave-fronts at rapid pacing rates, which initiated a pair of re-entrant spiral waves; and tissue anisotropy resulted in a further break-down of the spiral waves into multiple meandering wavelets characteristic of AF. The 3D virtual atria model itself was incorporated into the torso model to simulate the body surface ECG patterns in the normal and arrhythmic conditions. Therefore, a state-of-the-art computational platform has been developed, which can be used for studying multi

  5. Using Noninvasive Wearable Computers to Recognize Human Emotions from Physiological Signals

    Directory of Open Access Journals (Sweden)

    Nasoz Fatma

    2004-01-01

    Full Text Available We discuss the strong relationship between affect and cognition and the importance of emotions in multimodal human computer interaction (HCI and user modeling. We introduce the overall paradigm for our multimodal system that aims at recognizing its users' emotions and at responding to them accordingly depending upon the current context or application. We then describe the design of the emotion elicitation experiment we conducted by collecting, via wearable computers, physiological signals from the autonomic nervous system (galvanic skin response, heart rate, temperature and mapping them to certain emotions (sadness, anger, fear, surprise, frustration, and amusement. We show the results of three different supervised learning algorithms that categorize these collected signals in terms of emotions, and generalize their learning to recognize emotions from new collections of signals. We finally discuss possible broader impact and potential applications of emotion recognition for multimodal intelligent systems.

  6. Investigation on human serum albumin and Gum Tragacanth interactions using experimental and computational methods.

    Science.gov (United States)

    Moradi, Sajad; Taran, Mojtaba; Shahlaei, Mohsen

    2018-02-01

    The study on the interaction of human serum albumin and Gum Tragacanth, a biodegradable bio-polymer, has been undertaken. For this purpose, several experimental and computational methods were used. Investigation of thermodynamic parameters and mode of interactions were carried out using Fluorescence spectroscopy in 300 and 310K. Also, a Fourier transformed infrared spectra and synchronous fluorescence spectroscopy was performed. To give detailed insight of possible interactions, docking and molecular dynamic simulations were also applied. Results show that the interaction is based on hydrogen bonding and van der Waals forces. Structural analysis implies on no adverse change in protein conformation during binding of GT. Furthermore, computational methods confirm some evidence on secondary structure enhancement of protein as a presence of combining with Gum Tragacanth. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Computational drug design strategies applied to the modelling of human immunodeficiency virus-1 reverse transcriptase inhibitors

    Directory of Open Access Journals (Sweden)

    Lucianna Helene Santos

    2015-11-01

    Full Text Available Reverse transcriptase (RT is a multifunctional enzyme in the human immunodeficiency virus (HIV-1 life cycle and represents a primary target for drug discovery efforts against HIV-1 infection. Two classes of RT inhibitors, the nucleoside RT inhibitors (NRTIs and the nonnucleoside transcriptase inhibitors are prominently used in the highly active antiretroviral therapy in combination with other anti-HIV drugs. However, the rapid emergence of drug-resistant viral strains has limited the successful rate of the anti-HIV agents. Computational methods are a significant part of the drug design process and indispensable to study drug resistance. In this review, recent advances in computer-aided drug design for the rational design of new compounds against HIV-1 RT using methods such as molecular docking, molecular dynamics, free energy calculations, quantitative structure-activity relationships, pharmacophore modelling and absorption, distribution, metabolism, excretion and toxicity prediction are discussed. Successful applications of these methodologies are also highlighted.

  8. Single-photon emission computed tomography in human immunodeficiency virus encephalopathy: A preliminary report

    International Nuclear Information System (INIS)

    Masdeu, J.C.; Yudd, A.; Van Heertum, R.L.; Grundman, M.; Hriso, E.; O'Connell, R.A.; Luck, D.; Camli, U.; King, L.N.

    1991-01-01

    Depression or psychosis in a previously asymptomatic individual infected with the human immunodeficiency virus (HIV) may be psychogenic, related to brain involvement by the HIV or both. Although prognosis and treatment differ depending on etiology, computed tomography (CT) and magnetic resonance imaging (MRI) are usually unrevealing in early HIV encephalopathy and therefore cannot differentiate it from psychogenic conditions. Thirty of 32 patients (94%) with HIV encephalopathy had single-photon emission computed tomography (SPECT) findings that differed from the findings in 15 patients with non-HIV psychoses and 6 controls. SPECT showed multifocal cortical and subcortical areas of hypoperfusion. In 4 cases, cognitive improvement after 6-8 weeks of zidovudine (AZT) therapy was reflected in amelioration of SPECT findings. CT remained unchanged. SPECT may be a useful technique for the evaluation of HIV encephalopathy

  9. U.S. Army weapon systems human-computer interface style guide. Version 2

    Energy Technology Data Exchange (ETDEWEB)

    Avery, L.W.; O`Mara, P.A.; Shepard, A.P.; Donohoo, D.T.

    1997-12-31

    A stated goal of the US Army has been the standardization of the human computer interfaces (HCIs) of its system. Some of the tools being used to accomplish this standardization are HCI design guidelines and style guides. Currently, the Army is employing a number of HCI design guidance documents. While these style guides provide good guidance for the command, control, communications, computers, and intelligence (C4I) domain, they do not necessarily represent the more unique requirements of the Army`s real time and near-real time (RT/NRT) weapon systems. The Office of the Director of Information for Command, Control, Communications, and Computers (DISC4), in conjunction with the Weapon Systems Technical Architecture Working Group (WSTAWG), recognized this need as part of their activities to revise the Army Technical Architecture (ATA), now termed the Joint Technical Architecture-Army (JTA-A). To address this need, DISC4 tasked the Pacific Northwest National Laboratory (PNNL) to develop an Army weapon systems unique HCI style guide, which resulted in the US Army Weapon Systems Human-Computer Interface (WSHCI) Style Guide Version 1. Based on feedback from the user community, DISC4 further tasked PNNL to revise Version 1 and publish Version 2. The intent was to update some of the research and incorporate some enhancements. This document provides that revision. The purpose of this document is to provide HCI design guidance for the RT/NRT Army system domain across the weapon systems subdomains of ground, aviation, missile, and soldier systems. Each subdomain should customize and extend this guidance by developing their domain-specific style guides, which will be used to guide the development of future systems within their subdomains.

  10. Computational Strategy for Quantifying Human Pesticide Exposure based upon a Saliva Measurement

    Directory of Open Access Journals (Sweden)

    Charles eTimchalk

    2015-05-01

    Full Text Available Quantitative exposure data is important for evaluating toxicity risk and biomonitoring is a critical tool for evaluating human exposure. Direct personal monitoring provides the most accurate estimation of a subject’s true dose, and non-invasive methods are advocated for quantifying exposure to xenobiotics. In this regard, there is a need to identify chemicals that are cleared in saliva at concentrations that can be quantified to support the implementation of this approach. This manuscript reviews the computational modeling approaches that are coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics and provides additional insight on species-dependent differences in partitioning that are of key importance for extrapolation. The primary mechanism by which xenobiotics leave the blood and enter saliva involves paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of xenobiotics transferred by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computationally modeled using compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of the Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis identified that both protein-binding and pKa (for weak acids and bases have significant impact on determining partitioning and species dependent differences based upon physiological variance. Future strategies are focused on an in vitro salivary acinar cell based system to experimentally determine and computationally predict salivary gland uptake and clearance for xenobiotics. It is envisioned that a combination of salivary biomonitoring and computational modeling will enable the non-invasive measurement of chemical exposures in human

  11. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  12. COMPUTING

    CERN Multimedia

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  13. Real-time non-invasive eyetracking and gaze-point determination for human-computer interaction and biomedicine

    Science.gov (United States)

    Talukder, Ashit; Morookian, John-Michael; Monacos, S.; Lam, R.; Lebaw, C.; Bond, A.

    2004-01-01

    Eyetracking is one of the latest technologies that has shown potential in several areas including human-computer interaction for people with and without disabilities, and for noninvasive monitoring, detection, and even diagnosis of physiological and neurological problems in individuals.

  14. Human-Computer Interaction Handbook Fundamentals, Evolving Technologies, and Emerging Applications

    CERN Document Server

    Jacko, Julie A

    2012-01-01

    The third edition of a groundbreaking reference, The Human--Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications raises the bar for handbooks in this field. It is the largest, most complete compilation of HCI theories, principles, advances, case studies, and more that exist within a single volume. The book captures the current and emerging sub-disciplines within HCI related to research, development, and practice that continue to advance at an astonishing rate. It features cutting-edge advances to the scientific knowledge base as well as visionary perspe

  15. Proceedings of the SIGCHI Conference on Human Factors in Computing Systems

    DEFF Research Database (Denmark)

    also deeply appreciate the huge amount of time donated to this process by the 211-member program committee, who paid their own way to attend the face-to-face program committee meeting, an event larger than the average ACM conference. We are proud of the work of the CHI 2013 program committee and hope...... a tremendous amount of work from all areas of the human-computer interaction community. As co-chairs of the process, we are amazed at the ability of the community to organize itself to accomplish this task. We would like to thank the 2680 individual reviewers for their careful consideration of these papers. We...

  16. Advances in Human-Computer Interaction: Graphics and Animation Components for Interface Design

    Science.gov (United States)

    Cipolla Ficarra, Francisco V.; Nicol, Emma; Cipolla-Ficarra, Miguel; Richardson, Lucy

    We present an analysis of communicability methodology in graphics and animation components for interface design, called CAN (Communicability, Acceptability and Novelty). This methodology has been under development between 2005 and 2010, obtaining excellent results in cultural heritage, education and microcomputing contexts. In studies where there is a bi-directional interrelation between ergonomics, usability, user-centered design, software quality and the human-computer interaction. We also present the heuristic results about iconography and layout design in blogs and websites of the following countries: Spain, Italy, Portugal and France.

  17. Observation of human tissue with phase-contrast x-ray computed tomography

    Science.gov (United States)

    Momose, Atsushi; Takeda, Tohoru; Itai, Yuji; Tu, Jinhong; Hirano, Keiichi

    1999-05-01

    Human tissues obtained from cancerous kidneys fixed in formalin were observed with phase-contrast X-ray computed tomography (CT) using 17.7-keV synchrotron X-rays. By measuring the distributions of the X-ray phase shift caused by samples using an X-ray interferometer, sectional images that map the distribution of the refractive index were reconstructed. Because of the high sensitivity of phase- contrast X-ray CT, a cancerous lesion was differentiated from normal tissue and a variety of other structures were revealed without the need for staining.

  18. Machine takeover the growing threat to human freedom in a computer-controlled society

    CERN Document Server

    George, Frank Honywill

    1977-01-01

    Machine Takeover: The Growing Threat to Human Freedom in a Computer-Controlled Society discusses the implications of technological advancement. The title identifies the changes in society that no one is aware of, along with what this changes entails. The text first covers the information science, particularly the aspect of an automated system for information processing. Next, the selection deals with social implications of information science, such as information pollution. The text also tackles the concerns in the utilization of technology in order to manipulate the lives of people without th

  19. South African sign language human-computer interface in the context of the national accessibility portal

    CSIR Research Space (South Africa)

    Olivrin, GJ

    2006-02-01

    Full Text Available example, between a deaf person who can sign and an able person or a person with a different disability who cannot sign). METHODOLOGY A signing avatar is set up to work together with a chatterbot. The chatterbot is a natural language dialogue interface... are then offered in sign language as the replies are interpreted by a signing avatar, a living character that can reproduce human-like gestures and expressions. To make South African Sign Language (SASL) available digitally, computational models of the language...

  20. Abortion: taking the debate seriously

    Directory of Open Access Journals (Sweden)

    Miguel Hugo Kottow Lang

    2015-05-01

    Full Text Available El aborto voluntariamente inducido se mantiene a lo largo de la historia como práctica prevalente sumida en la oscuridad y en la clandestinidad porque toda fecundación extramatrimonial ha sido socialmente rechazada. Desde mediados del siglo 20, se produce una actitud de tolerancia que lleva a la despenalización y legalización del aborto, según dos modelos jurídicos: el modelo de indicaciones, conocido como aborto terapéutico, adoptado en naciones conservadoras, y el modelo de plazos que permite a la mujer requerir el aborto dentro del primer trimestre de embarazo. La liberalización del aborto obedece a la invariable política social que busca eliminar la clandestinidad y sus nocivos efectos, para educar, disuadir y, eventualmente, considerar el aborto como un servicio médico seguro y accesible dentro de los marcos legalmente establecidos, todas normativas orientadas a disminuir la incidencia del aborto procurado. El Proyecto de Ley de despenalización del aborto presentado al Parlamento chileno obedece al modelo de indicaciones, que son presentadas en forma muy restrictiva y por ende no cumplen con los tres objetivos que deben orientarla: 1 Enmarcar legalmente la práctica del aborto; 2 Contribuir a la paz social; 3 Resolver el problema de salud pública del aborto clandestino. Es de urgencia abrir el debate a incluir alternativas más resolutivas, en consonancia con la tendencia general a preferir el modelo de plazos que incluye el respeto a la decisión de la mujer.

  1. A structural approach to constructing perspective efficient and reliable human-computer interfaces

    International Nuclear Information System (INIS)

    Balint, L.

    1989-01-01

    The principles of human-computer interface (HCI) realizations are investigated with the aim of getting closer to a general framework and thus, to a more or less solid background of constructing perspective efficient, reliable and cost-effective human-computer interfaces. On the basis of characterizing and classifying the different HCI solutions, the fundamental problems of interface construction are pointed out especially with respect to human error occurrence possibilities. The evolution of HCI realizations is illustrated by summarizing the main properties of past, present and foreseeable future interface generations. HCI modeling is pointed out to be a crucial problem in theoretical and practical investigations. Suggestions concerning HCI structure (hierarchy and modularity), HCI functional dynamics (mapping from input to output information), minimization of human error caused system failures (error-tolerance, error-recovery and error-correcting) as well as cost-effective HCI design and realization methodology (universal and application-oriented vs. application-specific solutions) are presented. The concept of RISC-based and SCAMP-type HCI components is introduced with the aim of having a reduced interaction scheme in communication and a well defined architecture in HCI components' internal structure. HCI efficiency and reliability are dealt with, by taking into account complexity and flexibility. The application of fast computerized prototyping is also briefly investigated as an experimental device of achieving simple, parametrized, invariant HCI models. Finally, a concise outline of an approach of how to construct ideal HCI's is also suggested by emphasizing the open questions and the need of future work related to the proposals, as well. (author). 14 refs, 6 figs

  2. Human errors and violations in computer and information security: the viewpoint of network administrators and security specialists.

    Science.gov (United States)

    Kraemer, Sara; Carayon, Pascale

    2007-03-01

    This paper describes human errors and violations of end users and network administration in computer and information security. This information is summarized in a conceptual framework for examining the human and organizational factors contributing to computer and information security. This framework includes human error taxonomies to describe the work conditions that contribute adversely to computer and information security, i.e. to security vulnerabilities and breaches. The issue of human error and violation in computer and information security was explored through a series of 16 interviews with network administrators and security specialists. The interviews were audio taped, transcribed, and analyzed by coding specific themes in a node structure. The result is an expanded framework that classifies types of human error and identifies specific human and organizational factors that contribute to computer and information security. Network administrators tended to view errors created by end users as more intentional than unintentional, while errors created by network administrators as more unintentional than intentional. Organizational factors, such as communication, security culture, policy, and organizational structure, were the most frequently cited factors associated with computer and information security.

  3. Human spaceflight and space adaptations: Computational simulation of gravitational unloading on the spine

    Science.gov (United States)

    Townsend, Molly T.; Sarigul-Klijn, Nesrin

    2018-04-01

    Living in reduced gravitational environments for a prolonged duration such, as a fly by mission to Mars or an extended stay at the international space station, affects the human body - in particular, the spine. As the spine adapts to spaceflight, morphological and physiological changes cause the mechanical integrity of the spinal column to be compromised, potentially endangering internal organs, nervous health, and human body mechanical function. Therefore, a high fidelity computational model and simulation of the whole human spine was created and validated for the purpose of investigating the mechanical integrity of the spine in crew members during exploratory space missions. A spaceflight exposed spine has been developed through the adaptation of a three-dimensional nonlinear finite element model with the updated Lagrangian formulation of a healthy ground-based human spine in vivo. Simulation of the porohyperelastic response of the intervertebral disc to mechanical unloading resulted in a model capable of accurately predicting spinal swelling/lengthening, spinal motion, and internal stress distribution. The curvature of this space adaptation exposed spine model was compared to a control terrestrial-based finite element model, indicating how the shape changed. Finally, the potential of injury sites to crew members are predicted for a typical 9 day mission.

  4. Impact of familiarity on information complexity in human-computer interfaces

    Directory of Open Access Journals (Sweden)

    Bakaev Maxim

    2016-01-01

    Full Text Available A quantitative measure of information complexity remains very much desirable in HCI field, since it may aid in optimization of user interfaces, especially in human-computer systems for controlling complex objects. Our paper is dedicated to exploration of subjective (subject-depended aspect of the complexity, conceptualized as information familiarity. Although research of familiarity in human cognition and behaviour is done in several fields, the accepted models in HCI, such as Human Processor or Hick-Hyman’s law do not generally consider this issue. In our experimental study the subjects performed search and selection of digits and letters, whose familiarity was conceptualized as frequency of occurrence in numbers and texts. The analysis showed significant effect of information familiarity on selection time and throughput in regression models, although the R2 values were somehow low. Still, we hope that our results might aid in quantification of information complexity and its further application for optimizing interaction in human-machine systems.

  5. COMPUTING

    CERN Multimedia

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  7. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  8. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  9. COMPUTING

    CERN Multimedia

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  11. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  12. Using the Electrocorticographic Speech Network to Control a Brain-Computer Interface in Humans

    Science.gov (United States)

    Leuthardt, Eric C.; Gaona, Charles; Sharma, Mohit; Szrama, Nicholas; Roland, Jarod; Freudenberg, Zac; Solis, Jamie; Breshears, Jonathan; Schalk, Gerwin

    2013-01-01

    Electrocorticography (ECoG) has emerged as a new signal platform for brain-computer interface (BCI) systems. Classically, the cortical physiology that has been commonly investigated and utilized for device control in humans has been brain signals from sensorimotor cortex. Hence, it was unknown whether other neurophysiological substrates, such as the speech network, could be used to further improve on or complement existing motor-based control paradigms. We demonstrate here for the first time that ECoG signals associated with different overt and imagined phoneme articulation can enable invasively monitored human patients to control a one-dimensional computer cursor rapidly and accurately. This phonetic content was distinguishable within higher gamma frequency oscillations and enabled users to achieve final target accuracies between 68 and 91% within 15 minutes. Additionally, one of the patients achieved robust control using recordings from a microarray consisting of 1 mm spaced microwires. These findings suggest that the cortical network associated with speech could provide an additional cognitive and physiologic substrate for BCI operation and that these signals can be acquired from a cortical array that is small and minimally invasive. PMID:21471638

  13. Effects of muscle fatigue on the usability of a myoelectric human-computer interface.

    Science.gov (United States)

    Barszap, Alexander G; Skavhaug, Ida-Maria; Joshi, Sanjay S

    2016-10-01

    Electromyography-based human-computer interface development is an active field of research. However, knowledge on the effects of muscle fatigue for specific devices is limited. We have developed a novel myoelectric human-computer interface in which subjects continuously navigate a cursor to targets by manipulating a single surface electromyography (sEMG) signal. Two-dimensional control is achieved through simultaneous adjustments of power in two frequency bands through a series of dynamic low-level muscle contractions. Here, we investigate the potential effects of muscle fatigue during the use of our interface. In the first session, eight subjects completed 300 cursor-to-target trials without breaks; four using a wrist muscle and four using a head muscle. The wrist subjects returned for a second session in which a static fatiguing exercise took place at regular intervals in-between cursor-to-target trials. In the first session we observed no declines in performance as a function of use, even after the long period of use. In the second session, we observed clear changes in cursor trajectories, paired with a target-specific decrease in hit rates. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Eye Tracking Based Control System for Natural Human-Computer Interaction

    Directory of Open Access Journals (Sweden)

    Xuebai Zhang

    2017-01-01

    Full Text Available Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user’s eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

  15. Eye Tracking Based Control System for Natural Human-Computer Interaction.

    Science.gov (United States)

    Zhang, Xuebai; Liu, Xiaolong; Yuan, Shyan-Ming; Lin, Shu-Fan

    2017-01-01

    Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

  16. Computational strategy for quantifying human pesticide exposure based upon a saliva measurement

    Energy Technology Data Exchange (ETDEWEB)

    Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

    2015-05-27

    The National Research Council of the National Academies report, Toxicity Testing in the 21st Century: A Vision and Strategy, highlighted the importance of quantitative exposure data for evaluating human toxicity risk and noted that biomonitoring is a critical tool for quantitatively evaluating exposure from both environmental and occupational settings. Direct measurement of chemical exposures using personal monitoring provides the most accurate estimation of a subject’s true exposure, and non-invasive methods have also been advocated for quantifying the pharmacokinetics and bioavailability of drugs and xenobiotics. In this regard, there is a need to identify chemicals that are readily cleared in saliva at concentrations that can be quantified to support the implementation of this approach.. The current manuscript describes the use of computational modeling approaches that are closely coupled to in vivo and in vitro experiments to predict salivary uptake and clearance of xenobiotics. The primary mechanism by which xenobiotics leave the blood and enter saliva is thought to involve paracellular transport, passive transcellular diffusion, or trancellular active transport with the majority of drugs and xenobiotics cleared from plasma into saliva by passive diffusion. The transcellular or paracellular diffusion of unbound chemicals in plasma to saliva has been computational modeled using a combination of compartmental and physiologically based approaches. Of key importance for determining the plasma:saliva partitioning was the utilization of a modified Schmitt algorithm that calculates partitioning based upon the tissue composition, pH, chemical pKa and plasma protein-binding. Sensitivity analysis of key model parameters specifically identified that both protein-binding and pKa (for weak acids and bases) had the most significant impact on the determination of partitioning and that there were clear species dependent differences based upon physiological variance between

  17. Rhetorical Legitimacy, and the Presidential Debates.

    Science.gov (United States)

    Lucaites, John Louis

    1989-01-01

    Explores the negative popular reaction to the 1988 Presidential Debates. Examines how these events function as ritualistic enactments of the , thus providing a rhetorical legitimacy for the electoral process in a system dedicated to . Suggests how the 1988 debates failed to satisfy that function. (MM)

  18. Media Nihilism and the Presidential Debates.

    Science.gov (United States)

    Hogan, J. Michael

    1989-01-01

    Discusses the function of media nihilism--the rhetoric of "crisis and failure"--in the 1988 Presidential Debates. Examines journalists' debate questions, noting that they painted an almost wholly negative portrait of America. Suggests that the candidate who effectively "skewers" the media on its own hypocrisy should be declared…

  19. The debate on international revitalisation of labour

    DEFF Research Database (Denmark)

    Søborg, Henrik

    Globalisation has sparked off a new debate on international labour and trade unions in different disciplines such as industrrial relations, labour history, sociology and geography......Globalisation has sparked off a new debate on international labour and trade unions in different disciplines such as industrrial relations, labour history, sociology and geography...

  20. Using Debates to Teach Information Ethics

    Science.gov (United States)

    Peace, A. Graham

    2011-01-01

    This experience report details the use of debates in a course on Information Ethics. Formal debates have been used in academia for centuries and create an environment in which students must think critically, communicate well and, above all, synthesize and evaluate the relevant classroom material. They also provide a break from the standard…

  1. Debate: a strategy for teaching critical thinking.

    Science.gov (United States)

    Bell, E A

    1991-01-01

    Nurses in advanced practice require high-level critical thinking skills. Two elements of critical thinking are discovery and justification. The process of justification is focused on argumentation skills. Using the debate process to analyze, critique, and construct arguments may be an effective teaching-learning technique. Suggestions for the use of debate in graduate nursing curricula are included.

  2. Advanced information access to parliamentary debates

    NARCIS (Netherlands)

    Marx, M.

    2009-01-01

    Parliamentary debates are highly structured transcripts of meetings of politicians in parliament. These debates are an important part of the cultural heritage of many countries; they are often free of copy-right; citizens often have a legal right to inspect them; and several countries make great

  3. Advanced Information Acces to Parliamentary Debates

    NARCIS (Netherlands)

    Marx, M.

    2009-01-01

    Parliamentary debates are highly structured transcripts of meetings of politicians in parliament. These debates are an important part of the cultural heritage of many countries; they are often free of copy-right; citizens often have a legal right to inspect them; and several countries make great

  4. Rhinology Future Debates, an EUFOREA Report

    NARCIS (Netherlands)

    Fokkens, W. J.; Bachert, C.; Bernal-Sprekelsen, M.; Bousquet, J.; Djandji, M.; Dorenbaum, A.; Hakimi-Mehr, D.; Hendry, S.; Hopkins, C.; Leunig, A.; Mannent, L.; Mucha, D.; Onerci, M.; Pugin, B.; Toppila-Salmi, S.; Rowe, P.; Seys, S. F.; Stimson, S.; Strzembosz, A.; Hellings, P. W.

    2017-01-01

    The first Rhinology Future Debates was held in Brussels in December 2016, organized by EUFOREA (European Forum for Research and Education in Allergy and Airways diseases). The purpose of these debates is to bring novel developments in the field of Rhinology to the attention of the medical,

  5. Leagues Revive Debate in City Schools

    Science.gov (United States)

    Keller, Bess

    2008-01-01

    This article describes how the National Association for Urban Debate Leagues is reviving debate competitions among high school students in city schools. Starting in Atlanta in 1985 and boosted by seed money from the billionaire George Soros' Open Society Institute, urban educators and their supporters in 2002 formed the National Association for…

  6. Debate Revives Old Arguments on HPV Vaccine

    Science.gov (United States)

    Shah, Nirvi

    2011-01-01

    The author reports on a Republican presidential debate which revives the contention over requiring middle school girls to be vaccinated against the virus that causes cervical cancer. At the September 12 debate, U.S. Representative Michele Bachmann, of Minnesota, and Rick Santorum, a former U.S. senator from Pennsylvania, attacked Texas Governor…

  7. Distributed and grid computing projects with research focus in human health.

    Science.gov (United States)

    Diomidous, Marianna; Zikos, Dimitrios

    2012-01-01

    Distributed systems and grid computing systems are used to connect several computers to obtain a higher level of performance, in order to solve a problem. During the last decade, projects use the World Wide Web to aggregate individuals' CPU power for research purposes. This paper presents the existing active large scale distributed and grid computing projects with research focus in human health. There have been found and presented 11 active projects with more than 2000 Processing Units (PUs) each. The research focus for most of them is molecular biology and, specifically on understanding or predicting protein structure through simulation, comparing proteins, genomic analysis for disease provoking genes and drug design. Though not in all cases explicitly stated, common target diseases include research to find cure against HIV, dengue, Duchene dystrophy, Parkinson's disease, various types of cancer and influenza. Other diseases include malaria, anthrax, Alzheimer's disease. The need for national initiatives and European Collaboration for larger scale projects is stressed, to raise the awareness of citizens to participate in order to create a culture of internet volunteering altruism.

  8. Multi-step EMG Classification Algorithm for Human-Computer Interaction

    Science.gov (United States)

    Ren, Peng; Barreto, Armando; Adjouadi, Malek

    A three-electrode human-computer interaction system, based on digital processing of the Electromyogram (EMG) signal, is presented. This system can effectively help disabled individuals paralyzed from the neck down to interact with computers or communicate with people through computers using point-and-click graphic interfaces. The three electrodes are placed on the right frontalis, the left temporalis and the right temporalis muscles in the head, respectively. The signal processing algorithm used translates the EMG signals during five kinds of facial movements (left jaw clenching, right jaw clenching, eyebrows up, eyebrows down, simultaneous left & right jaw clenching) into five corresponding types of cursor movements (left, right, up, down and left-click), to provide basic mouse control. The classification strategy is based on three principles: the EMG energy of one channel is typically larger than the others during one specific muscle contraction; the spectral characteristics of the EMG signals produced by the frontalis and temporalis muscles during different movements are different; the EMG signals from adjacent channels typically have correlated energy profiles. The algorithm is evaluated on 20 pre-recorded EMG signal sets, using Matlab simulations. The results show that this method provides improvements and is more robust than other previous approaches.

  9. FDTD computation of human eye exposure to ultra-wideband electromagnetic pulses

    Energy Technology Data Exchange (ETDEWEB)

    Simicevic, Neven [Center for Applied Physics Studies, Louisiana Tech University, Ruston, LA 71272 (United States)], E-mail: neven@phys.latech.edu

    2008-03-21

    With an increase in the application of ultra-wideband (UWB) electromagnetic pulses in the communications industry, radar, biotechnology and medicine, comes an interest in UWB exposure safety standards. Despite an increase of the scientific research on bioeffects of exposure to non-ionizing UWB pulses, characterization of those effects is far from complete. A numerical computational approach, such as a finite-difference time domain (FDTD) method, is required to visualize and understand the complexity of broadband electromagnetic interactions. The FDTD method has almost no limits in the description of the geometrical and dispersive properties of the simulated material, it is numerically robust and appropriate for current computer technology. In this paper, a complete calculation of exposure of the human eye to UWB electromagnetic pulses in the frequency range of 3.1-10.6, 22-29 and 57-64 GHz is performed. Computation in this frequency range required a geometrical resolution of the eye of 0.1 mm and an arbitrary precision in the description of its dielectric properties in terms of the Debye model. New results show that the interaction of UWB pulses with the eye tissues exhibits the same properties as the interaction of the continuous electromagnetic waves (CWs) with the frequencies from the pulse's frequency spectrum. It is also shown that under the same exposure conditions the exposure to UWB pulses is from one to many orders of magnitude safer than the exposure to CW.

  10. FDTD computation of human eye exposure to ultra-wideband electromagnetic pulses.

    Science.gov (United States)

    Simicevic, Neven

    2008-03-21

    With an increase in the application of ultra-wideband (UWB) electromagnetic pulses in the communications industry, radar, biotechnology and medicine, comes an interest in UWB exposure safety standards. Despite an increase of the scientific research on bioeffects of exposure to non-ionizing UWB pulses, characterization of those effects is far from complete. A numerical computational approach, such as a finite-difference time domain (FDTD) method, is required to visualize and understand the complexity of broadband electromagnetic interactions. The FDTD method has almost no limits in the description of the geometrical and dispersive properties of the simulated material, it is numerically robust and appropriate for current computer technology. In this paper, a complete calculation of exposure of the human eye to UWB electromagnetic pulses in the frequency range of 3.1-10.6, 22-29 and 57-64 GHz is performed. Computation in this frequency range required a geometrical resolution of the eye of 0.1 mm and an arbitrary precision in the description of its dielectric properties in terms of the Debye model. New results show that the interaction of UWB pulses with the eye tissues exhibits the same properties as the interaction of the continuous electromagnetic waves (CWs) with the frequencies from the pulse's frequency spectrum. It is also shown that under the same exposure conditions the exposure to UWB pulses is from one to many orders of magnitude safer than the exposure to CW.

  11. FDTD computation of human eye exposure to ultra-wideband electromagnetic pulses

    International Nuclear Information System (INIS)

    Simicevic, Neven

    2008-01-01

    With an increase in the application of ultra-wideband (UWB) electromagnetic pulses in the communications industry, radar, biotechnology and medicine, comes an interest in UWB exposure safety standards. Despite an increase of the scientific research on bioeffects of exposure to non-ionizing UWB pulses, characterization of those effects is far from complete. A numerical computational approach, such as a finite-difference time domain (FDTD) method, is required to visualize and understand the complexity of broadband electromagnetic interactions. The FDTD method has almost no limits in the description of the geometrical and dispersive properties of the simulated material, it is numerically robust and appropriate for current computer technology. In this paper, a complete calculation of exposure of the human eye to UWB electromagnetic pulses in the frequency range of 3.1-10.6, 22-29 and 57-64 GHz is performed. Computation in this frequency range required a geometrical resolution of the eye of 0.1 mm and an arbitrary precision in the description of its dielectric properties in terms of the Debye model. New results show that the interaction of UWB pulses with the eye tissues exhibits the same properties as the interaction of the continuous electromagnetic waves (CWs) with the frequencies from the pulse's frequency spectrum. It is also shown that under the same exposure conditions the exposure to UWB pulses is from one to many orders of magnitude safer than the exposure to CW

  12. MRI Reconstructions of Human Phrenic Nerve Anatomy and Computational Modeling of Cryoballoon Ablative Therapy.

    Science.gov (United States)

    Goff, Ryan P; Spencer, Julianne H; Iaizzo, Paul A

    2016-04-01

    The primary goal of this computational modeling study was to better quantify the relative distance of the phrenic nerves to areas where cryoballoon ablations may be applied within the left atria. Phrenic nerve injury can be a significant complication of applied ablative therapies for treatment of drug refractory atrial fibrillation. To date, published reports suggest that such injuries may occur more frequently in cryoballoon ablations than in radiofrequency therapies. Ten human heart-lung blocs were prepared in an end-diastolic state, scanned with MRI, and analyzed using Mimics software as a means to make anatomical measurements. Next, generated computer models of ArticFront cryoballoons (23, 28 mm) were mated with reconstructed pulmonary vein ostias to determine relative distances between the phrenic nerves and projected balloon placements, simulating pulmonary vein isolation. The effects of deep seating balloons were also investigated. Interestingly, the relative anatomical differences in placement of 23 and 28 mm cryoballoons were quite small, e.g., the determined difference between mid spline distance to the phrenic nerves between the two cryoballoon sizes was only 1.7 ± 1.2 mm. Furthermore, the right phrenic nerves were commonly closer to the pulmonary veins than the left, and surprisingly tips of balloons were further from the nerves, yet balloon size choice did not significantly alter calculated distance to the nerves. Such computational modeling is considered as a useful tool for both clinicians and device designers to better understand these associated anatomies that, in turn, may lead to optimization of therapeutic treatments.

  13. The UF family of hybrid phantoms of the developing human fetus for computational radiation dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Maynard, Matthew R; Geyer, John W; Bolch, Wesley [Department of Nuclear and Radiological Engineering, University of Florida, Gainesville, FL (United States); Aris, John P [Department of Anatomy and Cell Biology, University of Florida, Gainesville, FL (United States); Shifrin, Roger Y, E-mail: wbolch@ufl.edu [Department of Radiology, University of Florida, Gainesville, FL (United States)

    2011-08-07

    Historically, the development of computational phantoms for radiation dosimetry has primarily been directed at capturing and representing adult and pediatric anatomy, with less emphasis devoted to models of the human fetus. As concern grows over possible radiation-induced cancers from medical and non-medical exposures of the pregnant female, the need to better quantify fetal radiation doses, particularly at the organ-level, also increases. Studies such as the European Union's SOLO (Epidemiological Studies of Exposed Southern Urals Populations) hope to improve our understanding of cancer risks following chronic in utero radiation exposure. For projects such as SOLO, currently available fetal anatomic models do not provide sufficient anatomical detail for organ-level dose assessment. To address this need, two fetal hybrid computational phantoms were constructed using high-quality magnetic resonance imaging and computed tomography image sets obtained for two well-preserved fetal specimens aged 11.5 and 21 weeks post-conception. Individual soft tissue organs, bone sites and outer body contours were segmented from these images using 3D-DOCTOR(TM) and then imported to the 3D modeling software package Rhinoceros(TM) for further modeling and conversion of soft tissue organs, certain bone sites and outer body contours to deformable non-uniform rational B-spline surfaces. The two specimen-specific phantoms, along with a modified version of the 38 week UF hybrid newborn phantom, comprised a set of base phantoms from which a series of hybrid computational phantoms was derived for fetal ages 8, 10, 15, 20, 25, 30, 35 and 38 weeks post-conception. The methodology used to construct the series of phantoms accounted for the following age-dependent parameters: (1) variations in skeletal size and proportion, (2) bone-dependent variations in relative levels of bone growth, (3) variations in individual organ masses and total fetal masses and (4) statistical percentile variations

  14. The UF family of hybrid phantoms of the developing human fetus for computational radiation dosimetry

    International Nuclear Information System (INIS)

    Maynard, Matthew R; Geyer, John W; Bolch, Wesley; Aris, John P; Shifrin, Roger Y

    2011-01-01

    Historically, the development of computational phantoms for radiation dosimetry has primarily been directed at capturing and representing adult and pediatric anatomy, with less emphasis devoted to models of the human fetus. As concern grows over possible radiation-induced cancers from medical and non-medical exposures of the pregnant female, the need to better quantify fetal radiation doses, particularly at the organ-level, also increases. Studies such as the European Union's SOLO (Epidemiological Studies of Exposed Southern Urals Populations) hope to improve our understanding of cancer risks following chronic in utero radiation exposure. For projects such as SOLO, currently available fetal anatomic models do not provide sufficient anatomical detail for organ-level dose assessment. To address this need, two fetal hybrid computational phantoms were constructed using high-quality magnetic resonance imaging and computed tomography image sets obtained for two well-preserved fetal specimens aged 11.5 and 21 weeks post-conception. Individual soft tissue organs, bone sites and outer body contours were segmented from these images using 3D-DOCTOR(TM) and then imported to the 3D modeling software package Rhinoceros(TM) for further modeling and conversion of soft tissue organs, certain bone sites and outer body contours to deformable non-uniform rational B-spline surfaces. The two specimen-specific phantoms, along with a modified version of the 38 week UF hybrid newborn phantom, comprised a set of base phantoms from which a series of hybrid computational phantoms was derived for fetal ages 8, 10, 15, 20, 25, 30, 35 and 38 weeks post-conception. The methodology used to construct the series of phantoms accounted for the following age-dependent parameters: (1) variations in skeletal size and proportion, (2) bone-dependent variations in relative levels of bone growth, (3) variations in individual organ masses and total fetal masses and (4) statistical percentile variations in

  15. POLYAR, a new computer program for prediction of poly(A sites in human sequences

    Directory of Open Access Journals (Sweden)

    Qamar Raheel

    2010-11-01

    Full Text Available Abstract Background mRNA polyadenylation is an essential step of pre-mRNA processing in eukaryotes. Accurate prediction of the pre-mRNA 3'-end cleavage/polyadenylation sites is important for defining the gene boundaries and understanding gene expression mechanisms. Results 28761 human mapped poly(A sites have been classified into three classes containing different known forms of polyadenylation signal (PAS or none of them (PAS-strong, PAS-weak and PAS-less, respectively and a new computer program POLYAR for the prediction of poly(A sites of each class was developed. In comparison with polya_svm (till date the most accurate computer program for prediction of poly(A sites while searching for PAS-strong poly(A sites in human sequences, POLYAR had a significantly higher prediction sensitivity (80.8% versus 65.7% and specificity (66.4% versus 51.7% However, when a similar sort of search was conducted for PAS-weak and PAS-less poly(A sites, both programs had a very low prediction accuracy, which indicates that our knowledge about factors involved in the determination of the poly(A sites is not sufficient to identify such polyadenylation regions. Conclusions We present a new classification of polyadenylation sites into three classes and a novel computer program POLYAR for prediction of poly(A sites/regions of each of the class. In tests, POLYAR shows high accuracy of prediction of the PAS-strong poly(A sites, though this program's efficiency in searching for PAS-weak and PAS-less poly(A sites is not very high but is comparable to other available programs. These findings suggest that additional characteristics of such poly(A sites remain to be elucidated. POLYAR program with a stand-alone version for downloading is available at http://cub.comsats.edu.pk/polyapredict.htm.

  16. Dual-Modality Imaging of the Human Finger Joint Systems by Using Combined Multispectral Photoacoustic Computed Tomography and Ultrasound Computed Tomography

    Directory of Open Access Journals (Sweden)

    Yubin Liu

    2016-01-01

    Full Text Available We developed a homemade dual-modality imaging system that combines multispectral photoacoustic computed tomography and ultrasound computed tomography for reconstructing the structural and functional information of human finger joint systems. The fused multispectral photoacoustic-ultrasound computed tomography (MPAUCT system was examined by the phantom and in vivo experimental tests. The imaging results indicate that the hard tissues such as the bones and the soft tissues including the blood vessels, the tendon, the skins, and the subcutaneous tissues in the finger joints systems can be effectively recovered by using our multimodality MPAUCT system. The developed MPAUCT system is able to provide us with more comprehensive information of the human finger joints, which shows its potential for characterization and diagnosis of bone or joint diseases.

  17. High School Students' Written Argumentation Qualities with Problem-Based Computer-Aided Material (PBCAM) Designed about Human Endocrine System

    Science.gov (United States)

    Vekli, Gülsah Sezen; Çimer, Atilla

    2017-01-01

    This study investigated development of students' scientific argumentation levels in the applications made with Problem-Based Computer-Aided Material (PBCAM) designed about Human Endocrine System. The case study method was used: The study group was formed of 43 students in the 11th grade of the science high school in Rize. Human Endocrine System…

  18. Human-Computer Interaction and Sociological Insight: A Theoretical Examination and Experiment in Building Affinity in Small Groups

    Science.gov (United States)

    Oren, Michael Anthony

    2011-01-01

    The juxtaposition of classic sociological theory and the, relatively, young discipline of human-computer interaction (HCI) serves as a powerful mechanism for both exploring the theoretical impacts of technology on human interactions as well as the application of technological systems to moderate interactions. It is the intent of this dissertation…

  19. Formal monkey linguistics : The debate

    NARCIS (Netherlands)

    Schlenker, Philippe; Chemla, Emmanuel; Schel, Anne M.|info:eu-repo/dai/nl/413333450; Fuller, James; Gautier, Jean Pierre; Kuhn, Jeremy; Veselinović, Dunja; Arnold, Kate; Cäsar, Cristiane; Keenan, Sumir; Lemasson, Alban; Ouattara, Karim; Ryder, Robin; Zuberbühler, Klaus

    2016-01-01

    We explain why general techniques from formal linguistics can and should be applied to the analysis of monkey communication - in the areas of syntax and especially semantics. An informed look at our recent proposals shows that such techniques needn't rely excessively on categories of human language:

  20. Astronomers debate diamonds in space

    Science.gov (United States)

    1999-04-01

    This is not the first time the intriguing carbonaceous compound has been detected in space. A peculiar elite of twelve stars are known to produce it. The star now added by ISO to this elite is one of the best representatives of this exclusive family, since it emits a very strong signal of the compound. Additionally ISO found a second new member of the group with weaker emission, and also observed with a spectral resolution never achieved before other already known stars in this class. Astronomers think these ISO results will help solve the mystery of the true nature of the compound. Their publication by two different groups, from Spain and Canada, has triggered a debate on the topic, both in astronomy institutes and in chemistry laboratories. At present, mixed teams of astrophysicists and chemists are investigating in the lab compounds whose chemical signature or "fingerprint" matches that detected by ISO. Neither diamonds nor fullerenes have ever been detected in space, but their presence has been predicted. Tiny diamonds of pre-solar origin --older than the Solar System-- have been found in meteorites, which supports the as yet unconfirmed theory of their presence in interstellar space. The fullerene molecule, made of 60 carbon atoms linked to form a sphere (hence the name "buckyball"), has also been extensively searched for in space but never found. If the carbonaceous compound detected by ISO is a fullerene or a diamond, there will be new data on the production of these industrially interesting materials. Fullerenes are being investigated as "capsules" to deliver new pharmaceuticals to the body. Diamonds are commonly used in the electronics industry and for the development of new materials; if they are formed in the dust surrounding some stars, at relatively low temperatures and conditions of low pressure, companies could learn more about the ideal physical conditions to produce them. A textbook case The latest star in which the compound has been found is

  1. Understanding conservationists' perspectives on the new-conservation debate.

    Science.gov (United States)

    Holmes, George; Sandbrook, Chris; Fisher, Janet A

    2017-04-01

    A vibrant debate about the future direction of biodiversity conservation centers on the merits of the so-called new conservation. Proponents of the new conservation advocate a series of positions on key conservation ideas, such as the importance of human-dominated landscapes and conservation's engagement with capitalism. These have been fiercely contested in a debate dominated by a few high-profile individuals, and so far there has been no empirical exploration of existing perspectives on these issues among a wider community of conservationists. We used Q methodology to examine empirically perspectives on the new conservation held by attendees at the 2015 International Congress for Conservation Biology (ICCB). Although we identified a consensus on several key issues, 3 distinct positions emerged: in favor of conservation to benefit people but opposed to links with capitalism and corporations, in favor of biocentric approaches but with less emphasis on wilderness protection than prominent opponents of new conservation, and in favor of the published new conservation perspective but with less emphasis on increasing human well-being as a goal of conservation. Our results revealed differences between the debate on the new conservation in the literature and views held within a wider, but still limited, conservation community and demonstrated the existence of at least one viewpoint (in favor of conservation to benefit people but opposed to links with capitalism and corporations) that is almost absent from the published debate. We hope the fuller understanding we present of the variety of views that exist but have not yet been heard, will improve the quality and tone of debates on the subject. © 2016 The Authors. Conservation Biology published by Wiley Periodicals, Inc. on behalf of Society for Conservation Biology.

  2. Comparison of computational to human observer detection for evaluation of CT low dose iterative reconstruction

    Science.gov (United States)

    Eck, Brendan; Fahmi, Rachid; Brown, Kevin M.; Raihani, Nilgoun; Wilson, David L.

    2014-03-01

    Model observers were created and compared to human observers for the detection of low contrast targets in computed tomography (CT) images reconstructed with an advanced, knowledge-based, iterative image reconstruction method for low x-ray dose imaging. A 5-channel Laguerre-Gauss Hotelling Observer (CHO) was used with internal noise added to the decision variable (DV) and/or channel outputs (CO). Models were defined by parameters: (k1) DV-noise with standard deviation (std) proportional to DV std; (k2) DV-noise with constant std; (k3) CO-noise with constant std across channels; and (k4) CO-noise in each channel with std proportional to CO variance. Four-alternative forced choice (4AFC) human observer studies were performed on sub-images extracted from phantom images with and without a "pin" target. Model parameters were estimated using maximum likelihood comparison to human probability correct (PC) data. PC in human and all model observers increased with dose, contrast, and size, and was much higher for advanced iterative reconstruction (IMR) as compared to filtered back projection (FBP). Detection in IMR was better than FPB at 1/3 dose, suggesting significant dose savings. Model(k1,k2,k3,k4) gave the best overall fit to humans across independent variables (dose, size, contrast, and reconstruction) at fixed display window. However Model(k1) performed better when considering model complexity using the Akaike information criterion. Model(k1) fit the extraordinary detectability difference between IMR and FBP, despite the different noise quality. It is anticipated that the model observer will predict results from iterative reconstruction methods having similar noise characteristics, enabling rapid comparison of methods.

  3. Public debate - radioactive wastes management; Debat public - gestion des dechets radioactifs

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-07-01

    Between September 2005 and January 2006 a national debate has been organized on the radioactive wastes management. This debate aimed to inform the public and to allow him to give his opinion. This document presents, the reasons of this debate, the operating, the synthesis of the results and technical documents to bring information in the domain of radioactive wastes management. (A.L.B.)

  4. COMPUTING

    CERN Multimedia

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  5. COMPUTING

    CERN Multimedia

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  6. COMPUTING

    CERN Multimedia

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  7. COMPUTING

    CERN Multimedia

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  8. COMPUTING

    CERN Multimedia

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  9. COMPUTING

    CERN Multimedia

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  10. COMPUTING

    CERN Multimedia

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  11. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

    Science.gov (United States)

    Lee, Seungcheol Austin; Liang, Yuhua Jake

    2015-04-01

    Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

  12. Computer program for assessing the human dose due to stationary release of tritium

    International Nuclear Information System (INIS)

    Saito, Masahiro; Raskob, Wolfgang

    2003-01-01

    The computer program TriStat (Tritium dose assessment for stationary release) has been developed to assess the dose to humans assuming a stationary release of tritium as HTO and/or HT from nuclear facilities. A Gaussian dispersion model describes the behavior of HT gas and HTO vapor in the atmosphere. Tritium concentrations in soil, vegetables and forage were estimated on the basis of specific tritium concentrations in the free water component and the organic component. The uptake of contamination via food by humans was modeled by assuming a forage compartment, a vegetable component, and an animal compartment. A standardized vegetable and a standardized animal with the relative content of major nutrients, i.e. proteins, lipids and carbohydrates, representing a standard Japanese diet, were included. A standardized forage was defined in a similar manner by using the forage composition for typical farm animals. These standard feed- and foodstuffs are useful to simplify the tritium dosimetry and the food chain related to the tritium transfer to the human body. (author)

  13. Three-dimensional computer-aided human factors engineering analysis of a grafting robot.

    Science.gov (United States)

    Chiu, Y C; Chen, S; Wu, G J; Lin, Y H

    2012-07-01

    The objective of this research was to conduct a human factors engineering analysis of a grafting robot design using computer-aided 3D simulation technology. A prototype tubing-type grafting robot for fruits and vegetables was the subject of a series of case studies. To facilitate the incorporation of human models into the operating environment of the grafting robot, I-DEAS graphic software was applied to establish individual models of the grafting robot in line with Jack ergonomic analysis. Six human models (95th percentile, 50th percentile, and 5th percentile by height for both males and females) were employed to simulate the operating conditions and working postures in a real operating environment. The lower back and upper limb stresses of the operators were analyzed using the lower back analysis (LBA) and rapid upper limb assessment (RULA) functions in Jack. The experimental results showed that if a leg space is introduced under the robot, the operator can sit closer to the robot, which reduces the operator's level of lower back and upper limbs stress. The proper environmental layout for Taiwanese operators for minimum levels of lower back and upper limb stress are to set the grafting operation at 23.2 cm away from the operator at a height of 85 cm and with 45 cm between the rootstock and scion units.

  14. Rana computatrix to human language: towards a computational neuroethology of language evolution.

    Science.gov (United States)

    Arbib, Michael A

    2003-10-15

    Walter's Machina speculatrix inspired the name Rana computatrix for a family of models of visuomotor coordination in the frog, which contributed to the development of computational neuroethology. We offer here an 'evolutionary' perspective on models in the same tradition for rat, monkey and human. For rat, we show how the frog-like taxon affordance model provides a basis for the spatial navigation mechanisms that involve the hippocampus and other brain regions. For monkey, we recall two models of neural mechanisms for visuomotor coordination. The first, for saccades, shows how interactions between the parietal and frontal cortex augment superior colliculus seen as the homologue of frog tectum. The second, for grasping, continues the theme of parieto-frontal interactions, linking parietal affordances to motor schemas in premotor cortex. It further emphasizes the mirror system for grasping, in which neurons are active both when the monkey executes a specific grasp and when it observes a similar grasp executed by others. The model of human-brain mechanisms is based on the mirror-system hypothesis of the evolution of the language-ready brain, which sees the human Broca's area as an evolved extension of the mirror system for grasping.

  15. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  16. Electrophysiological properties of computational human ventricular cell action potential models under acute ischemic conditions.

    Science.gov (United States)

    Dutta, Sara; Mincholé, Ana; Quinn, T Alexander; Rodriguez, Blanca

    2017-10-01

    Acute myocardial ischemia is one of the main causes of sudden cardiac death. The mechanisms have been investigated primarily in experimental and computational studies using different animal species, but human studies remain scarce. In this study, we assess the ability of four human ventricular action potential models (ten Tusscher and Panfilov, 2006; Grandi et al., 2010; Carro et al., 2011; O'Hara et al., 2011) to simulate key electrophysiological consequences of acute myocardial ischemia in single cell and tissue simulations. We specifically focus on evaluating the effect of extracellular potassium concentration and activation of the ATP-sensitive inward-rectifying potassium current on action potential duration, post-repolarization refractoriness, and conduction velocity, as the most critical factors in determining reentry vulnerability during ischemia. Our results show that the Grandi and O'Hara models required modifications to reproduce expected ischemic changes, specifically modifying the intracellular potassium concentration in the Grandi model and the sodium current in the O'Hara model. With these modifications, the four human ventricular cell AP models analyzed in this study reproduce the electrophysiological alterations in repolarization, refractoriness, and conduction velocity caused by acute myocardial ischemia. However, quantitative differences are observed between the models and overall, the ten Tusscher and modified O'Hara models show closest agreement to experimental data. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Computational and human observer image quality evaluation of low dose, knowledge-based CT iterative reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Eck, Brendan L.; Fahmi, Rachid; Miao, Jun [Department of Biomedical Engineering, Case Western Reserve University, Cleveland, Ohio 44106 (United States); Brown, Kevin M.; Zabic, Stanislav; Raihani, Nilgoun [Philips Healthcare, Cleveland, Ohio 44143 (United States); Wilson, David L., E-mail: dlw@case.edu [Department of Biomedical Engineering, Case Western Reserve University, Cleveland, Ohio 44106 and Department of Radiology, Case Western Reserve University, Cleveland, Ohio 44106 (United States)

    2015-10-15

    Purpose: Aims in this study are to (1) develop a computational model observer which reliably tracks the detectability of human observers in low dose computed tomography (CT) images reconstructed with knowledge-based iterative reconstruction (IMR™, Philips Healthcare) and filtered back projection (FBP) across a range of independent variables, (2) use the model to evaluate detectability trends across reconstructions and make predictions of human observer detectability, and (3) perform human observer studies based on model predictions to demonstrate applications of the model in CT imaging. Methods: Detectability (d′) was evaluated in phantom studies across a range of conditions. Images were generated using a numerical CT simulator. Trained observers performed 4-alternative forced choice (4-AFC) experiments across dose (1.3, 2.7, 4.0 mGy), pin size (4, 6, 8 mm), contrast (0.3%, 0.5%, 1.0%), and reconstruction (FBP, IMR), at fixed display window. A five-channel Laguerre–Gauss channelized Hotelling observer (CHO) was developed with internal noise added to the decision variable and/or to channel outputs, creating six different internal noise models. Semianalytic internal noise computation was tested against Monte Carlo and used to accelerate internal noise parameter optimization. Model parameters were estimated from all experiments at once using maximum likelihood on the probability correct, P{sub C}. Akaike information criterion (AIC) was used to compare models of different orders. The best model was selected according to AIC and used to predict detectability in blended FBP-IMR images, analyze trends in IMR detectability improvements, and predict dose savings with IMR. Predicted dose savings were compared against 4-AFC study results using physical CT phantom images. Results: Detection in IMR was greater than FBP in all tested conditions. The CHO with internal noise proportional to channel output standard deviations, Model-k4, showed the best trade-off between fit

  18. Idealism and realism in International Relations: an ontological debate

    Directory of Open Access Journals (Sweden)

    Vitor Ramon Fernandes

    2016-11-01

    Full Text Available The debate between realism and idealism continues to mark the discipline of International Relations. On the one hand, realism argues that international politics is a struggle for power and a quest for survival, which results in a condition of permanent conflict between States without any possibility of evolution or progress. On the other hand, idealism considers it possible to build a world of peaceful coexistence, prosperity and well-being, achieved through cooperation and based on values and aspirations shared by humans. The object of this article is to analyse the debate between idealism and realism, considering it as an ontological debate and taking into account the controversy it has generated. The argument presented here is that both realism and idealism are two responses to the creation and maintenance of international order, that is, how States relate in international society; however these responses are not mutually exclusive and can coexist in constant tension with one another. An analysis of internationalist thought of two authors, Hans Morgenthau and Raymond Aron, is also presented, which relates to how they are positioned in this debate as well as International Relations as a whole.

  19. COMPUTING

    CERN Multimedia

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  20. Creating Communications, Computing, and Networking Technology Development Road Maps for Future NASA Human and Robotic Missions

    Science.gov (United States)

    Bhasin, Kul; Hayden, Jeffrey L.

    2005-01-01

    For human and robotic exploration missions in the Vision for Exploration, roadmaps are needed for capability development and investments based on advanced technology developments. A roadmap development process was undertaken for the needed communications, and networking capabilities and technologies for the future human and robotics missions. The underlying processes are derived from work carried out during development of the future space communications architecture, an d NASA's Space Architect Office (SAO) defined formats and structures for accumulating data. Interrelationships were established among emerging requirements, the capability analysis and technology status, and performance data. After developing an architectural communications and networking framework structured around the assumed needs for human and robotic exploration, in the vicinity of Earth, Moon, along the path to Mars, and in the vicinity of Mars, information was gathered from expert participants. This information was used to identify the capabilities expected from the new infrastructure and the technological gaps in the way of obtaining them. We define realistic, long-term space communication architectures based on emerging needs and translate the needs into interfaces, functions, and computer processing that will be required. In developing our roadmapping process, we defined requirements for achieving end-to-end activities that will be carried out by future NASA human and robotic missions. This paper describes: 10 the architectural framework developed for analysis; 2) our approach to gathering and analyzing data from NASA, industry, and academia; 3) an outline of the technology research to be done, including milestones for technology research and demonstrations with timelines; and 4) the technology roadmaps themselves.

  1. Imaging cellular and subcellular structure of human brain tissue using micro computed tomography

    Science.gov (United States)

    Khimchenko, Anna; Bikis, Christos; Schweighauser, Gabriel; Hench, Jürgen; Joita-Pacureanu, Alexandra-Teodora; Thalmann, Peter; Deyhle, Hans; Osmani, Bekim; Chicherova, Natalia; Hieber, Simone E.; Cloetens, Peter; Müller-Gerbl, Magdalena; Schulz, Georg; Müller, Bert

    2017-09-01

    Brain tissues have been an attractive subject for investigations in neuropathology, neuroscience, and neurobiol- ogy. Nevertheless, existing imaging methodologies have intrinsic limitations in three-dimensional (3D) label-free visualisation of extended tissue samples down to (sub)cellular level. For a long time, these morphological features were visualised by electron or light microscopies. In addition to being time-consuming, microscopic investigation includes specimen fixation, embedding, sectioning, staining, and imaging with the associated artefacts. More- over, optical microscopy remains hampered by a fundamental limit in the spatial resolution that is imposed by the diffraction of visible light wavefront. In contrast, various tomography approaches do not require a complex specimen preparation and can now reach a true (sub)cellular resolution. Even laboratory-based micro computed tomography in the absorption-contrast mode of formalin-fixed paraffin-embedded (FFPE) human cerebellum yields an image contrast comparable to conventional histological sections. Data of a superior image quality was obtained by means of synchrotron radiation-based single-distance X-ray phase-contrast tomography enabling the visualisation of non-stained Purkinje cells down to the subcellular level and automated cell counting. The question arises, whether the data quality of the hard X-ray tomography can be superior to optical microscopy. Herein, we discuss the label-free investigation of the human brain ultramorphology be means of synchrotron radiation-based hard X-ray magnified phase-contrast in-line tomography at the nano-imaging beamline ID16A (ESRF, Grenoble, France). As an example, we present images of FFPE human cerebellum block. Hard X-ray tomography can provide detailed information on human tissues in health and disease with a spatial resolution below the optical limit, improving understanding of the neuro-degenerative diseases.

  2. A soft-contact model for computing safety margins in human prehension.

    Science.gov (United States)

    Singh, Tarkeshwar; Ambike, Satyajit

    2017-10-01

    The soft human digit tip forms contact with grasped objects over a finite area and applies a moment about an axis normal to the area. These moments are important for ensuring stability during precision grasping. However, the contribution of these moments to grasp stability is rarely investigated in prehension studies. The more popular hard-contact model assumes that the digits exert a force vector but no free moment on the grasped object. Many sensorimotor studies use this model and show that humans estimate friction coefficients to scale the normal force to grasp objects stably, i.e. the smoother the surface, the tighter the grasp. The difference between the applied normal force and the minimal normal force needed to prevent slipping is called safety margin and this index is widely used as a measure of grasp planning. Here, we define and quantify safety margin using a more realistic contact model that allows digits to apply both forces and moments. Specifically, we adapt a soft-contact model from robotics and demonstrate that the safety margin thus computed is a more accurate and robust index of grasp planning than its hard-contact variant. Previously, we have used the soft-contact model to propose two indices of grasp planning that show how humans account for the shape and inertial properties of an object. A soft-contact based safety margin offers complementary insights by quantifying how humans may account for surface properties of the object and skin tissue during grasp planning and execution. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Debate: Prevention and Victim Compensation

    Directory of Open Access Journals (Sweden)

    Nisha Varia

    2014-09-01

    Full Text Available Afroza, a Bangladeshi woman who worked for sixteen years without getting paid and was not allowed to go home to visit her family. Keni, an Indonesian woman whose employers injured her with a hot iron, leaving disfiguring third-degree burns all over her body. Kartika, an older Sri Lankan woman whose employers made her work around the clock without pay, shaved her head to humiliate her and gouged pieces of flesh out of her arm with knives. These are some of the women whose faces and stories still haunt me after ten years of investigating human rights abuses against migrant domestic workers in Asia and the Middle East.

  4. Young Voters’ Responses to Polemical Debate

    DEFF Research Database (Denmark)

    Kock, Christian Erik J

    I will present an authentic case: 24 young voters in a Danish “Folk high school” watching a televised, very polemical debate between the two contenders for the office of Prime Minister of Denmark shortly before the parliamentary election in 2015. I asked this group to note down all their evaluative...... of alert young voters like or dislike debaters to do in a mediated polemical debate to which they are spectators: what speech act types, rhetorical maneuvers, argument types, etc., make them—metaphorically speaking—either cheer or hiss? This picture, in turn, may be held against various normative...

  5. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    International Nuclear Information System (INIS)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-01-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  6. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-02-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  7. Human Capital, (Human) Capabilities and Higher Education

    Science.gov (United States)

    Le Grange, L.

    2011-01-01

    In this article I initiate a debate into the (de)merits of human capital theory and human capability theory and discuss implications of the debate for higher education. Human capital theory holds that economic growth depends on investment in education and that economic growth is the basis for improving the quality of human life. Human capable…

  8. MAPPS (Maintenance Personnel Performance Simulation): a computer simulation model for human reliability analysis

    International Nuclear Information System (INIS)

    Knee, H.E.; Haas, P.M.

    1985-01-01

    A computer model has been developed, sensitivity tested, and evaluated capable of generating reliable estimates of human performance measures in the nuclear power plant (NPP) maintenance context. The model, entitled MAPPS (Maintenance Personnel Performance Simulation), is of the simulation type and is task-oriented. It addresses a number of person-machine, person-environment, and person-person variables and is capable of providing the user with a rich spectrum of important performance measures including mean time for successful task performance by a maintenance team and maintenance team probability of task success. These two measures are particularly important for input to probabilistic risk assessment (PRA) studies which were the primary impetus for the development of MAPPS. The simulation nature of the model along with its generous input parameters and output variables allows its usefulness to extend beyond its input to PRA

  9. Human-computer interaction for alert warning and attention allocation systems of the multimodal watchstation

    Science.gov (United States)

    Obermayer, Richard W.; Nugent, William A.

    2000-11-01

    The SPAWAR Systems Center San Diego is currently developing an advanced Multi-Modal Watchstation (MMWS); design concepts and software from this effort are intended for transition to future United States Navy surface combatants. The MMWS features multiple flat panel displays and several modes of user interaction, including voice input and output, natural language recognition, 3D audio, stylus and gestural inputs. In 1999, an extensive literature review was conducted on basic and applied research concerned with alerting and warning systems. After summarizing that literature, a human computer interaction (HCI) designer's guide was prepared to support the design of an attention allocation subsystem (AAS) for the MMWS. The resultant HCI guidelines are being applied in the design of a fully interactive AAS prototype. An overview of key findings from the literature review, a proposed design methodology with illustrative examples, and an assessment of progress made in implementing the HCI designers guide are presented.

  10. Single photon emission computed tomography study of human pulmonary perfusion: preliminary findings

    Energy Technology Data Exchange (ETDEWEB)

    Carratu, L; Sofia, M [Naples Univ. (Italy). Facolta di Medicina e Chirurgia; Salvatore, M; Muto, P; Ariemma, G [Istituto Nazionale per la Prevenzione, Lo Studio e La Cura dei Tumori Fondazione Pascale, Naples (Italy); Lopez-Majano, V [Cook County Hospital, Chicago, IL (USA). Nuclear Medicine Div.

    1984-02-01

    Single photon emission computed tomography (SPECT) was performed with /sup 99/Tcsup(m)-albumin macroaggregates to study human pulmonary perfusion in healthy subjects and patients with respiratory diseases such as chronic obstructive pulmonary disease (COPD) and lung neoplasms. The reconstructed SPECT data was displayed in coronal, transverse, sagittal plane sections and compared to conventional perfusion scans. The SPECT data gave more complicated anatomical information about the extent of damage and morphology of the pulmonary vascular bed. In healthy subjects and COPD patients, qualitative and quantitative assessment of pulmonary perfusion could be obtained from serial SPECT scans with respect to distribution and relative concentration of the injected radiopharmaceutical. Furthermore, SPECT of pulmonary perfusion has been useful in detecting the extent of damage to the pulmonary circulation. This is useful for the preoperative evaluation and staging of lung cancer.

  11. A Single Camera Motion Capture System for Human-Computer Interaction

    Science.gov (United States)

    Okada, Ryuzo; Stenger, Björn

    This paper presents a method for markerless human motion capture using a single camera. It uses tree-based filtering to efficiently propagate a probability distribution over poses of a 3D body model. The pose vectors and associated shapes are arranged in a tree, which is constructed by hierarchical pairwise clustering, in order to efficiently evaluate the likelihood in each frame. Anew likelihood function based on silhouette matching is proposed that improves the pose estimation of thinner body parts, i. e. the limbs. The dynamic model takes self-occlusion into account by increasing the variance of occluded body-parts, thus allowing for recovery when the body part reappears. We present two applications of our method that work in real-time on a Cell Broadband Engine™: a computer game and a virtual clothing application.

  12. Human thyroid specimen imaging by fluorescent x-ray computed tomography with synchrotron radiation

    Science.gov (United States)

    Takeda, Tohoru; Yu, Quanwen; Yashiro, Toru; Yuasa, Tetsuya; Hasegawa, Yasuo; Itai, Yuji; Akatsuka, Takao

    1999-09-01

    Fluorescent x-ray computed tomography (FXCT) is being developed to detect non-radioactive contrast materials in living specimens. The FXCT system consists of a silicon (111) channel cut monochromator, an x-ray slit and a collimator for fluorescent x ray detection, a scanning table for the target organ and an x-ray detector for fluorescent x-ray and transmission x-ray. To reduce Compton scattering overlapped on the fluorescent K(alpha) line, incident monochromatic x-ray was set at 37 keV. The FXCT clearly imaged a human thyroid gland and iodine content was estimated quantitatively. In a case of hyperthyroidism, the two-dimensional distribution of iodine content was not uniform, and thyroid cancer had a small amount of iodine. FXCT can be used to detect iodine within thyroid gland quantitatively and to delineate its distribution.

  13. Application of computer-assisted imaging technology in human musculoskeletal joint research

    Directory of Open Access Journals (Sweden)

    Xudong Liu

    2014-01-01

    Full Text Available Computer-assisted imaging analysis technology has been widely used in the musculoskeletal joint biomechanics research in recent years. Imaging techniques can accurately reconstruct the anatomic features of the target joint and reproduce its in vivo motion characters. The data has greatly improved our understanding of normal joint function, joint injury mechanism, and surgical treatment, and can provide foundations for using reverse-engineering methods to develop biomimetic artificial joints. In this paper, we systematically reviewed the investigation of in vivo kinematics of the human knee, shoulder, lumber spine, and ankle using advanced imaging technologies, especially those using a dual fluoroscopic imaging system (DFIS. We also briefly discuss future development of imaging analysis technology in musculoskeletal joint research.

  14. Histogram analysis for age change of human lung with computed tomography

    International Nuclear Information System (INIS)

    Shirabe, Ichiju

    1990-01-01

    In order to evaluate physiological changes of normal lung with aging by computed tomography (CT), the peak position (PP) and full width half maximum (FWHM) of CT-histogram were studied in 77 normal human lung. Above 30 years old, PP tended to be seen in the lower attenuation value with advancing ages, with the result that the follow equation was obtained. CT attenuation value of PP=-0.87 x age -815. The peak position shifted to the range of higher CT attenuation in 30's. FWHM did not change with advancing ages. There were no differences of peak value and FWHM among the upper, middle and lower lung field. In this study, physiological changes of lung were evaluated quantitatively. Furthermore, this study was considered to be useful for diagnosis and treatment in lung diseases. (author)

  15. Accuracy of computer-guided implantation in a human cadaver model.

    Science.gov (United States)

    Yatzkair, Gustavo; Cheng, Alice; Brodie, Stan; Raviv, Eli; Boyan, Barbara D; Schwartz, Zvi

    2015-10-01

    To examine the accuracy of computer-guided implantation using a human cadaver model with reduced experimental variability. Twenty-eight (28) dental implants representing 12 clinical cases were placed in four cadaver heads using a static guided implantation template. All planning and surgeries were performed by one clinician. All radiographs and measurements were performed by two examiners. The distance of the implants from buccal and lingual bone and mesial implant or tooth was analyzed at the apical and coronal levels, and measurements were compared to the planned values. No significant differences were seen between planned and implanted measurements. Average deviation of an implant from its planning radiograph was 0.8 mm, which is within the range of variability expected from CT analysis. Guided implantation can be used safely with a margin of error of 1 mm. © 2014 The Authors. Clinical Oral Implants Research Published by John Wiley & Sons Ltd.

  16. Soft Electronics Enabled Ergonomic Human-Computer Interaction for Swallowing Training

    Science.gov (United States)

    Lee, Yongkuk; Nicholls, Benjamin; Sup Lee, Dong; Chen, Yanfei; Chun, Youngjae; Siang Ang, Chee; Yeo, Woon-Hong

    2017-04-01

    We introduce a skin-friendly electronic system that enables human-computer interaction (HCI) for swallowing training in dysphagia rehabilitation. For an ergonomic HCI, we utilize a soft, highly compliant (“skin-like”) electrode, which addresses critical issues of an existing rigid and planar electrode combined with a problematic conductive electrolyte and adhesive pad. The skin-like electrode offers a highly conformal, user-comfortable interaction with the skin for long-term wearable, high-fidelity recording of swallowing electromyograms on the chin. Mechanics modeling and experimental quantification captures the ultra-elastic mechanical characteristics of an open mesh microstructured sensor, conjugated with an elastomeric membrane. Systematic in vivo studies investigate the functionality of the soft electronics for HCI-enabled swallowing training, which includes the application of a biofeedback system to detect swallowing behavior. The collection of results demonstrates clinical feasibility of the ergonomic electronics in HCI-driven rehabilitation for patients with swallowing disorders.

  17. Monochromatic computed tomography of the human brain using synchrotron x rays: Technical feasibility

    International Nuclear Information System (INIS)

    Nachaliel, E.; Dilmanian, F.A.; Garrett, R.F.; Thomlinson, W.C.; Chapman, L.D.; Gmuer, N.F.; Lazarz, N.M.; Moulin, H.R.; Rivers, M.L.; Rarback, H.; Stefan, P.M.; Spanne, P.; Luke, P.N.; Pehl, R.; Thompson, A.C.; Miller, M.

    1991-01-01

    A monochromatic computed tomography (CT) scanner is being developed at the X17 superconducting wiggler beamline at the National Synchrotron Light Source (NSLS), Brookhaven National Laboratory, to image the human head and neck. The system configuration is one of a horizontal fan beam and an upright seated rotating subject. The purpose of the project are to demonstrate improvement in the image contrast and in the image quantitative accuracy that can be obtained in monochromatic CT and to apply the system to specific clinical research programs in neuroradiology. This paper describes the first phantom studies carried out with a prototype system, using the dual photon absorptiometry (DPA) method at energies of 20 and 39 Kev. The results show that improvements in image contrast and quantitative accuracy are possible with monochromatic DPA CT. Estimates of the clinical performance of the planned CT system are made on the basis of these initial results

  18. Twenty Years of Creativity Research in Human-Computer Interaction: Current State and Future Directions

    DEFF Research Database (Denmark)

    Frich Pedersen, Jonas; Biskjaer, Michael Mose; Dalsgaard, Peter

    2018-01-01

    Creativity has been a growing topic within the ACM community since the 1990s. However, no clear overview of this trend has been offered. We present a thorough survey of 998 creativity-related publications in the ACM Digital Library collected using keyword search to determine prevailing approaches......, topics, and characteristics of creativity-oriented Human-Computer Interaction (HCI) research. . A selected sample based on yearly citations yielded 221 publications, which were analyzed using constant comparison analysis. We found that HCI is almost exclusively responsible for creativity......-oriented publications; they focus on collaborative creativity rather than individual creativity; there is a general lack of definition of the term ‘creativity’; empirically based contributions are prevalent; and many publications focus on new tools, often developed by researchers. On this basis, we present three...

  19. Radiotherapy infrastructure and human resources in Switzerland : Present status and projected computations for 2020.

    Science.gov (United States)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-09-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology "Quantification of Radiation Therapy Infrastructure and Staffing" guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO "Health Economics in Radiation Oncology" (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland.

  20. Radiotherapy infrastructure and human resources in Switzerland. Present status and projected computations for 2020

    International Nuclear Information System (INIS)

    Datta, Niloy Ranjan; Khan, Shaka; Marder, Dietmar; Zwahlen, Daniel; Bodis, Stephan

    2016-01-01

    The purpose of this study was to evaluate the present status of radiotherapy infrastructure and human resources in Switzerland and compute projections for 2020. The European Society of Therapeutic Radiation Oncology ''Quantification of Radiation Therapy Infrastructure and Staffing'' guidelines (ESTRO-QUARTS) and those of the International Atomic Energy Agency (IAEA) were applied to estimate the requirements for teleradiotherapy (TRT) units, radiation oncologists (RO), medical physicists (MP) and radiotherapy technologists (RTT). The databases used for computation of the present gap and additional requirements are (a) Global Cancer Incidence, Mortality and Prevalence (GLOBOCAN) for cancer incidence (b) the Directory of Radiotherapy Centres (DIRAC) of the IAEA for existing TRT units (c) human resources from the recent ESTRO ''Health Economics in Radiation Oncology'' (HERO) survey and (d) radiotherapy utilization (RTU) rates for each tumour site, published by the Ingham Institute for Applied Medical Research (IIAMR). In 2015, 30,999 of 45,903 cancer patients would have required radiotherapy. By 2020, this will have increased to 34,041 of 50,427 cancer patients. Switzerland presently has an adequate number of TRTs, but a deficit of 57 ROs, 14 MPs and 36 RTTs. By 2020, an additional 7 TRTs, 72 ROs, 22 MPs and 66 RTTs will be required. In addition, a realistic dynamic model for calculation of staff requirements due to anticipated changes in future radiotherapy practices has been proposed. This model could be tailor-made and individualized for any radiotherapy centre. A 9.8 % increase in radiotherapy requirements is expected for cancer patients over the next 5 years. The present study should assist the stakeholders and health planners in designing an appropriate strategy for meeting future radiotherapy needs for Switzerland. (orig.) [de

  1. Computationally derived points of fragility of a human cascade are consistent with current therapeutic strategies.

    Directory of Open Access Journals (Sweden)

    Deyan Luan

    2007-07-01

    Full Text Available The role that mechanistic mathematical modeling and systems biology will play in molecular medicine and clinical development remains uncertain. In this study, mathematical modeling and sensitivity analysis were used to explore the working hypothesis that mechanistic models of human cascades, despite model uncertainty, can be computationally screened for points of fragility, and that these sensitive mechanisms could serve as therapeutic targets. We tested our working hypothesis by screening a model of the well-studied coagulation cascade, developed and validated from literature. The predicted sensitive mechanisms were then compared with the treatment literature. The model, composed of 92 proteins and 148 protein-protein interactions, was validated using 21 published datasets generated from two different quiescent in vitro coagulation models. Simulated platelet activation and thrombin generation profiles in the presence and absence of natural anticoagulants were consistent with measured values, with a mean correlation of 0.87 across all trials. Overall state sensitivity coefficients, which measure the robustness or fragility of a given mechanism, were calculated using a Monte Carlo strategy. In the absence of anticoagulants, fluid and surface phase factor X/activated factor X (fX/FXa activity and thrombin-mediated platelet activation were found to be fragile, while fIX/FIXa and fVIII/FVIIIa activation and activity were robust. Both anti-fX/FXa and direct thrombin inhibitors are important classes of anticoagulants; for example, anti-fX/FXa inhibitors have FDA approval for the prevention of venous thromboembolism following surgical intervention and as an initial treatment for deep venous thrombosis and pulmonary embolism. Both in vitro and in vivo experimental evidence is reviewed supporting the prediction that fIX/FIXa activity is robust. When taken together, these results support our working hypothesis that computationally derived points of

  2. COMPUTING

    CERN Multimedia

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  3. Integrated multimodal human-computer interface and augmented reality for interactive display applications

    Science.gov (United States)

    Vassiliou, Marius S.; Sundareswaran, Venkataraman; Chen, S.; Behringer, Reinhold; Tam, Clement K.; Chan, M.; Bangayan, Phil T.; McGee, Joshua H.

    2000-08-01

    We describe new systems for improved integrated multimodal human-computer interaction and augmented reality for a diverse array of applications, including future advanced cockpits, tactical operations centers, and others. We have developed an integrated display system featuring: speech recognition of multiple concurrent users equipped with both standard air- coupled microphones and novel throat-coupled sensors (developed at Army Research Labs for increased noise immunity); lip reading for improving speech recognition accuracy in noisy environments, three-dimensional spatialized audio for improved display of warnings, alerts, and other information; wireless, coordinated handheld-PC control of a large display; real-time display of data and inferences from wireless integrated networked sensors with on-board signal processing and discrimination; gesture control with disambiguated point-and-speak capability; head- and eye- tracking coupled with speech recognition for 'look-and-speak' interaction; and integrated tetherless augmented reality on a wearable computer. The various interaction modalities (speech recognition, 3D audio, eyetracking, etc.) are implemented a 'modality servers' in an Internet-based client-server architecture. Each modality server encapsulates and exposes commercial and research software packages, presenting a socket network interface that is abstracted to a high-level interface, minimizing both vendor dependencies and required changes on the client side as the server's technology improves.

  4. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations

    Directory of Open Access Journals (Sweden)

    Andrea Stocco

    2018-04-01

    Full Text Available This article describes the data analyzed in the paper “Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model” (Stocco et al., 2017 [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004 [2], Simon task (Craft and Simon, 1970 [3], and Automated Operation Span (Unsworth et al., 2005 [4], as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  5. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations.

    Science.gov (United States)

    Stocco, Andrea; Yamasaki, Brianna L; Prat, Chantel S

    2018-04-01

    This article describes the data analyzed in the paper "Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model" (Stocco et al., 2017) [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004) [2], Simon task (Craft and Simon, 1970) [3], and Automated Operation Span (Unsworth et al., 2005) [4]), as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format) as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  6. USING OLFACTORY DISPLAYS AS A NONTRADITIONAL INTERFACE IN HUMAN COMPUTER INTERACTION

    Directory of Open Access Journals (Sweden)

    Alper Efe

    2017-07-01

    Full Text Available Smell has its limitations and disadvantages as a display medium, but it also has its strengths and many have recognized its potential. At present, in communications and virtual technologies, smell is either forgotten or improperly stimulated, because non controlled odorants present in the physical space surrounding the user. Nonetheless a controlled presentation of olfactory information can give advantages in various application fields. Therefore, two enabling technologies, electronic noses and especially olfactory displays are reviewed. Scenarios of usage are discussed together with relevant psycho-physiological issues. End-to-end systems including olfactory interfaces are quantitatively characterised under many respects. Recent works done by the authors on field are reported. The article will touch briefly on the control of scent emissions; an important factor to consider when building scented computer systems. As a sample application SUBSMELL system investigated. A look at areas of human computer interaction where olfaction output may prove useful will be presented. The article will finish with some brief conclusions and discuss some shortcomings and gaps of the topic. In particular, the addition of olfactory cues to a virtual environment increased the user's sense of presence and memory of the environment. Also, this article discusses the educational aspect of the subsmell systems.

  7. Flat panel computed tomography of human ex vivo heart and bone specimens: initial experience

    Energy Technology Data Exchange (ETDEWEB)

    Nikolaou, Konstantin; Becker, Christoph R.; Reiser, Maximilian F. [Ludwig-Maximilians-University, Department of Clinical Radiology, Munich (Germany); Flohr, Thomas; Stierstorfer, Karl [CT Division, Siemens Medical Solutions, Forchheim (Germany)

    2005-02-01

    The aim of this technical investigation was the detailed description of a prototype flat panel detector computed tomography system (FPCT) and its initial evaluation in an ex vivo setting. The prototype FPCT scanner consists of a conventional radiographic flat panel detector, mounted on a multi-slice CT scanner gantry. Explanted human ex vivo heart and foot specimens were examined. Images were reformatted with various reconstruction algorithms and were evaluated for high-resolution anatomic information. For comparison purposes, the ex vivo specimens were also scanned with a conventional 16-detector-row CT scanner (Sensation 16, Siemens Medical Solutions, Forchheim, Germany). With the FPCT prototype used, a 1,024 x 768 resolution matrix can be obtained, resulting in an isotropic voxel size of 0.25 x 0.25 x 0.25 mm at the iso-center. Due to the high spatial resolution, very small structures such as trabecular bone or third-degree, distal branches of coronary arteries could be visualized. This first evaluation showed that flat panel detector systems can be used in a cone-beam computed tomography scanner and that very high spatial resolutions can be achieved. However, there are limitations for in vivo use due to constraints in low contrast resolution and slow scan speed. (orig.)

  8. The Dimensions of the Orbital Cavity Based on High-Resolution Computed Tomography of Human Cadavers

    DEFF Research Database (Denmark)

    Felding, Ulrik Ascanius; Bloch, Sune Land; Buchwald, Christian von

    2016-01-01

    for surface area. To authors' knowledge, this study is the first to have measured the entire surface area of the orbital cavity.The volume and surface area of the orbital cavity were estimated in computed tomography scans of 11 human cadavers using unbiased stereological sampling techniques. The mean (± SD......) total volume and total surface area of the orbital cavities was 24.27 ± 3.88 cm and 32.47 ± 2.96 cm, respectively. There was no significant difference in volume (P = 0.315) or surface area (P = 0.566) between the 2 orbital cavities.The stereological technique proved to be a robust and unbiased method...... that may be used as a gold standard for comparison with automated computer software. Future imaging studies in blow-out fracture patients may be based on individual and relative calculation involving both herniated volume and fractured surface area in relation to the total volume and surface area...

  9. A Novel Wearable Forehead EOG Measurement System for Human Computer Interfaces.

    Science.gov (United States)

    Heo, Jeong; Yoon, Heenam; Park, Kwang Suk

    2017-06-23

    Amyotrophic lateral sclerosis (ALS) patients whose voluntary muscles are paralyzed commonly communicate with the outside world using eye movement. There have been many efforts to support this method of communication by tracking or detecting eye movement. An electrooculogram (EOG), an electro-physiological signal, is generated by eye movements and can be measured with electrodes placed around the eye. In this study, we proposed a new practical electrode position on the forehead to measure EOG signals, and we developed a wearable forehead EOG measurement system for use in Human Computer/Machine interfaces (HCIs/HMIs). Four electrodes, including the ground electrode, were placed on the forehead. The two channels were arranged vertically and horizontally, sharing a positive electrode. Additionally, a real-time eye movement classification algorithm was developed based on the characteristics of the forehead EOG. Three applications were employed to evaluate the proposed system: a virtual keyboard using a modified Bremen BCI speller and an automatic sequential row-column scanner, and a drivable power wheelchair. The mean typing speeds of the modified Bremen brain-computer interface (BCI) speller and automatic row-column scanner were 10.81 and 7.74 letters per minute, and the mean classification accuracies were 91.25% and 95.12%, respectively. In the power wheelchair demonstration, the user drove the wheelchair through an 8-shape course without collision with obstacles.

  10. HumanComputer Systems Interaction Backgrounds and Applications 2 Part 2

    CERN Document Server

    Kulikowski, Juliusz; Mroczek, Teresa

    2012-01-01

    This volume of the book contains a collection of chapters selected from the papers which originally (in shortened form) have been presented at the 3rd International Conference on Human-Systems Interaction held in Rzeszow, Poland, in 2010. The chapters are divided into five sections concerning: IV. Environment monitoring and robotic systems, V. Diagnostic systems, VI. Educational Systems, and VII. General Problems. The novel concepts and realizations of humanoid robots, talking robots and orthopedic surgical robots, as well as those of direct brain-computer interface  are examples of particularly interesting topics presented in Sec. VI. In Sec. V the problems of  skin cancer recognition, colonoscopy diagnosis, and brain strokes diagnosis as well as more general problems of ontology design for  medical diagnostic knowledge are presented. Example of an industrial diagnostic system and a concept of new algorithm for edges detection in computer-analyzed images  are also presented in this Section. Among the edu...

  11. Direct Monte Carlo dose calculation using polygon-surface computational human model

    International Nuclear Information System (INIS)

    Jeong, Jong Hwi; Kim, Chan Hyeong; Yeom, Yeon Su; Cho, Sungkoo; Chung, Min Suk; Cho, Kun-Woo

    2011-01-01

    In the present study, a voxel-type computational human model was converted to a polygon-surface model, after which it was imported directly to the Geant4 code without using a voxelization process, that is, without converting back to a voxel model. The original voxel model was also imported to the Geant4 code, in order to compare the calculated dose values and the computational speed. The average polygon size of the polygon-surface model was ∼0.5 cm 2 , whereas the voxel resolution of the voxel model was 1.981 × 1.981 × 2.0854 mm 3 . The results showed a good agreement between the calculated dose values of the two models. The polygon-surface model was, however, slower than the voxel model by a factor of 6–9 for the photon energies and irradiation geometries considered in the present study, which nonetheless is considered acceptable, considering that direct use of the polygon-surface model does not require a separate voxelization process. (author)

  12. Neural and cortisol responses during play with human and computer partners in children with autism

    Science.gov (United States)

    Edmiston, Elliot Kale; Merkle, Kristen

    2015-01-01

    Children with autism spectrum disorder (ASD) exhibit impairment in reciprocal social interactions, including play, which can manifest as failure to show social preference or discrimination between social and nonsocial stimuli. To explore mechanisms underlying these deficits, we collected salivary cortisol from 42 children 8–12 years with ASD or typical development during a playground interaction with a confederate child. Participants underwent functional MRI during a prisoner’s dilemma game requiring cooperation or defection with a human (confederate) or computer partner. Search region of interest analyses were based on previous research (e.g. insula, amygdala, temporal parietal junction—TPJ). There were significant group differences in neural activation based on partner and response pattern. When playing with a human partner, children with ASD showed limited engagement of a social salience brain circuit during defection. Reduced insula activation during defection in the ASD children relative to TD children, regardless of partner type, was also a prominent finding. Insula and TPJ BOLD during defection was also associated with stress responsivity and behavior in the ASD group under playground conditions. Children with ASD engage social salience networks less than TD children during conditions of social salience, supporting a fundamental disturbance of social engagement. PMID:25552572

  13. X-ray micro computed tomography for the visualization of an atherosclerotic human coronary artery

    Science.gov (United States)

    Matviykiv, Sofiya; Buscema, Marzia; Deyhle, Hans; Pfohl, Thomas; Zumbuehl, Andreas; Saxer, Till; Müller, Bert

    2017-06-01

    Atherosclerosis refers to narrowing or blocking of blood vessels that can lead to a heart attack, chest pain or stroke. Constricted segments of diseased arteries exhibit considerably increased wall shear stress, compared to the healthy ones. One of the possibilities to improve patient’s treatment is the application of nano-therapeutic approaches, based on shear stress sensitive nano-containers. In order to tailor the chemical composition and subsequent physical properties of such liposomes, one has to know precisely the morphology of critically stenosed arteries at micrometre resolution. It is often obtained by means of histology, which has the drawback of offering only two-dimensional information. Additionally, it requires the artery to be decalcified before sectioning, which might lead to deformations within the tissue. Micro computed tomography (μCT) enables the three-dimensional (3D) visualization of soft and hard tissues at micrometre level. μCT allows lumen segmentation that is crucial for subsequent flow simulation analysis. In this communication, tomographic images of a human coronary artery before and after decalcification are qualitatively and quantitatively compared. We analyse the cross section of the diseased human coronary artery before and after decalcification, and calculate the lumen area of both samples.

  14. Computational Fluid Dynamics Ventilation Study for the Human Powered Centrifuge at the International Space Station

    Science.gov (United States)

    Son, Chang H.

    2012-01-01

    The Human Powered Centrifuge (HPC) is a facility that is planned to be installed on board the International Space Station (ISS) to enable crew exercises under the artificial gravity conditions. The HPC equipment includes a "bicycle" for long-term exercises of a crewmember that provides power for rotation of HPC at a speed of 30 rpm. The crewmember exercising vigorously on the centrifuge generates the amount of carbon dioxide of about two times higher than a crewmember in ordinary conditions. The goal of the study is to analyze the airflow and carbon dioxide distribution within Pressurized Multipurpose Module (PMM) cabin when HPC is operating. A full unsteady formulation is used for airflow and CO2 transport CFD-based modeling with the so-called sliding mesh concept when the HPC equipment with the adjacent Bay 4 cabin volume is considered in the rotating reference frame while the rest of the cabin volume is considered in the stationary reference frame. The rotating part of the computational domain includes also a human body model. Localized effects of carbon dioxide dispersion are examined. Strong influence of the rotating HPC equipment on the CO2 distribution detected is discussed.

  15. Computational dissection of human episodic memory reveals mental process-specific genetic profiles

    Science.gov (United States)

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G.; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J.-F.

    2015-01-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory. PMID:26261317

  16. Computational Prediction of Human Salivary Proteins from Blood Circulation and Application to Diagnostic Biomarker Identification

    Science.gov (United States)

    Wang, Jiaxin; Liang, Yanchun; Wang, Yan; Cui, Juan; Liu, Ming; Du, Wei; Xu, Ying

    2013-01-01

    Proteins can move from blood circulation into salivary glands through active transportation, passive diffusion or ultrafiltration, some of which are then released into saliva and hence can potentially serve as biomarkers for diseases if accurately identified. We present a novel computational method for predicting salivary proteins that come from circulation. The basis for the prediction is a set of physiochemical and sequence features we found to be discerning between human proteins known to be movable from circulation to saliva and proteins deemed to be not in saliva. A classifier was trained based on these features using a support-vector machine to predict protein secretion into saliva. The classifier achieved 88.56% average recall and 90.76% average precision in 10-fold cross-validation on the training data, indicating that the selected features are informative. Considering the possibility that our negative training data may not be highly reliable (i.e., proteins predicted to be not in saliva), we have also trained a ranking method, aiming to rank the known salivary proteins from circulation as the highest among the proteins in the general background, based on the same features. This prediction capability can be used to predict potential biomarker proteins for specific human diseases when coupled with the information of differentially expressed proteins in diseased versus healthy control tissues and a prediction capability for blood-secretory proteins. Using such integrated information, we predicted 31 candidate biomarker proteins in saliva for breast cancer. PMID:24324552

  17. Neural mechanisms of transient neocortical beta rhythms: Converging evidence from humans, computational modeling, monkeys, and mice

    Science.gov (United States)

    Sherman, Maxwell A.; Lee, Shane; Law, Robert; Haegens, Saskia; Thorn, Catherine A.; Hämäläinen, Matti S.; Moore, Christopher I.; Jones, Stephanie R.

    2016-01-01

    Human neocortical 15–29-Hz beta oscillations are strong predictors of perceptual and motor performance. However, the mechanistic origin of beta in vivo is unknown, hindering understanding of its functional role. Combining human magnetoencephalography (MEG), computational modeling, and laminar recordings in animals, we present a new theory that accounts for the origin of spontaneous neocortical beta. In our MEG data, spontaneous beta activity from somatosensory and frontal cortex emerged as noncontinuous beta events typically lasting drive targeting proximal and distal dendrites of pyramidal neurons, where the defining feature of a beta event was a strong distal drive that lasted one beta period (∼50 ms). This beta mechanism rigorously accounted for the beta event profiles; several other mechanisms did not. The spatial location of synaptic drive in the model to supragranular and infragranular layers was critical to the emergence of beta events and led to the prediction that beta events should be associated with a specific laminar current profile. Laminar recordings in somatosensory neocortex from anesthetized mice and awake monkeys supported these predictions, suggesting this beta mechanism is conserved across species and recording modalities. These findings make several predictions about optimal states for perceptual and motor performance and guide causal interventions to modulate beta for optimal function. PMID:27469163

  18. Computed aided system for separation and classification of the abnormal erythrocytes in human blood

    Science.gov (United States)

    Wąsowicz, Michał; Grochowski, Michał; Kulka, Marek; Mikołajczyk, Agnieszka; Ficek, Mateusz; Karpieńko, Katarzyna; Cićkiewicz, Maciej

    2017-12-01

    The human peripheral blood consists of cells (red cells, white cells, and platelets) suspended in plasma. In the following research the team assessed an influence of nanodiamond particles on blood elements over various periods of time. The material used in the study consisted of samples taken from ten healthy humans of various age, different blood types and both sexes. The markings were leaded by adding to the blood unmodified diamonds and oxidation modified. The blood was put under an impact of two diamond concentrations: 20μl and 100μl. The amount of abnormal cells increased with time. The percentage of echinocytes as a result of interaction with nanodiamonds in various time intervals for individual specimens was scarce. The impact of the two diamond types had no clinical importance on red blood cells. It is supposed that as a result of longlasting exposure a dehydratation of red cells takes place, because of the function of the cells. The analysis of an influence of nanodiamond particles on blood elements was supported by computer system designed for automatic counting and classification of the Red Blood Cells (RBC). The system utilizes advanced image processing methods for RBCs separation and counting and Eigenfaces method coupled with the neural networks for RBCs classification into normal and abnormal cells purposes.

  19. Computational dissection of human episodic memory reveals mental process-specific genetic profiles.

    Science.gov (United States)

    Luksys, Gediminas; Fastenrath, Matthias; Coynel, David; Freytag, Virginie; Gschwind, Leo; Heck, Angela; Jessen, Frank; Maier, Wolfgang; Milnik, Annette; Riedel-Heller, Steffi G; Scherer, Martin; Spalek, Klara; Vogler, Christian; Wagner, Michael; Wolfsgruber, Steffen; Papassotiropoulos, Andreas; de Quervain, Dominique J-F

    2015-09-01

    Episodic memory performance is the result of distinct mental processes, such as learning, memory maintenance, and emotional modulation of memory strength. Such processes can be effectively dissociated using computational models. Here we performed gene set enrichment analyses of model parameters estimated from the episodic memory performance of 1,765 healthy young adults. We report robust and replicated associations of the amine compound SLC (solute-carrier) transporters gene set with the learning rate, of the collagen formation and transmembrane receptor protein tyrosine kinase activity gene sets with the modulation of memory strength by negative emotional arousal, and of the L1 cell adhesion molecule (L1CAM) interactions gene set with the repetition-based memory improvement. Furthermore, in a large functional MRI sample of 795 subjects we found that the association between L1CAM interactions and memory maintenance revealed large clusters of differences in brain activity in frontal cortical areas. Our findings provide converging evidence that distinct genetic profiles underlie specific mental processes of human episodic memory. They also provide empirical support to previous theoretical and neurobiological studies linking specific neuromodulators to the learning rate and linking neural cell adhesion molecules to memory maintenance. Furthermore, our study suggests additional memory-related genetic pathways, which may contribute to a better understanding of the neurobiology of human memory.

  20. Comparison between a Computational Seated Human Model and Experimental Verification Data

    Directory of Open Access Journals (Sweden)

    Christian G. Olesen

    2014-01-01

    Full Text Available Sitting-acquired deep tissue injuries (SADTI are the most serious type of pressure ulcers. In order to investigate the aetiology of SADTI a new approach is under development: a musculo-skeletal model which can predict forces between the chair and the human body at different seated postures. This study focuses on comparing results from a model developed in the AnyBody Modeling System, with data collected from an experimental setup. A chair with force-measuring equipment was developed, an experiment was conducted with three subjects, and the experimental results were compared with the predictions of the computational model. The results show that the model predicted the reaction forces for different chair postures well. The correlation coefficients of how well the experiment and model correlate for the seat angle, backrest angle and footrest height was 0.93, 0.96, and 0.95. The study show a good agreement between experimental data and model prediction of forces between a human body and a chair. The model can in the future be used in designing wheelchairs or automotive seats.