WorldWideScience

Sample records for making computers laugh

  1. A Review of Humor for Computer Games: Play, Laugh and More

    Dormann, Claire; Biddle, Robert

    2009-01-01

    Computer games are now becoming ways to communicate, teach, and influence attitudes and behavior. In this article, we address the role of humor in computer games, especially in support of serious purposes. We begin with a review of the main theories of humor, including superiority, incongruity, and relief. These theories and their…

  2. Make Computer Learning Stick.

    Casella, Vicki

    1985-01-01

    Teachers are using computer programs in conjunction with many classroom staples such as art supplies, math manipulatives, and science reference books. Twelve software programs and related activities are described which teach visual and auditory memory and spatial relations, as well as subject areas such as anatomy and geography. (MT)

  3. Computer Graphics and Administrative Decision-Making.

    Yost, Michael

    1984-01-01

    Reduction in prices now makes it possible for almost any institution to use computer graphics for administrative decision making and research. Current and potential uses of computer graphics in these two areas are discussed. (JN)

  4. Laughing rats are optimistic.

    Rafal Rygula

    Full Text Available Emotions can bias human decisions- for example depressed or anxious people tend to make pessimistic judgements while those in positive affective states are often more optimistic. Several studies have reported that affect contingent judgement biases can also be produced in animals. The animals, however, cannot self-report; therefore, the valence of their emotions, to date, could only be assumed. Here we present the results of an experiment where the affect-contingent judgement bias has been produced by objectively measured positive emotions. We trained rats in operant Skinner boxes to press one lever in response to one tone to receive a food reward and to press another lever in response to a different tone to avoid punishment by electric foot shock. After attaining a stable level of discrimination performance, the animals were subjected to either handling or playful, experimenter-administered manual stimulation - tickling. This procedure has been confirmed to induce a positive affective state in rats, and the 50-kHz ultrasonic vocalisations (rat laughter emitted by animals in response to tickling have been postulated to index positive emotions akin to human joy. During the tickling and handling sessions, the numbers of emitted high-frequency 50-kHz calls were scored. Immediately after tickling or handling, the animals were tested for their responses to a tone of intermediate frequency, and the pattern of their responses to this ambiguous cue was taken as an indicator of the animals' optimism. Our findings indicate that tickling induced positive emotions which are directly indexed in rats by laughter, can make animals more optimistic. We demonstrate for the first time a link between the directly measured positive affective state and decision making under uncertainty in an animal model. We also introduce innovative tandem-approach for studying emotional-cognitive interplay in animals, which may be of great value for understanding the emotional

  5. Parathion alters incubation behavior of laughing gulls

    White, D.H.; Mitchell, C.A.; Hill, E.F.

    1983-01-01

    One member of each pair of incubating laughing gulls at 9 nests was trapped, orally dosed with either 6 mg/kg parathion in corn oil or corn oil alone, and marked about the neck with red dye. Each nest was marked with a numbered stake and the treatment was recorded. A pilot study with captive laughing gulls had determined the proper dosage of parathion that would significantly inhibit their brain AChE activity (about 50% of normal) without overt signs of poisoning. After dosing, birds were released and the nests were observed for 2 1/2 days from a blind on the nesting island. The activities of the birds at each marked nest were recorded at 10-minute intervals. Results indicated that on the day of treatment there was no difference (P greater than 0.05, Chi-square test) in the proportion of time spent on the nest between treated and control birds. However, birds dosed with 6 mg/kg parathion spent significantly less time incubating on days 2 and 3 than did birds receiving only corn oil. By noon on the third day, sharing of nest duties between pair members in the treated group had approached normal, indicating recovery from parathion intoxication. These findings suggest that sublethal exposure of nesting birds to an organophosphate (OP) insecticide, such as parathion, may result in decreased nest attentiveness, thereby making the clutch more susceptible to predation or egg failure. Behavioral changes caused by sublethal OP exposure could be especially detrimental in avian species where only one pair member incubates or where both members are exposed in species sharing nest duties.

  6. Having The Last Laugh

    Aggerholm, Kenneth; Ronglan, Lars Tore

    2012-01-01

    This paper provides an existential analysis of humour as a social virtue in invasion games at the elite sport level. The main argument is that humour in this particular context can be valuable both in the competitive social training environment and in game performance. This is investigated through...... philosophical and psychological conceptualisations of humour that are used to reveal and analyse the appearance and possible value of a humorous approach in various social situations experienced during invasion games and the associated training situations. It is concluded that humour can help balance...... and structure the social training environment as well as facilitate creative game performance. On this basis it is suggested that the existential perspectives on humour presented could make a fruitful contribution to talent development in the domain of invasion games....

  7. Computer-aided decision making.

    Keith M. Reynolds; Daniel L. Schmoldt

    2006-01-01

    Several major classes of software technologies have been used in decisionmaking for forest management applications over the past few decades. These computer-based technologies include mathematical programming, expert systems, network models, multi-criteria decisionmaking, and integrated systems. Each technology possesses unique advantages and disadvantages, and has...

  8. Making IBM's Computer, Watson, Human

    Rachlin, Howard

    2012-01-01

    This essay uses the recent victory of an IBM computer (Watson) in the TV game, "Jeopardy," to speculate on the abilities Watson would need, in addition to those it has, to be human. The essay's basic premise is that to be human is to behave as humans behave and to function in society as humans function. Alternatives to this premise are considered…

  9. Making computers work for people

    Fricke, D.C.

    1994-01-01

    In May of 1992, NSP's Prairie Island Nuclear Plant was operating custom-built software on five different site computers to support Work Management, Records Management, Emergency Response, Word Processing, Radiation Protection, and User Applications. A nine-month study was conducted to evaluate the efficiency of the situation and whether improvements were warranted. The study included interviews with users and management for analysis of business needs. A consultant was engaged to assist the study team in defining what the future computer assisted work environment should be. The vision of the future needs resulted in the following strategic objectives: reduce the number of platforms; standardize programmer skills; allow purchase of standard applications; upgrade to state-of-the-art technology; and reduce hardware costs by $750,000 per year. NSP's Prairie Island and Monticello Nuclear Plants have successfully implemented an integrated work management system at both sites. The transition period has been approximately one year from the decision to full implementation. This has been accomplished through the commitment of ALL site employees and outstanding support from Corporate and Vendor organizations

  10. Computational Complexity and Human Decision-Making.

    Bossaerts, Peter; Murawski, Carsten

    2017-12-01

    The rationality principle postulates that decision-makers always choose the best action available to them. It underlies most modern theories of decision-making. The principle does not take into account the difficulty of finding the best option. Here, we propose that computational complexity theory (CCT) provides a framework for defining and quantifying the difficulty of decisions. We review evidence showing that human decision-making is affected by computational complexity. Building on this evidence, we argue that most models of decision-making, and metacognition, are intractable from a computational perspective. To be plausible, future theories of decision-making will need to take into account both the resources required for implementing the computations implied by the theory, and the resource constraints imposed on the decision-maker by biology. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. LAUGHING AT OURSELVES: REFLECTING MALAYSIAN ETHNIC DISPARITIES

    SWAGATA SINHA ROY

    2014-05-01

    Full Text Available Malaysia’s various ethnic groups make interesting study both sociologically and culturally. With such a heady mix of cultural elements to explore, it is often natural that the many groups stumble upon ‘rare gems’ that reflect their ‘Malaysianess’. Have Malaysians really ever appreciated the many and varied aspects of culture that they are seemingly suddenly thrown into? Do we embrace these happily or are we constantly rejecting them? Fortunately, through the medium of film, we are, from time to time, allowed to reflect on our obvious similarities and even more apparent disparities. In this paper, we explore the culture and perceptions of people from the major ethnic groups that are the human base of this very country. When was it we have last laughed at ourselves … heartily? Nasi Lemak 2.0 provides an interesting, if not disturbing insight into the workings of the Malaysian ‘mind’. Nasi Lemak 2.0 was released on 8th September 2011 and impacted a whole generation of Malaysians. The characters have been well chosen and have done a wonderful job of being representations of the various communities in this nation. Ethnocentrism is a reality and often rears its head, ‘ugly’ or otherwise in several situations. Are we able to grapple with the levels of ethnocentrism that we encounter? These are some of the issues that will trigger much debate and discussion among ourselves and perhaps also reflect our cores.

  12. Laugh and Interview with Antonia Baehr

    Alicia G. Hierro

    2013-12-01

    Full Text Available Con motivo del programa Secciò Irregular celebrado en el Mercat de les Flors en Barcelona el pasado 16 de enero, la artista alemana Antonia Baehr2 mostró el resultado de una extensa investigación sobre la risa que ha ocupado los últimos siete años de su carrera. Rire, Laugh, Lachen3 (título original en tres idiomas se presentó dentro del circuito performativo en Madrid en el año 2010 dentro de Escena Contemporánea. En esta ocasión, en 2013 repite la misma performance aplaudida por la escena internacional.

  13. Wearable computing: Will it make people prosocial?

    Nasiopoulos, Eleni; Risko, Evan F; Foulsham, Tom; Kingstone, Alan

    2015-05-01

    We recently reported that people who wear an eye tracker modify their natural looking behaviour in a prosocial manner. This change in looking behaviour represents a potential concern for researchers who wish to use eye trackers to understand the functioning of human attention. On the other hand, it may offer a real boon to manufacturers and consumers of wearable computing (e.g., Google Glass), for if wearable computing causes people to behave in a prosocial manner, then the public's fear that people with wearable computing will invade their privacy is unfounded. Critically, both of these divergent implications are grounded on the assumption that the prosocial behavioural effect of wearing an eye tracker is sustained for a prolonged period of time. Our study reveals that on the very first wearing of an eye tracker, and in less than 10 min, the prosocial effect of an eye tracker is abolished, but by drawing attention back to the eye tracker, the implied presence effect is easily reactivated. This suggests that eye trackers induce a transient social presence effect, which is rendered dormant when attention is shifted away from the source of implied presence. This is good news for researchers who use eye trackers to measure attention and behaviour; and could be bad news for advocates of wearable computing in everyday life. © 2014 The British Psychological Society.

  14. Making Informed Decisions: Management Issues Influencing Computers in the Classroom.

    Strickland, James

    A number of noninstructional factors appear to determine the extent to which computers make a difference in writing instruction. Once computers have been purchased and installed, it is generally school administrators who make management decisions, often from an uninformed pedagogical orientation. Issues such as what hardware and software to buy,…

  15. Computers make rig life extension an option

    NONE

    1996-10-01

    The worldwide semisubmersible drilling rig fleet is approaching retirement. But replacement is not an attractive option even though dayrates are reaching record highs. In 1991, Schlumberger Sedco Forex managers decided that an alternative might exist if regulators and insurers could be convinced to extend rig life expectancy through restoration. Sedco Forex chose their No. 704 semisubmersible, an 18-year North Sea veteran, to test their process. The first step was to determine what required restoration, meaning fatigue life analysis of each weld on the huge vessel. If inspected, the task would be unacceptably time-consuming and of questionable accuracy. Instead a suite of computer programs modeled the stress seen by each weld, statistically estimated the sea states seen by the rig throughout its North Sea service and calibrated a beam-element model on which to run their computer simulations. The elastic stiffness of the structure and detailed stress analysis of each weld was performed with ANSYS, a commercially available finite-element analysis program. The use of computer codes to evaluate service life extension is described.

  16. Perceptual Computing Aiding People in Making Subjective Judgments

    Mendel, Jerry

    2010-01-01

    Explains for the first time how "computing with words" can aid in making subjective judgments. Lotfi Zadeh, the father of fuzzy logic, coined the phrase "computing with words" (CWW) to describe a methodology in which the objects of computation are words and propositions drawn from a natural language. Perceptual Computing explains how to implement CWW to aid in the important area of making subjective judgments, using a methodology that leads to an interactive device—a "Perceptual Computer"—that propagates random and linguistic uncertainties into the subjective judg

  17. How Can We Make Computing Lessons More Inclusive?

    Shelton , Chris

    2017-01-01

    Part 3: Computer Science Education and Its Future Focus and Development; International audience; Whilst there is a substantial body of research that shows how Information and Communications Technologies (ICTs) can support schools and teachers to make their classrooms more inclusive, there is a need for more evidence describing how best to ensure that the teaching of computing itself is inclusive. This paper reports on a literature review of inclusive education in school computing lessons. It ...

  18. A Reflective Study into Children's Cognition When Making Computer Games

    Allsop, Yasemin

    2016-01-01

    In this paper, children's mental activities when making digital games are explored. Where previous studies have mainly focused on children's learning, this study aimed to unfold the children's thinking process for learning when making computer games. As part of an ongoing larger scale study, which adopts an ethnographic approach, this research…

  19. Decision Making about Computer Acquisition and Use in American Schools.

    Becker, Henry Jay

    1993-01-01

    Discusses the centralization and decentralization of decision making about computer use in elementary and secondary schools based on results of a 1989 national survey. Results unexpectedly indicate that more successful programs are the result of districtwide planning than individual teacher or school-level decision making. (LRW)

  20. Computer Supported Decision Making in Therapy of Arterial Hypertension

    Peleška, Jan; Švejda, David; Zvárová, Jana

    1997-01-01

    Roč. 45, 1/2 (1997), s. 25-29 ISSN 1386-5056 R&D Projects: GA ČR GA313/93/0616 Grant - others:COPERNICUS(XE) JRP-10053 Keywords : computer supported decision making * microsoft access language * therapy of arterial hypertension

  1. Grid Computing Making the Global Infrastructure a Reality

    Fox, Geoffrey C; Hey, Anthony J G

    2003-01-01

    Grid computing is applying the resources of many computers in a network to a single problem at the same time Grid computing appears to be a promising trend for three reasons: (1) Its ability to make more cost-effective use of a given amount of computer resources, (2) As a way to solve problems that can't be approached without an enormous amount of computing power (3) Because it suggests that the resources of many computers can be cooperatively and perhaps synergistically harnessed and managed as a collaboration toward a common objective. A number of corporations, professional groups, university consortiums, and other groups have developed or are developing frameworks and software for managing grid computing projects. The European Community (EU) is sponsoring a project for a grid for high-energy physics, earth observation, and biology applications. In the United States, the National Technology Grid is prototyping a computational grid for infrastructure and an access grid for people. Sun Microsystems offers Gri...

  2. Presidential laugh lines. Candidate display behavior and audience laughter in the 2008 primary debates.

    Stewart, Patrick A

    2010-09-01

    Political humor has long been used by candidates to mobilize supporters by enhancing status or denigrating the opposition. Research concerning laughter provides insight into the building of social bonds; however, little research has focused on the nonverbal cues displayed by the individual making humorous comments. This study first investigates whether there is a relationship between facial display behavior and the presence and strength of laughter. Next, the analysis explores whether specific candidate displays during a humorous comment depend on the target of the comment. This paper analyzes the use of humor by Republican and Democratic candidates during ten 2008 presidential primary debates. Data analyzed here employs laughter as an indicator of a successful humorous comment and documents candidate display behavior in the seconds immediately preceding and during each laughter event. Findings suggest specific facial displays play an important communication role. Different types of smiles, whether felt, false, or fear-based, are related to who laughs as well as how intensely the audience is judged to laugh.

  3. Developing Decision-Making Skill: Experiential Learning in Computer Games

    Kurt A. April; Katja M. J. Goebel; Eddie Blass; Jonathan Foster-Pedley

    2012-01-01

    This paper explores the value that computer and video games bring to learning and leadership and explores how games work as learning environments and the impact they have on personal development. The study looks at decisiveness, decision-making ability and styles, and on how this leadership-related skill is learnt through different paradigms. The paper compares the learning from a lecture to the learning from a designed computer game, both of which have the same content through the use of a s...

  4. Helper contributions in the cooperatively breeding laughing kookaburra: feeding young is no laughing matter.

    Legge

    2000-05-01

    I studied the contributions of individuals to incubation and nestling feeding in a population of cooperatively breeding laughing kookaburras, Dacelo novaeguineae. In most cooperatively breeding birds where nest success is limited by nestling starvation, related helpers increase the overall level of provisioning to the nest, thus boosting the production of nondescendent kin. However, although partial brood loss is the largest cause of lost productivity in kookaburra nests, additional helpers failed to increase overall provisioning. Instead, all group members, but especially helpers, reduced their feeding contributions as group size increased. Breeders and helpers reduced the size of prey delivered, and helpers also reduced the number of feeding visits. An important benefit of helping in kookaburras may be to allow all group members to reduce their effort. Within groups, contributions to care depended on status, sex, group size and the brood size. Breeding males delivered the most food. Breeding females provisioned less than their partner, but their effort was comparable to that of male helpers. Female helpers contributed the least food. Incubation effort followed similar patterns. The relatedness of helpers to the brood had no impact on their provisioning. Across all group sizes, helpers generally brought larger items to the nest than breeders. Copyright 2000 The Association for the Study of Animal Behaviour.

  5. [Neurology of laughter and humour: pathological laughing and crying].

    Arias, Manuel

    2011-10-01

    Laughter, which is usually a healthy biological phenomenon, may be also a symptom of several severe brain pathologies. To review the neurobiological bases of laughter and humour, as well as those of pathological laughing and crying syndrome. At the mesencephalic-pontine junction there is a central coordinator of the nuclei that innervate the muscles involved in laughter (facial expression, respiratory and phonatory). This centre receives connections from three systems: inhibitory (pre-motor and motor cortex), excitatory (temporal cortex, amygdala, hypothalamus) and modulator (cerebellum). Humour is a complex phenomenon with a range of components: the perception of the unexpected incongruence (occipitotemporal area, prefrontal cortex), emotional (reward circuit) and volitional (temporal and frontal cortex). Functional magnetic resonance imaging studies do not reveal a markedly prominent role of the right frontal lobe in processing humour, as had been suggested in the classical studies. The causes of pathological laughing and crying syndrome can be classified in two groups: altered behaviour with unmotivated happiness (Angelman syndrome, schizophrenia, manias, dementia) and interference with the inhibitory/excitatory mechanisms (gelastic epilepsy, fou rire prodromique in strokes, multiple sclerosis, amyotrophic lateral sclerosis, Parkinson's disease and Parkinson-plus, traumatic injuries, tumours). Serotonin and noradrenalin reuptake inhibitors, levodopa, lamotrigine and the association of dextromethorphan/quinidine can be effective in certain cases of pathological laughing and crying. As human neurobiological phenomena, laughter and humour also belong to the field of clinical neurology; their processing is affected in a number of different diseases and, in certain cases, effective treatment can be established.

  6. Granular computing and decision-making interactive and iterative approaches

    Chen, Shyi-Ming

    2015-01-01

    This volume is devoted to interactive and iterative processes of decision-making– I2 Fuzzy Decision Making, in brief. Decision-making is inherently interactive. Fuzzy sets help realize human-machine communication in an efficient way by facilitating a two-way interaction in a friendly and transparent manner. Human-centric interaction is of paramount relevance as a leading guiding design principle of decision support systems.   The volume provides the reader with an updated and in-depth material on the conceptually appealing and practically sound methodology and practice of I2 Fuzzy Decision Making. The book engages a wealth of methods of fuzzy sets and Granular Computing, brings new concepts, architectures and practice of fuzzy decision-making providing the reader with various application studies.   The book is aimed at a broad audience of researchers and practitioners in numerous disciplines in which decision-making processes play a pivotal role and serve as a vehicle to produce solutions to existing prob...

  7. Sequential decision making in computational sustainability via adaptive submodularity

    Krause, Andreas; Golovin, Daniel; Converse, Sarah J.

    2015-01-01

    Many problems in computational sustainability require making a sequence of decisions in complex, uncertain environments. Such problems are generally notoriously difficult. In this article, we review the recently discovered notion of adaptive submodularity, an intuitive diminishing returns condition that generalizes the classical notion of submodular set functions to sequential decision problems. Problems exhibiting the adaptive submodularity property can be efficiently and provably near-optimally solved using simple myopic policies. We illustrate this concept in several case studies of interest in computational sustainability: First, we demonstrate how it can be used to efficiently plan for resolving uncertainty in adaptive management scenarios. Secondly, we show how it applies to dynamic conservation planning for protecting endangered species, a case study carried out in collaboration with the US Geological Survey and the US Fish and Wildlife Service.

  8. Computational intelligence paradigms in economic and financial decision making

    Resta, Marina

    2016-01-01

    The book focuses on a set of cutting-edge research techniques, highlighting the potential of soft computing tools in the analysis of economic and financial phenomena and in providing support for the decision-making process. In the first part the textbook presents a comprehensive and self-contained introduction to the field of self-organizing maps, elastic maps and social network analysis tools and provides necessary background material on the topic, including a discussion of more recent developments in the field. In the second part the focus is on practical applications, with particular attention paid to budgeting problems, market simulations, and decision-making processes, and on how such problems can be effectively managed by developing proper methods to automatically detect certain patterns. The book offers a valuable resource for both students and practitioners with an introductory-level college math background.

  9. Computational Psychometrics in Communication and Implications in Decision Making.

    Cipresso, Pietro; Villani, Daniela; Repetto, Claudia; Bosone, Lucia; Balgera, Anna; Mauri, Maurizio; Villamira, Marco; Antonietti, Alessandro; Riva, Giuseppe

    2015-01-01

    Recent investigations emphasized the role of communication features on behavioral trust and reciprocity in economic decision making but no studies have been focused on the effect of communication on affective states in such a context. Thanks to advanced methods of computational psychometrics, in this study, affective states were deeply examined using simultaneous and synchronized recordings of gazes and psychophysiological signals in 28 female students during an investment game. Results showed that participants experienced different affective states according to the type of communication (personal versus impersonal). In particular, participants involved in personal communication felt more relaxed than participants involved in impersonal communication. Moreover, personal communication influenced reciprocity and participants' perceptions about trust and reciprocity. Findings were interpreted in the light of the Arousal/Valence Model and self-disclosure process.

  10. Judy Collins shares a laugh with First Lady Hillary Clinton

    1999-01-01

    Singer Judy Collins (left) shares a laugh with First Lady Hillary Rodham Clinton in the Apollo/Saturn V Facility. Both women are at KSC to view the launch of Space Shuttle mission STS-93 scheduled for 12:36 a.m. EDT July 20. Much attention has been generated over the launch due to Commander Eileen M. Collins, the first woman to serve as commander of a Shuttle mission. Judy Collins has honored the commander with a song, 'Beyond the Sky,' which was commissioned by NASA through the NASA Art Program.

  11. Aping expressions? Chimpanzees produce distinct laugh types when responding to laughter of others.

    Davila-Ross, Marina; Allcock, Bethan; Thomas, Chris; Bard, Kim A

    2011-10-01

    Humans have the ability to replicate the emotional expressions of others even when they undergo different emotions. Such distinct responses of expressions, especially positive expressions, play a central role in everyday social communication of humans and may give the responding individuals important advantages in cooperation and communication. The present work examined laughter in chimpanzees to test whether nonhuman primates also use their expressions in such distinct ways. The approach was first to examine the form and occurrence of laugh replications (laughter after the laughter of others) and spontaneous laughter of chimpanzees during social play and then to test whether their laugh replications represented laugh-elicited laugh responses (laughter triggered by the laughter of others) by using a quantitative method designed to measure responses in natural social settings. The results of this study indicated that chimpanzees produce laugh-elicited laughter that is distinct in form and occurrence from their spontaneous laughter. These findings provide the first empirical evidence that nonhuman primates have the ability to replicate the expressions of others by producing expressions that differ in their underlying emotions and social implications. The data further showed that the laugh-elicited laugh responses of the subjects were closely linked to play maintenance, suggesting that chimpanzees might gain important cooperative and communicative advantages by responding with laughter to the laughter of their social partners. Notably, some chimpanzee groups of this study responded more with laughter than others, an outcome that provides empirical support of a socialization of expressions in great apes similar to that of humans.

  12. ENIAC in action making and remaking the modern computer

    Haigh, Thomas; Rope, Crispin

    2016-01-01

    Conceived in 1943, completed in 1945, and decommissioned in 1955, ENIAC (the Electronic Numerical Integrator and Computer) was the first general-purpose programmable electronic computer. But ENIAC was more than just a milestone on the road to the modern computer. During its decade of operational life, ENIAC calculated sines and cosines and tested for statistical outliers, plotted the trajectories of bombs and shells, and ran the first numerical weather simulations. " ENIAC in Action "tells the whole story for the first time, from ENIAC's design, construction, testing, and use to its afterlife as part of computing folklore. It highlights the complex relationship of ENIAC and its designers to the revolutionary approaches to computer architecture and coding first documented by John von Neumann in 1945. Within this broad sweep, the authors emphasize the crucial but previously neglected years of 1947 to 1948, when ENIAC was reconfigured to run what the authors claim was the first modern computer program to be exe...

  13. He who laughs last - Jesus and laughter in the Synoptic and Gnostic ...

    The aim of the article is to examine the meaning of references to laughter in the Synoptic Gospels and a number of Gnostic texts. Whereas Jesus is depicted as an object of ridicule (Mk 5:40 par.) and as condemning those who laugh in the Synoptic Gospels (Lk 6:25), it is he who often laughs derisively at the ignorance of ...

  14. Blind quantum computation protocol in which Alice only makes measurements

    Morimae, Tomoyuki; Fujii, Keisuke

    2013-05-01

    Blind quantum computation is a new secure quantum computing protocol which enables Alice (who does not have sufficient quantum technology) to delegate her quantum computation to Bob (who has a full-fledged quantum computer) in such a way that Bob cannot learn anything about Alice's input, output, and algorithm. In previous protocols, Alice needs to have a device which generates quantum states, such as single-photon states. Here we propose another type of blind computing protocol where Alice does only measurements, such as the polarization measurements with a threshold detector. In several experimental setups, such as optical systems, the measurement of a state is much easier than the generation of a single-qubit state. Therefore our protocols ease Alice's burden. Furthermore, the security of our protocol is based on the no-signaling principle, which is more fundamental than quantum physics. Finally, our protocols are device independent in the sense that Alice does not need to trust her measurement device in order to guarantee the security.

  15. Computational Intelligence and Decision Making Trends and Applications

    Madureira, Ana; Marques, Viriato

    2013-01-01

    This book provides a general overview and original analysis of new developments and applications in several areas of Computational Intelligence and Information Systems. Computational Intelligence has become an important tool for engineers to develop and analyze novel techniques to solve problems in basic sciences such as physics, chemistry, biology, engineering, environment and social sciences.   The material contained in this book addresses the foundations and applications of Artificial Intelligence and Decision Support Systems, Complex and Biological Inspired Systems, Simulation and Evolution of Real and Artificial Life Forms, Intelligent Models and Control Systems, Knowledge and Learning Technologies, Web Semantics and Ontologies, Intelligent Tutoring Systems, Intelligent Power Systems, Self-Organized and Distributed Systems, Intelligent Manufacturing Systems and Affective Computing. The contributions have all been written by international experts, who provide current views on the topics discussed and pr...

  16. Terapias complementarias en los cuidados: Humor y risoterapia Complementary therapies in the cares: humour and laugh

    M. Carmen Ruiz Gómez

    2005-06-01

    Full Text Available Dentro de las terapias complementarias, la utilización de la risoterapia aporta beneficios tanto en la salud como en las situaciones de enfermedad, siendo un "instrumento de cuidados" barato y sin efectos secundarios. La tendencia actual hacia todo lo natural, las corrientes sobre cuidados de salud de las distintas culturas, unidas a las recomendaciones de la OMS a los enfermeros sobre la necesidad de utilizar los "métodos tradicionales y complementarios" para conseguir mejorar la salud de la población, hacen que la risoterapia sea una alternativa como instrumento opcional de cuidados. Planteamos una revisión bibliográfica con el objetivo de conocer la aplicación de la risoterapia en la salud y más concretamente en los cuidados enfermeros. Del análisis de los resultados podemos concretar que la risa se trabaja en diferentes ámbitos profesionales, no sólo en el sanitario. Es en el campo de la comunicación donde más se divulga esta terapia. Hemos encontrado pocas publicaciones de enfermería, pero muy valiosas ya que trabajan en el terreno de la investigación y de la docencia. Sería interesante que los profesionales de enfermería utilizaran esta terapia que mejora la calidad de los cuidados y ofrece un campo independiente propicio para la investigación.Related to complementary therapies, humour and laugh contribute to enhance health status and diminish illness situations. Humour as a therapy is an inexpensive tool in the patient care and it has not secondary effects in most cases. Current tendencies, either about natural ways to health or believes on health care of popular cultures, and who recommendations to patients in the use of traditional and complementary methods to improve population health, make humour and laugh an alternative therapy as an optional care tool. We have reviewed the literature with the aim of recognize the application of humour therapy on health and more specifically on nurse cares.On the basis of the review

  17. Effectiveness of an Electronic Performance Support System on Computer Ethics and Ethical Decision-Making Education

    Kert, Serhat Bahadir; Uz, Cigdem; Gecu, Zeynep

    2014-01-01

    This study examined the effectiveness of an electronic performance support system (EPSS) on computer ethics education and the ethical decision-making processes. There were five different phases to this ten month study: (1) Writing computer ethics scenarios, (2) Designing a decision-making framework (3) Developing EPSS software (4) Using EPSS in a…

  18. Could one make a diamond-based quantum computer?

    Stoneham, A Marshall; Harker, A H; Morley, Gavin W

    2009-01-01

    We assess routes to a diamond-based quantum computer, where we specifically look towards scalable devices, with at least 10 linked quantum gates. Such a computer should satisfy the deVincenzo rules and might be used at convenient temperatures. The specific examples that we examine are based on the optical control of electron spins. For some such devices, nuclear spins give additional advantages. Since there have already been demonstrations of basic initialization and readout, our emphasis is on routes to two-qubit quantum gate operations and the linking of perhaps 10-20 such gates. We analyse the dopant properties necessary, especially centres containing N and P, and give results using simple scoping calculations for the key interactions determining gate performance. Our conclusions are cautiously optimistic: it may be possible to develop a useful quantum information processor that works above cryogenic temperatures.

  19. PRISIM: a computer program that makes PRA useful

    Fussell, J.B.; Campbell, D.J.; Glynn, J.C.; Burdick, G.R.

    1986-01-01

    PRISIM is an IBM personal computer program that translates probabilistic risk assessment (PRA) information and calculates additional PRA type information for use by those who are not PRA experts. Specifically, PRISIM was developed for the US Nuclear Regulatory Commission for use by their resident inspectors at nuclear power plants. Inspector activities are either scheduled or are in response to a particular status of a plant. PRISIM is useful for either activity

  20. Neutron visual sensing techniques making good use of computer science

    Kureta, Masatoshi

    2009-01-01

    Neutron visual sensing technique is one of the nondestructive visualization and image-sensing techniques. In this article, some advanced neutron visual sensing techniques are introduced. The most up-to-date high-speed neutron radiography, neutron 3D CT, high-speed scanning neutron 3D/4D CT and multi-beam neutron 4D CT techniques are included with some fundamental application results. Oil flow in a car engine was visualized by high-speed neutron radiography technique to make clear the unknown phenomena. 4D visualization of pained sand in the sand glass was reported as the demonstration of the high-speed scanning neutron 4D CT technique. The purposes of the development of these techniques are to make clear the unknown phenomena and to measure the void fraction, velocity etc. with high-speed or 3D/4D for many industrial applications. (author)

  1. Fear of Being Laughed at in Borderline Personality Disorder

    Carolin Brück

    2018-01-01

    Full Text Available Building on the assumption of a possible link between biases in social information processing frequently associated with borderline personality disorder (BPD and the occurrence of gelotophobia (i.e., a fear of being laughed at, the present study aimed at evaluating the prevalence rate of gelotophobia among BPD patients. Using the Geloph<15> , a questionnaire that allows a standardized assessment of the presence and severity of gelotophobia symptoms, rates of gelotophobia were assessed in a group of 30 female BPD patients and compared to data gathered in clinical and non-clinical reference groups. Results indicate a high prevalence of gelotophobia among BPD patients with 87% of BPD patients meeting the Geloph<15> criterion for being classified as gelotophobic. Compared to other clinical and non-clinical reference groups, the rate of gelotophobia among BPD patients appears to be remarkably high, far exceeding the numbers reported for other groups in the literature to date, with 30% of BPD patients reaching extreme levels, 37% pronounced levels, and 20% slight levels of gelotophobia.

  2. ElectroEncephaloGraphics: Making waves in computer graphics research.

    Mustafa, Maryam; Magnor, Marcus

    2014-01-01

    Electroencephalography (EEG) is a novel modality for investigating perceptual graphics problems. Until recently, EEG has predominantly been used for clinical diagnosis, in psychology, and by the brain-computer-interface community. Researchers are extending it to help understand the perception of visual output from graphics applications and to create approaches based on direct neural feedback. Researchers have applied EEG to graphics to determine perceived image and video quality by detecting typical rendering artifacts, to evaluate visualization effectiveness by calculating the cognitive load, and to automatically optimize rendering parameters for images and videos on the basis of implicit neural feedback.

  3. What makes computational open source software libraries successful?

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects.

  4. What makes computational open source software libraries successful?

    Bangerth, Wolfgang; Heister, Timo

    2013-01-01

    Software is the backbone of scientific computing. Yet, while we regularly publish detailed accounts about the results of scientific software, and while there is a general sense of which numerical methods work well, our community is largely unaware of best practices in writing the large-scale, open source scientific software upon which our discipline rests. This is particularly apparent in the commonly held view that writing successful software packages is largely the result of simply ‘being a good programmer’ when in fact there are many other factors involved, for example the social skill of community building. In this paper, we consider what we have found to be the necessary ingredients for successful scientific software projects and, in particular, for software libraries upon which the vast majority of scientific codes are built today. In particular, we discuss the roles of code, documentation, communities, project management and licenses. We also briefly comment on the impact on academic careers of engaging in software projects. (paper)

  5. MoCog1: A computer simulation of recognition-primed human decision making

    Gevarter, William B.

    1991-01-01

    The results of the first stage of a research effort to develop a 'sophisticated' computer model of human cognitive behavior are described. Most human decision making is an experience-based, relatively straight-forward, largely automatic response to internal goals and drives, utilizing cues and opportunities perceived from the current environment. The development of the architecture and computer program (MoCog1) associated with such 'recognition-primed' decision making is discussed. The resultant computer program was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  6. He who laughs last – Jesus and laughter in the Synoptic and ...

    2014-05-06

    May 6, 2014 ... a result of this reappraisal of laughter as healthy for both the individual and for .... stomach with the pleasure of eating), difficult for people to ..... in the Gnostic myths, human beings laugh at Jahweh (Gilhus 1997:71–73).

  7. Behaviour, Physiology and Experience of Pathological Laughing and Crying in Amyotrophic Lateral Sclerosis

    Olney, Nicholas T.; Goodkind, Madeleine S.; Lomen-Hoerth, Catherine; Whalen, Patrick K.; Williamson, Craig A.; Holley, Deborah E.; Verstaen, Alice; Brown, Laurel M.; Miller, Bruce L.; Kornak, John; Levenson, Robert W.; Rosen, Howard J.

    2011-01-01

    Pathological laughing and crying is a disorder of emotional expression seen in a number of neurological diseases. The aetiology is poorly understood, but clinical descriptions suggest a disorder of emotion regulation. The goals of this study were: (i) to characterize the subjective, behavioural and physiological emotional reactions that occur…

  8. Media Detectives: Bridging the Relationship among Empathy, Laugh Tracks, and Gender in Childhood

    Kanthan, Sruti; Graham, James A.; Azarchi, Lynne

    2016-01-01

    Empathy in college-age students is decreasing at unprecedented rates. Understanding empathy in children can act as primary prevention in tackling the problem. This study considers laugh tracks' capacity to bias reality, foster empathy, and investigate differences across time and gender in 181 fifth grade students. Findings from this…

  9. Laugh and Smile upon the Holy Quran: The Study of Analytical Objectivities

    al-Domi, Mohammad Mahmoud

    2015-01-01

    This study aims to examine the positive impact of The Holy Quran based on the laugh and smile. This kind of derivatives in which context of praise, expression the feeling of happiness and joyful in the positive senses. Everyone needs to relieve his heart so that happiness and joy on their faces can be seen. Laughter also are some of attribute…

  10. Preaching What We Practice: Teaching Ethical Decision-Making to Computer Security Professionals

    Fleischmann, Kenneth R.

    The biggest challenge facing computer security researchers and professionals is not learning how to make ethical decisions; rather it is learning how to recognize ethical decisions. All too often, technology development suffers from what Langdon Winner terms technological somnambulism - we sleepwalk through our technology design, following past precedents without a second thought, and fail to consider the perspectives of other stakeholders [1]. Computer security research and practice involves a number of opportunities for ethical decisions. For example, decisions about whether or not to automatically provide security updates involve tradeoffs related to caring versus user autonomy. Decisions about online voting include tradeoffs between convenience and security. Finally, decisions about routinely screening e-mails for spam involve tradeoffs of efficiency and privacy. It is critical that these and other decisions facing computer security researchers and professionals are confronted head on as value-laden design decisions, and that computer security researchers and professionals consider the perspectives of various stakeholders in making these decisions.

  11. Reason, emotion and decision-making: risk and reward computation with feeling

    Quartz, Steven R.

    2009-01-01

    Many models of judgment and decision-making posit distinct cognitive and emotional contributions to decision-making under uncertainty. Cognitive processes typically involve exact computations according to a cost-benefit calculus, whereas emotional processes typically involve approximate, heuristic processes that deliver rapid evaluations without mental effort. However, it remains largely unknown what specific parameters of uncertain decision the brain encodes, the extent to which these parame...

  12. Rational behavior in decision making. A comparison between humans, computers and fast and frugal strategies

    Snijders, C.C.P.

    2007-01-01

    Rational behavior in decision making. A comparison between humans, computers, and fast and frugal strategies Chris Snijders and Frits Tazelaar (Eindhoven University of Technology, The Netherlands) Real life decisions often have to be made in "noisy" circumstances: not all crucial information is

  13. Persons with Alzheimer's Disease Make Phone Calls Independently Using a Computer-Aided Telephone System

    Perilli, Viviana; Lancioni, Giulio E.; Singh, Nirbhay N.; O'Reilly, Mark F.; Sigafoos, Jeff; Cassano, Germana; Cordiano, Noemi; Pinto, Katia; Minervini, Mauro G.; Oliva, Doretta

    2012-01-01

    This study assessed whether four patients with a diagnosis of Alzheimer's disease could make independent phone calls via a computer-aided telephone system. The study was carried out according to a non-concurrent multiple baseline design across participants. All participants started with baseline during which the telephone system was not available,…

  14. Great expectations: neural computations underlying the use of social norms in decision-making

    Chang, L.J.; Sanfey, A.G.

    2011-01-01

    Social expectations play a critical role in everyday decision-making. However, their precise neuro-computational role in the decision process remains unknown. Here we adopt a decision neuroscience framework by combining methods and theories from psychology, economics and neuroscience to outline a

  15. Great expectations: neural computations underlying the use of social norms in decision-making

    Chang, L.J.; Sanfey, A.G.

    2013-01-01

    Social expectations play a critical role in everyday decision-making. However, their precise neuro-computational role in the decision process remains unknown. Here we adopt a decision neuroscience framework by combining methods and theories from psychology, economics and neuroscience to outline a

  16. Effectiveness of a computer-based tutorial for teaching how to make a blood smear.

    Preast, Vanessa; Danielson, Jared; Bender, Holly; Bousson, Maury

    2007-09-01

    Computer-aided instruction (CAI) was developed to teach veterinary students how to make blood smears. This instruction was intended to replace the traditional instructional method in order to promote efficient use of faculty resources while maintaining learning outcomes and student satisfaction. The purpose of this study was to evaluate the effect of a computer-aided blood smear tutorial on 1) instructor's teaching time, 2) students' ability to make blood smears, and 3) students' ability to recognize smear quality. Three laboratory sessions for senior veterinary students were taught using traditional methods (control group) and 4 sessions were taught using the CAI tutorial (experimental group). Students in the control group received a short demonstration and lecture by the instructor at the beginning of the laboratory and then practiced making blood smears. Students in the experimental group received their instruction through the self-paced, multimedia tutorial on a laptop computer and then practiced making blood smears. Data was collected from observation, interview, survey questionnaires, and smear evaluation by students and experts using a scoring rubric. Students using the CAI made better smears and were better able to recognize smear quality. The average time the instructor spent in the room was not significantly different between groups, but the quality of the instructor time was improved with the experimental instruction. The tutorial implementation effectively provided students and instructors with a teaching and learning experience superior to the traditional method of instruction. Using CAI is a viable method of teaching students to make blood smears.

  17. MoCog1: A computer simulation of recognition-primed human decision making, considering emotions

    Gevarter, William B.

    1992-01-01

    The successful results of the first stage of a research effort to develop a versatile computer model of motivated human cognitive behavior are reported. Most human decision making appears to be an experience-based, relatively straightforward, largely automatic response to situations, utilizing cues and opportunities perceived from the current environment. The development, considering emotions, of the architecture and computer program associated with such 'recognition-primed' decision-making is described. The resultant computer program (MoCog1) was successfully utilized as a vehicle to simulate earlier findings that relate how an individual's implicit theories orient the individual toward particular goals, with resultant cognitions, affects, and behavior in response to their environment.

  18. Heuristic and optimal policy computations in the human brain during sequential decision-making.

    Korn, Christoph W; Bach, Dominik R

    2018-01-23

    Optimal decisions across extended time horizons require value calculations over multiple probabilistic future states. Humans may circumvent such complex computations by resorting to easy-to-compute heuristics that approximate optimal solutions. To probe the potential interplay between heuristic and optimal computations, we develop a novel sequential decision-making task, framed as virtual foraging in which participants have to avoid virtual starvation. Rewards depend only on final outcomes over five-trial blocks, necessitating planning over five sequential decisions and probabilistic outcomes. Here, we report model comparisons demonstrating that participants primarily rely on the best available heuristic but also use the normatively optimal policy. FMRI signals in medial prefrontal cortex (MPFC) relate to heuristic and optimal policies and associated choice uncertainties. Crucially, reaction times and dorsal MPFC activity scale with discrepancies between heuristic and optimal policies. Thus, sequential decision-making in humans may emerge from integration between heuristic and optimal policies, implemented by controllers in MPFC.

  19. He who laughs last – Jesus and laughter in the Synoptic and Gnostic traditions

    Marius J. Nel

    2014-05-01

    Full Text Available The aim of the article is to examine the meaning of references to laughter in the Synoptic Gospels and a number of Gnostic texts. Whereas Jesus is depicted as an object of ridicule (Mk 5:40 par. and as condemning those who laugh in the Synoptic Gospels (Lk 6:25, it is he who often laughs derisively at the ignorance of others in Gnostic texts. The meaning of laughter in the Synoptic Gospels and a number of Gnostic texts is examined in the light of the general Greco-Roman attitude towards laughter and, more specifically, in regard to the archetypical distinction between playful and consequential laughter in Greek culture.

  20. Computer versus paper--does it make any difference in test performance?

    Karay, Yassin; Schauber, Stefan K; Stosch, Christoph; Schüttpelz-Brauns, Katrin

    2015-01-01

    CONSTRUCT: In this study, we examine the differences in test performance between the paper-based and the computer-based version of the Berlin formative Progress Test. In this context it is the first study that allows controlling for students' prior performance. Computer-based tests make possible a more efficient examination procedure for test administration and review. Although university staff will benefit largely from computer-based tests, the question arises if computer-based tests influence students' test performance. A total of 266 German students from the 9th and 10th semester of medicine (comparable with the 4th-year North American medical school schedule) participated in the study (paper = 132, computer = 134). The allocation of the test format was conducted as a randomized matched-pair design in which students were first sorted according to their prior test results. The organizational procedure, the examination conditions, the room, and seating arrangements, as well as the order of questions and answers, were identical in both groups. The sociodemographic variables and pretest scores of both groups were comparable. The test results from the paper and computer versions did not differ. The groups remained within the allotted time, but students using the computer version (particularly the high performers) needed significantly less time to complete the test. In addition, we found significant differences in guessing behavior. Low performers using the computer version guess significantly more than low-performing students in the paper-pencil version. Participants in computer-based tests are not at a disadvantage in terms of their test results. The computer-based test required less processing time. The reason for the longer processing time when using the paper-pencil version might be due to the time needed to write the answer down, controlling for transferring the answer correctly. It is still not known why students using the computer version (particularly low

  1. Neural correlates and neural computations in posterior parietal cortex during perceptual decision-making

    Alexander eHuk

    2012-10-01

    Full Text Available A recent line of work has found remarkable success in relating perceptual decision-making and the spiking activity in the macaque lateral intraparietal area (LIP. In this review, we focus on questions about the neural computations in LIP that are not answered by demonstrations of neural correlates of psychological processes. We highlight three areas of limitations in our current understanding of the precise neural computations that might underlie neural correlates of decisions: (1 empirical questions not yet answered by existing data; (2 implementation issues related to how neural circuits could actually implement the mechanisms suggested by both physiology and psychology; and (3 ecological constraints related to the use of well-controlled laboratory tasks and whether they provide an accurate window on sensorimotor computation. These issues motivate the adoption of a more general encoding-decoding framework that will be fruitful for more detailed contemplation of how neural computations in LIP relate to the formation of perceptual decisions.

  2. Effectiveness of a Case-Based Computer Program on Students' Ethical Decision Making.

    Park, Eun-Jun; Park, Mihyun

    2015-11-01

    The aim of this study was to test the effectiveness of a case-based computer program, using an integrative ethical decision-making model, on the ethical decision-making competency of nursing students in South Korea. This study used a pre- and posttest comparison design. Students in the intervention group used a computer program for case analysis assignments, whereas students in the standard group used a traditional paper assignment for case analysis. The findings showed that using the case-based computer program as a complementary tool for the ethics courses offered at the university enhanced students' ethical preparedness and satisfaction with the course. On the basis of the findings, it is recommended that nurse educators use a case-based computer program as a complementary self-study tool in ethics courses to supplement student learning without an increase in course hours, particularly in terms of analyzing ethics cases with dilemma scenarios and exercising ethical decision making. Copyright 2015, SLACK Incorporated.

  3. A conceptual and computational model of moral decision making in human and artificial agents.

    Wallach, Wendell; Franklin, Stan; Allen, Colin

    2010-07-01

    Recently, there has been a resurgence of interest in general, comprehensive models of human cognition. Such models aim to explain higher-order cognitive faculties, such as deliberation and planning. Given a computational representation, the validity of these models can be tested in computer simulations such as software agents or embodied robots. The push to implement computational models of this kind has created the field of artificial general intelligence (AGI). Moral decision making is arguably one of the most challenging tasks for computational approaches to higher-order cognition. The need for increasingly autonomous artificial agents to factor moral considerations into their choices and actions has given rise to another new field of inquiry variously known as Machine Morality, Machine Ethics, Roboethics, or Friendly AI. In this study, we discuss how LIDA, an AGI model of human cognition, can be adapted to model both affective and rational features of moral decision making. Using the LIDA model, we will demonstrate how moral decisions can be made in many domains using the same mechanisms that enable general decision making. Comprehensive models of human cognition typically aim for compatibility with recent research in the cognitive and neural sciences. Global workspace theory, proposed by the neuropsychologist Bernard Baars (1988), is a highly regarded model of human cognition that is currently being computationally instantiated in several software implementations. LIDA (Franklin, Baars, Ramamurthy, & Ventura, 2005) is one such computational implementation. LIDA is both a set of computational tools and an underlying model of human cognition, which provides mechanisms that are capable of explaining how an agent's selection of its next action arises from bottom-up collection of sensory data and top-down processes for making sense of its current situation. We will describe how the LIDA model helps integrate emotions into the human decision-making process, and we

  4. Chimpanzees (Pan troglodytes) Produce the Same Types of 'Laugh Faces' when They Emit Laughter and when They Are Silent.

    Davila-Ross, Marina; Jesus, Goncalo; Osborne, Jade; Bard, Kim A

    2015-01-01

    The ability to flexibly produce facial expressions and vocalizations has a strong impact on the way humans communicate, as it promotes more explicit and versatile forms of communication. Whereas facial expressions and vocalizations are unarguably closely linked in primates, the extent to which these expressions can be produced independently in nonhuman primates is unknown. The present work, thus, examined if chimpanzees produce the same types of facial expressions with and without accompanying vocalizations, as do humans. Forty-six chimpanzees (Pan troglodytes) were video-recorded during spontaneous play with conspecifics at the Chimfunshi Wildlife Orphanage. ChimpFACS was applied, a standardized coding system to measure chimpanzee facial movements, based on FACS developed for humans. Data showed that the chimpanzees produced the same 14 configurations of open-mouth faces when laugh sounds were present and when they were absent. Chimpanzees, thus, produce these facial expressions flexibly without being morphologically constrained by the accompanying vocalizations. Furthermore, the data indicated that the facial expression plus vocalization and the facial expression alone were used differently in social play, i.e., when in physical contact with the playmates and when matching the playmates' open-mouth faces. These findings provide empirical evidence that chimpanzees produce distinctive facial expressions independently from a vocalization, and that their multimodal use affects communicative meaning, important traits for a more explicit and versatile way of communication. As it is still uncertain how human laugh faces evolved, the ChimpFACS data were also used to empirically examine the evolutionary relation between open-mouth faces with laugh sounds of chimpanzees and laugh faces of humans. The ChimpFACS results revealed that laugh faces of humans must have gradually emerged from laughing open-mouth faces of ancestral apes. This work examines the main evolutionary

  5. The emergence of understanding in a computer model of concepts and analogy-making

    Mitchell, Melanie; Hofstadter, Douglas R.

    1990-06-01

    This paper describes Copycat, a computer model of the mental mechanisms underlying the fluidity and adaptability of the human conceptual system in the context of analogy-making. Copycat creates analogies between idealized situations in a microworld that has been designed to capture and isolate many of the central issues of analogy-making. In Copycat, an understanding of the essence of a situation and the recognition of deep similarity between two superficially different situations emerge from the interaction of a large number of perceptual agents with an associative, overlapping, and context-sensitive network of concepts. Central features of the model are: a high degree of parallelism; competition and cooperation among a large number of small, locally acting agents that together create a global understanding of the situation at hand; and a computational temperature that measures the amount of perceptual organization as processing proceeds and that in turn controls the degree of randomness with which decisions are made in the system.

  6. Reason, emotion and decision-making: risk and reward computation with feeling.

    Quartz, Steven R

    2009-05-01

    Many models of judgment and decision-making posit distinct cognitive and emotional contributions to decision-making under uncertainty. Cognitive processes typically involve exact computations according to a cost-benefit calculus, whereas emotional processes typically involve approximate, heuristic processes that deliver rapid evaluations without mental effort. However, it remains largely unknown what specific parameters of uncertain decision the brain encodes, the extent to which these parameters correspond to various decision-making frameworks, and their correspondence to emotional and rational processes. Here, I review research suggesting that emotional processes encode in a precise quantitative manner the basic parameters of financial decision theory, indicating a reorientation of emotional and cognitive contributions to risky choice.

  7. Computer programs to make a Chart of the nuclides for WWW

    Nakagawa, Tsuneo; Katakura, Jun-ichi; Horiguchi, Takayoshi

    1999-06-01

    Computer programs to make a chart of the nuclides for World Wide Web (WWW) have been developed. The programs make a data file for WWW chart of the nuclides from a data file containing nuclide information in the format similar to ENSDF, by filling unknown half-lives with calculated ones. Then, the WWW chart of the nuclides in the gif format is created from the data file. The programs to make html files and image map files, to select a chart of selected nuclides, and to show various information of nuclides are included in the system. All the programs are written in C language. This report describes the formats of files, the programs and 1998 issue of Chart of the Nuclides made by means of the present programs. (author)

  8. A novel computer based expert decision making model for prostate cancer disease management.

    Richman, Martin B; Forman, Ernest H; Bayazit, Yildirim; Einstein, Douglas B; Resnick, Martin I; Stovsky, Mark D

    2005-12-01

    We propose a strategic, computer based, prostate cancer decision making model based on the analytic hierarchy process. We developed a model that improves physician-patient joint decision making and enhances the treatment selection process by making this critical decision rational and evidence based. Two groups (patient and physician-expert) completed a clinical study comparing an initial disease management choice with the highest ranked option generated by the computer model. Participants made pairwise comparisons to derive priorities for the objectives and subobjectives related to the disease management decision. The weighted comparisons were then applied to treatment options to yield prioritized rank lists that reflect the likelihood that a given alternative will achieve the participant treatment goal. Aggregate data were evaluated by inconsistency ratio analysis and sensitivity analysis, which assessed the influence of individual objectives and subobjectives on the final rank list of treatment options. Inconsistency ratios less than 0.05 were reliably generated, indicating that judgments made within the model were mathematically rational. The aggregate prioritized list of treatment options was tabulated for the patient and physician groups with similar outcomes for the 2 groups. Analysis of the major defining objectives in the treatment selection decision demonstrated the same rank order for the patient and physician groups with cure, survival and quality of life being more important than controlling cancer, preventing major complications of treatment, preventing blood transfusion complications and limiting treatment cost. Analysis of subobjectives, including quality of life and sexual dysfunction, produced similar priority rankings for the patient and physician groups. Concordance between initial treatment choice and the highest weighted model option differed between the groups with the patient group having 59% concordance and the physician group having only 42

  9. "Laisse-moi rire! Fais-moi parler!" ("Let Me Laugh! Make Me Speak!").

    Borgomano, Laure

    1983-01-01

    A discussion of the use of cartoons and humorous vignettes in French instruction is presented. It is suggested that there exists a wealth of such material but little guidance in using it. Problems of cultural context, understanding the use of pictures, and potential for classroom discussion are considered. (MSE)

  10. The Impact of Computed Tomography on Decision Making in Tibial Plateau Fractures.

    Castiglia, Marcello Teixeira; Nogueira-Barbosa, Marcello Henrique; Messias, Andre Marcio Vieira; Salim, Rodrigo; Fogagnolo, Fabricio; Schatzker, Joseph; Kfuri, Mauricio

    2018-02-14

    Schatzker introduced one of the most used classification systems for tibial plateau fractures, based on plain radiographs. Computed tomography brought to attention the importance of coronal plane-oriented fractures. The goal of our study was to determine if the addition of computed tomography would affect the decision making of surgeons who usually use the Schatzker classification to assess tibial plateau fractures. Image studies of 70 patients who sustained tibial plateau fractures were uploaded to a dedicated homepage. Every patient was linked to a folder which contained two radiographic projections (anteroposterior and lateral), three interactive videos of computed tomography (axial, sagittal, and coronal), and eight pictures depicting tridimensional reconstructions of the tibial plateau. Ten attending orthopaedic surgeons, who were blinded to the cases, were granted access to the homepage and assessed each set of images in two different rounds, separated to each other by an interval of 2 weeks. Each case was evaluated in three steps, where surgeons had access, respectively to radiographs, two-dimensional videos of computed tomography, and three-dimensional reconstruction images. After every step, surgeons were asked to present how would they classify the case using the Schatzker system and which surgical approaches would be appropriate. We evaluated the inter- and intraobserver reliability of the Schatzker classification using the Kappa concordance coefficient, as well as the impact of computed tomography in the decision making regarding the surgical approach for each case, by using the chi-square test and likelihood ratio. The interobserver concordance kappa coefficients after each assessment step were, respectively, 0.58, 0.62, and 0.64. For the intraobserver analysis, the coefficients were, respectively, 0.76, 0.75, and 0.78. Computed tomography changed the surgical approach selection for the types II, V, and VI of Schatzker ( p  < 0.01). The addition of

  11. Building bridges between perceptual and economic decision-making: neural and computational mechanisms

    Christopher eSummerfield

    2012-05-01

    Full Text Available Investigation into the neural and computational bases of decision-making has proceeded in two parallel but distinct streams. Perceptual decision making (PDM is concerned with how observers detect, discriminate and categorise noisy sensory information. Economic decision making (EDM explores how options are selected on the basis of their reinforcement history. Traditionally, the subfields of PDM and EDM have employed different paradigms, proposed different mechanistic models, explored different brain regions, disagreed about whether decisions approach optimality. Nevertheless, we argue that there is a common framework for understanding decisions made in both domains, under which an agent has to combine sensory information (what is the stimulus with value information (what is it worth. We review computational models of the decision process typically used in PDM, based around the idea that decisions involve a serial integration of evidence, and assess their applicability to decisions between good and gambles. Subsequently, we consider the contribution of three key brain regions – the parietal cortex, the basal ganglia, and the orbitofrontal cortex – to perceptual and economic decision-making, with a focus on the mechanisms by which sensory and reward information are integrated during choice. We find that although the parietal cortex is often implicated in the integration of sensory evidence, there is evidence for its role in encoding the expected value of a decision. Similarly, although much research has emphasised the role of the striatum and orbitofrontal cortex in value-guided choices, they may play an important role in categorisation of perceptual information. In conclusion, we consider how findings from the two fields might be brought together, in order to move towards a general framework for understanding decision-making in humans and other primates.

  12. Decision Accuracy in Computer-Mediated versus Face-to-Face Decision-Making Teams.

    Hedlund; Ilgen; Hollenbeck

    1998-10-01

    Changes in the way organizations are structured and advances in communication technologies are two factors that have altered the conditions under which group decisions are made. Decisions are increasingly made by teams that have a hierarchical structure and whose members have different areas of expertise. In addition, many decisions are no longer made via strictly face-to-face interaction. The present study examines the effects of two modes of communication (face-to-face or computer-mediated) on the accuracy of teams' decisions. The teams are characterized by a hierarchical structure and their members differ in expertise consistent with the framework outlined in the Multilevel Theory of team decision making presented by Hollenbeck, Ilgen, Sego, Hedlund, Major, and Phillips (1995). Sixty-four four-person teams worked for 3 h on a computer simulation interacting either face-to-face (FtF) or over a computer network. The communication mode had mixed effects on team processes in that members of FtF teams were better informed and made recommendations that were more predictive of the correct team decision, but leaders of CM teams were better able to differentiate staff members on the quality of their decisions. Controlling for the negative impact of FtF communication on staff member differentiation increased the beneficial effect of the FtF mode on overall decision making accuracy. Copyright 1998 Academic Press.

  13. A comparative analysis of multi-level computer-assisted decision making systems for traumatic injuries

    Huynh Toan

    2009-01-01

    Full Text Available Abstract Background This paper focuses on the creation of a predictive computer-assisted decision making system for traumatic injury using machine learning algorithms. Trauma experts must make several difficult decisions based on a large number of patient attributes, usually in a short period of time. The aim is to compare the existing machine learning methods available for medical informatics, and develop reliable, rule-based computer-assisted decision-making systems that provide recommendations for the course of treatment for new patients, based on previously seen cases in trauma databases. Datasets of traumatic brain injury (TBI patients are used to train and test the decision making algorithm. The work is also applicable to patients with traumatic pelvic injuries. Methods Decision-making rules are created by processing patterns discovered in the datasets, using machine learning techniques. More specifically, CART and C4.5 are used, as they provide grammatical expressions of knowledge extracted by applying logical operations to the available features. The resulting rule sets are tested against other machine learning methods, including AdaBoost and SVM. The rule creation algorithm is applied to multiple datasets, both with and without prior filtering to discover significant variables. This filtering is performed via logistic regression prior to the rule discovery process. Results For survival prediction using all variables, CART outperformed the other machine learning methods. When using only significant variables, neural networks performed best. A reliable rule-base was generated using combined C4.5/CART. The average predictive rule performance was 82% when using all variables, and approximately 84% when using significant variables only. The average performance of the combined C4.5 and CART system using significant variables was 89.7% in predicting the exact outcome (home or rehabilitation, and 93.1% in predicting the ICU length of stay for

  14. Make

    Frauenfelder, Mark

    2012-01-01

    The first magazine devoted entirely to do-it-yourself technology projects presents its 29th quarterly edition for people who like to tweak, disassemble, recreate, and invent cool new uses for technology. MAKE Volume 29 takes bio-hacking to a new level. Get introduced to DIY tracking devices before they hit the consumer electronics marketplace. Learn how to build an EKG machine to study your heartbeat, and put together a DIY bio lab to study athletic motion using consumer grade hardware.

  15. Decision Support Systems and the Conflict Model of Decision Making: A Stimulus for New Computer-Assisted Careers Guidance Systems.

    Ballantine, R. Malcolm

    Decision Support Systems (DSSs) are computer-based decision aids to use when making decisions which are partially amenable to rational decision-making procedures but contain elements where intuitive judgment is an essential component. In such situations, DSSs are used to improve the quality of decision-making. The DSS approach is based on Simon's…

  16. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.

    1992-07-01

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study.

  17. Computer models used to support cleanup decision-making at hazardous and radioactive waste sites

    Moskowitz, P.D.; Pardi, R.; DePhillips, M.P.; Meinhold, A.F.

    1992-07-01

    Massive efforts are underway to cleanup hazardous and radioactive waste sites located throughout the US To help determine cleanup priorities, computer models are being used to characterize the source, transport, fate and effects of hazardous chemicals and radioactive materials found at these sites. Although, the US Environmental Protection Agency (EPA), the US Department of Energy (DOE), and the US Nuclear Regulatory Commission (NRC) have provided preliminary guidance to promote the use of computer models for remediation purposes, no Agency has produced directed guidance on models that must be used in these efforts. To identify what models are actually being used to support decision-making at hazardous and radioactive waste sites, a project jointly funded by EPA, DOE and NRC was initiated. The purpose of this project was to: (1) Identify models being used for hazardous and radioactive waste site assessment purposes; and (2) describe and classify these models. This report presents the results of this study

  18. Computer programing for geosciences: Teach your students how to make tools

    Grapenthin, Ronni

    2011-12-01

    When I announced my intention to pursue a Ph.D. in geophysics, some people gave me confused looks, because I was working on a master's degree in computer science at the time. My friends, like many incoming geoscience graduate students, have trouble linking these two fields. From my perspective, it is pretty straightforward: Much of geoscience evolves around novel analyses of large data sets that require custom tools—computer programs—to minimize the drudgery of manual data handling; other disciplines share this characteristic. While most faculty adapted to the need for tool development quite naturally, as they grew up around computer terminal interfaces, incoming graduate students lack intuitive understanding of programing concepts such as generalization and automation. I believe the major cause is the intuitive graphical user interfaces of modern operating systems and applications, which isolate the user from all technical details. Generally, current curricula do not recognize this gap between user and machine. For students to operate effectively, they require specialized courses teaching them the skills they need to make tools that operate on particular data sets and solve their specific problems. Courses in computer science departments are aimed at a different audience and are of limited help.

  19. The impact of natural aging on computational and neural indices of perceptual decision making: A review.

    Dully, Jessica; McGovern, David P; O'Connell, Redmond G

    2018-02-10

    It is well established that natural aging negatively impacts on a wide variety of cognitive functions and research has sought to identify core neural mechanisms that may account for these disparate changes. A central feature of any cognitive task is the requirement to translate sensory information into an appropriate action - a process commonly known as perceptual decision making. While computational, psychophysical, and neurophysiological research has made substantial progress in establishing the key computations and neural mechanisms underpinning decision making, it is only relatively recently that this knowledge has begun to be applied to research on aging. The purpose of this review is to provide an overview of this work which is beginning to offer new insights into the core psychological processes that mediate age-related cognitive decline in adults aged 65 years and over. Mathematical modelling studies have consistently reported that older adults display longer non-decisional processing times and implement more conservative decision policies than their younger counterparts. However, there are limits on what we can learn from behavioural modeling alone and neurophysiological analyses can play an essential role in empirically validating model predictions and in pinpointing the precise neural mechanisms that are impacted by aging. Although few studies to date have explicitly examined correspondences between computational models and neural data with respect to cognitive aging, neurophysiological studies have already highlighted age-related changes at multiple levels of the sensorimotor hierarchy that are likely to be consequential for decision making behaviour. Here, we provide an overview of this literature and suggest some future directions for the field. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  20. Meaning Making Through Minimal Linguistic Forms in Computer-Mediated Communication

    Muhammad Shaban Rafi

    2014-05-01

    Full Text Available The purpose of this study was to investigate the linguistic forms, which commonly constitute meanings in the digital environment. The data were sampled from 200 Bachelor of Science (BS students (who had Urdu as their primary language of communication and English as one of the academic languages or the most prestigious second language of five universities situated in Lahore, Pakistan. The procedure for analysis was conceived within much related theoretical work on text analysis. The study reveals that cyber-language is organized through patterns of use, which can be broadly classified into minimal linguistic forms constituting a meaning-making resource. In addition, the expression of syntactic mood, and discourse roles the participants technically assume tend to contribute to the theory of meaning in the digital environment. It is hoped that the study would make some contribution to the growing literature on multilingual computer-mediated communication (CMC.

  1. Causal Inference for Cross-Modal Action Selection: A Computational Study in a Decision Making Framework.

    Daemi, Mehdi; Harris, Laurence R; Crawford, J Douglas

    2016-01-01

    Animals try to make sense of sensory information from multiple modalities by categorizing them into perceptions of individual or multiple external objects or internal concepts. For example, the brain constructs sensory, spatial representations of the locations of visual and auditory stimuli in the visual and auditory cortices based on retinal and cochlear stimulations. Currently, it is not known how the brain compares the temporal and spatial features of these sensory representations to decide whether they originate from the same or separate sources in space. Here, we propose a computational model of how the brain might solve such a task. We reduce the visual and auditory information to time-varying, finite-dimensional signals. We introduce controlled, leaky integrators as working memory that retains the sensory information for the limited time-course of task implementation. We propose our model within an evidence-based, decision-making framework, where the alternative plan units are saliency maps of space. A spatiotemporal similarity measure, computed directly from the unimodal signals, is suggested as the criterion to infer common or separate causes. We provide simulations that (1) validate our model against behavioral, experimental results in tasks where the participants were asked to report common or separate causes for cross-modal stimuli presented with arbitrary spatial and temporal disparities. (2) Predict the behavior in novel experiments where stimuli have different combinations of spatial, temporal, and reliability features. (3) Illustrate the dynamics of the proposed internal system. These results confirm our spatiotemporal similarity measure as a viable criterion for causal inference, and our decision-making framework as a viable mechanism for target selection, which may be used by the brain in cross-modal situations. Further, we suggest that a similar approach can be extended to other cognitive problems where working memory is a limiting factor, such

  2. Assessment of Two Desk-Top Computer Simulations Used to Train Tactical Decision Making (TDM) of Small Unit Infantry Leaders

    Beal, Scott A

    2007-01-01

    Fifty-two leaders in the Basic Non-Commissioned Officer Course (BNCOC) at Fort Benning, Georgia, participated in an assessment of two desk-top computer simulations used to train tactical decision making...

  3. Piloting a new approach: making use of technology to present a distance learning computer science course

    Tina Wilson

    1996-12-01

    Full Text Available Teaching projects which make use of new technology are becoming of interest to all academic institutions in the UK due to economic pressure to increase student numbers. CMC (Computer- Mediated Communication such as computer conferencing appears an attractive solution to higher education's 'numbers' problem, with the added benefit that it is free from time and place constraints. Researchers have discussed CMC from a number of different perspectives, for example Mason and Kaye (1989 describe a CMC system as a system of interactivity between tutors, students, resources and organizational structure. Steeples et al (1993 consider CMC in terms of group cohesion, modes of discourse and intervention strategies to stimulate and structure participation. Goodyear et al (1994 discuss the Just in Time (TT-Based Open Learning (JTTOL model in terms of a set of educational beliefs, role definitions, working methods and learning resources, together with a definition of infrastructure requirements for CMC. Shedletsky (1993 suggests that a CMC should be viewed in terms of an 'intrapersonal communication' model, while Mayes et al (1994 identify three types of learning which is mediated by telematics, that is, learning by conceptualization, construction and dialogue. Other researchers, such as Velayo (1994, describe the teacher as 'an active agent', and present a model for computer conferencing which neglects the social aspect of CMC, while Berge (1995 mentions the importance of social activity between students and the importance of the role of the moderator. From these accounts, there appear to be a number of dimensions which can be used to evaluate CMC. Not all researchers emphasize the same dimensions; however, this paper proposes that computer conferencing systems should be designed to encourage students to participate in all three of the following dimensions. These can be summarized as: (a a knowledge dimension (includes domain and meta knowledge; (b a social

  4. Laughter Differs in Children with Autism: An Acoustic Analysis of Laughs Produced by Children with and without the Disorder

    Hudenko, William J.; Stone, Wendy; Bachorowski, Jo-Anne

    2009-01-01

    Few studies have examined vocal expressions of emotion in children with autism. We tested the hypothesis that during social interactions, children diagnosed with autism would exhibit less extreme laugh acoustics than their nonautistic peers. Laughter was recorded during a series of playful interactions with an examiner. Results showed that…

  5. Making nuclear power plant operational decisions using probabilistic safety assessment information and personal computers. Working material

    1991-01-01

    PRISIM described in this case study makes a PSA useful to decision makers like plant managers, operational personnel or safety assessors because it provides a rapid access to specific information and the ability to generate updated PSA results that reflect the plant status at a particular time. From the capabilities of PRISIM one can conclude that the ability of a user friendly update of the system model in the PC or changes in the data files at the computer is not realized to data. Also the calculation of averaged probabilities instead of time dependent instantaneous probabilities is a sort of a restriction and will be changed in the future. 5 refs, 34 figs, 3 tabs

  6. In the Clouds: The Implications of Cloud Computing for Higher Education Information Technology Governance and Decision Making

    Dulaney, Malik H.

    2013-01-01

    Emerging technologies challenge the management of information technology in organizations. Paradigm changing technologies, such as cloud computing, have the ability to reverse the norms in organizational management, decision making, and information technology governance. This study explores the effects of cloud computing on information technology…

  7. Coronary Computed Tomographic Angiography-Derived Fractional Flow Reserve for Therapeutic Decision Making.

    Tesche, Christian; Vliegenthart, Rozemarijn; Duguay, Taylor M; De Cecco, Carlo N; Albrecht, Moritz H; De Santis, Domenico; Langenbach, Marcel C; Varga-Szemes, Akos; Jacobs, Brian E; Jochheim, David; Baquet, Moritz; Bayer, Richard R; Litwin, Sheldon E; Hoffmann, Ellen; Steinberg, Daniel H; Schoepf, U Joseph

    2017-12-15

    This study investigated the performance of coronary computed tomography angiography (cCTA) with cCTA-derived fractional flow reserve (CT-FFR) compared with invasive coronary angiography (ICA) with fractional flow reserve (FFR) for therapeutic decision making in patients with suspected coronary artery disease (CAD). Seventy-four patients (62 ± 11 years, 62% men) with at least 1 coronary stenosis of ≥50% on clinically indicated dual-source cCTA, who had subsequently undergone ICA with FFR measurement, were retrospectively evaluated. CT-FFR values were computed using an on-site machine-learning algorithm to assess the functional significance of CAD. The therapeutic strategy (optimal medical therapy alone vs revascularization) and the appropriate revascularization procedure (percutaneous coronary intervention vs coronary artery bypass grafting) were selected using cCTA-CT-FFR. Thirty-six patients (49%) had a functionally significant CAD based on ICA-FFR. cCTA-CT-FFR correctly identified a functionally significant CAD and the need of revascularization in 35 of 36 patients (97%). When revascularization was deemed indicated, the same revascularization procedure (32 percutaneous coronary interventions and 3 coronary artery bypass grafting) was chosen in 35 of 35 patients (100%). Overall, identical management strategies were selected in 73 of the 74 patients (99%). cCTA-CT-FFR shows excellent performance to identify patients with and without the need for revascularization and to select the appropriate revascularization strategy. cCTA-CT-FFR as a noninvasive "one-stop shop" has the potential to change diagnostic workflows and to directly inform therapeutic decision making in patients with suspected CAD. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Making computers noble. An experiment in automatic analysis of medieval texts

    Andrea Colli

    2016-02-01

    Full Text Available L’analisi informatica di testi filosofici, la creazione di database, ipertesti o edizioni elettroniche non costituiscono più unicamente una ricerca di frontiera, ma sono da molti anni una risorsa preziosa per gli studi umanistici. Ora, non si tratta di richiedere alle macchine un ulteriore sforzo per comprendere il linguaggio umano, quanto piuttosto di perfezionare gli strumenti affinché esse possano essere a tutti gli effetti collaboratori di ricerca. Questo articolo è concepito come il resoconto di un esperimento finalizzato a documentare come le associazioni lessicali di un gruppo selezionato di testi medievali possa offrire qualche suggerimento in merito ai loro contenuti teorici. Computer analysis of texts, creation of databases hypertexts and digital editions are not the final frontier of research anymore. Quite the contrary, from many years they have been representing a significant contribution to medieval studies. Therefore, we do not mean to make the computer able to grasp the meaning of human language and penetrate its secrets, but rather we aim at improving their tools, so that they will become an even more efficient equipment employed in research activities. This paper is thought as a sort of technical report with the proposed task to verify if an automatic identification of some word associations within a selected groups of medieval writings produces suggestions on the subject of the processed texts, able to be used in a theoretical inquiry.

  9. Spatial and temporal variation in lead and cadmium in the Laughing Gull, Larus atricilla

    Reid, M; Hacker, C S

    1982-11-01

    Lead and cadmium concentrations were measured in eggs and in bone, kidney, liver and stomach contents of downy young, prefledgling, and adult Laughing Gulls collected from Matagorda Bay and Galveston Bay, Texas. Matagorda Bay drains a rural, moderately industrialized region while the Galveston Bay area is heavily urbanized and industrialized. Lead levels were lower in birds from Matagorda Bay and decreased in birds from Galveston Bay between 1977 and 1980. Cadmium levels were also lower in birds from Matagorda Bay but increased over the three-year period in those from Galveston Bay. The temporal decrease in lead may be associated with such environmental control efforts as reduced point source emissions and substitution of unleaded gasoline.

  10. Computer Simulation as a Tool for Assessing Decision-Making in Pandemic Influenza Response Training

    James M Leaming

    2013-05-01

    Full Text Available Introduction: We sought to develop and test a computer-based, interactive simulation of a hypothetical pandemic influenza outbreak. Fidelity was enhanced with integrated video and branching decision trees, built upon the 2007 federal planning assumptions. We conducted a before-and-after study of the simulation effectiveness to assess the simulations’ ability to assess participants’ beliefs regarding their own hospitals’ mass casualty incident preparedness.Methods: Development: Using a Delphi process, we finalized a simulation that serves up a minimum of over 50 key decisions to 6 role-players on networked laptops in a conference area. The simulation played out an 8-week scenario, beginning with pre-incident decisions. Testing: Role-players and trainees (N=155 were facilitated to make decisions during the pandemic. Because decision responses vary, the simulation plays out differently, and a casualty counter quantifies hypothetical losses. The facilitator reviews and critiques key factors for casualty control, including effective communications, working with external organizations, development of internal policies and procedures, maintaining supplies and services, technical infrastructure support, public relations and training. Pre- and post-survey data were compared on trainees.Results: Post-simulation trainees indicated a greater likelihood of needing to improve their organization in terms of communications, mass casualty incident planning, public information and training. Participants also recognized which key factors required immediate attention at their own home facilities.Conclusion: The use of a computer-simulation was effective in providing a facilitated environment for determining the perception of preparedness, evaluating general preparedness concepts and introduced participants to critical decisions involved in handling a regional pandemic influenza surge. [West J Emerg Med. 2013;14(3:236–242.

  11. Eye Contact and Fear of Being Laughed at in a Gaze Discrimination Task

    Jorge Torres-Marín

    2017-11-01

    Full Text Available Current approaches conceptualize gelotophobia as a personality trait characterized by a disproportionate fear of being laughed at by others. Consistently with this perspective, gelotophobes are also described as neurotic and introverted and as having a paranoid tendency to anticipate derision and mockery situations. Although research on gelotophobia has significantly progressed over the past two decades, no evidence exists concerning the potential effects of gelotophobia in reaction to eye contact. Previous research has pointed to difficulties in discriminating gaze direction as the basis of possible misinterpretations of others’ intentions or mental states. The aim of the present research was to examine whether gelotophobia predisposition modulates the effects of eye contact (i.e., gaze discrimination when processing faces portraying several emotional expressions. In two different experiments, participants performed an experimental gaze discrimination task in which they responded, as quickly and accurately as possible, to the eyes’ directions on faces displaying either a happy, angry, fear, neutral, or sad emotional expression. In particular, we expected trait-gelotophobia to modulate the eye contact effect, showing specific group differences in the happiness condition. The results of Study 1 (N = 40 indicated that gelotophobes made more errors than non-gelotophobes did in the gaze discrimination task. In contrast to our initial hypothesis, the happiness expression did not have any special role in the observed differences between individuals with high vs. low trait-gelotophobia. In Study 2 (N = 40, we replicated the pattern of data concerning gaze discrimination ability, even after controlling for individuals’ scores on social anxiety. Furthermore, in our second experiment, we found that gelotophobes did not exhibit any problem with identifying others’ emotions, or a general incorrect attribution of affective features, such as valence

  12. Factors Influencing the Adoption of Cloud Computing by Decision Making Managers

    Ross, Virginia Watson

    2010-01-01

    Cloud computing is a growing field, addressing the market need for access to computing resources to meet organizational computing requirements. The purpose of this research is to evaluate the factors that influence an organization in their decision whether to adopt cloud computing as a part of their strategic information technology planning.…

  13. Foundations for Reasoning in Cognition-Based Computational Representations of Human Decision Making; TOPICAL

    SENGLAUB, MICHAEL E.; HARRIS, DAVID L.; RAYBOURN, ELAINE M.

    2001-01-01

    In exploring the question of how humans reason in ambiguous situations or in the absence of complete information, we stumbled onto a body of knowledge that addresses issues beyond the original scope of our effort. We have begun to understand the importance that philosophy, in particular the work of C. S. Peirce, plays in developing models of human cognition and of information theory in general. We have a foundation that can serve as a basis for further studies in cognition and decision making. Peircean philosophy provides a foundation for understanding human reasoning and capturing behavioral characteristics of decision makers due to cultural, physiological, and psychological effects. The present paper describes this philosophical approach to understanding the underpinnings of human reasoning. We present the work of C. S. Peirce, and define sets of fundamental reasoning behavior that would be captured in the mathematical constructs of these newer technologies and would be able to interact in an agent type framework. Further, we propose the adoption of a hybrid reasoning model based on his work for future computational representations or emulations of human cognition

  14. Computer simulation of leadership, consensus decision making and collective behaviour in humans.

    Song Wu

    Full Text Available The aim of this study is to evaluate the reliability of a crowd simulation model developed by the authors by reproducing Dyer et al.'s experiments (published in Philosophical Transactions in 2009 on human leadership and consensus decision making in a computer-based environment. The theoretical crowd model of the simulation environment is presented, and its results are compared and analysed against Dyer et al.'s original experiments. It is concluded that the simulation results are largely consistent with the experiments, which demonstrates the reliability of the crowd model. Furthermore, the simulation data also reveals several additional new findings, namely: 1 the phenomena of sacrificing accuracy to reach a quicker consensus decision found in ants colonies was also discovered in the simulation; 2 the ability of reaching consensus in groups has a direct impact on the time and accuracy of arriving at the target position; 3 the positions of the informed individuals or leaders in the crowd could have significant impact on the overall crowd movement; and 4 the simulation also confirmed Dyer et al.'s anecdotal evidence of the proportion of the leadership in large crowds and its effect on crowd movement. The potential applications of these findings are highlighted in the final discussion of this paper.

  15. Incentive motivation deficits in schizophrenia reflect effort computation impairments during cost-benefit decision-making.

    Fervaha, Gagan; Graff-Guerrero, Ariel; Zakzanis, Konstantine K; Foussias, George; Agid, Ofer; Remington, Gary

    2013-11-01

    Motivational impairments are a core feature of schizophrenia and although there are numerous reports studying this feature using clinical rating scales, objective behavioural assessments are lacking. Here, we use a translational paradigm to measure incentive motivation in individuals with schizophrenia. Sixteen stable outpatients with schizophrenia and sixteen matched healthy controls completed a modified version of the Effort Expenditure for Rewards Task that accounts for differences in motoric ability. Briefly, subjects were presented with a series of trials where they may choose to expend a greater amount of effort for a larger monetary reward versus less effort for a smaller reward. Additionally, the probability of receiving money for a given trial was varied at 12%, 50% and 88%. Clinical and other reward-related variables were also evaluated. Patients opted to expend greater effort significantly less than controls for trials of high, but uncertain (i.e. 50% and 88% probability) incentive value, which was related to amotivation and neurocognitive deficits. Other abnormalities were also noted but were related to different clinical variables such as impulsivity (low reward and 12% probability). These motivational deficits were not due to group differences in reward learning, reward valuation or hedonic capacity. Our findings offer novel support for incentive motivation deficits in schizophrenia. Clinical amotivation is associated with impairments in the computation of effort during cost-benefit decision-making. This objective translational paradigm may guide future investigations of the neural circuitry underlying these motivational impairments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Making alliances work -- Using a computer-based management system to integrate the supply chain

    Johnson, J.B.; Randolph, S.

    1995-01-01

    Traditionally, price has been king in the selection of suppliers and service companies in the upstream oil and gas market. Three years ago, Amoco began to question this selection practice and embarked on an extensive benchmarking effort that has led the company to a proven strategy for goods and services procurement called supply-chain management (SCM). However, the company found that managing compact, integrated supply chains is not always easy. Several implementation issues need to be reconciled for alliances to achieve their full bottom-line potential benefits consistently. Issues that must be resolved, whether they are called alliances, supply chains, or integrated services, are (1) whether these new working relationships are profitable for all the entities involved, from suppliers through to end users; (2) how to assess and improve risk management; (3) how to reduce total system costs; and (4) how to improve performance for each of the alliance members and for the alliance as a whole. This brief describes one possible solution to the complex issues involved in making alliances work: a computer-facilitated management system designed to integrate the work processes of different organizations. In the case described, the Drilling Management System (DMS) was developed and used by the Amoco (U.K.) Well Dept. The system uses off-the-shelf commercial software to improve the performance of the company's drilling operations by integrating the activities of the company and its suppliers

  17. The Influence of a Game-Making Project on Male and Female Learners' Attitudes to Computing

    Robertson, Judy

    2013-01-01

    There is a pressing need for gender inclusive approaches to engage young people in computer science. A recent popular approach has been to harness learners' enthusiasm for computer games to motivate them to learn computer science concepts through game authoring. This article describes a study in which 992 learners across 13 schools took part in a…

  18. Environmentally acquired lead, cadmium, and manganese in the cattle egret, Bubulcus ibis, and the laughing gull, Larus atricilla

    Hulse, M; Mahoney, J S; Schroder, G D; Hacker, C S; Pier, S M

    1980-01-01

    Concentrations of lead, cadmium, and manganese in the tissues of cattle egrets and laughing gulls gathered from the Galveston Bay region of Texas were compared to determine if different patterns of accumulation exist. Lead, cadmium, and manganese levels in these species were within the range reported for other bird species. Lead levels in bones were comparable, but gulls had more lead in brain, liver, and kidney tissues than egrets had, which suggested a higher rate of accumulation or exposure. Because of their high abundance and comparable positions in the estuarine and terrestrial food webs, cattle egrets and laughing gulls may serve as convenient biological indicators to monitor potentially toxic substances in these ecosystems. (29 references, 7 tables)

  19. Nonaneurysmal "Pseudo-Subarachnoid Hemorrhage" Computed Tomography Patterns: Challenges in an Acute Decision-Making Heuristics.

    Hasan, Tasneem F; Duarte, Walter; Akinduro, Oluwaseun O; Goldstein, Eric D; Hurst, Rebecca; Haranhalli, Neil; Miller, David A; Wharen, Robert E; Tawk, Rabih G; Freeman, William D

    2018-06-05

    Acute aneurysmal subarachnoid hemorrhage (SAH) is a medical and neurosurgical emergency from ruptured brain aneurysm. Aneurysmal SAH is identified on brain computed tomography (CT) as increased density of basal cisterns and subarachnoid spaces from acute blood products. Aneurysmal SAH-like pattern on CT appears as an optical illusion effect of hypodense brain parenchyma and/or hyperdense surrounding cerebral cisterns and blood vessels termed as "pseudo-subarachnoid hemorrhage" (pseudo-SAH). We reviewed clinical, laboratory, and radiographic data of all SAH diagnoses between January 2013 and January 2018, and found subsets of nonaneurysmal SAH, originally suspected to be aneurysmal in origin. We performed a National Library of Medicine search methodology using terms "subarachnoid hemorrhage," "pseudo," and "non-aneurysmal subarachnoid hemorrhage" singly and in combination to understand the sensitivity, specificity, and precision of pseudo-SAH. Over 5 years, 230 SAH cases were referred to our tertiary academic center and only 7 (3%) met the definition of pseudo-SAH. Searching the National Library of Medicine using subarachnoid hemorrhage yielded 27,402 results. When subarachnoid hemorrhage and pseudo were combined, this yielded 70 results and sensitivity was 50% (n = 35). Similarly, search precision was relatively low (26%) as only 18 results fit the clinical description similar to the 7 cases discussed in our series. Aneurysmal SAH pattern on CT is distinct from nonaneurysmal and pseudo-SAH patterns. The origin of pseudo-SAH terminology appears mostly tied to comatose cardiac arrest patients with diffuse dark brain Hounsfield units and cerebral edema, and is a potential imaging pitfall in acute medical decision-making. Copyright © 2018 National Stroke Association. Published by Elsevier Inc. All rights reserved.

  20. Breathing, Laughing, Sneezing, Coughing: Model and Control of an Anatomically Inspired, Physically-Based Human Torso Simulation

    DiLorenzo, Paul Carmen

    2008-01-01

    Breathing, laughing, sneezing and coughing are all important human behaviors that are generated in the torso. Yet, when these behaviors are animated, the movement of the human torso is often simplified and stylized. Recent work aiming to depict the movement of the torso has focused on pure data-driven approaches such as a skin capture of an actor using a motion capture system. Although this generates impressive results to recreate the captured motion, it does not provide control to an animato...

  1. What the hyena's laugh tells: Sex, age, dominance and individual signature in the giggling call of Crocuta crocuta

    Weldele Mary

    2010-03-01

    Full Text Available Abstract Background Among mammals living in social groups, individuals form communication networks where they signal their identity and social status, facilitating social interaction. In spite of its importance for understanding of mammalian societies, the coding of individual-related information in the vocal signals of non-primate mammals has been relatively neglected. The present study focuses on the spotted hyena Crocuta crocuta, a social carnivore known for its complex female-dominated society. We investigate if and how the well-known hyena's laugh, also known as the giggle call, encodes information about the emitter. Results By analyzing acoustic structure in both temporal and frequency domains, we show that the hyena's laugh can encode information about age, individual identity and dominant/subordinate status, providing cues to receivers that could enable assessment of the social position of an emitting individual. Conclusions The range of messages encoded in the hyena's laugh is likely to play a role during social interactions. This call, together with other vocalizations and other sensory channels, should ensure an array of communication signals that support the complex social system of the spotted hyena. Experimental studies are now needed to decipher precisely the communication network of this species.

  2. The use of computer decision-making support systems to justify address rehabilitation of the Semipalatinsk test site area

    Viktoria V. Zaets; Alexey V. Panov

    2011-01-01

    The paper describes the development of a range of optimal protective measures for remediation of the territory of the Semipalatinsk Test Site. The computer system for decision-making support, ReSCA, was employed for the estimations. Costs and radiological effectiveness of countermeasures were evaluated.

  3. The use of computer decision-making support systems to justify address rehabilitation of the Semipalatinsk test site area

    Viktoria V. Zaets

    2011-05-01

    Full Text Available The paper describes the development of a range of optimal protective measures for remediation of the territory of the Semipalatinsk Test Site. The computer system for decision-making support, ReSCA, was employed for the estimations. Costs and radiological effectiveness of countermeasures were evaluated.

  4. Preservice Teacher Sense-Making as They Learn to Teach Reading as Seen through Computer-Mediated Discourse

    Stefanski, Angela J.; Leitze, Amy; Fife-Demski, Veronica M.

    2018-01-01

    This collective case study used methods of discourse analysis to consider what computer-mediated collaboration might reveal about preservice teachers' sense-making in a field-based practicum as they learn to teach reading to children identified as struggling readers. Researchers agree that field-based experiences coupled with time for reflection…

  5. A computer-tailored intervention to promote informed decision making for prostate cancer screening among African American men.

    Allen, Jennifer D; Mohllajee, Anshu P; Shelton, Rachel C; Drake, Bettina F; Mars, Dana R

    2009-12-01

    African American men experience a disproportionate burden of prostate cancer (CaP) morbidity and mortality. National screening guidelines advise men to make individualized screening decisions through a process termed informed decision making (IDM). In this pilot study, a computer-tailored decision-aid designed to promote IDM was evaluated using a pre-/posttest design. African American men aged 40 years and older were recruited from a variety of community settings (n = 108). At pretest, 43% of men reported having made a screening decision; at posttest 47% reported this to be the case (p = .39). Significant improvements were observed between pre- and posttest on scores of knowledge, decision self-efficacy, and decisional conflict. Men were also more likely to want an active role in decision making after using the tool. These results suggest that use of a computer-tailored decision aid is a promising strategy to promote IDM for CaP screening among African American men.

  6. Soft computing based on hierarchical evaluation approach and criteria interdependencies for energy decision-making problems: A case study

    Gitinavard, Hossein; Mousavi, S. Meysam; Vahdani, Behnam

    2017-01-01

    In numerous real-world energy decision problems, decision makers often encounter complex environments, in which existent imprecise data and uncertain information lead us to make an appropriate decision. In this paper, a new soft computing group decision-making approach is introduced based on novel compromise ranking method and interval-valued hesitant fuzzy sets (IVHFSs) for energy decision-making problems under multiple criteria. In the proposed approach, the assessment information is provided by energy experts or decision makers based on interval-valued hesitant fuzzy elements under incomplete criteria weights. In this respect, a new ranking index is presented respecting to interval-valued hesitant fuzzy Hamming distance measure to prioritize energy candidates, and criteria weights are computed based on an extended maximizing deviation method by considering the preferences experts' judgments about the relative importance of each criterion. Also, a decision making trial and evaluation laboratory (DEMATEL) method is extended under an IVHF-environment to compute the interdependencies between and within the selected criteria in the hierarchical structure. Accordingly, to demonstrate the applicability of the presented approach a case study and a practical example are provided regarding to hierarchical structure and criteria interdependencies relations for renewable energy and energy policy selection problems. Hence, the obtained computational results are compared with a fuzzy decision-making method from the recent literature based on some comparison parameters to show the advantages and constraints of the proposed approach. Finally, a sensitivity analysis is prepared to indicate effects of different criteria weights on ranking results to present the robustness or sensitiveness of the proposed soft computing approach versus the relative importance of criteria. - Highlights: • Introducing a novel interval-valued hesitant fuzzy compromise ranking method. • Presenting

  7. Chimpanzees (Pan troglodytes) Produce the Same Types of ‘Laugh Faces’ when They Emit Laughter and when They Are Silent

    Davila-Ross, Marina; Jesus, Goncalo; Osborne, Jade; Bard, Kim A.

    2015-01-01

    The ability to flexibly produce facial expressions and vocalizations has a strong impact on the way humans communicate, as it promotes more explicit and versatile forms of communication. Whereas facial expressions and vocalizations are unarguably closely linked in primates, the extent to which these expressions can be produced independently in nonhuman primates is unknown. The present work, thus, examined if chimpanzees produce the same types of facial expressions with and without accompanying vocalizations, as do humans. Forty-six chimpanzees (Pan troglodytes) were video-recorded during spontaneous play with conspecifics at the Chimfunshi Wildlife Orphanage. ChimpFACS was applied, a standardized coding system to measure chimpanzee facial movements, based on FACS developed for humans. Data showed that the chimpanzees produced the same 14 configurations of open-mouth faces when laugh sounds were present and when they were absent. Chimpanzees, thus, produce these facial expressions flexibly without being morphologically constrained by the accompanying vocalizations. Furthermore, the data indicated that the facial expression plus vocalization and the facial expression alone were used differently in social play, i.e., when in physical contact with the playmates and when matching the playmates’ open-mouth faces. These findings provide empirical evidence that chimpanzees produce distinctive facial expressions independently from a vocalization, and that their multimodal use affects communicative meaning, important traits for a more explicit and versatile way of communication. As it is still uncertain how human laugh faces evolved, the ChimpFACS data were also used to empirically examine the evolutionary relation between open-mouth faces with laugh sounds of chimpanzees and laugh faces of humans. The ChimpFACS results revealed that laugh faces of humans must have gradually emerged from laughing open-mouth faces of ancestral apes. This work examines the main evolutionary

  8. Making Water Pollution a Problem in the Classroom Through Computer Assisted Instruction.

    Flowers, John D.

    Alternative means for dealing with water pollution control are presented for students and teachers. One computer oriented program is described in terms of teaching wastewater treatment and pollution concepts to middle and secondary school students. Suggestions are given to help teachers use a computer simulation program in their classrooms.…

  9. Data needs and computational requirements for ST decision making. Internal deliverable ID6.2.1

    Clement, Rémy; Tournebise, Pascal; Perkin, Samuel

    The objective of this deliverable is to present the requirements for adapting available tools/models and identifying data needs for probabilistic reliability analysis and optimal decision-making in the short-term decision making process. It will serve as a basis for the next tasks of GARPUR work ...

  10. The Computational Complexity of Valuation and Motivational Forces in Decision-Making Processes.

    Redish, A David; Schultheiss, Nathan W; Carter, Evan C

    2016-01-01

    The concept of value is fundamental to most theories of motivation and decision making. However, value has to be measured experimentally. Different methods of measuring value produce incompatible valuation hierarchies. Taking the agent's perspective (rather than the experimenter's), we interpret the different valuation measurement methods as accessing different decision-making systems and show how these different systems depend on different information processing algorithms. This identifies the translation from these multiple decision-making systems into a single action taken by a given agent as one of the most important open questions in decision making today. We conclude by looking at how these different valuation measures accessing different decision-making systems can be used to understand and treat decision dysfunction such as in addiction.

  11. The Laugh Model: Reframing and Rebranding Public Health Through Social Media.

    Lister, Cameron; Royne, Marla; Payne, Hannah E; Cannon, Ben; Hanson, Carl; Barnes, Michael

    2015-11-01

    We examined the use of low-cost social media platforms in communicating public health messages and outline the laugh model, a framework through which public health organizations can reach and engage communities. In August 2014, we developed an online campaign (Web site and social media) to help promote healthy family meals in Utah in conjunction with the state and local health departments. By the end of September 2014, a total of 3641 individuals had visited the Utahfamilymeals.org Web site. Facebook ads reached a total of 29 078 people, and 56 900 people were reached through Twitter ads. The per-person price of the campaign was 0.2 cents, and the total estimated target population reach was between 10% and 12%. There are 3 key takeaways from our campaign: use of empowering and engaging techniques may be more effective than use of educational techniques; use of social media Web sites and online marketing tactics can enhance collaboration, interdisciplinary strategies, and campaign effectiveness; and use of social media as a communication platform is often preferable to use of mass media in terms of cost-effectiveness, more precise evaluations of campaign success, and increased sustainability.

  12. Nitrous oxide (laughing gas) inhalation as an alternative to electroconvulsive therapy.

    Milne, Brian

    2010-05-01

    Electroconvulsive therapy (ECT) is used widely in the treatment of psychiatric conditions; however, its use is not without controversy with some recommending a moratorium on its clinical use. Complications and side effects of ECT include memory loss, injury, problems originating from sympathetic stimulation such as arrhythmias and myocardial ischemia and the risk of general anesthesia. Nitrous oxide (laughing gas) could potentially substitute for ECT as it shares some similar effects, has potential beneficial properties for these psychiatric patients and is relatively safe and easy to administer. Nitrous oxide induces laughter which has been described as nature's epileptoid catharsis which one might surmise would be beneficial for depression. It also produces a central sympathetic stimulation similar to ECT and causes release of endogenous opioid peptides, which are potential candidates for the development of antidepressant drugs. Nitrous oxide is also associated with seizure like activity itself. Administration of nitrous oxide as a substitute for ECT is eminently feasible and could be given in a series of treatments similar to ECT therapy.

  13. The Laugh Model: Reframing and Rebranding Public Health Through Social Media

    Royne, Marla; Payne, Hannah E.; Cannon, Ben; Hanson, Carl; Barnes, Michael

    2015-01-01

    Objectives. We examined the use of low-cost social media platforms in communicating public health messages and outline the laugh model, a framework through which public health organizations can reach and engage communities. Methods. In August 2014, we developed an online campaign (Web site and social media) to help promote healthy family meals in Utah in conjunction with the state and local health departments. Results. By the end of September 2014, a total of 3641 individuals had visited the Utahfamilymeals.org Web site. Facebook ads reached a total of 29 078 people, and 56 900 people were reached through Twitter ads. The per-person price of the campaign was 0.2 cents, and the total estimated target population reach was between 10% and 12%. Conclusions. There are 3 key takeaways from our campaign: use of empowering and engaging techniques may be more effective than use of educational techniques; use of social media Web sites and online marketing tactics can enhance collaboration, interdisciplinary strategies, and campaign effectiveness; and use of social media as a communication platform is often preferable to use of mass media in terms of cost-effectiveness, more precise evaluations of campaign success, and increased sustainability. PMID:26378824

  14. Laugh syncope as a rare sub-type of the situational syncopes: a case report

    Nishida Katsufumi

    2008-06-01

    Full Text Available Abstract Introduction Laughter is a good medicine; it enhances cardiovascular health and the immune system. What happens, however, if a person laughs too much or the laughter becomes out of control? Laughter-induced syncope is rare and likely goes unrecognized by many health care providers. It is thought to be another form of Valsalva-induced syncope. Case presentation We report the case of a 56-year-old, moderately obese (body mass index of 35 man with a past medical history of sleep apnea, hypertension and hyperlipidemia who suffered from syncope secondary to intense laughter. The patient also had a history of syncope in the distant past when he collapsed on the floor for several seconds. Treadmill stress testing after the incident revealed no arrhythmia or ischemic disease, although he complained of dizziness after the test and a sudden drop in blood pressure was noted. Conclusion Laughter-induced or gelastic syncope is extremely rare. It is thought to be a sub-type of the situational syncopes.

  15. Oral health promotion and education messages in Live.Learn.Laugh. projects.

    Horn, Virginie; Phantumvanit, Prathip

    2014-10-01

    The FDI-Unilever Live.Learn.Laugh. phase 2 partnership involved dissemination of the key oral health message of encouraging 'twice-daily toothbrushing with fluoride toothpaste' and education of people worldwide by FDI, National Dental Associations, the Unilever Oral Care global team and local brands. The dissemination and education process used different methodologies, each targeting specific groups, namely: mother and child (Project option A); schoolchildren (Project option B); dentists and patients (Project option C); and specific communities (Project option D). Altogether, the partnership implemented 29 projects in 27 countries. These consisted of educational interventions, evaluations including (in some cases) clinical assessment, together with communication activities at both global and local levels, to increase the reach of the message to a broader population worldwide. The phase 2 experience reveals the strength of such a public-private partnership approach in tackling global oral health issues by creating synergies between partners and optimising the promotion and education process. © 2014 FDI World Dental Federation.

  16. Computer simulation, rhetoric, and the scientific imagination how virtual evidence shapes science in the making and in the news

    Roundtree, Aimee Kendall

    2013-01-01

    Computer simulations help advance climatology, astrophysics, and other scientific disciplines. They are also at the crux of several high-profile cases of science in the news. How do simulation scientists, with little or no direct observations, make decisions about what to represent? What is the nature of simulated evidence, and how do we evaluate its strength? Aimee Kendall Roundtree suggests answers in Computer Simulation, Rhetoric, and the Scientific Imagination. She interprets simulations in the sciences by uncovering the argumentative strategies that underpin the production and disseminati

  17. On the usage of ultrasound computational models for decision making under ambiguity

    Dib, Gerges; Sexton, Samuel; Prowant, Matthew; Crawford, Susan; Diaz, Aaron

    2018-04-01

    Computer modeling and simulation is becoming pervasive within the non-destructive evaluation (NDE) industry as a convenient tool for designing and assessing inspection techniques. This raises a pressing need for developing quantitative techniques for demonstrating the validity and applicability of the computational models. Computational models provide deterministic results based on deterministic and well-defined input, or stochastic results based on inputs defined by probability distributions. However, computational models cannot account for the effects of personnel, procedures, and equipment, resulting in ambiguity about the efficacy of inspections based on guidance from computational models only. In addition, ambiguity arises when model inputs, such as the representation of realistic cracks, cannot be defined deterministically, probabilistically, or by intervals. In this work, Pacific Northwest National Laboratory demonstrates the ability of computational models to represent field measurements under known variabilities, and quantify the differences using maximum amplitude and power spectrum density metrics. Sensitivity studies are also conducted to quantify the effects of different input parameters on the simulation results.

  18. Two adults with multiple disabilities use a computer-aided telephone system to make phone calls independently.

    Lancioni, Giulio E; O'Reilly, Mark F; Singh, Nirbhay N; Sigafoos, Jeff; Oliva, Doretta; Alberti, Gloria; Lang, Russell

    2011-01-01

    This study extended the assessment of a newly developed computer-aided telephone system with two participants (adults) who presented with blindness or severe visual impairment and motor or motor and intellectual disabilities. For each participant, the study was carried out according to an ABAB design, in which the A represented baseline phases and the B represented intervention phases, during which the special telephone system was available. The system involved among others a net-book computer provided with specific software, a global system for mobile communication modem, and a microswitch. Both participants learned to use the system very rapidly and managed to make phone calls independently to a variety of partners such as family members, friends and staff personnel. The results were discussed in terms of the technology under investigation (its advantages, drawbacks, and need of improvement) and the social-communication impact it can make for persons with multiple disabilities. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. A Social Cognition Perspective on Human–Computer Trust: The Effect of Perceived Warmth and Competence on Trust in Decision-Making With Computers

    Philipp Kulms

    2018-06-01

    Full Text Available Trust is a crucial guide in interpersonal interactions, helping people to navigate through social decision-making problems and cooperate with others. In human–computer interaction (HCI, trustworthy computer agents foster appropriate trust by supporting a match between their perceived and actual characteristics. As computers are increasingly endowed with capabilities for cooperation and intelligent problem-solving, it is critical to ask under which conditions people discern and distinguish trustworthy from untrustworthy technology. We present an interactive cooperation game framework allowing us to capture human social attributions that indicate trust in continued and interdependent human–agent cooperation. Within this framework, we experimentally examine the impact of two key dimensions of social cognition, warmth and competence, as antecedents of behavioral trust and self-reported trustworthiness attributions of intelligent computers. Our findings suggest that, first, people infer warmth attributions from unselfish vs. selfish behavior and competence attributions from competent vs. incompetent problem-solving. Second, warmth statistically mediates the relation between unselfishness and behavioral trust as well as between unselfishness and perceived trustworthiness. We discuss the possible role of human social cognition for human–computer trust.

  20. Using the calculational simulating complexes when making the computer process control systems for NPP

    Zimakov, V.N.; Chernykh, V.P.

    1998-01-01

    The problems on creating calculational-simulating (CSC) and their application by developing the program and program-technical means for computer-aided process control systems at NPP are considered. The abo- ve complex is based on the all-mode real time mathematical model, functioning at a special complex of computerized means

  1. Diagnosis and decision making in endodontics with the use of cone beam computed tomography

    Metska, M.E.

    2014-01-01

    In the current thesis the use of cone beam computed tomography (CBCT) in endodontics has been evaluated within the framework of ex vivo and in vivo studies. The first objective of the thesis was to examine whether CBCT scans can be used for the detection of vertical root fractures in endodontically

  2. Practice Makes Perfect: Using a Computer-Based Business Simulation in Entrepreneurship Education

    Armer, Gina R. M.

    2011-01-01

    This article explains the use of a specific computer-based simulation program as a successful experiential learning model and as a way to increase student motivation while augmenting conventional methods of business instruction. This model is based on established adult learning principles.

  3. Cost analysis of hash collisions : will quantum computers make SHARCS obsolete?

    Bernstein, D.J.

    2009-01-01

    Current proposals for special-purpose factorization hardware will become obsolete if large quantum computers are built: the number-field sieve scales much more poorly than Shor's quantum algorithm for factorization. Will all special-purpose cryptanalytic hardware become obsolete in a post-quantum

  4. Teaching Neuroanatomy Using Computer-Aided Learning: What Makes for Successful Outcomes?

    Svirko, Elena; Mellanby, Jane

    2017-01-01

    Computer-aided learning (CAL) is an integral part of many medical courses. The neuroscience course at Oxford University for medical students includes CAL course of neuroanatomy. CAL is particularly suited to this since neuroanatomy requires much detailed three-dimensional visualization, which can be presented on screen. The CAL course was…

  5. Computer-Supported Aids to Making Sense of Scientific Articles: Cognitive, Motivational, and Attitudinal Effects

    Gegner, Julie A.; Mackay, Donald H. J.; Mayer, Richard E.

    2009-01-01

    High school students can access original scientific research articles on the Internet, but may have trouble understanding them. To address this problem of online literacy, the authors developed a computer-based prototype for guiding students' comprehension of scientific articles. High school students were asked to read an original scientific…

  6. Impaired Flexible Reward-Based Decision-Making in Binge Eating Disorder: Evidence from Computational Modeling and Functional Neuroimaging.

    Reiter, Andrea M F; Heinze, Hans-Jochen; Schlagenhauf, Florian; Deserno, Lorenz

    2017-02-01

    Despite its clinical relevance and the recent recognition as a diagnostic category in the DSM-5, binge eating disorder (BED) has rarely been investigated from a cognitive neuroscientific perspective targeting a more precise neurocognitive profiling of the disorder. BED patients suffer from a lack of behavioral control during recurrent binge eating episodes and thus fail to adapt their behavior in the face of negative consequences, eg, high risk for obesity. To examine impairments in flexible reward-based decision-making, we exposed BED patients (n=22) and matched healthy individuals (n=22) to a reward-guided decision-making task during functional resonance imaging (fMRI). Performing fMRI analysis informed via computational modeling of choice behavior, we were able to identify specific signatures of altered decision-making in BED. On the behavioral level, we observed impaired behavioral adaptation in BED, which was due to enhanced switching behavior, a putative deficit in striking a balance between exploration and exploitation appropriately. This was accompanied by diminished activation related to exploratory decisions in the anterior insula/ventro-lateral prefrontal cortex. Moreover, although so-called model-free reward prediction errors remained intact, representation of ventro-medial prefrontal learning signatures, incorporating inference on unchosen options, was reduced in BED, which was associated with successful decision-making in the task. On the basis of a computational psychiatry account, the presented findings contribute to defining a neurocognitive phenotype of BED.

  7. Feasibility of neuro-morphic computing to emulate error-conflict based decision making.

    Branch, Darren W.

    2009-09-01

    A key aspect of decision making is determining when errors or conflicts exist in information and knowing whether to continue or terminate an action. Understanding the error-conflict processing is crucial in order to emulate higher brain functions in hardware and software systems. Specific brain regions, most notably the anterior cingulate cortex (ACC) are known to respond to the presence of conflicts in information by assigning a value to an action. Essentially, this conflict signal triggers strategic adjustments in cognitive control, which serve to prevent further conflict. The most probable mechanism is the ACC reports and discriminates different types of feedback, both positive and negative, that relate to different adaptations. Unique cells called spindle neurons that are primarily found in the ACC (layer Vb) are known to be responsible for cognitive dissonance (disambiguation between alternatives). Thus, the ACC through a specific set of cells likely plays a central role in the ability of humans to make difficult decisions and solve challenging problems in the midst of conflicting information. In addition to dealing with cognitive dissonance, decision making in high consequence scenarios also relies on the integration of multiple sets of information (sensory, reward, emotion, etc.). Thus, a second area of interest for this proposal lies in the corticostriatal networks that serve as an integration region for multiple cognitive inputs. In order to engineer neurological decision making processes in silicon devices, we will determine the key cells, inputs, and outputs of conflict/error detection in the ACC region. The second goal is understand in vitro models of corticostriatal networks and the impact of physical deficits on decision making, specifically in stressful scenarios with conflicting streams of data from multiple inputs. We will elucidate the mechanisms of cognitive data integration in order to implement a future corticostriatal-like network in silicon

  8. Gene expression, glutathione status and indicators of hepatic oxidative stress in laughing gull (Larus atricilla) hatchlings exposed to methylmercury

    Jenko, Kathryn; Karouna-Renier, Natalie K.; Hoffman, David J.

    2012-01-01

    Despite extensive studies of methylmercury (MeHg) toxicity in birds, molecular effects on birds are poorly characterized. To improve our understanding of toxicity pathways and identify novel indicators of avian exposure to Hg, the authors investigated genomic changes, glutathione status, and oxidative status indicators in liver from laughing gull (Larus atricilla) hatchlings that were exposed in ovo to MeHg (0.05–1.6 µg/g). Genes involved in the transsulfuration pathway, iron transport and storage, thyroid-hormone related processes, and cellular respiration were identified by suppression subtractive hybridization as differentially expressed. Quantitative polymerase chain reaction (qPCR) identified statistically significant effects of Hg on cytochrome C oxidase subunits I and II, transferrin, and methionine adenosyltransferase RNA expression. Glutathione-S-transferase activity and protein-bound sulfhydryl levels decreased, whereas glucose-6-phosphate dehydrogenase activity increased dose-dependently. Total sulfhydryl concentrations were significantly lower at 0.4 µg/g Hg than in controls. T ogether, these endpoints provided some evidence of compensatory effects, but little indication of oxidative damage at the tested doses, and suggest that sequestration of Hg through various pathways may be important for minimizing toxicity in laughing gulls. This is the first study to describe the genomic response of an avian species to Hg. Laughing gulls are among the less sensitive avian species with regard to Hg toxicity, and their ability to prevent hepatic oxidative stress may be important for surviving levels of MeHg exposures at which other species succumb.

  9. Improved Targeting Through Collaborative Decision-Making and Brain Computer Interfaces

    Stoica, Adrian; Barrero, David F.; McDonald-Maier, Klaus

    2013-01-01

    This paper reports a first step toward a brain-computer interface (BCI) for collaborative targeting. Specifically, we explore, from a broad perspective, how the collaboration of a group of people can increase the performance on a simple target identification task. To this end, we requested a group of people to identify the location and color of a sequence of targets appearing on the screen and measured the time and accuracy of the response. The individual results are compared to a collective identification result determined by simple majority voting, with random choice in case of drawn. The results are promising, as the identification becomes significantly more reliable even with this simple voting and a small number of people (either odd or even number) involved in the decision. In addition, the paper briefly analyzes the role of brain-computer interfaces in collaborative targeting, extending the targeting task by using a BCI instead of a mechanical response.

  10. Do Social Computing Make You Happy? A Case Study of Nomadic Children in Mixed Environments

    Christensen, Bent Guldbjerg

    2005-01-01

    In this paper I describe a perspective on ambient, ubiquitous, and pervasive computing called the happiness perspective. By using the happiness perspective, the application domain and how the technology is used and experienced, becomes a central and integral part of perceiving ambient technology....... will use the perspective in a case study on field test experiments with nomadic children in mixed environments using the eBag system....

  11. "Did You Climax or Are You Just Laughing at Me?" Rare Phenomena Associated With Orgasm.

    Reinert, Anna E; Simon, James A

    2017-07-01

    The study of the human orgasm has shown a core set of physiologic and psychological symptoms experienced by most individuals. The study of normal sheds light on the abnormal and has spotlighted rare physical and psychological symptoms experienced by some individuals in association with orgasm. These phenomena are rare and, as is typical of rare phenomena, their documentation in the medical literature is largely confined to case studies. To identify peri-orgasmic phenomena, defined as unusual physical or psychological symptoms subjectively experienced by some individuals as part of the orgasm response, distinct from the usual or normal orgasm response. A list of peri-orgasmic phenomena was made with help from sexual health colleagues and, using this list as a foundation, a literature search was performed of articles published in English. Publications included in this review report on physical or psychological phenomena at the time of orgasm that are distinct from psychological, whole-body, and genito-pelvic sensations commonly experienced at the time of orgasm. Cases of physical symptoms related to the physiology of sexual intercourse and not specifically to orgasm were excluded. Case studies of peri-orgasmic phenomena were reviewed, including cases describing cataplexy (weakness), crying, dysorgasmia, dysphoria, facial and/or ear pain, foot pain, headache, pruritus, laughter, panic attack, post-orgasm illness syndrome, seizures, and sneezing. The literature review confirms the existence of diverse and frequently replicated peri-orgasmic phenomena. The value of case studies is in the collection and recording of observations so that hypotheses can be formed about the observed phenomena. Accordingly, this review could inspire further research on the neurophysiologic mechanisms of orgasm. Reinert AE, Simon JA. "Did You Climax or Are You Just Laughing at Me?" Rare Phenomena Associated With Orgasm. Sex Med Rev 2017;5:275-281. Copyright © 2017 International Society for

  12. A Computational Approach to Characterizing the Impact of Social Influence on Individuals’ Vaccination Decision Making

    Xia, Shang; Liu, Jiming

    2013-01-01

    In modeling individuals vaccination decision making, existing studies have typically used the payoff-based (e.g., game-theoretical) approaches that evaluate the risks and benefits of vaccination. In reality, whether an individual takes vaccine or not is also influenced by the decisions of others, i.e., due to the impact of social influence. In this regard, we present a dual-perspective view on individuals decision making that incorporates both the cost analysis of vaccination and the impact of social influence. In doing so, we consider a group of individuals making their vaccination decisions by both minimizing the associated costs and evaluating the decisions of others. We apply social impact theory (SIT) to characterize the impact of social influence with respect to individuals interaction relationships. By doing so, we propose a novel modeling framework that integrates an extended SIT-based characterization of social influence with a game-theoretical analysis of cost minimization. We consider the scenario of voluntary vaccination against an influenza-like disease through a series of simulations. We investigate the steady state of individuals’ decision making, and thus, assess the impact of social influence by evaluating the coverage of vaccination for infectious diseases control. Our simulation results suggest that individuals high conformity to social influence will increase the vaccination coverage if the cost of vaccination is low, and conversely, will decrease it if the cost is high. Interestingly, if individuals are social followers, the resulting vaccination coverage would converge to a certain level, depending on individuals’ initial level of vaccination willingness rather than the associated costs. We conclude that social influence will have an impact on the control of an infectious disease as they can affect the vaccination coverage. In this respect, our work can provide a means for modeling the impact of social influence as well as for estimating

  13. A computational approach to characterizing the impact of social influence on individuals' vaccination decision making.

    Xia, Shang; Liu, Jiming

    2013-01-01

    In modeling individuals vaccination decision making, existing studies have typically used the payoff-based (e.g., game-theoretical) approaches that evaluate the risks and benefits of vaccination. In reality, whether an individual takes vaccine or not is also influenced by the decisions of others, i.e., due to the impact of social influence. In this regard, we present a dual-perspective view on individuals decision making that incorporates both the cost analysis of vaccination and the impact of social influence. In doing so, we consider a group of individuals making their vaccination decisions by both minimizing the associated costs and evaluating the decisions of others. We apply social impact theory (SIT) to characterize the impact of social influence with respect to individuals interaction relationships. By doing so, we propose a novel modeling framework that integrates an extended SIT-based characterization of social influence with a game-theoretical analysis of cost minimization. We consider the scenario of voluntary vaccination against an influenza-like disease through a series of simulations. We investigate the steady state of individuals' decision making, and thus, assess the impact of social influence by evaluating the coverage of vaccination for infectious diseases control. Our simulation results suggest that individuals high conformity to social influence will increase the vaccination coverage if the cost of vaccination is low, and conversely, will decrease it if the cost is high. Interestingly, if individuals are social followers, the resulting vaccination coverage would converge to a certain level, depending on individuals' initial level of vaccination willingness rather than the associated costs. We conclude that social influence will have an impact on the control of an infectious disease as they can affect the vaccination coverage. In this respect, our work can provide a means for modeling the impact of social influence as well as for estimating the

  14. Making of attenuation-correcting computation table for RIs and emitted gamma ray table using MS-Excel

    Miura, Shigeyuki; Takahashi, Mitsuyuki; Sato, Isamu

    1995-01-01

    In the technical workshop of National Institute for Fusion Science in the last year, report was made on the making of attenuation-correcting computation table for R/S by using the software Lotus 1-2-3 on MS-DOS. It was decided to use this table by applying Windows, and further, to partially add some functions to this table. Excel 5.0 was to be used as the software, since Excel seems to be the main of Windows. It was decided to make anew the γ-ray data table which is linked to the radioactivity data in the RI attenuation-correcting computation table. First work is to convert the RI attenuation-correcting computation table made as the file of Lotus 1-2-3 to the file of Excel 5.0 of Windows, and this is very simple. As the result of the file conversion, it was found that the data file became compact. Next work is the addition of functions to this table. The function being added this time is that for judging whether R/S are those which are stipulated in the laws or not from the values of radioactivity calculated by the attenuation correction. The concrete method of this addition of function is explained. The data table on the γ-ray for respective nuclides was made. The present state of the data base on radiation was investigated. (K.I.)

  15. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations

    Andrea Stocco

    2018-04-01

    Full Text Available This article describes the data analyzed in the paper “Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model” (Stocco et al., 2017 [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004 [2], Simon task (Craft and Simon, 1970 [3], and Automated Operation Span (Unsworth et al., 2005 [4], as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  16. Human performance across decision making, selective attention, and working memory tasks: Experimental data and computer simulations.

    Stocco, Andrea; Yamasaki, Brianna L; Prat, Chantel S

    2018-04-01

    This article describes the data analyzed in the paper "Individual differences in the Simon effect are underpinned by differences in the competitive dynamics in the basal ganglia: An experimental verification and a computational model" (Stocco et al., 2017) [1]. The data includes behavioral results from participants performing three cognitive tasks (Probabilistic Stimulus Selection (Frank et al., 2004) [2], Simon task (Craft and Simon, 1970) [3], and Automated Operation Span (Unsworth et al., 2005) [4]), as well as simulationed traces generated by a computational neurocognitive model that accounts for individual variations in human performance across the tasks. The experimental data encompasses individual data files (in both preprocessed and native output format) as well as group-level summary files. The simulation data includes the entire model code, the results of a full-grid search of the model's parameter space, and the code used to partition the model space and parallelize the simulations. Finally, the repository includes the R scripts used to carry out the statistical analyses reported in the original paper.

  17. Computer-Based Support of Decision Making Processes during Biological Incidents

    Karel Antos

    2010-04-01

    Full Text Available The paper describes contextual analysis of a general system that should provide a computerized support of decision making processes related to response operations in case of a biological incident. This analysis is focused on information systems and information resources perspective and their integration using appropriate tools and technology. In the contextual design the basic modules of BioDSS system are suggested and further elaborated. The modules deal with incident description, scenarios development and recommendation of appropriate countermeasures. Proposals for further research are also included.

  18. A New Decision-Making Method for Stock Portfolio Selection Based on Computing with Linguistic Assessment

    Chen-Tung Chen

    2009-01-01

    Full Text Available The purpose of stock portfolio selection is how to allocate the capital to a large number of stocks in order to bring a most profitable return for investors. In most of past literatures, experts considered the portfolio of selection problem only based on past crisp or quantitative data. However, many qualitative and quantitative factors will influence the stock portfolio selection in real investment situation. It is very important for experts or decision-makers to use their experience or knowledge to predict the performance of each stock and make a stock portfolio. Because of the knowledge, experience, and background of each expert are different and vague, different types of 2-tuple linguistic variable are suitable used to express experts' opinions for the performance evaluation of each stock with respect to criteria. According to the linguistic evaluations of experts, the linguistic TOPSIS and linguistic ELECTRE methods are combined to present a new decision-making method for dealing with stock selection problems in this paper. Once the investment set has been determined, the risk preferences of investor are considered to calculate the investment ratio of each stock in the investment set. Finally, an example is implemented to demonstrate the practicability of the proposed method.

  19. Intersubjective decision-making for computer-aided forging technology design

    Kanyukov, S. I.; Konovalov, A. V.; Muizemnek, O. Yu.

    2017-12-01

    We propose a concept of intersubjective decision-making for problems of open-die forging technology design. The intersubjective decisions are chosen from a set of feasible decisions using the fundamentals of the decision-making theory in fuzzy environment according to the Bellman-Zadeh scheme. We consider the formalization of subjective goals and the choice of membership functions for the decisions depending on subjective goals. We study the arrangement of these functions into an intersubjective membership function. The function is constructed for a resulting decision, which is chosen from a set of feasible decisions. The choice of the final intersubjective decision is discussed. All the issues are exemplified by a specific technological problem. The considered concept of solving technological problems under conditions of fuzzy goals allows one to choose the most efficient decisions from a set of feasible ones. These decisions correspond to the stated goals. The concept allows one to reduce human participation in automated design. This concept can be used to develop algorithms and design programs for forging numerous types of forged parts.

  20. Global analysis of dynamical decision-making models through local computation around the hidden saddle.

    Laura Trotta

    Full Text Available Bistable dynamical switches are frequently encountered in mathematical modeling of biological systems because binary decisions are at the core of many cellular processes. Bistable switches present two stable steady-states, each of them corresponding to a distinct decision. In response to a transient signal, the system can flip back and forth between these two stable steady-states, switching between both decisions. Understanding which parameters and states affect this switch between stable states may shed light on the mechanisms underlying the decision-making process. Yet, answering such a question involves analyzing the global dynamical (i.e., transient behavior of a nonlinear, possibly high dimensional model. In this paper, we show how a local analysis at a particular equilibrium point of bistable systems is highly relevant to understand the global properties of the switching system. The local analysis is performed at the saddle point, an often disregarded equilibrium point of bistable models but which is shown to be a key ruler of the decision-making process. Results are illustrated on three previously published models of biological switches: two models of apoptosis, the programmed cell death and one model of long-term potentiation, a phenomenon underlying synaptic plasticity.

  1. Computational modeling for eco engineering: Making the connections between engineering and ecology (Invited)

    Bowles, C.

    2013-12-01

    Ecological engineering, or eco engineering, is an emerging field in the study of integrating ecology and engineering, concerned with the design, monitoring, and construction of ecosystems. According to Mitsch (1996) 'the design of sustainable ecosystems intends to integrate human society with its natural environment for the benefit of both'. Eco engineering emerged as a new idea in the early 1960s, and the concept has seen refinement since then. As a commonly practiced field of engineering it is relatively novel. Howard Odum (1963) and others first introduced it as 'utilizing natural energy sources as the predominant input to manipulate and control environmental systems'. Mtisch and Jorgensen (1989) were the first to define eco engineering, to provide eco engineering principles and conceptual eco engineering models. Later they refined the definition and increased the number of principles. They suggested that the goals of eco engineering are: a) the restoration of ecosystems that have been substantially disturbed by human activities such as environmental pollution or land disturbance, and b) the development of new sustainable ecosystems that have both human and ecological values. Here a more detailed overview of eco engineering is provided, particularly with regard to how engineers and ecologists are utilizing multi-dimensional computational models to link ecology and engineering, resulting in increasingly successful project implementation. Descriptions are provided pertaining to 1-, 2- and 3-dimensional hydrodynamic models and their use at small- and large-scale applications. A range of conceptual models that have been developed to aid the in the creation of linkages between ecology and engineering are discussed. Finally, several case studies that link ecology and engineering via computational modeling are provided. These studies include localized stream rehabilitation, spawning gravel enhancement on a large river system, and watershed-wide floodplain modeling of

  2. Women planning to major in computer science: Who are they and what makes them unique?

    Lehman, Kathleen J.; Sax, Linda J.; Zimmerman, Hilary B.

    2016-12-01

    Despite the current growing popularity of the computer science (CS) major, women remain sorely underrepresented in the field, continuing to earn only 18% of bachelor's degrees. Understanding women's low rates of participation in CS is important given that the demand for individuals with CS training has grown sharply in recent years. Attracting and retaining more women to high-paying fields like CS may also help narrow the gender pay gap. Further, it is important that women participate in developing new technology so that technology advances serve the needs of both women and men. This paper explores the background characteristics, career aspirations, and self-perceptions of 1636 female first-year college students in the United States who intend to major in CS and compares them with 4402 male CS aspirants as well as with 26,642 women planning to major in other STEM sub-fields. The findings reveal a unique profile of women who pursue the CS major and notes many significant differences between men and women in CS and between women in CS and those in other STEM fields. For instance, women in CS tend to earn lower high school grades than women in other STEM fields, but earn higher SAT verbal scores. They also rate themselves higher than men in CS and women in other STEM fields on measures of their artistic ability, but rate themselves lower on other self-ratings, including academic and leadership ability. Further, women in CS are more likely to be undecided in their career plans than men in CS and women in other STEM fields. Understanding the unique characteristics of women in CS will help inform policies and recruitment programs designed to address the gender gap in computing.

  3. INTEGRATED ON-BOARD COMPUTING SYSTEMS: PRESENT SITUATION REVIEW AND DEVELOPMENT PROSPECTS ANALYSIS IN THE AVIATION INSTRUMENT-MAKING INDUSTRY

    P. P. Paramonov

    2013-03-01

    Full Text Available The article deals with present situation review and analysis of development prospects for integrated on-board computing systems, used in the aviation instrument-making industry. The main attention is paid to the projects carried out in the framework of an integrated modular avionics. Hierarchical levels of module design, crates (onboard systems and aviation complexes are considered in detail. Examples of the existing products of our country and from abroad and their brief technical characteristics are given and voluminous bibliography on the subject matter as well.

  4. Decision-making in stimulant and opiate addicts in protracted abstinence: evidence from computational modeling with pure users.

    Ahn, Woo-Young; Vasilev, Georgi; Lee, Sung-Ha; Busemeyer, Jerome R; Kruschke, John K; Bechara, Antoine; Vassileva, Jasmin

    2014-01-01

    Substance dependent individuals (SDI) often exhibit decision-making deficits; however, it remains unclear whether the nature of the underlying decision-making processes is the same in users of different classes of drugs and whether these deficits persist after discontinuation of drug use. We used computational modeling to address these questions in a unique sample of relatively "pure" amphetamine-dependent (N = 38) and heroin-dependent individuals (N = 43) who were currently in protracted abstinence, and in 48 healthy controls (HC). A Bayesian model comparison technique, a simulation method, and parameter recovery tests were used to compare three cognitive models: (1) Prospect Valence Learning with decay reinforcement learning rule (PVL-DecayRI), (2) PVL with delta learning rule (PVL-Delta), and (3) Value-Plus-Perseverance (VPP) model based on Win-Stay-Lose-Switch (WSLS) strategy. The model comparison results indicated that the VPP model, a hybrid model of reinforcement learning (RL) and a heuristic strategy of perseverance had the best post-hoc model fit, but the two PVL models showed better simulation and parameter recovery performance. Computational modeling results suggested that overall all three groups relied more on RL than on a WSLS strategy. Heroin users displayed reduced loss aversion relative to HC across all three models, which suggests that their decision-making deficits are longstanding (or pre-existing) and may be driven by reduced sensitivity to loss. In contrast, amphetamine users showed comparable cognitive functions to HC with the VPP model, whereas the second best-fitting model with relatively good simulation performance (PVL-DecayRI) revealed increased reward sensitivity relative to HC. These results suggest that some decision-making deficits persist in protracted abstinence and may be mediated by different mechanisms in opiate and stimulant users.

  5. Decision-making in stimulant and opiate addicts in protracted abstinence: evidence from computational modeling with pure users

    Woo-Young eAhn

    2014-08-01

    Full Text Available Substance dependent individuals (SDI often exhibit decision-making deficits; however, it remains unclear whether the nature of the underlying decision-making processes is the same in users of different classes of drugs and whether these deficits persist after discontinuation of drug use. We used computational modeling to address these questions in a unique sample of relatively pure amphetamine-dependent (N=38 and heroin-dependent individuals (N=43 who were currently in protracted abstinence, and in 48 healthy controls. A Bayesian model comparison technique, a simulation method, and parameter recovery tests were used to compare three cognitive models: (1 Prospect Valence Learning with decay reinforcement learning rule (PVL-DecayRI, (2 PVL with delta learning rule (PVL-Delta, and (3 Value-Plus-Perseverance (VPP models based on Win-Stay-Lose-Switch (WSLS strategy. The model comparison results indicated that the VPP model, a hybrid model of reinforcement learning (RL and a heuristic strategy of perseverance had the best post hoc model fit, but the two PVL models showed better simulation performance. Computational modeling results suggested that overall all three groups relied more on RL than on a WSLS strategy. Heroin users displayed reduced loss aversion relative to healthy controls across all three models, which suggests that their decision-making deficits are longstanding (or pre-existing and may be driven by reduced sensitivity to loss. In contrast, amphetamine users showed comparable cognitive functions to healthy controls with the VPP model, whereas the second best-fitting model with relatively good simulation performance (PVL-DecayRI revealed increased reward sensitivity relative to healthy controls. These results suggest that some decision-making deficits persist in protracted abstinence and may be mediated by different mechanisms in opiate and stimulant users.

  6. A new computational account of cognitive control over reinforcement-based decision-making: Modeling of a probabilistic learning task.

    Zendehrouh, Sareh

    2015-11-01

    Recent work on decision-making field offers an account of dual-system theory for decision-making process. This theory holds that this process is conducted by two main controllers: a goal-directed system and a habitual system. In the reinforcement learning (RL) domain, the habitual behaviors are connected with model-free methods, in which appropriate actions are learned through trial-and-error experiences. However, goal-directed behaviors are associated with model-based methods of RL, in which actions are selected using a model of the environment. Studies on cognitive control also suggest that during processes like decision-making, some cortical and subcortical structures work in concert to monitor the consequences of decisions and to adjust control according to current task demands. Here a computational model is presented based on dual system theory and cognitive control perspective of decision-making. The proposed model is used to simulate human performance on a variant of probabilistic learning task. The basic proposal is that the brain implements a dual controller, while an accompanying monitoring system detects some kinds of conflict including a hypothetical cost-conflict one. The simulation results address existing theories about two event-related potentials, namely error related negativity (ERN) and feedback related negativity (FRN), and explore the best account of them. Based on the results, some testable predictions are also presented. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. The fear of being laughed at as additional diagnostic criterion in social anxiety disorder and avoidant personality disorder?

    Havranek, Michael M; Volkart, Fleur; Bolliger, Bianca; Roos, Sophie; Buschner, Maximilian; Mansour, Ramin; Chmielewski, Thomas; Gaudlitz, Katharina; Hättenschwiler, Josef; Seifritz, Erich; Ruch, Willibald

    2017-01-01

    Social anxiety disorder (SAD) is the most common anxiety disorder and has considerable negative impact on social functioning, quality of life, and career progression of those affected. Gelotophobia (the fear of being laughed at) shares many similarities and has therefore been proposed as a subtype of SAD. This hypothesis has, however, never been tested in a clinical sample. Thus, the relationship between gelotophobia, SAD and avoidant personality disorder (APD) was investigated by examining a sample of 133 participants (64 psychiatric patients and 69 healthy controls matched for age and sex) using the Structured Clinical Interview for the Diagnostic and Statistical Manual of Mental Disorders (4th edition) and an established rating instrument for gelotophobia (GELOPH). As expected, gelotophobia scores and the number of gelotophobic individuals were significantly higher among patients with SAD (n = 22) and APD (n = 12) compared to healthy controls and other psychiatric patients. Furthermore, gelotophobia scores were highest in patients suffering from both SAD and APD. In fact, all patients suffering from both disorders were also suffering from gelotophobia. As explained in the discussion, the observed data did not suggest that gelotophobia is a subtype of SAD. The findings rather imply that the fear of being laughed at is a symptom characteristic for both SAD and APD. Based on that, gelotophobia may prove to be a valuable additional diagnostic criterion for SAD and APD and the present results also contribute to the ongoing debate on the relationship between SAD and APD.

  8. The fear of being laughed at as additional diagnostic criterion in social anxiety disorder and avoidant personality disorder?

    Michael M Havranek

    Full Text Available Social anxiety disorder (SAD is the most common anxiety disorder and has considerable negative impact on social functioning, quality of life, and career progression of those affected. Gelotophobia (the fear of being laughed at shares many similarities and has therefore been proposed as a subtype of SAD. This hypothesis has, however, never been tested in a clinical sample. Thus, the relationship between gelotophobia, SAD and avoidant personality disorder (APD was investigated by examining a sample of 133 participants (64 psychiatric patients and 69 healthy controls matched for age and sex using the Structured Clinical Interview for the Diagnostic and Statistical Manual of Mental Disorders (4th edition and an established rating instrument for gelotophobia (GELOPH. As expected, gelotophobia scores and the number of gelotophobic individuals were significantly higher among patients with SAD (n = 22 and APD (n = 12 compared to healthy controls and other psychiatric patients. Furthermore, gelotophobia scores were highest in patients suffering from both SAD and APD. In fact, all patients suffering from both disorders were also suffering from gelotophobia. As explained in the discussion, the observed data did not suggest that gelotophobia is a subtype of SAD. The findings rather imply that the fear of being laughed at is a symptom characteristic for both SAD and APD. Based on that, gelotophobia may prove to be a valuable additional diagnostic criterion for SAD and APD and the present results also contribute to the ongoing debate on the relationship between SAD and APD.

  9. Diets of nesting laughing gulls (Larus atricilla) at the Virginia Coast Reserve: observations from stable isotope analysis

    Knoff, A.J.; Macko, S.A.; Erwin, R.M.

    2001-01-01

    Food web studies often ignore details of temporal, spatial, and intrapopulation dietary variation in top-level consumers. In this study, intrapopulation dietary variation of a dominant carnivore, the Laughing Gull (Larus atricilla), was examined using carbon, nitrogen, and sulfur isotope analysis of gull tissues as well as their prey (fish, invertebrates, and insects) from the Virginia Coast Reserve estuarine system. As earlier traditional diet studies found evidence of individual dietary specialization within gull populations, this study used stable isotope analysis to assess specialization in a coastal Laughing Gull population. Specifically, blood, muscle, and feather isotope values indicated significant intrapopulation dietary specialization. Some gulls relied more heavily on estuarine prey (mean blood δ13C = -17.5, δ15N = 12.6, and δ34S = 9.3), whereas others appeared to consume more foods of marine origin (mean blood δ13C = -19.4, δ15N = 14.8, and δ34S = 10.4). It is important to account for such dietary variability when assessing trophic linkages in dynamic estuarine systems.

  10. COMPUTING

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  11. COMPUTING

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  12. Clean air, clear market. Making emissions trading work: The role of a computer-assisted auction

    Bartels, C.W.; Marron, D.B.; Lipsky, M.I.

    1993-01-01

    Creating a new commodity presents the chance to develop new markets in which to trade it. In many cases, existing markets can be adapted easily; in other cases it proves worthwhile to develop new forms that reflect special characteristics of the commodity and those who trade it. In the case of the sulfur dioxide (SO 2 ) emission allowances created by the Clean Air Act Amendments of 1990, a number of standard market forms already have been adopted. While these will prove useful for handling some transactions, a new Market Clearing Auction (MCA) offers buyers and sellers a centralized marketplace for trading SO 2 emission allowances. The MCA, which was developed by the brokerage firm Cantor Fitzgerald, is a computer-assisted open-quotes smartclose quotes auction designed to replicate the outcome of an efficient market in emission allowances, and accepts bids and offers for any possible combination of allowances. Orders can be submitted for streams of allowances. Orders can be submitted for streams of allowances covering more than one year. The auction then determines the combination of bids and offers that maximizes the gains from trades in the market, and establishes uniform market clearing prices for each allowance issue (1995, 1996, and so on). Once executed, trades are settled on a cash-forward basis; that is, allowances are delivered and payments are made at future dates

  13. Dual Coding Theory Explains Biphasic Collective Computation in Neural Decision-Making

    Bryan C. Daniels

    2017-06-01

    Full Text Available A central question in cognitive neuroscience is how unitary, coherent decisions at the whole organism level can arise from the distributed behavior of a large population of neurons with only partially overlapping information. We address this issue by studying neural spiking behavior recorded from a multielectrode array with 169 channels during a visual motion direction discrimination task. It is well known that in this task there are two distinct phases in neural spiking behavior. Here we show Phase I is a distributed or incompressible phase in which uncertainty about the decision is substantially reduced by pooling information from many cells. Phase II is a redundant or compressible phase in which numerous single cells contain all the information present at the population level in Phase I, such that the firing behavior of a single cell is enough to predict the subject's decision. Using an empirically grounded dynamical modeling framework, we show that in Phase I large cell populations with low redundancy produce a slow timescale of information aggregation through critical slowing down near a symmetry-breaking transition. Our model indicates that increasing collective amplification in Phase II leads naturally to a faster timescale of information pooling and consensus formation. Based on our results and others in the literature, we propose that a general feature of collective computation is a “coding duality” in which there are accumulation and consensus formation processes distinguished by different timescales.

  14. Dual Coding Theory Explains Biphasic Collective Computation in Neural Decision-Making.

    Daniels, Bryan C; Flack, Jessica C; Krakauer, David C

    2017-01-01

    A central question in cognitive neuroscience is how unitary, coherent decisions at the whole organism level can arise from the distributed behavior of a large population of neurons with only partially overlapping information. We address this issue by studying neural spiking behavior recorded from a multielectrode array with 169 channels during a visual motion direction discrimination task. It is well known that in this task there are two distinct phases in neural spiking behavior. Here we show Phase I is a distributed or incompressible phase in which uncertainty about the decision is substantially reduced by pooling information from many cells. Phase II is a redundant or compressible phase in which numerous single cells contain all the information present at the population level in Phase I, such that the firing behavior of a single cell is enough to predict the subject's decision. Using an empirically grounded dynamical modeling framework, we show that in Phase I large cell populations with low redundancy produce a slow timescale of information aggregation through critical slowing down near a symmetry-breaking transition. Our model indicates that increasing collective amplification in Phase II leads naturally to a faster timescale of information pooling and consensus formation. Based on our results and others in the literature, we propose that a general feature of collective computation is a "coding duality" in which there are accumulation and consensus formation processes distinguished by different timescales.

  15. Monkeys Wait to Begin a Computer Task when Waiting Makes Their Responses More Effective

    Theodore A. Evans

    2014-02-01

    Full Text Available Rhesus monkeys (Macaca mulatta and capuchin monkeys (Cebus apella performed a computerized inhibitory control task modeled after an “escalating interest task” from a recent human study (Young, Webb, & Jacobs, 2011. In the original study, which utilized a first-person shooter game, human participants learned to inhibit firing their simulated weapon long enough for the weapon‟s damage potential to grow in effectiveness (up to 10 seconds in duration. In the present study, monkeys earned food pellets for eliminating arrays of target objects using a digital eraser. We assessed whether monkeys could suppress trial-initiating joystick movements long enough for the eraser to grow in size and speed, thereby making their eventual responses more effective. Monkeys of both species learned to inhibit moving the eraser for as long as 10 seconds, and they allowed the eraser to grow larger for successively larger target arrays. This study demonstrates an interesting parallel in behavioral inhibition between human and nonhuman participants and provides a method for future comparative testing of human and nonhuman test groups.

  16. Hypothetical decision making in schizophrenia: The role of expected value computation and “irrational” biases

    Brown, Jaime K.; Waltz, James A.; Strauss, Gregory P.; McMahon, Robert P.; Frank, Michael J.; Gold, James M.

    2013-01-01

    The aim of the present study was to examine the contributions to decision making (DM) deficits in schizophrenia (SZ) patients, of expected value (EV) estimation and loss aversion. Patients diagnosed with SZ (n=46) and healthy controls (n=34) completed two gambling tasks. In one task, participants chose between two options with the same EV across two conditions: Loss frames and Keep frames. A second task involved accepting or rejecting gambles, in which gain and loss amounts varied, determining the EV of each trial. SZ patients showed a reduced “framing effect” relative to controls, as they did not show an increased tendency to gamble when faced with a certain loss. SZ patients also showed a reduced tendency to modify behavior as a function of EV. The degree to which choices tracked EV correlated significantly with several cognitive measures in both patients and controls. SZ patients show distinct deviations from normal behavior under risk when their decisions are based on prospective outcomes. These deviations are two-fold: cognitive deficits prevent value-based DM in more-impaired patients, and in less-impaired patients there is a lack of influence from well-established subjective biases found in healthy people. These abnormalities likely affect every-day DM strategies in schizophrenia patients. PMID:23664664

  17. Advertising, patient decision making, and self-referral for computed tomographic and magnetic resonance imaging.

    Illes, Judy; Kann, Dylan; Karetsky, Kim; Letourneau, Phillip; Raffin, Thomas A; Schraedley-Desmond, Pamela; Koenig, Barbara A; Atlas, Scott W

    Self-referred imaging is one of the latest health care services to be marketed directly to consumers. Most aspects of these services are unregulated, and little is known about the messages in advertising used to attract potential consumers. We conducted a detailed analysis of print advertisements and informational brochures for self-referred imaging with respect to themes, content, accuracy, and emotional valence. Forty print advertisements from US newspapers around the country and 20 informational brochures were analyzed by 2 independent raters according to 7 major themes: health care technology; emotion, empowerment, and assurance; incentives; limited supporting evidence; popular appeal; statistics; and images. The Fisher exact test was used to identify significant differences in information content. Both the advertisements and the brochures emphasized health care and technology information and provided assurances of good health and incentives to self-refer. These materials also encouraged consumers to seek further information from company resources; virtually none referred to noncomplying sources of information or to the risks of having a scan. Images of people commonly portrayed European Americans. We found statistical differences between newspaper advertisements and mailed brochures for references to "prevalence of disease" (Padvertisements (n = 15) and 25% of the brochures (n = 5). Direct-to-consumer marketing of self-referred imaging services, in both print advertisements and informational brochures, fails to provide prospective consumers with comprehensive balanced information vital to informed autonomous decision making. Professional guidelines and oversight for advertising and promotion of these services are needed.

  18. Hypothetical decision making in schizophrenia: the role of expected value computation and "irrational" biases.

    Brown, Jaime K; Waltz, James A; Strauss, Gregory P; McMahon, Robert P; Frank, Michael J; Gold, James M

    2013-09-30

    The aim of the present study was to examine the contributions to decision making (DM) deficits in schizophrenia (SZ) patients of expected value (EV) estimation and loss aversion. Patients diagnosed with SZ (n=46) and healthy controls (n=34) completed two gambling tasks. In one task, participants chose between two options with the same EV across two conditions: Loss frames and Keep frames. A second task involved accepting or rejecting gambles, in which gain and loss amounts varied, determining the EV of each trial. SZ patients showed a reduced "framing effect" relative to controls, as they did not show an increased tendency to gamble when faced with a certain loss. SZ patients also showed a reduced tendency to modify behavior as a function of EV. The degree to which choices tracked EV correlated significantly with several cognitive measures in both patients and controls. SZ patients show distinct deviations from normal behavior under risk when their decisions are based on prospective outcomes. These deviations are two-fold: cognitive deficits prevent value-based DM in more-impaired patients, and in less-impaired patients there is a lack of influence from well-established subjective biases found in healthy people. These abnormalities likely affect everyday DM strategies in schizophrenia patients. © 2013 Elsevier Ireland Ltd. All rights reserved.

  19. Why do we laugh at misfortunes? An electrophysiological exploration of comic situation processing.

    Manfredi, Mirella; Adorni, Roberta; Proverbio, Alice Mado; Proverbio, Alice

    2014-08-01

    The goal of the present study was to shed some light on a particular kind of humour, called slapstick, by measuring brain bioelectrical activity during the perception of funny vs. non-funny pictures involving misfortunate circumstances. According to our hypothesis, the element mostly providing a comic feature in a misfortunate situation is the facial expression of the victims: the observer׳s reaction will usually be laughing only if the victims will show a funny bewilderment face and not a painful or anger expression. Several coloured photographs depicting people involved in misfortunate situations were presented to 30 Italian healthy volunteers, while their EEG was recorded. Three different situations were considered: people showing a painful or an angry expression (Affective); people showing a bewilderment expression and, so, a comic look (Comic); people engaged in similar misfortunate situations but with no face visible (No Face). Results showed that the mean amplitude of both the posterior N170 and anterior N220 components was much larger in amplitude to comic pictures, than the other stimuli. This early response could be considered the first identification of a comic element and evidence of the compelling and automatic response that usually characterizes people amused reaction during a misfortune. In addition, we observed a larger P300 amplitude in response to comic than affective pictures, probably reflecting a more conscious processing of the comic element. Finally, no face pictures elicited an anteriorly distributed N400, which might reflect the effort to comprehend the nature of the situation displayed without any affective facial information, and a late positivity, possibly indexing a re-analysis processing of the unintelligible misfortunate situation (comic or unhappy) depicted in the No Face stimuli. These data support the hypothesis that the facial expression of the victims acts as a specific trigger for the amused feeling that observers usually

  20. A computer-tailored intervention to promote informed decision making for prostate cancer screening among African-American men

    Allen, Jennifer D.; Mohllajee, Anshu P.; Shelton, Rachel C.; Drake, Bettina F.; Mars, Dana R.

    2010-01-01

    African-American men experience a disproportionate burden of prostate cancer (CaP) morbidity and mortality. National screening guidelines advise men to make individualized screening decisions through a process termed “informed decision making” (IDM). In this pilot study, a computer-tailored decision-aid designed to promote IDM was evaluated using a pre/post test design. African-American men aged 40+ recruited from a variety of community settings (n=108). At pre-test, 43% of men reported having made a screening decision; at post-test 47% reported this to be the case (p=0.39). Significant improvements were observed on scores (0–100%) of knowledge (54% vs 72%; pMen were also more likely to want an active role in decision-making after using the tool (67% vs 75%; p=0.03). These results suggest that use of a computer-tailored decision-aid is a promising strategy to promote IDM for CaP screening among African-American men. PMID:19477736

  1. How to make a start on green house gas reduction and laugh all the way to the bank

    Rogers, G.; Javan, A.

    2000-01-01

    Green house gas emission reduction is mostly directed to CO 2 , a byproduct of combustion of fossil fuels. While Australia is in a disadvantageous position relative to many countries because of heavy reliance on coal, compared to say natural gas, wind nuclear or hydro source, Australian technology is helping the worlds fossil fuel fired power stations and many industrial boilers minimise total CO 2 emissions by increasing efficiency of combustion, by excess air administration. CSIRO developed the world's leading Zirconia sensor technology for flue gas oxygen measurement. An Australian company commercialised this technology and now dominates the Australian power station market with the technology because it is for the first time reliable enough for closed-loop automatic air trimming control especially on gas fired plant. This paper will explain the relationships between fossil fuel stoichiometry, excess air and CO 2 emissions and zirconia sensor flue gas oxygen measurement and control. By carefully controlling excess air reductions in CO 2 emissions of 3-5% are readily achievable along with reductions in fuel costs of the same order. Side benefits can include reduction in NOx and increased safety of the operation. This is expected to be a low cost and increasingly popular measure for industrial enterprises to attack the greenhouse problem with an easy win-win result, as compared to many other greenhouse emission reduction strategies available

  2. Guide to making time-lapse graphics using the facilities of the National Magnetic Fusion Energy Computing Center

    Munro, J.K. Jr.

    1980-05-01

    The advent of large, fast computers has opened the way to modeling more complex physical processes and to handling very large quantities of experimental data. The amount of information that can be processed in a short period of time is so great that use of graphical displays assumes greater importance as a means of displaying this information. Information from dynamical processes can be displayed conveniently by use of animated graphics. This guide presents the basic techniques for generating black and white animated graphics, with consideration of aesthetic, mechanical, and computational problems. The guide is intended for use by someone who wants to make movies on the National Magnetic Fusion Energy Computing Center (NMFECC) CDC-7600. Problems encountered by a geographically remote user are given particular attention. Detailed information is given that will allow a remote user to do some file checking and diagnosis before giving graphics files to the system for processing into film in order to spot problems without having to wait for film to be delivered. Source listings of some useful software are given in appendices along with descriptions of how to use it. 3 figures, 5 tables

  3. Combining Human Computing and Machine Learning to Make Sense of Big (Aerial) Data for Disaster Response.

    Ofli, Ferda; Meier, Patrick; Imran, Muhammad; Castillo, Carlos; Tuia, Devis; Rey, Nicolas; Briant, Julien; Millet, Pauline; Reinhard, Friedrich; Parkan, Matthew; Joost, Stéphane

    2016-03-01

    results suggest that the platform we have developed to combine crowdsourcing and machine learning to make sense of large volumes of aerial images can be used for disaster response.

  4. Make It Intuitive: An Evaluation Practice Emergent From The Plans And Scripted Behavior Of The Computer-community Of Practice

    Pat Lehane

    2012-01-01

    Full Text Available The catch phrase today for system designers is to “make it intuitive,” which begs the question, what is intuitive? The action research discussed in this article was the final stage of the application of grounded theory to user data that provided survey categories (criteria for system acceptance. A theoretical rationale from the discipline of human–computer interaction was proposed to provide a consistent and repeatable interpretation of the users’ responses to the survey and directly align the responses to software design considerations. To put this work into context, I discuss in this article a case study on the use of the survey to monitor the user experience during the upgrade of an enterprise system and the subsequent implications and outcomes of applying the theoretical paradigm in practice. As such it may provide food for thought on survey design for elicitation of user requirements for information and communication technology systems.

  5. Computational modelling and analysis of hippocampal-prefrontal information coding during a spatial decision-making task

    Thomas eJahans-Price

    2014-03-01

    Full Text Available We introduce a computational model describing rat behaviour and the interactions of neural populations processing spatial and mnemonic information during a maze-based, decision-making task. The model integrates sensory input and implements a working memory to inform decisions at a choice point, reproducing rat behavioural data and predicting the occurrence of turn- and memory-dependent activity in neuronal networks supporting task performance. We tested these model predictions using a new software toolbox (Maze Query Language, MQL to analyse activity of medial prefrontal cortical (mPFC and dorsal hippocampal (dCA1 neurons recorded from 6 adult rats during task performance. The firing rates of dCA1 neurons discriminated context (i.e. the direction of the previous turn, whilst a subset of mPFC neurons was selective for current turn direction or context, with some conjunctively encoding both. mPFC turn-selective neurons displayed a ramping of activity on approach to the decision turn and turn-selectivity in mPFC was significantly reduced during error trials. These analyses complement data from neurophysiological recordings in non-human primates indicating that firing rates of cortical neurons correlate with integration of sensory evidence used to inform decision-making.

  6. Cynthia J. Miller and A. Bowdoin Van Riper (eds.), The Laughing Dead: The Horror-Comedy Film from Bride of Frank

    Mullen, Elizabeth

    2017-01-01

    The Laughing Dead is as hybrid as its subject, covering American and British film and television in a broad manner. Most of the essays here do not delve deeply into film aesthetics or theory, but they do provide a different perspective on both commonly analyzed and lesser-known films. The essays dealing with suburbia and gender are the strongest of the book. This collection of sixteen articles explores ways in which comedy and horror subvert generic norms, shattering expectations and forcing ...

  7. Putting Making into High School Computer Science Classrooms: Promoting Equity in Teaching and Learning with Electronic Textiles in "Exploring Computer Science"

    Fields, Deborah Ann; Kafai, Yasmin; Nakajima, Tomoko; Goode, Joanna; Margolis, Jane

    2018-01-01

    Recent discussions of making have focused on developing out-of-school makerspaces and activities to provide more equitable and enriching learning opportunities for youth. Yet school classrooms present a unique opportunity to help broaden access, diversify representation, and deepen participation in making. In turning to classrooms, we want to…

  8. COMPUTING

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  9. COMPUTING

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  10. COMPUTING

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  11. COMPUTING

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  12. COMPUTING

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  13. Learning to Laugh at Ourselves: Humor, Self-Transcendence, and the Cultivation of Moral Virtues

    Gordon, Mordechai

    2010-01-01

    In this essay Mordechai Gordon begins to address the neglect of humor among philosophers of education by focusing on some interesting connections between humor, self-transcendence, and the development of moral virtues. More specifically, he explores the kind of humor that makes fun of oneself and how it can affect educational encounters. Gordon…

  14. Making eco-friendly transportation safer: developing computer-based simulations to assess of the impacts of bicycle accident prevention interventions on healthcare utilization.

    Juhra, Christian; Borycki, Elizabeth M; Kushniruk, Andre W; Anderson, Jim; Anderson, Marilyn

    2011-01-01

    Computer-based modeling and simulations are becoming increasingly used for applications in health and safety. In this paper we describe a multi-phase project aimed at modeling bicycle accidents in Munster, Germany. The work involved a first phase of collecting empirical data on accident rates and severity. In the second phase a computer-based simulation model of bicycle accidents was created, using data from phase one to identify relevant parameters in the model. Finally, initial results from running the model are described that will be used to inform decision making regarding safety initiatives.

  15. Hybrid Human-Computing Distributed Sense-Making: Extending the SOA Paradigm for Dynamic Adjudication and Optimization of Human and Computer Roles

    Rimland, Jeffrey C.

    2013-01-01

    In many evolving systems, inputs can be derived from both human observations and physical sensors. Additionally, many computation and analysis tasks can be performed by either human beings or artificial intelligence (AI) applications. For example, weather prediction, emergency event response, assistive technology for various human sensory and…

  16. Intelligent Decision-Making System with Green Pervasive Computing for Renewable Energy Business in Electricity Markets on Smart Grid

    Park JongHyuk

    2009-01-01

    Full Text Available This paper is about the intelligent decision-making system for the smart grid based electricity market which requires distributed decision making on the competitive environments composed of many players and components. It is very important to consider the renewable energy and emission problem which are expected to be monitored by wireless communication networks. It is very difficult to predict renewable energy outputs and emission prices over time horizon, so it could be helpful to catch up those data on real time basis using many different kinds of communication infrastructures. On this backgrounds this paper provides an algorithm to make an optimal decision considering above factors.

  17. Data needs and computational requirements for asset management decision making. Internal deliverable ID5.2.1

    Catrinu-Renstrom, Maria; Clement, Rémy; Tournebise, Pascal

    addressing to the requirements of RMAC criterion developed in work package 2. The report has been written by several partners, three of them being European TSOs, and the three other being academic partners. Special attention has been paid to address every topic in asset management decision making process......The objective of this deliverable is to present the requirements for adapting available tools/models and identifying data needs for reliability analysis and optimal decision-making for asset management decision making process. It will serve as a basis for the next tasks of GARPUR work package 5...... decision making process, as described in work package 2. Some advanced models exist in scientific literature to characterize the spatio-temporal variation and correlations of relevant factors. Some of these models have been proposed in academia, and offer improved representation with respect to those...

  18. COMPUTING

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. COMPUTING

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  20. Investigating the neuroanatomical substrate of pathological laughing and crying in amyotrophic lateral sclerosis with multimodal neuroimaging techniques.

    Christidi, Foteini; Karavasilis, Efstratios; Ferentinos, Panagiotis; Xirou, Sophia; Velonakis, Georgios; Rentzos, Michalis; Zouvelou, Vasiliki; Zalonis, Ioannis; Efstathopoulos, Efstathios; Kelekis, Nikolaos; Evdokimidis, Ioannis

    2018-02-01

    Pathological laughing and crying (PLC) is common in several neurological and psychiatric diseases and is associated with a distributed network involving the frontal cortex, the brainstem and cortico-pontine-cerebellar circuits. By applying multimodal neuroimaging approach, we examined the neuroanatomical substrate of PLC in a sample of patients with amyotrophic lateral sclerosis (ALS). We studied 56 non-demented ALS patients and 25 healthy controls (HC). PLC was measured in ALS using the Center of Neurologic Study Lability Scale (CNS-LS; cutoff score: 13). All participants underwent 3D-T1-weighted and 30-directional diffusion-weighted imaging at 3T. Voxel-based morphometry and tract-based spatial-statistics analysis was used to examine gray matter (GM) and white matter (WM) differences between ALS patients with and without PLC (ALS-PLC and ALS-nonPLC, respectively). Comparisons were restricted to regions with detected differences between ALS and HC, controlling for age, gender, total intracranial volume and depressive symptoms. In regions with significant differences between ALS and HC, ALS-PLC patients showed decreased GM volume in left orbitofrontal cortex, frontal operculum, and putamen and bilateral frontal poles, compared to ALS-nonPLC. They also had decreased fractional anisotropy in left cingulum bundle and posterior corona radiata. WM abnormalities were additionally detected in WM associative and ponto-cerebellar tracts (using a more liberal threshold). PLC in ALS is driven by both GM and WM abnormalities which highlight the role of circuits rather than isolated centers in the emergence of this condition. ALS is suggested as a useful natural experimental model to study PLC.

  1. COMPUTING

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  2. COMPUTING

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  3. COMPUTING

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  4. COMPUTING

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  5. COMPUTING

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  6. Development of a Computer-Based Air Force Installation Restoration Workstation for Contaminant Modeling and Decision-Making

    1995-03-01

    IS EXPECTED. K: SITE LOCATED IN KARST TYPOGRAPHY , OR IS UNDERLAIN BY CAVERNOUS LIMESTONE. M: MOUNDING OF THE WATER TABLE BENEATH A CONTAMINATION SITE...mulatjc~rs). The developers have endeavored to design the Systemn, as i-ar as pos,’sible, to run on any brand of parson-Al computer that operates under

  7. Decision-Making Processes of SME in Cloud Computing Adoption to Create Disruptive Innovation: Mediating Effect of Collaboration

    Sonthiprasat, Rattanawadee

    2014-01-01

    THE PROBLEM. The purpose of this quantitative correlation study was to assess the relationship between different Cloud service levels of effective business innovation for SMEs. In addition, the new knowledge gained from the benefits of Cloud adoption with knowledge sharing would enhance the decision making process for businesses to consider the…

  8. Effects of Computer-Assisted Instruction in Using Formal Decision-Making Strategies to Choose a College Major.

    Mau, Wei-Cheng; Jepsen, David A.

    1992-01-01

    Compared decision-making strategies and college major choice among 113 first-year students assigned to Elimination by Aspects Strategy (EBA), Subjective Expected Utility Strategy (SEU), and control groups. "Rational" EBA students scored significantly higher on choice certainty; lower on choice anxiety and career indecision than "rational"…

  9. A concept for a visual computer interface to make error taxonomies useful at the point of primary care

    Ranjit Singh

    2008-01-01

    The approach is designed to capture and disseminate patient safety information in an unambiguous format that is useful to all members of the healthcare team (including the patient at the point of care as well as at the policy-making level.

  10. COMPUTING

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  11. COMPUTING

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  12. COMPUTING

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  13. COMPUTING

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  14. COMPUTING

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  15. COMPUTING

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  16. COMPUTING

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  17. BRAND AWARENESS ANALYSIS TOWARD PURCHASE DECISION MAKING IN GAMING COMPUTER INDUSTRY (A Case Study of Allienware-Dell Gaming Laptop)

    Giffels, Mario Dwikirama; Hidayat, Nila Krishnawati; Pasasa, Linus .

    2013-01-01

    The first purpose of this research is to analyze the level of Brand Awareness of Alienware consumers toward Purchasing Decision Making. The second purpose is to analyze the most influential factor inPurchase Decision Factor, and the last purpose is to analyze the correlation between Brand Awareness and its Purchasing Decision. To support this research, the author made an interview with the General Manager of one of Alienware retail store in South Jakarta and handed 100 questionnaires to targe...

  18. Computational prediction of multidisciplinary team decision-making for adjuvant breast cancer drug therapies: a machine learning approach.

    Lin, Frank P Y; Pokorny, Adrian; Teng, Christina; Dear, Rachel; Epstein, Richard J

    2016-12-01

    Multidisciplinary team (MDT) meetings are used to optimise expert decision-making about treatment options, but such expertise is not digitally transferable between centres. To help standardise medical decision-making, we developed a machine learning model designed to predict MDT decisions about adjuvant breast cancer treatments. We analysed MDT decisions regarding adjuvant systemic therapy for 1065 breast cancer cases over eight years. Machine learning classifiers with and without bootstrap aggregation were correlated with MDT decisions (recommended, not recommended, or discussable) regarding adjuvant cytotoxic, endocrine and biologic/targeted therapies, then tested for predictability using stratified ten-fold cross-validations. The predictions so derived were duly compared with those based on published (ESMO and NCCN) cancer guidelines. Machine learning more accurately predicted adjuvant chemotherapy MDT decisions than did simple application of guidelines. No differences were found between MDT- vs. ESMO/NCCN- based decisions to prescribe either adjuvant endocrine (97%, p = 0.44/0.74) or biologic/targeted therapies (98%, p = 0.82/0.59). In contrast, significant discrepancies were evident between MDT- and guideline-based decisions to prescribe chemotherapy (87%, p machine learning models. A machine learning approach based on clinicopathologic characteristics can predict MDT decisions about adjuvant breast cancer drug therapies. The discrepancy between MDT- and guideline-based decisions regarding adjuvant chemotherapy implies that certain non-clincopathologic criteria, such as patient preference and resource availability, are factored into clinical decision-making by local experts but not captured by guidelines.

  19. COMPUTING

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  20. The Einstein formula: E0=mc2. 'Isn't the Lord laughing?'

    Okun, L B

    2008-01-01

    The article traces the way Einstein formulated the relation between energy and mass in his work from 1905 to 1955. Einstein emphasized quite often that the mass m of a body is equivalent to its rest energy E 0 . At the same time, he frequently resorted to the less clear-cut statement of the equivalence of energy and mass. As a result, Einstein's formula E 0 =mc 2 still remains much less known than its popular form, E=mc 2 , in which E is the total energy equal to the sum of the rest energy and the kinetic energy of a freely moving body. One of the consequences of this is the widespread fallacy that the mass of a body increases when its velocity increases and even that this is an experimental fact. As wrote the playwright A N Ostrovsky 'Something must exist for people, something so austere, so lofty, so sacrosanct that it would make profaning it unthinkable.' (from the history of physics)

  1. COMPUTING

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...

  2. Intelligent computer aided training systems in the real world: Making the technology accessible to the educational mainstream

    Kovarik, Madeline

    1993-01-01

    Intelligent computer aided training systems hold great promise for the application of this technology to mainstream education and training. Yet, this technology, which holds such a vast potential impact for the future of education and training, has had little impact beyond the enclaves of government research labs. This is largely due to the inaccessibility of the technology to those individuals in whose hands it can have the greatest impact, teachers and educators. Simply throwing technology at an educator and expecting them to use it as an effective tool is not the answer. This paper provides a background into the use of technology as a training tool. MindLink, developed by HyperTech Systems, provides trainers with a powerful rule-based tool that can be integrated directly into a Windows application. By embedding expert systems technology it becomes more accessible and easier to master.

  3. Case: Making students grasp the concept of differential equations in a broader and more flexible way by using computers

    Andresen, Mette

    2003-01-01

    their abilities in this area. Organizing mathematical objects by building structures, as well as by handling the interaction between the component parts and the structures in full, is one activity in learning mathematics at all levels in school. I have concentrated on setting up and solving differential equations......, because this area offers a suitable structure complexity at various levels. Furthermore, one point of interest for me is to find out how to use CAS (Computer Algebraic Systems) constructively in this context. I feel that research in this area could prove valuable, provided that its outcome in the form...... through specific directions for subsequently implementing in teaching practice. Abstract: Teaching mathematics for more than 12 years, mainly at upper secondary school level, I have often wondered why the ability to handle structures seemingly is insufficiently developed in many students. It causes them...

  4. Taking a lot of Pictures of Real Things and Making them into a Single Picture you can Move on a Computer

    Linneman, C.; Hults, C.

    2017-12-01

    This summer I spent my time in the largest state of all the states, with the people who take care of the most important parks, owned by all of us. My job was to take a lot of pictures of real things, small and large, and to make them into one piece on a computer, into pictures that can be moved and turned and can be easily shared across the world at any time. My job had three different classes: very small, pretty big, and very big. For the small things: Using a table that turns, I took many still pictures of old animals turned into rocks as well as things thrown away by people who are now dead. The pieces of rock and old things are important and exciting, but they can break quite easily, so only a few people are allowed to touch them. With the pictures you can move, many more people can learn about, "touch", and see them, but they use a computer instead of their hands. For a pretty big block of ice moving down a long area of land, I took many pictures of the end of it, while at the same time knowing just where I was on the face of the world. Using a computer, I again put all the pictures together into one picture that could be turned and moved. One person with a computer could look at any part of the piece of ice without having to actually visit it. Finally, for the very big things, I was part of a team that would fly slowly over the areas we were interested in, taking pictures about every half of a second. After taking tens of hundreds of pictures, the computer join all the pictures together into a single picture that showed each and every little up and down of the land that we had flown over, getting very few wrong. This way of making pictures you can move doesn't take as much money as other means, and it can be used on things of very different areas, from something as small as a finger to something as large as a huge field of ice moving slowly over time. The people who care for the parks that we all own don't have as much money as some, and in the biggest state

  5. Using computer-extracted image features for modeling of error-making patterns in detection of mammographic masses among radiology residents.

    Zhang, Jing; Lo, Joseph Y; Kuzmiak, Cherie M; Ghate, Sujata V; Yoon, Sora C; Mazurowski, Maciej A

    2014-09-01

    Mammography is the most widely accepted and utilized screening modality for early breast cancer detection. Providing high quality mammography education to radiology trainees is essential, since excellent interpretation skills are needed to ensure the highest benefit of screening mammography for patients. The authors have previously proposed a computer-aided education system based on trainee models. Those models relate human-assessed image characteristics to trainee error. In this study, the authors propose to build trainee models that utilize features automatically extracted from images using computer vision algorithms to predict likelihood of missing each mass by the trainee. This computer vision-based approach to trainee modeling will allow for automatically searching large databases of mammograms in order to identify challenging cases for each trainee. The authors' algorithm for predicting the likelihood of missing a mass consists of three steps. First, a mammogram is segmented into air, pectoral muscle, fatty tissue, dense tissue, and mass using automated segmentation algorithms. Second, 43 features are extracted using computer vision algorithms for each abnormality identified by experts. Third, error-making models (classifiers) are applied to predict the likelihood of trainees missing the abnormality based on the extracted features. The models are developed individually for each trainee using his/her previous reading data. The authors evaluated the predictive performance of the proposed algorithm using data from a reader study in which 10 subjects (7 residents and 3 novices) and 3 experts read 100 mammographic cases. Receiver operating characteristic (ROC) methodology was applied for the evaluation. The average area under the ROC curve (AUC) of the error-making models for the task of predicting which masses will be detected and which will be missed was 0.607 (95% CI,0.564-0.650). This value was statistically significantly different from 0.5 (perror-making

  6. Using computer-extracted image features for modeling of error-making patterns in detection of mammographic masses among radiology residents

    Zhang, Jing; Ghate, Sujata V.; Yoon, Sora C.; Lo, Joseph Y.; Kuzmiak, Cherie M.; Mazurowski, Maciej A.

    2014-01-01

    Purpose: Mammography is the most widely accepted and utilized screening modality for early breast cancer detection. Providing high quality mammography education to radiology trainees is essential, since excellent interpretation skills are needed to ensure the highest benefit of screening mammography for patients. The authors have previously proposed a computer-aided education system based on trainee models. Those models relate human-assessed image characteristics to trainee error. In this study, the authors propose to build trainee models that utilize features automatically extracted from images using computer vision algorithms to predict likelihood of missing each mass by the trainee. This computer vision-based approach to trainee modeling will allow for automatically searching large databases of mammograms in order to identify challenging cases for each trainee. Methods: The authors’ algorithm for predicting the likelihood of missing a mass consists of three steps. First, a mammogram is segmented into air, pectoral muscle, fatty tissue, dense tissue, and mass using automated segmentation algorithms. Second, 43 features are extracted using computer vision algorithms for each abnormality identified by experts. Third, error-making models (classifiers) are applied to predict the likelihood of trainees missing the abnormality based on the extracted features. The models are developed individually for each trainee using his/her previous reading data. The authors evaluated the predictive performance of the proposed algorithm using data from a reader study in which 10 subjects (7 residents and 3 novices) and 3 experts read 100 mammographic cases. Receiver operating characteristic (ROC) methodology was applied for the evaluation. Results: The average area under the ROC curve (AUC) of the error-making models for the task of predicting which masses will be detected and which will be missed was 0.607 (95% CI,0.564-0.650). This value was statistically significantly different

  7. Using computer-extracted image features for modeling of error-making patterns in detection of mammographic masses among radiology residents

    Zhang, Jing, E-mail: jing.zhang2@duke.edu; Ghate, Sujata V.; Yoon, Sora C. [Department of Radiology, Duke University School of Medicine, Durham, North Carolina 27705 (United States); Lo, Joseph Y. [Department of Radiology, Duke University School of Medicine, Durham, North Carolina 27705 (United States); Duke Cancer Institute, Durham, North Carolina 27710 (United States); Departments of Biomedical Engineering and Electrical and Computer Engineering, Duke University, Durham, North Carolina 27705 (United States); Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States); Kuzmiak, Cherie M. [Department of Radiology, University of North Carolina at Chapel Hill School of Medicine, Chapel Hill, North Carolina 27599 (United States); Mazurowski, Maciej A. [Department of Radiology, Duke University School of Medicine, Durham, North Carolina 27705 (United States); Duke Cancer Institute, Durham, North Carolina 27710 (United States); Medical Physics Graduate Program, Duke University, Durham, North Carolina 27705 (United States)

    2014-09-15

    Purpose: Mammography is the most widely accepted and utilized screening modality for early breast cancer detection. Providing high quality mammography education to radiology trainees is essential, since excellent interpretation skills are needed to ensure the highest benefit of screening mammography for patients. The authors have previously proposed a computer-aided education system based on trainee models. Those models relate human-assessed image characteristics to trainee error. In this study, the authors propose to build trainee models that utilize features automatically extracted from images using computer vision algorithms to predict likelihood of missing each mass by the trainee. This computer vision-based approach to trainee modeling will allow for automatically searching large databases of mammograms in order to identify challenging cases for each trainee. Methods: The authors’ algorithm for predicting the likelihood of missing a mass consists of three steps. First, a mammogram is segmented into air, pectoral muscle, fatty tissue, dense tissue, and mass using automated segmentation algorithms. Second, 43 features are extracted using computer vision algorithms for each abnormality identified by experts. Third, error-making models (classifiers) are applied to predict the likelihood of trainees missing the abnormality based on the extracted features. The models are developed individually for each trainee using his/her previous reading data. The authors evaluated the predictive performance of the proposed algorithm using data from a reader study in which 10 subjects (7 residents and 3 novices) and 3 experts read 100 mammographic cases. Receiver operating characteristic (ROC) methodology was applied for the evaluation. Results: The average area under the ROC curve (AUC) of the error-making models for the task of predicting which masses will be detected and which will be missed was 0.607 (95% CI,0.564-0.650). This value was statistically significantly different

  8. Laughing for Real?

    Karlsen, Mads Peter; Villadsen, Kaspar

    2015-01-01

    Management and humour are becoming more closely interlinked in contemporary organizational life. Whereas humour was conventionally viewed as a deleterious, alien element at the workplace, it is now increasingly viewed as a valuable management tool. This development raises the question of whether...... humour can still be regarded as having critical or subversive potential. This article discusses three research approaches to management and humour: the instrumental, the ideological critical, and contemporary critical organization studies, giving particular emphasis to extending the last tradition. Hence......, the article situates itself in the critical debate on the function of humour in the workplace and on ‘cynical reasoning’ recently initiated in organization studies. It seeks to contribute to this debate by defining the features of a critical humoristic practice in a post-authoritarian management context...

  9. Condoms No Laughing Matter

    Adam Jiwa

    2011-11-01

    Full Text Available This multimedia project was as a beer commercial, a lad’snight in, an environment that can be used to target therelevant audience. The goal was to deploy a familiaratmosphere and recognisable characters whilst delivering aserious message with humour in a very short space of time.

  10. Laughing All the Way.

    Young Children, 1988

    1988-01-01

    Points out that a key to discipline is the developing of a fun-filled friendship between adults and children in the child care environment. Suggests that "me vs. you" situations can be avoided by distracting the child with something interesting and fun, such as directions given in the form of a jingle or song. (RWB)

  11. The role of additional computed tomography in the decision-making process on the secondary prevention in patients after systemic cerebral thrombolysis

    Sobolewski P

    2015-12-01

    Full Text Available Piotr Sobolewski,1 Grzegorz Kozera,2 Wiktor Szczuchniak,1 Walenty M Nyka2 1Department of Neurology and Stroke, Unit of Holy Spirit Specialist Hospital in Sandomierz, Sandomierz, Poland; 2Department of Neurology, Medical University of Gdańsk, Gdańsk, Poland Introduction: Patients with ischemic stroke undergoing intravenous (iv-thrombolysis are routinely controlled with computed tomography on the second day to assess stroke evolution and hemorrhagic transformation (HT. However, the benefits of an additional computed tomography (aCT performed over the next days after iv-thrombolysis have not been determined.Methods: We retrospectively screened 287 Caucasian patients with ischemic stroke who were consecutively treated with iv-thrombolysis from 2008 to 2012. The results of computed tomography performed on the second (control computed tomography and seventh (aCT day after iv-thrombolysis were compared in 274 patients (95.5%; 13 subjects (4.5%, who died before the seventh day from admission were excluded from the analysis.Results: aCTs revealed a higher incidence of HT than control computed tomographies (14.2% vs 6.6%; P=0.003. Patients with HT in aCT showed higher median of National Institutes of Health Stroke Scale score on admission than those without HT (13.0 vs 10.0; P=0.01 and higher presence of ischemic changes >1/3 middle cerebral artery territory (66.7% vs 35.2%; P<0.01. Correlations between presence of HT in aCT and National Institutes of Health Stroke Scale score on admission (rpbi 0.15; P<0.01, and the ischemic changes >1/3 middle cerebral artery (phi=0.03 existed, and the presence of HT in aCT was associated with 3-month mortality (phi=0.03.Conclusion: aCT after iv-thrombolysis enables higher detection of HT, which is related to higher 3-month mortality. Thus, patients with severe middle cerebral artery infarction may benefit from aCT in the decision-making process on the secondary prophylaxis. Keywords: ischemic stroke, iv

  12. Predicting Motivation: Computational Models of PFC Can Explain Neural Coding of Motivation and Effort-based Decision-making in Health and Disease.

    Vassena, Eliana; Deraeve, James; Alexander, William H

    2017-10-01

    Human behavior is strongly driven by the pursuit of rewards. In daily life, however, benefits mostly come at a cost, often requiring that effort be exerted to obtain potential benefits. Medial PFC (MPFC) and dorsolateral PFC (DLPFC) are frequently implicated in the expectation of effortful control, showing increased activity as a function of predicted task difficulty. Such activity partially overlaps with expectation of reward and has been observed both during decision-making and during task preparation. Recently, novel computational frameworks have been developed to explain activity in these regions during cognitive control, based on the principle of prediction and prediction error (predicted response-outcome [PRO] model [Alexander, W. H., & Brown, J. W. Medial prefrontal cortex as an action-outcome predictor. Nature Neuroscience, 14, 1338-1344, 2011], hierarchical error representation [HER] model [Alexander, W. H., & Brown, J. W. Hierarchical error representation: A computational model of anterior cingulate and dorsolateral prefrontal cortex. Neural Computation, 27, 2354-2410, 2015]). Despite the broad explanatory power of these models, it is not clear whether they can also accommodate effects related to the expectation of effort observed in MPFC and DLPFC. Here, we propose a translation of these computational frameworks to the domain of effort-based behavior. First, we discuss how the PRO model, based on prediction error, can explain effort-related activity in MPFC, by reframing effort-based behavior in a predictive context. We propose that MPFC activity reflects monitoring of motivationally relevant variables (such as effort and reward), by coding expectations and discrepancies from such expectations. Moreover, we derive behavioral and neural model-based predictions for healthy controls and clinical populations with impairments of motivation. Second, we illustrate the possible translation to effort-based behavior of the HER model, an extended version of PRO

  13. Testing of an oral dosing technique for double-crested cormorants, Phalacocorax auritus, laughing gulls, Leucophaeus atricilla, homing pigeons, Columba livia, and western sandpipers, Calidris mauri, with artificially weather MC252 oil.

    Dean, K M; Cacela, D; Carney, M W; Cunningham, F L; Ellis, C; Gerson, A R; Guglielmo, C G; Hanson-Dorr, K C; Harr, K E; Healy, K A; Horak, K E; Isanhart, J P; Kennedy, L V; Link, J E; Lipton, I; McFadden, A K; Moye, J K; Perez, C R; Pritsos, C A; Pritsos, K L; Muthumalage, T; Shriner, S A; Bursian, S J

    2017-12-01

    Scoping studies were designed to determine if double-crested cormorants (Phalacocorax auritus), laughing gulls (Leucophaues atricilla), homing pigeons (Columba livia) and western sandpipers (Calidris mauri) that were gavaged with a mixture of artificially weathered MC252 oil and food for either a single day or 4-5 consecutive days showed signs of oil toxicity. Where volume allowed, samples were collected for hematology, plasma protein electrophoresis, clinical chemistry and electrolytes, oxidative stress and organ weigh changes. Double-crested cormorants, laughing gulls and western sandpipers all excreted oil within 30min of dose, while pigeons regurgitated within less than one hour of dosing. There were species differences in the effectiveness of the dosing technique, with double-crested cormorants having the greatest number of responsive endpoints at the completion of the trial. Statistically significant changes in packed cell volume, white cell counts, alkaline phosphatase, alanine aminotransferase, creatine phosphokinase, gamma glutamyl transferase, uric acid, chloride, sodium, potassium, calcium, total glutathione, glutathione disulfide, reduced glutathione, spleen and liver weights were measured in double-crested cormorants. Homing pigeons had statistically significant changes in creatine phosphokinase, total glutathione, glutathione disulfide, reduced glutathione and Trolox equivalents. Laughing gulls exhibited statistically significant decreases in spleen and kidney weight, and no changes were observed in any measurement endpoints tested in western sandpipers. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Recurrent anterior shoulder instability: accuracy of estimations of glenoid bone loss with computed tomography is insufficient for therapeutic decision-making

    Huijsmans, Polydoor Emile [Haga Hospital, Department of Orthopedic Surgery, The Hague (Netherlands); Witte, Pieter Bas de [Leiden University Medical Center, Department of Orthopedic Surgery, Leiden (Netherlands); Villiers, Richard V.P. de; Kruger, Niel Ruben [Van Wageningen and Partners, Radiology Department, Somerset West (South Africa); Wolterbeek, Derk Willem; Warmerdam, Piet [Haga Hospital, Department of Radiology, The Hague (Netherlands); Beer, Joe F. de [Cape Shoulder Institute, Department of Orthopedic Surgery, Cape Town (South Africa)

    2011-10-15

    To evaluate the reliability of glenoid bone loss estimations based on either axial computed tomography (CT) series or single sagittal (''en face'' to glenoid) CT reconstructions, and to assess their accuracy by comparing with actual CT-based bone loss measurements, in patients with anterior glenohumeral instability. In two separate series of patients diagnosed with recurrent anterior glenohumeral instability, glenoid bone loss was estimated on axial CT series and on the most lateral sagittal (en face) glenoid view by two blinded radiologists. Additionally, in the second series of patients, glenoid defects were measured on sagittal CT reconstructions by an independent observer. In both series, larger defects were estimated when based on sagittal CT images compared to axial views. In the second series, mean measured bone loss was 11.5% (SD = 6.0) of the total original glenoid area, with estimations of 9.6% (SD = 7.2) and 7.8% (SD = 4.2) for sagittal and axial views, respectively. Correlations of defect estimations with actual measurements were fair to poor; glenoid defects tended to be underestimated, especially when based on axial views. CT-based estimations of glenoid bone defects are inaccurate. Especially for axial views, there is a high chance of glenoid defect underestimation. When using glenoid bone loss quantification in therapeutic decision-making, measuring the defect instead of estimating is strongly advised. (orig.)

  15. Green Computing

    K. Shalini

    2013-01-01

    Full Text Available Green computing is all about using computers in a smarter and eco-friendly way. It is the environmentally responsible use of computers and related resources which includes the implementation of energy-efficient central processing units, servers and peripherals as well as reduced resource consumption and proper disposal of electronic waste .Computers certainly make up a large part of many people lives and traditionally are extremely damaging to the environment. Manufacturers of computer and its parts have been espousing the green cause to help protect environment from computers and electronic waste in any way.Research continues into key areas such as making the use of computers as energy-efficient as Possible, and designing algorithms and systems for efficiency-related computer technologies.

  16. Why not make a PC cluster of your own? 5. AppleSeed: A Parallel Macintosh Cluster for Scientific Computing

    Decyk, Viktor K.; Dauger, Dean E.

    We have constructed a parallel cluster consisting of Apple Macintosh G4 computers running both Classic Mac OS as well as the Unix-based Mac OS X, and have achieved very good performance on numerically intensive, parallel plasma particle-in-cell simulations. Unlike other Unix-based clusters, no special expertise in operating systems is required to build and run the cluster. This enables us to move parallel computing from the realm of experts to the mainstream of computing.

  17. The Benefits of Making Data from the EPA National Center for Computational Toxicology available for reuse (ACS Fall meeting 3 of 12)

    Researchers at EPA’s National Center for Computational Toxicology (NCCT) integrate advances in biology, chemistry, exposure and computer science to help prioritize chemicals for further research based on potential human health risks. The goal of this research is to quickly evalua...

  18. Computer-Based Video Instruction to Teach Students with Intellectual Disabilities to Verbally Respond to Questions and Make Purchases in Fast Food Restaurants

    Mechling, Linda C.; Pridgen, Leslie S.; Cronin, Beth A.

    2005-01-01

    Computer-based video instruction (CBVI) was used to teach verbal responses to questions presented by cashiers and purchasing skills in fast food restaurants. A multiple probe design across participants was used to evaluate the effectiveness of CBVI. Instruction occurred through simulations of three fast food restaurants on the computer using video…

  19. No Special Equipment Required: The Accessibility Features Built into the Windows and Macintosh Operating Systems make Computers Accessible for Students with Special Needs

    Kimball,Walter H.; Cohen,Libby G.; Dimmick,Deb; Mills,Rick

    2003-01-01

    The proliferation of computers and other electronic learning devices has made knowledge and communication accessible to people with a wide range of abilities. Both Windows and Macintosh computers have accessibility options to help with many different special needs. This documents discusses solutions for: (1) visual impairments; (2) hearing…

  20. Business resilience system (BRS) driven through Boolean, fuzzy logics and cloud computation real and near real time analysis and decision making system

    Zohuri, Bahman

    2017-01-01

    This book provides a technical approach to a Business Resilience System with its Risk Atom and Processing Data Point based on fuzzy logic and cloud computation in real time. Its purpose and objectives define a clear set of expectations for Organizations and Enterprises so their network system and supply chain are totally resilient and protected against cyber-attacks, manmade threats, and natural disasters. These enterprises include financial, organizational, homeland security, and supply chain operations with multi-point manufacturing across the world. Market shares and marketing advantages are expected to result from the implementation of the system. The collected information and defined objectives form the basis to monitor and analyze the data through cloud computation, and will guarantee the success of their survivability's against any unexpected threats. This book will be useful for advanced undergraduate and graduate students in the field of computer engineering, engineers that work for manufacturing com...

  1. Quantum Computing

    In the first part of this article, we had looked at how quantum physics can be harnessed to make the building blocks of a quantum computer. In this concluding part, we look at algorithms which can exploit the power of this computational device, and some practical difficulties in building such a device. Quantum Algorithms.

  2. Social Robots vs. Computer Display: Does the Way Social Stories Are Delivered Make a Difference for Their Effectiveness on ASD Children?

    Pop, Cristina A.; Simut, Ramona E.; Pintea, Sebastian; Saldien, Jelle; Rusu, Alina S.; Vanderfaeillie, Johan; David, Daniel O.; Lefeber, Dirk; Vanderborght, Bram

    2013-01-01

    Background and Objectives: The aim of this exploratory study is to test whether social stories presented by a social robot have a greater effect than ones presented on a computer display in increasing the independency in expressing social abilities of children with autism spectrum disorders (ASD). Although much progress has been made in developing…

  3. Computer interfacing

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  4. No Laughing Matter: Presence, Consumption Trends, Drug Awareness, and Perceptions of “Hippy Crack” (Nitrous Oxide among Young Adults in England

    Esther M. Ehirim

    2018-01-01

    Full Text Available In clinical settings, nitrous oxide gas is a safe anesthetic used during childbirth, in dentistry, and to relieve anxiety in emergencies. Colloquially known as “hippy crack”’ or “laughing gas,” it is increasingly taken recreationally for its euphoric and relaxing effects and hallucinogenic properties. Using a self-reported survey, we gathered quantitative and qualitative information on users and non-users of hippy crack among a young population regarding: consumption patterns, knowledge, risk awareness and intentions toward future abuse. Quantitative responses from a total of 140 participants were analyzed for frequencies and relationships, whereas qualitative data were evaluated via identifying the reoccurring themes. Overall, 77.1% (n = 108 had heard of hippy crack and 27.9% (n = 39 admitted to past-year use. Prior users mostly indicated intended future use, had an average low number of past-year uses but some with > 20 occasions, had a varied number of inhalations per occasion (often 1–10 with an effect lasting up to 5 min, and a majority preferred social rather than lone use. For non-users, 79.2% said they would take hippy crack with the vast majority (94% preferring a social setting. The results show a concerning gap between available evidence and awareness of side effects. Despite serious reported side effects, including psychosis and myeloneuropathy—especially on the young developing brain—only a minority (29.3% was aware of any side effects. In contrast, in a hypothetical scenario depicting a first social encounter with hippy crack, the qualitative responses were in contrast to qualitative outcomes revealing that participants would try (n = 30/not try (n = 25 it, would feel under pressure to try it (n = 6 with only 11 opting to exit the situation. In summary, this first report of trends and perceptions of the use of hippy crack among young adults in the England highlights a lack of concern with

  5. Cloud Computing: The Future of Computing

    Aggarwal, Kanika

    2013-01-01

    Cloud computing has recently emerged as a new paradigm for hosting and delivering services over the Internet. Cloud computing is attractive to business owners as it eliminates the requirement for users to plan ahead for provisioning, and allows enterprises to start from the small and increase resources only when there is a rise in service demand. The basic principles of cloud computing is to make the computing be assigned in a great number of distributed computers, rather then local computer ...

  6. Making Friends in Dark Shadows: An Examination of the Use of Social Computing Strategy Within the United States Intelligence Community Since 9/11

    Andrew Chomik

    2011-01-01

    Full Text Available The tragic events of 9/11/2001 in the United States highlighted failures in communication and cooperation in the U.S. intelligence community. Agencies within the community failed to “connect the dots” by not collaborating in intelligence gathering efforts, which resulted in severe gaps in data sharing that eventually contributed to the terrorist attack on American soil. Since then, and under the recommendation made by the 9/11 Commission Report, the United States intelligence community has made organizational and operational changes to intelligence gathering and sharing, primarily with the creation of the Office of the Director of National Intelligence (ODNI. The ODNI has since introduced a series of web-based social computing tools to be used by all members of the intelligence community, primarily with its closed-access wiki entitled “Intellipedia” and their social networking service called “A-Space”. This paper argues that, while the introduction of these and other social computing tools have been adopted successfully into the intelligence workplace, they have reached a plateau in their use and serve only as complementary tools to otherwise pre-existing information sharing processes. Agencies continue to ‘stove-pipe’ their respective data, a chronic challenge that plagues the community due to bureaucratic policy, technology use and workplace culture. This paper identifies and analyzes these challenges, and recommends improvements in the use of these tools, both in the business processes behind them and the technology itself. These recommendations aim to provide possible solutions for using these social computing tools as part of a more trusted, collaborative information sharing process.

  7. Water-equivalent oral contrast agents in dual-modality PET/computed tomography scanning: does a little barium make the difference?

    Kinner, Sonja; Veit-Haibach, Patrick; Lauenstein, Thomas C; Bockisch, Andreas; Antoch, Gerald

    2009-03-01

    To retrospectively evaluate the performance of two water-equivalent oral contrast agents [locust bean gum (LBG)-mannitol and VoLumen] concerning their potential to distend the bowel while avoiding contrast-associated artifacts in PET/computed tomography. PET/computed tomography examinations of 30 patients with two different oral contrast agents were reviewed. Bowel distension, intraluminal density, and potential contrast-associated artifacts were assessed for stomach, jejunum, and ileum. Statistical significance was tested by Student's t-test. Distension was slightly better in the stomach with VoLumen as compared with LBG-mannitol whereas LBG-mannitol was found to slightly better distend the small bowel. This difference proved to be statistically significant for the jejunum. A statistically significant difference was detected for intraluminal density with higher densities for VoLumen. This difference, however, did not result in a higher incidence of PET artifacts with VoLumen. LBG-mannitol provides excellent bowel distension, thereby avoiding contrast-associated PET artifacts. If this solution is not available, VoLumen provides a satisfactory alternative for bowel distension without relevant PET artifacts.

  8. Decision Making

    Pier Luigi Baldi

    2006-06-01

    Full Text Available This article points out some conditions which significantly exert an influence upon decision and compares decision making and problem solving as interconnected processes. Some strategies of decision making are also examined.

  9. The Influence of Future Command, Control, Communications, and Computers (C4) on Doctrine and the Operational Commander's Decision-Making Process

    Mayer, Michael G.

    1996-01-01

    Future C4 systems will alter the traditional balance between force and information, having a profound influence on doctrine and the operational commander's decision making process. The Joint Staff's future vision of C4 is conceptualized in 'C4I for the Warrior' which envisions a joint C4I architecture providing timely sensor to shoot information direct to the warfighter. C4 system must manage and filter an overwhelming amount of information; deal with interoperability issues; overcome technological limitations; meet emerging security requirements; and protect against 'Information Warfare.' Severe budget constraints necessitate unified control of C4 systems under singular leadership for the common good of all the services. In addition, acquisition policy and procedures must be revamped to allow new technologies to be fielded quickly; and the commercial marketplace will become the preferred starting point for modernization. Flatter command structures are recommended in this environment where information is available instantaneously. New responsibilities for decision making at lower levels are created. Commanders will have to strike a balance between exerting greater control and allowing subordinates enough flexibility to maintain initiative. Clearly, the commander's intent remains the most important tool in striking this balance.

  10. Computer says 2.5 litres--how best to incorporate intelligent software into clinical decision making in the intensive care unit?

    Lane, Katie; Boyd, Owen

    2009-01-01

    What will be the role of the intensivist when computer-assisted decision support reaches maturity? Celi's group reports that Bayesian theory can predict a patient's fluid requirement on day 2 in 78% of cases, based on data collected on day 1 and the known associations between those data, based on observations in previous patients in their unit. There are both advantages and limitations to the Bayesian approach, and this test study identifies areas for improvement in future models. Although such models have the potential to improve diagnostic and therapeutic accuracy, they must be introduced judiciously and locally to maximize their effect on patient outcome. Efficacy is thus far undetermined, and these novel approaches to patient management raise new challenges, not least medicolegal ones.

  11. Organic Computing

    Würtz, Rolf P

    2008-01-01

    Organic Computing is a research field emerging around the conviction that problems of organization in complex systems in computer science, telecommunications, neurobiology, molecular biology, ethology, and possibly even sociology can be tackled scientifically in a unified way. From the computer science point of view, the apparent ease in which living systems solve computationally difficult problems makes it inevitable to adopt strategies observed in nature for creating information processing machinery. In this book, the major ideas behind Organic Computing are delineated, together with a sparse sample of computational projects undertaken in this new field. Biological metaphors include evolution, neural networks, gene-regulatory networks, networks of brain modules, hormone system, insect swarms, and ant colonies. Applications are as diverse as system design, optimization, artificial growth, task allocation, clustering, routing, face recognition, and sign language understanding.

  12. Strategic decision making

    Stokman, Frans N.; Assen, Marcel A.L.M. van; Knoop, Jelle van der; Oosten, Reinier C.H. van

    2000-01-01

    This paper introduces a methodology for strategic intervention in collective decision making.The methodology is based on (1) a decomposition of the problem into a few main controversial issues, (2) systematic interviews of subject area specialists to obtain a specification of the decision setting,consisting of a list of stakeholders with their capabilities, positions, and salience on each of the issues; (3) computer simulation. The computer simulation models incorporate only the main processe...

  13. Technology makes life better

    李红

    2015-01-01

    There are many theories about the relationship between technology and society.With the development of world economy,technology has made great progress.However,many changes were taken place in our daily life,especially the appearance of computer.Sending emails,chatting with others online,search for information which is what we need to learn and many other demands in people’s daily life,computers make all of it into possibility.

  14. Science Is A Laughing Matter

    Weissman, P. R.

    2017-12-01

    Humor can be a powerful tool in communicating science to a professional or lay audience. Humor relaxes the audience and encourages them to pay better attention, lest they miss the next funny comment or slide (and be sure that you provide it for them). Humor sends the message that the speaker is so confident in his/her material that the speaker can joke about it; this tends to deter spurious or trivial questions after the talk. But humor is not for the faint of heart. It requires planning, practice, and especially, good timing. Good humorists are always on the lookout for new material that they can use in a talk, be it a funny image, a cartoon, or a quip from a movie or from a professional comedian. But the humorist must also be a strict self-censor. Politically incorrect material can be extremely dangerous and can backfire on the speaker. Don't ever use material that insults some faction in the audience, even if that faction is not present at the moment or too stupid to notice. Don't include so much humor that the science in your talk gets lost in the laughter. Lastly, speakers who are not funny, should never attempt humor. There is nothing so damaging to a talk as poor humor that falls flat on its face. But if you have a good sense of humor, go for it. Life should be fun and so should science.

  15. Antinuclear laugh, but ecologists cry

    Nifenecker, H.

    2002-01-01

    Facing the decision of the German Government, of renounce the nuclear before 2001 and replace by fossil energy, the author compares the nuclear disadvantages and those of the fossil fuels, producers of greenhouse gases. (A.L.B.)

  16. The Role of Implicit Motives in Strategic Decision-Making: Computational Models of Motivated Learning and the Evolution of Motivated Agents

    Kathryn Merrick

    2015-11-01

    Full Text Available Individual behavioral differences in humans have been linked to measurable differences in their mental activities, including differences in their implicit motives. In humans, individual differences in the strength of motives such as power, achievement and affiliation have been shown to have a significant impact on behavior in social dilemma games and during other kinds of strategic interactions. This paper presents agent-based computational models of power-, achievement- and affiliation-motivated individuals engaged in game-play. The first model captures learning by motivated agents during strategic interactions. The second model captures the evolution of a society of motivated agents. It is demonstrated that misperception, when it is a result of motivation, causes agents with different motives to play a given game differently. When motivated agents who misperceive a game are present in a population, higher explicit payoff can result for the population as a whole. The implications of these results are discussed, both for modeling human behavior and for designing artificial agents with certain salient behavioral characteristics.

  17. Model : making

    Bottle, Neil

    2013-01-01

    The Model : making exhibition was curated by Brian Kennedy in collaboration with Allies & Morrison in September 2013. For the London Design Festival, the Model : making exhibition looked at the increased use of new technologies by both craft-makers and architectural model makers. In both practices traditional ways of making by hand are increasingly being combined with the latest technologies of digital imaging, laser cutting, CNC machining and 3D printing. This exhibition focussed on ...

  18. Computational physics

    Anon.

    1987-01-15

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October.

  19. Computational physics

    Anon.

    1987-01-01

    Computers have for many years played a vital role in the acquisition and treatment of experimental data, but they have more recently taken up a much more extended role in physics research. The numerical and algebraic calculations now performed on modern computers make it possible to explore consequences of basic theories in a way which goes beyond the limits of both analytic insight and experimental investigation. This was brought out clearly at the Conference on Perspectives in Computational Physics, held at the International Centre for Theoretical Physics, Trieste, Italy, from 29-31 October

  20. Motivational Interviewing with computer assistance as an intervention to empower women to make contraceptive choices while incarcerated: study protocol for randomized controlled trial

    Clarke Jennifer

    2012-07-01

    Full Text Available Abstract Background Unplanned pregnancies and sexually transmitted infections (STIs are important and costly public health problems in the United States resulting from unprotected sexual intercourse. Risk factors for unplanned pregnancies and STIs (poverty, low educational attainment, homelessness, substance abuse, lack of health insurance, history of an abusive environment, and practice of commercial sex work are especially high among women with a history of incarceration. Project CARE (Contraceptive Awareness and Reproductive Education is designed to evaluate an innovative intervention, Motivational Interviewing with Computer Assistance (MICA, aimed at enhancing contraceptive initiation and maintenance among incarcerated women who do not want a pregnancy within the next year and who are anticipated to be released back to the community. This study aims to: (1 increase the initiation of highly effective contraceptives while incarcerated; (2 increase the continuation of highly effective contraceptive use at 3, 6, 9, and 12 months after release; and (3 decrease unsafe sexual activity. Methods/Design This randomized controlled trial will recruit 400 women from the Rhode Island Department of Corrections (RI DOC women’s jail at risk for an unplanned pregnancy (that is, sexually active with men and not planning/wanting to become pregnant in the next year. They will be randomized to two interventions: a control group who receive two educational videos (on contraception, STIs, and pre-conception counseling or a treatment group who receive two sessions of personalized MICA. MICA is based on the principles of the Transtheoretical Model (TTM and on Motivational Interviewing (MI, an empirically supported counseling technique designed to enhance readiness to change targeted behaviors. Women will be followed at 3, 6, 9, and 12 months post release and assessed for STIs, pregnancy, and reported condom use. Discussion Results from this study are expected

  1. What makes Ras an efficient molecular switch: a computational, biophysical, and structural study of Ras-GDP interactions with mutants of Raf.

    Filchtinski, Daniel; Sharabi, Oz; Rüppel, Alma; Vetter, Ingrid R; Herrmann, Christian; Shifman, Julia M

    2010-06-11

    Ras is a small GTP-binding protein that is an essential molecular switch for a wide variety of signaling pathways including the control of cell proliferation, cell cycle progression and apoptosis. In the GTP-bound state, Ras can interact with its effectors, triggering various signaling cascades in the cell. In the GDP-bound state, Ras looses its ability to bind to known effectors. The interaction of the GTP-bound Ras (Ras(GTP)) with its effectors has been studied intensively. However, very little is known about the much weaker interaction between the GDP-bound Ras (Ras(GDP)) and Ras effectors. We investigated the factors underlying the nucleotide-dependent differences in Ras interactions with one of its effectors, Raf kinase. Using computational protein design, we generated mutants of the Ras-binding domain of Raf kinase (Raf) that stabilize the complex with Ras(GDP). Most of our designed mutations narrow the gap between the affinity of Raf for Ras(GTP) and Ras(GDP), producing the desired shift in binding specificity towards Ras(GDP). A combination of our best designed mutation, N71R, with another mutation, A85K, yielded a Raf mutant with a 100-fold improvement in affinity towards Ras(GDP). The Raf A85K and Raf N71R/A85K mutants were used to obtain the first high-resolution structures of Ras(GDP) bound to its effector. Surprisingly, these structures reveal that the loop on Ras previously termed the switch I region in the Ras(GDP).Raf mutant complex is found in a conformation similar to that of Ras(GTP) and not Ras(GDP). Moreover, the structures indicate an increased mobility of the switch I region. This greater flexibility compared to the same loop in Ras(GTP) is likely to explain the natural low affinity of Raf and other Ras effectors to Ras(GDP). Our findings demonstrate that an accurate balance between a rigid, high-affinity conformation and conformational flexibility is required to create an efficient and stringent molecular switch. Copyright 2010 Elsevier Ltd

  2. Steel making

    Chakrabarti, A K

    2014-01-01

    "Steel Making" is designed to give students a strong grounding in the theory and state-of-the-art practice of production of steels. This book is primarily focused to meet the needs of undergraduate metallurgical students and candidates for associate membership examinations of professional bodies (AMIIM, AMIE). Besides, for all engineering professionals working in steel plants who need to understand the basic principles of steel making, the text provides a sound introduction to the subject.Beginning with a brief introduction to the historical perspective and current status of steel making together with the reasons for obsolescence of Bessemer converter and open hearth processes, the book moves on to: elaborate the physiochemical principles involved in steel making; explain the operational principles and practices of the modern processes of primary steel making (LD converter, Q-BOP process, and electric furnace process); provide a summary of the developments in secondary refining of steels; discuss principles a...

  3. Make Sense?

    Gyrd-Jones, Richard; Törmälä, Minna

    Purpose: An important part of how we sense a brand is how we make sense of a brand. Sense-making is naturally strongly connected to how we cognize about the brand. But sense-making is concerned with multiple forms of knowledge that arise from our interpretation of the brand-related stimuli......: Declarative, episodic, procedural and sensory. Knowledge is given meaning through mental association (Keller, 1993) and / or symbolic interaction (Blumer, 1969). These meanings are centrally related to individuals’ sense of identity or “identity needs” (Wallpach & Woodside, 2009). The way individuals make...... sense of brands is related to who people think they are in their context and this shapes what they enact and how they interpret the brand (Currie & Brown, 2003; Weick, Sutcliffe, & Obstfeld, 2005; Weick, 1993). Our subject of interest in this paper is how stakeholders interpret and ascribe meaning...

  4. Making Waves

    benefits of information and communication technologies to rural villages in India. “From my long experience in agriculture, I find ... Computers, the Internet, mobile phones, interactive ... learn how to use ICTs if given a fair chance. They based ...

  5. Decision making.

    Chambers, David W

    2011-01-01

    A decision is a commitment of resources under conditions of risk in expectation of the best future outcome. The smart decision is always the strategy with the best overall expected value-the best combination of facts and values. Some of the special circumstances involved in decision making are discussed, including decisions where there are multiple goals, those where more than one person is involved in making the decision, using trigger points, framing decisions correctly, commitments to lost causes, and expert decision makers. A complex example of deciding about removal of asymptomatic third molars, with and without an EBD search, is discussed.

  6. Making extreme computations possible with virtual machines

    Reuter, J.; Chokoufe Nejad, B.

    2016-02-01

    State-of-the-art algorithms generate scattering amplitudes for high-energy physics at leading order for high-multiplicity processes as compiled code (in Fortran, C or C++). For complicated processes the size of these libraries can become tremendous (many GiB). We show that amplitudes can be translated to byte-code instructions, which even reduce the size by one order of magnitude. The byte-code is interpreted by a Virtual Machine with runtimes comparable to compiled code and a better scaling with additional legs. We study the properties of this algorithm, as an extension of the Optimizing Matrix Element Generator (O'Mega). The bytecode matrix elements are available as alternative input for the event generator WHIZARD. The bytecode interpreter can be implemented very compactly, which will help with a future implementation on massively parallel GPUs.

  7. Computational Cognition and Robust Decision Making

    2013-03-06

    much more powerful neuromorphic chips than current state of the art. L. Chua 10 DISTRIBUTION STATEMENT A – Unclassified, Unlimited Distribution 2...Cognition Program DARPA (Gill Pratt) • Systems of Neuromorphic Adaptive Plastic Scalable Electronics (SyNAPSE) Program IARPA (Brad Minnery...2012 - Four projects at SNU and KAIST co-funded with AOARD DARPA SyNAPSE Program: - Design, fabrication, and demonstration of neuromorphic

  8. Computer-supported collaborative decision-making

    Filip, Florin Gheorghe; Ciurea, Cristian

    2017-01-01

    This is a book about how management and control decisions are made by persons who collaborate and possibly use the support of an information system. The decision is the result of human conscious activities aiming at choosing a course of action for attaining a certain objective (or a set of objectives). The act of collaboration implies that several entities who work together and share responsibilities to jointly plan, implement and evaluate a program of activities to achieve the common goals. The book is intended to present a balanced view of the domain to include both well-established concepts and a selection of new results in the domains of methods and key technologies. It is meant to answer several questions, such as: a) “How are evolving the business models towards the ever more collaborative schemes?”; b) “What is the role of the decision-maker in the new context?” c) “What are the basic attributes and trends in the domain of decision-supporting information systems?”; d) “Which are the basic...

  9. Making Connections

    Pien, Cheng Lu; Dongsheng, Zhao

    2011-01-01

    Effective teaching includes enabling learners to make connections within mathematics. It is easy to accord with this statement, but how often is it a reality in the mathematics classroom? This article describes an approach in "connecting equivalent" fractions and whole number operations. The authors illustrate how a teacher can combine a common…

  10. Reconfigurable Computing

    Cardoso, Joao MP

    2011-01-01

    As the complexity of modern embedded systems increases, it becomes less practical to design monolithic processing platforms. As a result, reconfigurable computing is being adopted widely for more flexible design. Reconfigurable Computers offer the spatial parallelism and fine-grained customizability of application-specific circuits with the postfabrication programmability of software. To make the most of this unique combination of performance and flexibility, designers need to be aware of both hardware and software issues. FPGA users must think not only about the gates needed to perform a comp

  11. Making EDM Electrodes By Stereolithography

    Barlas, Philip A.

    1988-01-01

    Stereolithography is computer-aided manufacturing technique. Used to make models and molds of electrodes for electrical-discharge machining (EDM). Eliminates intermediate steps in fabrication of plastic model of object used in making EDM electrode to manufacture object or mold for object.

  12. Making Yugoslavs

    Nielsen, Christian Axboe

    . By the time Aleksandar was killed by an assassin’s bullet five years later, he not only had failed to create a unified Yugoslav nation but his dictatorship had also contributed to an increase in interethnic tensions.   In Making Yugoslavs, Christian Axboe Nielsen uses extensive archival research to explain...... the failure of the dictatorship’s program of forced nationalization. Focusing on how ordinary Yugoslavs responded to Aleksandar’s nationalization project, the book illuminates an often-ignored era of Yugoslav history whose lessons remain relevant not just for the study of Balkan history but for many...

  13. THE MAKING OF DECISION MAKING

    Leonardo Yuji Tamura

    2016-04-01

    Full Text Available Quantum Electronics was a Brazilian startup in the 1990's that was acquired by an American equity fund in 2012. They are currently the largest manufacturer of vehicle tracking and infotainment systems. The company was founded by three college friends, who are currently executives at the company: Camilo Santos, Pedro Barbosa and Luana Correa. Edward Hutter was sent by the equity fund to take over the company’s finances, but is having trouble making organizational decisions with his colleagues. As a consultant, I was called to help them improve their decision making process and project prioritization. I adapted and deployed our firm's methodology, but, in the end, its adequacy is shown to be very much in question. The author of this case study intends to explore how actual organizational decisions rely on different decision models and their assumptions, .as well as demonstrate that a decision model is neither absolutely good nor bad as its quality is context dependent.

  14. Roadmap to greener computing

    Nguemaleu, Raoul-Abelin Choumin

    2014-01-01

    A concise and accessible introduction to green computing and green IT, this book addresses how computer science and the computer infrastructure affect the environment and presents the main challenges in making computing more environmentally friendly. The authors review the methodologies, designs, frameworks, and software development tools that can be used in computer science to reduce energy consumption and still compute efficiently. They also focus on Computer Aided Design (CAD) and describe what design engineers and CAD software applications can do to support new streamlined business directi

  15. Medical Computational Thinking

    Musaeus, Peter; Tatar, Deborah Gail; Rosen, Michael A.

    2017-01-01

    Computational thinking (CT) in medicine means deliberating when to pursue computer-mediated solutions to medical problems and evaluating when such solutions are worth pursuing in order to assist in medical decision making. Teaching computational thinking (CT) at medical school should be aligned...

  16. Simulation of human decision making

    Forsythe, J Chris [Sandia Park, NM; Speed, Ann E [Albuquerque, NM; Jordan, Sabina E [Albuquerque, NM; Xavier, Patrick G [Albuquerque, NM

    2008-05-06

    A method for computer emulation of human decision making defines a plurality of concepts related to a domain and a plurality of situations related to the domain, where each situation is a combination of at least two of the concepts. Each concept and situation is represented in the computer as an oscillator output, and each situation and concept oscillator output is distinguishable from all other oscillator outputs. Information is input to the computer representative of detected concepts, and the computer compares the detected concepts with the stored situations to determine if a situation has occurred.

  17. Man and computer

    Fischbach, K.F.

    1981-01-01

    The discussion of cultural and sociological consequences of computer evolution is hindered by human prejudice. For example the sentence 'a computer is at best as intelligent as its programmer' veils actual developments. Theoretical limits of computer intelligence are the limits of intelligence in general. Modern computer systems replace not only human labour, but also human decision making and thereby human responsibility. The historical situation is unique. Human head-work is being automated and man is loosing function. (orig.) [de

  18. Review of quantum computation

    Lloyd, S.

    1992-01-01

    Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics

  19. Computers for imagemaking

    Clark, D

    1981-01-01

    Computers for Image-Making tells the computer non-expert all he needs to know about Computer Animation. In the hands of expert computer engineers, computer picture-drawing systems have, since the earliest days of computing, produced interesting and useful images. As a result of major technological developments since then, it no longer requires the expert's skill to draw pictures; anyone can do it, provided they know how to use the appropriate machinery. This collection of specially commissioned articles reflects the diversity of user applications in this expanding field

  20. Paraconsistent Computational Logic

    Jensen, Andreas Schmidt; Villadsen, Jørgen

    2012-01-01

    In classical logic everything follows from inconsistency and this makes classical logic problematic in areas of computer science where contradictions seem unavoidable. We describe a many-valued paraconsistent logic, discuss the truth tables and include a small case study....

  1. TH-AB-209-01: Making Benchtop X-Ray Fluorescence Computed Tomography (XFCT) Practical for in Vivo Imaging by Integration of a Dedicated High-Performance X-Ray Source in Conjunction with Micro-CT Functionality

    Manohar, N; Cho, S; Reynoso, F

    2016-01-01

    Purpose: To make benchtop x-ray fluorescence computed tomography (XFCT) practical for routine preclinical imaging tasks with gold nanoparticles (GNPs) by deploying, integrating, and characterizing a dedicated high-performance x-ray source and addition of simultaneous micro-CT functionality. Methods: Considerable research effort is currently under way to develop a polychromatic benchtop cone-beam XFCT system capable of imaging GNPs by stimulation and detection of gold K-shell x-ray fluorescence (XRF) photons. Recently, an ad hoc high-power x-ray source was incorporated and used to image the biodistribution of GNPs within a mouse, postmortem. In the current work, a dedicated x-ray source system featuring a liquid-cooled tungsten-target x-ray tube (max 160 kVp, ∼3 kW power) was deployed. The source was operated at 125 kVp, 24 mA. The tube’s compact dimensions allowed greater flexibility for optimizing both the irradiation and detection geometries. Incident x-rays were shaped by a conical collimator and filtered by 2 mm of tin. A compact “OEM” cadmium-telluride x-ray detector was implemented for detecting XRF/scatter spectra. Additionally, a flat panel detector was installed to allow simultaneous transmission CT imaging. The performance of the system was characterized by determining the detection limit (10-second acquisition time) for inserts filled with water/GNPs at various concentrations (0 and 0.010–1.0 wt%) and embedded in a small-animal-sized phantom. The phantom was loaded with 0.5, 0.3, and 0.1 wt% inserts and imaged using XFCT and simultaneous micro-CT. Results: An unprecedented detection limit of 0.030 wt% was experimentally demonstrated, with a 33% reduction in acquisition time. The reconstructed XFCT image accurately localized the imaging inserts. Micro-CT imaging did not provide enough contrast to distinguish imaging inserts from the phantom under the current conditions. Conclusion: The system is immediately capable of in vivo preclinical XFCT

  2. TH-AB-209-01: Making Benchtop X-Ray Fluorescence Computed Tomography (XFCT) Practical for in Vivo Imaging by Integration of a Dedicated High-Performance X-Ray Source in Conjunction with Micro-CT Functionality

    Manohar, N; Cho, S [UT MD Anderson Cancer Center, Houston, TX (United States); Reynoso, F [UT MD Anderson Cancer Center, Houston, TX (United States); Washington University School of Medicine, St. Louis, MO (United States)

    2016-06-15

    Purpose: To make benchtop x-ray fluorescence computed tomography (XFCT) practical for routine preclinical imaging tasks with gold nanoparticles (GNPs) by deploying, integrating, and characterizing a dedicated high-performance x-ray source and addition of simultaneous micro-CT functionality. Methods: Considerable research effort is currently under way to develop a polychromatic benchtop cone-beam XFCT system capable of imaging GNPs by stimulation and detection of gold K-shell x-ray fluorescence (XRF) photons. Recently, an ad hoc high-power x-ray source was incorporated and used to image the biodistribution of GNPs within a mouse, postmortem. In the current work, a dedicated x-ray source system featuring a liquid-cooled tungsten-target x-ray tube (max 160 kVp, ∼3 kW power) was deployed. The source was operated at 125 kVp, 24 mA. The tube’s compact dimensions allowed greater flexibility for optimizing both the irradiation and detection geometries. Incident x-rays were shaped by a conical collimator and filtered by 2 mm of tin. A compact “OEM” cadmium-telluride x-ray detector was implemented for detecting XRF/scatter spectra. Additionally, a flat panel detector was installed to allow simultaneous transmission CT imaging. The performance of the system was characterized by determining the detection limit (10-second acquisition time) for inserts filled with water/GNPs at various concentrations (0 and 0.010–1.0 wt%) and embedded in a small-animal-sized phantom. The phantom was loaded with 0.5, 0.3, and 0.1 wt% inserts and imaged using XFCT and simultaneous micro-CT. Results: An unprecedented detection limit of 0.030 wt% was experimentally demonstrated, with a 33% reduction in acquisition time. The reconstructed XFCT image accurately localized the imaging inserts. Micro-CT imaging did not provide enough contrast to distinguish imaging inserts from the phantom under the current conditions. Conclusion: The system is immediately capable of in vivo preclinical XFCT

  3. Cloud Computing Bible

    Sosinsky, Barrie

    2010-01-01

    The complete reference guide to the hot technology of cloud computingIts potential for lowering IT costs makes cloud computing a major force for both IT vendors and users; it is expected to gain momentum rapidly with the launch of Office Web Apps later this year. Because cloud computing involves various technologies, protocols, platforms, and infrastructure elements, this comprehensive reference is just what you need if you'll be using or implementing cloud computing.Cloud computing offers significant cost savings by eliminating upfront expenses for hardware and software; its growing popularit

  4. Computer Network Operations Methodology

    2004-03-01

    means of their computer information systems. Disrupt - This type of attack focuses on disrupting as “attackers might surreptitiously reprogram enemy...by reprogramming the computers that control distribution within the power grid. A disruption attack introduces disorder and inhibits the effective...between commanders. The use of methodologies is widespread and done subconsciously to assist individuals in decision making. The processes that

  5. Computer Software Reviews.

    Hawaii State Dept. of Education, Honolulu. Office of Instructional Services.

    Intended to provide guidance in the selection of the best computer software available to support instruction and to make optimal use of schools' financial resources, this publication provides a listing of computer software programs that have been evaluated according to their currency, relevance, and value to Hawaii's educational programs. The…

  6. Emission computed tomography

    Budinger, T.F.; Gullberg, G.T.; Huesman, R.H.

    1979-01-01

    This chapter is devoted to the methods of computer assisted tomography for determination of the three-dimensional distribution of gamma-emitting radionuclides in the human body. The major applications of emission computed tomography are in biological research and medical diagnostic procedures. The objectives of these procedures are to make quantitative measurements of in vivo biochemical and hemodynamic functions

  7. Skills and the appreciation of computer art

    Boden, Margaret A.

    2016-04-01

    The appreciation of art normally includes recognition of the artist's skills in making it. Most people cannot appreciate computer art in that way, because they know little or nothing about coding. Various suggestions are made about how computer artists and/or curators might design and present computer art in such a way as to make the relevant making-skills more intelligible.

  8. Computed tomography in hypothalamic hamartoma

    Mori, Koreaki; Takeuchi, Juji; Hanakita, Junya; Handa, Hajime; Nakano, Yoshihisa.

    1981-01-01

    Two cases of hypothalamic hamartoma were reported. Hypothalamic hamartoma is a rate tumor. The onset of symptoms is in infancy and early childhood. Clinical symptoms are composed of convulsive seizures, laughing spells and precocious puberty. CT finding of hypothalamic hamartoma is a mass in the suprasellar and interpeduncular cisterns which has the same density as the surrounding normal brain. The mass is not enhanced by injection of the contrast material and is easily differentiated from other masses in the suprasellar region. (author)

  9. A physicist's model of computation

    Fredkin, E.

    1991-01-01

    An attempt is presented to make a statement about what a computer is and how it works from the perspective of physics. The single observation that computation can be a reversible process allows for the same kind of insight into computing as was obtained by Carnot's discovery that heat engines could be modelled as reversible processes. It allows us to bring computation into the realm of physics, where the power of physics allows us to ask and answer questions that seemed intractable from the viewpoint of computer science. Strangely enough, this effort makes it clear why computers get cheaper every year. (author) 14 refs., 4 figs

  10. Quantum Computing: Pro and Con

    Preskill, John

    1997-01-01

    I assess the potential of quantum computation. Broad and important applications must be found to justify construction of a quantum computer; I review some of the known quantum algorithms and consider the prospects for finding new ones. Quantum computers are notoriously susceptible to making errors; I discuss recently developed fault-tolerant procedures that enable a quantum computer with noisy gates to perform reliably. Quantum computing hardware is still in its infancy; I comment on the spec...

  11. Portable computers - portable operating systems

    Wiegandt, D.

    1985-01-01

    Hardware development has made rapid progress over the past decade. Computers used to have attributes like ''general purpose'' or ''universal'', nowadays they are labelled ''personal'' and ''portable''. Recently, a major manufacturing company started marketing a portable version of their personal computer. But even for these small computers the old truth still holds that the biggest disadvantage of a computer is that it must be programmed, hardware by itself does not make a computer. (orig.)

  12. Make or buy strategy decision making in supply quality chain

    Seyed Mohammad Seyedhosseini

    2012-04-01

    Full Text Available Minimizing the total cost is absolutely the goal of each supply chain, which is most of the time pursued. In this regards, quality related costs that have significant roles are sometimes neglected. Selecting suppliers, which supply relatively high quality raw materials in a lower cost is considered as a strategic decision. Make or Buy decision can be also noticed in supplier selection process. In this paper, the supply strategy: Make or Buy decision (SS: MOB is studied in order to find which strategy (Make or Buy should be chosen to minimize the total costs of supply chain. Therefore, two separate models are generated for each strategy and several examples are solved for the respective models. Computational experiments show the efficiency of the proposed models for making decision about selecting the best strategy.

  13. Reversible computing fundamentals, quantum computing, and applications

    De Vos, Alexis

    2010-01-01

    Written by one of the few top internationally recognized experts in the field, this book concentrates on those topics that will remain fundamental, such as low power computing, reversible programming languages, and applications in thermodynamics. It describes reversible computing from various points of view: Boolean algebra, group theory, logic circuits, low-power electronics, communication, software, quantum computing. It is this multidisciplinary approach that makes it unique.Backed by numerous examples, this is useful for all levels of the scientific and academic community, from undergr

  14. Design of Computer Experiments

    Dehlendorff, Christian

    The main topic of this thesis is design and analysis of computer and simulation experiments and is dealt with in six papers and a summary report. Simulation and computer models have in recent years received increasingly more attention due to their increasing complexity and usability. Software...... packages make the development of rather complicated computer models using predefined building blocks possible. This implies that the range of phenomenas that are analyzed by means of a computer model has expanded significantly. As the complexity grows so does the need for efficient experimental designs...... and analysis methods, since the complex computer models often are expensive to use in terms of computer time. The choice of performance parameter is an important part of the analysis of computer and simulation models and Paper A introduces a new statistic for waiting times in health care units. The statistic...

  15. Computer in radiology

    Kuesters, H.

    1985-01-01

    With this publication, the author presents the requirements that a user specific software should fulfill to reach an effective practice rationalisation through computer usage and the hardware configuration necessary as basic equipment. This should make it more difficult in the future for sales representatives to sell radiologists unusable computer systems. Furthermore, questions shall be answered that were asked by computer interested radiologists during the system presentation. On the one hand there still exists a prejudice against programmes of standard texts and on the other side undefined fears, that handling a computer is to difficult and that one has to learn a computer language first to be able to work with computers. Finally, it i pointed out, the real competitive advantages can be obtained through computer usage. (orig.) [de

  16. Why the potent greenhouse gas laughing gas is formed in agriculture and forestry; Varfoer den starka vaexthusgasen lustgas bildas vid odling i jord- och skogsbruk

    2009-12-15

    available nitrogen can determine the nitrous oxide emission. This is one explanation why forest soils often present lower emissions than agricultural land. Another explanation of the lower emission from forests is that in Sweden and the rest of the world the most fertile soils are cultivated, while remaining forests in stony and less fertile areas suffer from nitrogen deficits. As long as the forest is growing, absorbing the nitrogen, the risk for nitrous oxide emission is low, but after clear cutting the risk increases. But there are forests where the nitrous oxide emission is high most of the time, such as fertile soils like drained fens with, typically, birch, raspberry and nettles. Biomass for energy use is sometimes specified as carbon dioxide neutral, since equal amounts of carbon dioxide is taken up by the photosynthesis as is released in the combustion or decomposition. But harvesting and manufacturing needs energy, often fossil, which adds carbon dioxide. Moreover, the cropping results in emission of nitrous oxide, which is a strong greenhouse gas with a long lifetime in the atmosphere. In the debate it has been claimed that, for climate reasons, the emission of nitrous oxide makes the exchange of oil for bioenergy meaningless. It can be concluded that biofuels almost always have a 'cost' of nitrous oxide and there is no climate neutral biofuel, but there are better and worse. In agriculture and forestry alike, the nitrous oxide production is influenced by management both in the short and the long run. As an example, addition of large amounts of nitrogen-fertilisers or manure increases the N{sub 2}O-emission when the available nitrogen exceeds the crop uptake capacity. But there are cropping systems as well where a low nitrous oxide emission has been measured in spite of an expected high emission. To get a minimum of nitrous oxide there is need for a tight connection between nitrogen liberation and plant uptake where minimal amounts are left to

  17. Aula de ciências em laboratório de informática: uma construção discursiva do monopólio participativo = Computer collaborative work in the elementary science classroom : the making of monopolization

    Jayme, B.; Reis, G.; Eijck, van M.W.; Roth, W.-M.

    2012-01-01

    In the present study, we articulate how the nature of students’ interactions during computer collaborative work (CCW) is not only mediated by their physical arrangement within the group, but it also contributes to the emergence of monopolization. Using a socio-cultural theoretical approach to

  18. Surfing for Data: A Gathering Trend in Data Storage Is the Use of Web-Based Applications that Make It Easy for Authorized Users to Access Hosted Server Content with Just a Computing Device and Browser

    Technology & Learning, 2005

    2005-01-01

    In recent years, the widespread availability of networks and the flexibility of Web browsers have shifted the industry from a client-server model to a Web-based one. In the client-server model of computing, clients run applications locally, with the servers managing storage, printing functions, and network traffic. Because every client is…

  19. Optical Computing

    Woods, Damien; Naughton, Thomas J.

    2008-01-01

    We consider optical computers that encode data using images and compute by transforming such images. We give an overview of a number of such optical computing architectures, including descriptions of the type of hardware commonly used in optical computing, as well as some of the computational efficiencies of optical devices. We go on to discuss optical computing from the point of view of computational complexity theory, with the aim of putting some old, and some very recent, re...

  20. Get set for computer science

    Edwards, Alistair

    2006-01-01

    This book is aimed at students who are thinking of studying Computer Science or a related topic at university. Part One is a brief introduction to the topics that make up Computer Science, some of which you would expect to find as course modules in a Computer Science programme. These descriptions should help you to tell the difference between Computer Science as taught in different departments and so help you to choose a course that best suits you. Part Two builds on what you have learned about the nature of Computer Science by giving you guidance in choosing universities and making your appli

  1. Demonstration of blind quantum computing.

    Barz, Stefanie; Kashefi, Elham; Broadbent, Anne; Fitzsimons, Joseph F; Zeilinger, Anton; Walther, Philip

    2012-01-20

    Quantum computers, besides offering substantial computational speedups, are also expected to preserve the privacy of a computation. We present an experimental demonstration of blind quantum computing in which the input, computation, and output all remain unknown to the computer. We exploit the conceptual framework of measurement-based quantum computation that enables a client to delegate a computation to a quantum server. Various blind delegated computations, including one- and two-qubit gates and the Deutsch and Grover quantum algorithms, are demonstrated. The client only needs to be able to prepare and transmit individual photonic qubits. Our demonstration is crucial for unconditionally secure quantum cloud computing and might become a key ingredient for real-life applications, especially when considering the challenges of making powerful quantum computers widely available.

  2. An ABC for decision making

    Garcia, Luiz Henrique Costa, E-mail: luiz_mogi@yahoo.com.br [Associacao de Medicina Intensiva Brasileira (AMIB), Sao Paulo, SP (Brazil); Irmandade da Santa Casa de Misericordia de Sao Paulo, SP (Brazil); Ferreira, Bruna Cortez [Hospital de Base de Sao Jose do Rio Preto, SP (Brazil)

    2015-03-15

    The present study was aimed at proposing a systematic evaluation of cranial computed tomography, identifying the main aspects to be analyzed in order to facilitate the decision making process regarding diagnosis and management in emergency settings. The present descriptive study comprised a literature review at the following databases: Access Medicine and Access Emergency Medicine (McGraw-Hill Education); British Medical Journal Evidence Center; UptoDate; Bireme; PubMed; Lilacs; SciELO; ProQuest; Micromedex (Thomson Reuters); Embase. Once the literature review was completed, the authors identified the main diseases with tomographic repercussions and proposed the present system to evaluate cranial computed tomography images. An easy-to-memorize ABC system will facilitate the decision making in emergency settings, as it covers the main diseases encountered by intensivists and emergency physicians, and provides a sequential guidance about anatomical structures to be investigated as well as their respective alterations. (author)

  3. An ABC for decision making

    Luiz Henrique Costa Garcia

    2015-04-01

    Full Text Available The present study was aimed at proposing a systematic evaluation of cranial computed tomography, identifying the main aspects to be analyzed in order to facilitate the decision making process regarding diagnosis and management in emergency settings. The present descriptive study comprised a literature review at the following databases: Access Medicine and Access Emergency Medicine (McGraw- Hill Education; British Medical Journal Evidence Center; UptoDate; Bireme; PubMed; Lilacs; SciELO; ProQuest; Micromedex (Thomson Reuters; Embase. Once the literature review was completed, the authors identified the main diseases with tomographic repercussions and proposed the present system to evaluate cranial computed tomography images. An easy-to-memorize ABC system will facilitate the decision making in emergency settings, as it covers the main diseases encountered by intensivists and emergency physicians, and provides a sequential guidance about anatomical structures to be investigated as well as their respective alterations.

  4. An ABC for decision making

    Garcia, Luiz Henrique Costa; Ferreira, Bruna Cortez

    2015-01-01

    The present study was aimed at proposing a systematic evaluation of cranial computed tomography, identifying the main aspects to be analyzed in order to facilitate the decision making process regarding diagnosis and management in emergency settings. The present descriptive study comprised a literature review at the following databases: Access Medicine and Access Emergency Medicine (McGraw-Hill Education); British Medical Journal Evidence Center; UptoDate; Bireme; PubMed; Lilacs; SciELO; ProQuest; Micromedex (Thomson Reuters); Embase. Once the literature review was completed, the authors identified the main diseases with tomographic repercussions and proposed the present system to evaluate cranial computed tomography images. An easy-to-memorize ABC system will facilitate the decision making in emergency settings, as it covers the main diseases encountered by intensivists and emergency physicians, and provides a sequential guidance about anatomical structures to be investigated as well as their respective alterations. (author)

  5. Why Make the World Move?

    Keith Evan Green

    2017-12-01

    Full Text Available The next horizons of human-computer interaction promise a whirling world of digital bytes, physical bits, and their hybrids. Are human beings prepared to inhabit such cyber-physical, adaptive environments? Assuming an optimistic view, this chapter offers a reply, drawing from art and art history, environmental design, literature, psychology, and evolutionary anthropology, to identify wide-ranging motivations for the design of such “new places” of human-computer interaction. Moreover, the author makes a plea to researchers focused in the domain of adaptive environments to pause and take a longer, more comprehensive, more self-reflective view to see what we’re doing, to recognize where we are, and to possibly find ourselves and others within our designed artifacts and systems that make the world move.

  6. Computation as Medium

    Jochum, Elizabeth Ann; Putnam, Lance

    2017-01-01

    Artists increasingly utilize computational tools to generate art works. Computational approaches to art making open up new ways of thinking about agency in interactive art because they invite participation and allow for unpredictable outcomes. Computational art is closely linked...... to the participatory turn in visual art, wherein spectators physically participate in visual art works. Unlike purely physical methods of interaction, computer assisted interactivity affords artists and spectators more nuanced control of artistic outcomes. Interactive art brings together human bodies, computer code......, and nonliving objects to create emergent art works. Computation is more than just a tool for artists, it is a medium for investigating new aesthetic possibilities for choreography and composition. We illustrate this potential through two artistic projects: an improvisational dance performance between a human...

  7. Introduction to morphogenetic computing

    Resconi, Germano; Xu, Guanglin

    2017-01-01

    This book offers a concise introduction to morphogenetic computing, showing that its use makes global and local relations, defects in crystal non-Euclidean geometry databases with source and sink, genetic algorithms, and neural networks more stable and efficient. It also presents applications to database, language, nanotechnology with defects, biological genetic structure, electrical circuit, and big data structure. In Turing machines, input and output states form a system – when the system is in one state, the input is transformed into output. This computation is always deterministic and without any possible contradiction or defects. In natural computation there are defects and contradictions that have to be solved to give a coherent and effective computation. The new computation generates the morphology of the system that assumes different forms in time. Genetic process is the prototype of the morphogenetic computing. At the Boolean logic truth value, we substitute a set of truth (active sets) values with...

  8. Decision-making under risk and uncertainty

    Gatev, G.I.

    2006-01-01

    Fuzzy sets and interval analysis tools to make computations and solve optimisation problems are presented. Fuzzy and interval extensions of Decision Theory criteria for decision-making under parametric uncertainty of prior information (probabilities, payoffs) are developed. An interval probability approach to the mean-value criterion is proposed. (author)

  9. Making Market Decisions in the Classroom.

    Rose, Stephen A.

    1986-01-01

    Computer software that will help intermediate and secondary social studies students learn to make rational decisions about personal and societal concerns are described. The courseware places students in the roles of business managers who make decisions about operating their firms. (RM)

  10. Opportunity for Realizing Ideal Computing System using Cloud Computing Model

    Sreeramana Aithal; Vaikunth Pai T

    2017-01-01

    An ideal computing system is a computing system with ideal characteristics. The major components and their performance characteristics of such hypothetical system can be studied as a model with predicted input, output, system and environmental characteristics using the identified objectives of computing which can be used in any platform, any type of computing system, and for application automation, without making modifications in the form of structure, hardware, and software coding by an exte...

  11. Computer group

    Bauer, H.; Black, I.; Heusler, A.; Hoeptner, G.; Krafft, F.; Lang, R.; Moellenkamp, R.; Mueller, W.; Mueller, W.F.; Schati, C.; Schmidt, A.; Schwind, D.; Weber, G.

    1983-01-01

    The computer groups has been reorganized to take charge for the general purpose computers DEC10 and VAX and the computer network (Dataswitch, DECnet, IBM - connections to GSI and IPP, preparation for Datex-P). (orig.)

  12. Computer Engineers.

    Moncarz, Roger

    2000-01-01

    Looks at computer engineers and describes their job, employment outlook, earnings, and training and qualifications. Provides a list of resources related to computer engineering careers and the computer industry. (JOW)

  13. Do kV really make everything look grey? A comparative study of 600 kV computed tomography systems; Macht kV wirklich grau? Eine Vergleichsstudie zu 600 kV Computertomografiesystemen

    Krumm, Michael; Sauerwein, Christoph; Haemmerle, Volker; Knupe, Gunnar [RayScan Technologies GmbH, Meersburg (Germany)

    2013-07-01

    In the automotive industry there is a manufacture of cast aluminium parts of considerable thickness. Growing importance is being attached to methods of nondestructive testing and full metrological characterisation of these parts. A major contribution to this end has been made by the latest generation of CT systems, whose 600 kV X-ray tubes and line detectors make for excellent image quality. The purpose of the present study was to make an objective assessment of the performance of these CT systems by examining the advantages and drawbacks of each. This included comparisons of various parameters such as tube voltage and exposure time as well as different tube and detector technologies and CT measurement methods.

  14. Computer Music

    Cook, Perry R.

    This chapter covers algorithms, technologies, computer languages, and systems for computer music. Computer music involves the application of computers and other digital/electronic technologies to music composition, performance, theory, history, and the study of perception. The field combines digital signal processing, computational algorithms, computer languages, hardware and software systems, acoustics, psychoacoustics (low-level perception of sounds from the raw acoustic signal), and music cognition (higher-level perception of musical style, form, emotion, etc.).

  15. Laugh of Art. Deformation of the Face

    Jhon Felipe Benavides Narváes

    2011-05-01

    Full Text Available Reflexiones que recrean la relación entre el arte y el humor, a partir de la revisión de la obra plástica de Adrián Montenegro, ofreciendo una panorámica del arte contemporáneo de Pasto. Ejercicio de escritura que busca acentuar la deformación del rostro, y la risa, en la obra y acción artísticas.

  16. Scholars See Comics as No Laughing Matter

    Viadero, Debra

    2009-01-01

    Once fuel for mass book burnings, comic books are gaining a foothold in the nation's schools, with teachers seeing them as a learning tool and scholars viewing them as a promising subject for educational research. Evidence of the rising credibility of Spiderman, Batman, and Archie came last month when Fordham University's graduate school of…

  17. CMS computing model evolution

    Grandi, C; Bonacorsi, D; Colling, D; Fisk, I; Girone, M

    2014-01-01

    The CMS Computing Model was developed and documented in 2004. Since then the model has evolved to be more flexible and to take advantage of new techniques, but many of the original concepts remain and are in active use. In this presentation we will discuss the changes planned for the restart of the LHC program in 2015. We will discuss the changes planning in the use and definition of the computing tiers that were defined with the MONARC project. We will present how we intend to use new services and infrastructure to provide more efficient and transparent access to the data. We will discuss the computing plans to make better use of the computing capacity by scheduling more of the processor nodes, making better use of the disk storage, and more intelligent use of the networking.

  18. Shared decision making

    ... page: //medlineplus.gov/ency/patientinstructions/000877.htm Shared decision making To use the sharing features on this page, ... treatment you both support. When to use Shared Decision Making Shared decision making is often used when you ...

  19. Parallel quantum computing in a single ensemble quantum computer

    Long Guilu; Xiao, L.

    2004-01-01

    We propose a parallel quantum computing mode for ensemble quantum computer. In this mode, some qubits are in pure states while other qubits are in mixed states. It enables a single ensemble quantum computer to perform 'single-instruction-multidata' type of parallel computation. Parallel quantum computing can provide additional speedup in Grover's algorithm and Shor's algorithm. In addition, it also makes a fuller use of qubit resources in an ensemble quantum computer. As a result, some qubits discarded in the preparation of an effective pure state in the Schulman-Varizani and the Cleve-DiVincenzo algorithms can be reutilized

  20. QUALITY ASSURANCE FOR CLOUD COMPUTING

    Sumaira Aslam; Hina Shahid

    2016-01-01

    Cloud computing is a greatest and latest thing. Marketers for lots of big companies are all using cloud computing terms in their marketing campaign to make them seem them impressive so, that they can get clients and customers. Cloud computing is overall the philosophy and design concept and it is much more complicated and yet much simpler. The basic underlined thing that cloud computing do is to separate the applications from operating systems from the software from the hardware that runs eve...

  1. THE COMPUTER AND SMALL BUSINESS.

    The place of the computer in small business is investigated with respect to what type of problems it can solve for small business and how the small...firm can acquire time on one. The decision-making process and the importance of information is discussed in relation to small business . Several...applications of computers are examined to show how the firm can use the computer in day-to-day business operations. The capabilities of a digital computer

  2. Computing platform to aid in decision making on energy management projects of the ELETROBRAS; Plataforma computacional para auxilio na tomada de decisao em projetos de gestao energetica da ELETROBRAS

    Assis, T.B.; Rosa, R.B.V.; Pinto, D.P.; Casagrande, C.G. [Universidade Federal de Juiz de Fora, MG (Brazil). Lab. de Eficiencia Energetica], Emails: tbassis@yahoo.com.br, tatobrasil@yahoo.com.br, casagrandejf@yahoo.com.br, danilo.pinto@ufjf.edu.br; Martins, C.C.; Cantarino, M. [Centrais Eletricas Brasileiras S.A. (ELETROBRAS), Rio de Janeiro, RJ (Brazil). Div. de Eficiencia Energetica em Edificacoes], Emails: cmartin@eletrobras.com, marcelo.cantarino@eletrobras.com

    2009-07-01

    A new tool developed by the Laboratory of Computational Efficiency Energy (LEENER), of the Federal University of Juiz de Fora (UFJF): the SP{sup 3} platform - Planning System of the Public Buildings is presented. This platform, when completed, will help Centrais Eletricas S.A. (ELETROBRAS) in meeting the demand of energetic efficiency projects for public buildings, standardizing data in order to accelerate the approval process and monitoring of a larger number of projects. This article discusses the stages of the platform development, the management methodology used, the goals and outcomes examined with the members of the PROCEL that working on this project.

  3. Granular computing: perspectives and challenges.

    Yao, JingTao; Vasilakos, Athanasios V; Pedrycz, Witold

    2013-12-01

    Granular computing, as a new and rapidly growing paradigm of information processing, has attracted many researchers and practitioners. Granular computing is an umbrella term to cover any theories, methodologies, techniques, and tools that make use of information granules in complex problem solving. The aim of this paper is to review foundations and schools of research and to elaborate on current developments in granular computing research. We first review some basic notions of granular computing. Classification and descriptions of various schools of research in granular computing are given. We also present and identify some research directions in granular computing.

  4. Making Astronomy Accessible

    Grice, Noreen A.

    2011-05-01

    A new semester begins, and your students enter the classroom for the first time. You notice a student sitting in a wheelchair or walking with assistance from a cane. Maybe you see a student with a guide dog or carrying a Braille computer. Another student gestures "hello” but then continues hand motions, and you realize the person is actually signing. You wonder why another student is using an electronic device to speak. Think this can't happen in your class? According to the U.S. Census, one out of every five Americans has a disability. And some disabilities, such as autism, dyslexia and arthritis, are considered "invisible” disabilities. This means you have a high probability that one of your students will have a disability. As an astronomy instructor, you have the opportunity to reach a wide variety of learners by using creative teaching strategies. I will share some suggestions on how to make astronomy and your part of the universe more accessible for everyone.

  5. Decision making and imperfection

    Karny, Miroslav; Wolpert, David

    2013-01-01

    Decision making (DM) is ubiquitous in both natural and artificial systems. The decisions made often differ from those recommended by the axiomatically well-grounded normative Bayesian decision theory, in a large part due to limited cognitive and computational resources of decision makers (either artificial units or humans). This state of a airs is often described by saying that decision makers are imperfect and exhibit bounded rationality. The neglected influence of emotional state and personality traits is an additional reason why normative theory fails to model human DM process.   The book is a joint effort of the top researchers from different disciplines to identify sources of imperfection and ways how to decrease discrepancies between the prescriptive theory and real-life DM. The contributions consider:   ·          how a crowd of imperfect decision makers outperforms experts' decisions;   ·          how to decrease decision makers' imperfection by reducing knowledge available;   ...

  6. Making training decisions proactively

    Hartman, R.F.

    1988-01-01

    The challenge of making training decisions with a high degree of confidence as to the results of those decisions face every DOD, Federal, State, and City agency. Training has historically been a very labor and paper intensive system with limited automation support. This paper outlines how one DOD component, the Air Force, is approaching that challenge. The Training Decision System (TDS) will provide the Air Force with an automated decision aid to help plan and estimate the consequences of various mixes of resident training, On-The-Job Training (OJT), and field training within a specialty such as security. The system described provides training from enlistment to separation and responds to hundreds of related security task needs. This system identifies what the tasks are, who should provide the training, what training setting should be used, what proficiency should be achieved, and through computer modeling provides an assessment of training effectiveness options and estimate the impact of implementing those options. With current budgetary constraints and with the possibility of further reductions in the future, the most cost effective training mix must be found to sustain required capabilities

  7. Analog computing

    Ulmann, Bernd

    2013-01-01

    This book is a comprehensive introduction to analog computing. As most textbooks about this powerful computing paradigm date back to the 1960s and 1970s, it fills a void and forges a bridge from the early days of analog computing to future applications. The idea of analog computing is not new. In fact, this computing paradigm is nearly forgotten, although it offers a path to both high-speed and low-power computing, which are in even more demand now than they were back in the heyday of electronic analog computers.

  8. Computational composites

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  9. Medical decision making

    Stiggelbout, A.M.; Vries, M. de; Scherer, L.; Keren, G.; Wu, G.

    2016-01-01

    This chapter presents an overview of the field of medical decision making. It distinguishes the levels of decision making seen in health-care practice and shows how research in judgment and decision making support or improve decision making. Most of the research has been done at the micro level,

  10. Quantum Computing

    Scarani, Valerio

    1998-01-01

    The aim of this thesis was to explain what quantum computing is. The information for the thesis was gathered from books, scientific publications, and news articles. The analysis of the information revealed that quantum computing can be broken down to three areas: theories behind quantum computing explaining the structure of a quantum computer, known quantum algorithms, and the actual physical realizations of a quantum computer. The thesis reveals that moving from classical memor...

  11. Computer tomography in otolaryngology

    Gradzki, J.

    1981-01-01

    The principles of design and the action of computer tomography which was applied also for the diagnosis of nose, ear and throat diseases are discussed. Computer tomography makes possible visualization of the structures of the nose, nasal sinuses and facial skeleton in transverse and eoronal planes. The method enables an accurate evaluation of the position and size of neoplasms in these regions and differentiation of inflammatory exudates against malignant masses. In otology computer tomography is used particularly in the diagnosis of pontocerebellar angle tumours and otogenic brain abscesses. Computer tomography of the larynx and pharynx provides new diagnostic data owing to the possibility of obtaining transverse sections and visualization of cartilage. Computer tomograms of some cases are presented. (author)

  12. Decision Making and Cancer

    Reyna, Valerie F.; Nelson, Wendy L.; Han, Paul K.; Pignone, Michael P.

    2015-01-01

    We review decision-making along the cancer continuum in the contemporary context of informed and shared decision making, in which patients are encouraged to take a more active role in their health care. We discuss challenges to achieving informed and shared decision making, including cognitive limitations and emotional factors, but argue that understanding the mechanisms of decision making offers hope for improving decision support. Theoretical approaches to decision making that explain cogni...

  13. Computer Profiling Based Model for Investigation

    Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh

    2011-01-01

    Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...

  14. Hospice decision making: diagnosis makes a difference.

    Waldrop, Deborah P; Meeker, Mary Ann

    2012-10-01

    This study explored the process of decision making about hospice enrollment and identified factors that influence the timing of that decision. This study employed an exploratory, descriptive, cross-sectional design and was conducted using qualitative methods. In-depth in-person semistructured interviews were conducted with 36 hospice patients and 55 caregivers after 2 weeks of hospice care. The study was guided by Janis and Mann's conflict theory model (CTM) of decision making. Qualitative data analysis involved a directed content analysis using concepts from the CTM. A model of hospice enrollment decision making is presented. Concepts from the CTM (appraisal, surveying and weighing the alternatives, deliberations, adherence) were used as an organizing framework to illustrate the dynamics. Distinct differences were found by diagnosis (cancer vs. other chronic illness, e.g., heart and lung diseases) during the pre-encounter phase or before the hospice referral but no differences emerged during the post-encounter phase. Differences in decision making by diagnosis suggest the need for research about effective means for tailored communication in end-of-life decision making by type of illness. Recognition that decision making about hospice admission varies is important for clinicians who aim to provide person-centered and family-focused care.

  15. Introduction of making of Micom robot

    Park, Sang Beom

    1991-01-01

    This book introduces micro robot like what is micro robot? mouse and cat, writing of plan of making, and tools for making, micom cat and mechanical cat, making of mechanical cat, sensor of CAT-3, software of CAT-3, motor and drive circuit of CAT-3, computer mouse of general system,, world of micro mouse, introduction of MICHI, sensor of MICHI, development of software like monitor function and communication program, related things of MICHI, advice for making of MICHI and arrangement of parts and program.

  16. Making Riverscapes Real (Invited)

    Marcus, A.; Carbonneau, P.; Fonstad, M. A.; Walther, S. C.

    2009-12-01

    The structure and function of rivers have long been characterized either by: (1) qualitative models such as the River Continuum Concept or Serial Discontinuity Concept which paint broad descriptive portraits of how river habitats and communities vary, or (2) quantitative models, such as Downstream Hydraulic Geometry, which rely on a limited number of measurements spread widely throughout a river basin. In contrast, Fausch et al. (2002) proposed applying landscape ecology methods to rivers to create “riverscapes.” Application of the riverscape concept requires information on the spatial distribution of organism-scale habitats throughout entire river systems. In practical terms, this means that researchers must replicate maps of local habitat continuously throughout entire rivers to document and predict total habitat availability, structure, and function. Likewise, information on time-dependent variations in these river habitats is necessary. Given these requirements, it is not surprising that the riverscape approach has largely remained a conceptual framework with limited practical application. Recent advances in remote sensing and desktop computing, however, make the riverscape concept more achievable from a mapping perspective. Remote sensing methods now enable sub-meter measurements of depth, water surface slope, grain size, biotypes, algae, and plants, as well as estimation of derived parameters such as velocity and stream power. Although significant obstacles remain to basin-extent sub-meter mapping of physical habitat, recent advances are overcoming these obstacles and allowing the riverscape concept to be put into use by different agencies - at least from a physical habitat perspective. More problematic to the riverscape approach, however, are two major issues that cannot be solved with technical solutions. First is the difficulty in acquiring maps of fauna, whether they be macroinvertebrates, fish, or microorganisms, at scales and spatial extents

  17. Discrete mathematics using a computer

    Hall, Cordelia

    2000-01-01

    Several areas of mathematics find application throughout computer science, and all students of computer science need a practical working understanding of them. These core subjects are centred on logic, sets, recursion, induction, relations and functions. The material is often called discrete mathematics, to distinguish it from the traditional topics of continuous mathematics such as integration and differential equations. The central theme of this book is the connection between computing and discrete mathematics. This connection is useful in both directions: • Mathematics is used in many branches of computer science, in applica­ tions including program specification, datastructures,design and analysis of algorithms, database systems, hardware design, reasoning about the correctness of implementations, and much more; • Computers can help to make the mathematics easier to learn and use, by making mathematical terms executable, making abstract concepts more concrete, and through the use of software tools su...

  18. Computational Medicine

    Nygaard, Jens Vinge

    2017-01-01

    The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours......The Health Technology Program at Aarhus University applies computational biology to investigate the heterogeneity of tumours...

  19. Grid Computing

    A computing grid interconnects resources such as high performancecomputers, scientific databases, and computercontrolledscientific instruments of cooperating organizationseach of which is autonomous. It precedes and is quitedifferent from cloud computing, which provides computingresources by vendors to customers ...

  20. Make Better Food Choices

    10 tips Nutrition Education Series make better food choices 10 tips for women’s health Fruits Grains Dairy Vegetables Protein Make yourself a priority and take time to care for yourself. ChooseMyPlate. gov ...

  1. Categorization = Decision Making + Generalization

    Seger, Carol A; Peterson, Erik J.

    2013-01-01

    We rarely, if ever, repeatedly encounter exactly the same situation. This makes generalization crucial for real world decision making. We argue that categorization, the study of generalizable representations, is a type of decision making, and that categorization learning research would benefit from approaches developed to study the neuroscience of decision making. Similarly, methods developed to examine generalization and learning within the field of categorization may enhance decision making research. We first discuss perceptual information processing and integration, with an emphasis on accumulator models. We then examine learning the value of different decision making choices via experience, emphasizing reinforcement learning modeling approaches. Next we discuss how value is combined with other factors in decision making, emphasizing the effects of uncertainty. Finally, we describe how a final decision is selected via thresholding processes implemented by the basal ganglia and related regions. We also consider how memory related functions in the hippocampus may be integrated with decision making mechanisms and contribute to categorization. PMID:23548891

  2. Making Games in the Classroom: Benefits and Gender Concerns

    Robertson, Judy

    2012-01-01

    This paper argues that making computer games as part of a classroom project can develop a range of new media storytelling, visual design and audience awareness skills. This claim is supported by data from the evaluation of a six week game making project in a state funded primary school in which 11-12 year old learners made their own computer games…

  3. Computer Technology for Industry

    1979-01-01

    In this age of the computer, more and more business firms are automating their operations for increased efficiency in a great variety of jobs, from simple accounting to managing inventories, from precise machining to analyzing complex structures. In the interest of national productivity, NASA is providing assistance both to longtime computer users and newcomers to automated operations. Through a special technology utilization service, NASA saves industry time and money by making available already developed computer programs which have secondary utility. A computer program is essentially a set of instructions which tells the computer how to produce desired information or effect by drawing upon its stored input. Developing a new program from scratch can be costly and time-consuming. Very often, however, a program developed for one purpose can readily be adapted to a totally different application. To help industry take advantage of existing computer technology, NASA operates the Computer Software Management and Information Center (COSMIC)(registered TradeMark),located at the University of Georgia. COSMIC maintains a large library of computer programs developed for NASA, the Department of Defense, the Department of Energy and other technology-generating agencies of the government. The Center gets a continual flow of software packages, screens them for adaptability to private sector usage, stores them and informs potential customers of their availability.

  4. Physicist or computer specialist?

    Clifton, J S [University College Hospital, London (United Kingdom)

    1966-06-15

    Since to most clinicians physical and computer science are two of the great mysteries of the world, the physicist in a hospital is expected by clinicians to be fully conversant with, and competent to make profound pronouncements on, all methods of computing. specific computing problems, and the suitability of computing machinery ranging from desk calculators to Atlas. This is not surprising since the proportion of the syllabus devoted to physics and mathematics in an M. B. degree is indeed meagre, and the word 'computer' has been surrounded with an aura of mysticism which suggests that it is some fantastic piece of electronic gadgetry comprehensible only to a veritable genius. The clinician consequently turns to the only scientific colleague with whom he has direct contact - the medical physicist - and expects him to be an authority. The physicist is thus thrust, however unwillingly, into the forefront of the advance of computer assistance to scientific medicine. It is therefore essential for him to acquire sufficient knowledge of computing science to enable him to provide satisfactory answers for the clinicianst queries, to proffer more detailed advice as to programming convince clinicians that the computer is really a 'simpleton' which can only add and subtract and even that only under instruction.

  5. Handbook on Decision Making Vol 2 Risk Management in Decision Making

    Lu, Jie; Zhang, Guangquan

    2012-01-01

    This book presents innovative theories, methodologies, and techniques in the field of risk management and decision making. It introduces new research developments and provides a comprehensive image of their potential applications to readers interested in the area. The collection includes: computational intelligence applications in decision making, multi-criteria decision making under risk, risk modelling,forecasting and evaluation, public security and community safety, risk management in supply chain and other business decision making, political risk management and disaster response systems. The book is directed to academic and applied researchers working on risk management, decision making, and management information systems.

  6. Quantum computers and quantum computations

    Valiev, Kamil' A

    2005-01-01

    This review outlines the principles of operation of quantum computers and their elements. The theory of ideal computers that do not interact with the environment and are immune to quantum decohering processes is presented. Decohering processes in quantum computers are investigated. The review considers methods for correcting quantum computing errors arising from the decoherence of the state of the quantum computer, as well as possible methods for the suppression of the decohering processes. A brief enumeration of proposed quantum computer realizations concludes the review. (reviews of topical problems)

  7. Quantum Computing for Computer Architects

    Metodi, Tzvetan

    2011-01-01

    Quantum computers can (in theory) solve certain problems far faster than a classical computer running any known classical algorithm. While existing technologies for building quantum computers are in their infancy, it is not too early to consider their scalability and reliability in the context of the design of large-scale quantum computers. To architect such systems, one must understand what it takes to design and model a balanced, fault-tolerant quantum computer architecture. The goal of this lecture is to provide architectural abstractions for the design of a quantum computer and to explore

  8. Pervasive Computing

    Silvis-Cividjian, N.

    This book provides a concise introduction to Pervasive Computing, otherwise known as Internet of Things (IoT) and Ubiquitous Computing (Ubicomp) which addresses the seamless integration of computing systems within everyday objects. By introducing the core topics and exploring assistive pervasive

  9. Computational vision

    Wechsler, Harry

    1990-01-01

    The book is suitable for advanced courses in computer vision and image processing. In addition to providing an overall view of computational vision, it contains extensive material on topics that are not usually covered in computer vision texts (including parallel distributed processing and neural networks) and considers many real applications.

  10. Spatial Computation

    2003-12-01

    Computation and today’s microprocessors with the approach to operating system architecture, and the controversy between microkernels and monolithic kernels...Both Spatial Computation and microkernels break away a relatively monolithic architecture into in- dividual lightweight pieces, well specialized...for their particular functionality. Spatial Computation removes global signals and control, in the same way microkernels remove the global address

  11. Teachers' Grading Decision Making

    Isnawati, Ida; Saukah, Ali

    2017-01-01

    This study investigated teachers' grading decision making, focusing on their beliefs underlying their grading decision making, their grading practices and assessment types, and factors they considered in grading decision making. Two teachers from two junior high schools applying different curriculum policies in grade reporting in Indonesian…

  12. I: Making Art

    Rosenfeld, Malke; Johnson, Marquetta; Plemons, Anna; Makol, Suzanne; Zanskas, Meghan; Dzula, Mark; Mahoney, Meg Robson

    2014-01-01

    Writing about the teaching artist practice should mean writing about art making. As both teacher and artist, the authors are required to be cognizant of their own art-making processes, both how it works and why it is important to them, in order to make this process visible to their students. They also need the same skills to write about how and…

  13. Elements of Making

    Rodriguez, Shelly; Harron, Jason; Fletcher, Steven; Spock, Hannah

    2018-01-01

    While there is no official definition, making is generally thought of as turning ideas into products through design, invention, and building. Support is growing for integrating making into science, technology, engineering, and mathematics (STEM) education. Making can help high school students explore science concepts and phenomena, yet, lacking…

  14. Computational intelligence in biomedical imaging

    2014-01-01

    This book provides a comprehensive overview of the state-of-the-art computational intelligence research and technologies in biomedical images with emphasis on biomedical decision making. Biomedical imaging offers useful information on patients’ medical conditions and clues to causes of their symptoms and diseases. Biomedical images, however, provide a large number of images which physicians must interpret. Therefore, computer aids are demanded and become indispensable in physicians’ decision making. This book discusses major technical advancements and research findings in the field of computational intelligence in biomedical imaging, for example, computational intelligence in computer-aided diagnosis for breast cancer, prostate cancer, and brain disease, in lung function analysis, and in radiation therapy. The book examines technologies and studies that have reached the practical level, and those technologies that are becoming available in clinical practices in hospitals rapidly such as computational inte...

  15. Parallel computations

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  16. Human Computation

    CERN. Geneva

    2008-01-01

    What if people could play computer games and accomplish work without even realizing it? What if billions of people collaborated to solve important problems for humanity or generate training data for computers? My work aims at a general paradigm for doing exactly that: utilizing human processing power to solve computational problems in a distributed manner. In particular, I focus on harnessing human time and energy for addressing problems that computers cannot yet solve. Although computers have advanced dramatically in many respects over the last 50 years, they still do not possess the basic conceptual intelligence or perceptual capabilities...

  17. Quantum computation

    Deutsch, D.

    1992-01-01

    As computers become ever more complex, they inevitably become smaller. This leads to a need for components which are fabricated and operate on increasingly smaller size scales. Quantum theory is already taken into account in microelectronics design. This article explores how quantum theory will need to be incorporated into computers in future in order to give them their components functionality. Computation tasks which depend on quantum effects will become possible. Physicists may have to reconsider their perspective on computation in the light of understanding developed in connection with universal quantum computers. (UK)

  18. Focused Science Delivery makes science make sense.

    Rachel W. Scheuering; Jamie. Barbour

    2004-01-01

    Science does not exist in a vacuum, but reading scientific publications might make you think it does. Although the policy and management implications of their findings could often touch a much wider audience, many scientists write only for the few people in the world who share their area of expertise. In addition, most scientific publications provide information that...

  19. Making detailed predictions makes (some) predictions worse

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  20. Computer software.

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  1. Computer sciences

    Smith, Paul H.

    1988-01-01

    The Computer Science Program provides advanced concepts, techniques, system architectures, algorithms, and software for both space and aeronautics information sciences and computer systems. The overall goal is to provide the technical foundation within NASA for the advancement of computing technology in aerospace applications. The research program is improving the state of knowledge of fundamental aerospace computing principles and advancing computing technology in space applications such as software engineering and information extraction from data collected by scientific instruments in space. The program includes the development of special algorithms and techniques to exploit the computing power provided by high performance parallel processors and special purpose architectures. Research is being conducted in the fundamentals of data base logic and improvement techniques for producing reliable computing systems.

  2. Making and Changing Wills

    Cheryl Tilse

    2016-02-01

    Full Text Available Wills are important social, economic, and legal documents. Yet little is known about current will making practices and intentions. A comprehensive national database on the prevalence of will making in Australia was developed to identify who is or is not most likely to draw up a will and triggers for making and changing wills. A national survey of 2,405 adults aged above 18 years was administered by telephone in August and September 2012. Fifty-nine percent of the Australian adult population has a valid will, and the likelihood of will making increases with age and estate value. Efforts to get organized, especially in combination with life stage and asset changes trigger will making; procrastination, rather than a strong resistance, appears to explain not making a will. Understanding will making is timely in the context of predicted significant intergenerational transfers of wealth, changing demographics, and a renewed emphasis on retirement planning.

  3. Introduction to computer networking

    Robertazzi, Thomas G

    2017-01-01

    This book gives a broad look at both fundamental networking technology and new areas that support it and use it. It is a concise introduction to the most prominent, recent technological topics in computer networking. Topics include network technology such as wired and wireless networks, enabling technologies such as data centers, software defined networking, cloud and grid computing and applications such as networks on chips, space networking and network security. The accessible writing style and non-mathematical treatment makes this a useful book for the student, network and communications engineer, computer scientist and IT professional. • Features a concise, accessible treatment of computer networking, focusing on new technological topics; • Provides non-mathematical introduction to networks in their most common forms today;< • Includes new developments in switching, optical networks, WiFi, Bluetooth, LTE, 5G, and quantum cryptography.

  4. Computational neurology and psychiatry

    Bhattacharya, Basabdatta; Cochran, Amy

    2017-01-01

    This book presents the latest research in computational methods for modeling and simulating brain disorders. In particular, it shows how mathematical models can be used to study the relationship between a given disorder and the specific brain structure associated with that disorder. It also describes the emerging field of computational psychiatry, including the study of pathological behavior due to impaired functional connectivity, pathophysiological activity, and/or aberrant decision-making. Further, it discusses the data analysis techniques that will be required to analyze the increasing amount of data being generated about the brain. Lastly, the book offers some tips on the application of computational models in the field of quantitative systems pharmacology. Mainly written for computational scientists eager to discover new application fields for their model, this book also benefits neurologists and psychiatrists wanting to learn about new methods.

  5. Studi Perbandingan Layanan Cloud Computing

    Afdhal, Afdhal

    2013-01-01

    In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud d...

  6. Cloud Computing: Architecture and Services

    Ms. Ravneet Kaur

    2018-01-01

    Cloud computing is Internet-based computing, whereby shared resources, software, and information are provided to computers and other devices on demand, like the electricity grid. It is a method for delivering information technology (IT) services where resources are retrieved from the Internet through web-based tools and applications, as opposed to a direct connection to a server. Rather than keeping files on a proprietary hard drive or local storage device, cloud-based storage makes it possib...

  7. From computer to brain foundations of computational neuroscience

    Lytton, William W

    2002-01-01

    Biology undergraduates, medical students and life-science graduate students often have limited mathematical skills. Similarly, physics, math and engineering students have little patience for the detailed facts that make up much of biological knowledge. Teaching computational neuroscience as an integrated discipline requires that both groups be brought forward onto common ground. This book does this by making ancillary material available in an appendix and providing basic explanations without becoming bogged down in unnecessary details. The book will be suitable for undergraduates and beginning graduate students taking a computational neuroscience course and also to anyone with an interest in the uses of the computer in modeling the nervous system.

  8. Business making decisions

    Enrique Benjamín Franklin Fincowsky

    2011-06-01

    Full Text Available People and organizations make better or get wrong as consequence of making decisions. Sometimes making decisions is just a trial and error process. Some others, decisions are good and the results profitable with a few of mistakes, most of the time because it’s considered the experience and the control of a specific field or the good intention of who makes them. Actually, all kinds of decisions bring learning. What is important is the intention, the attitude and the values considered in this process. People from different scenes face many facts and circumstances—almost always out of control—that affect the making decisions process. There is not a unique way to make decisions for all companies in many settings. The person who makes a decision should identify the problem, to solve it later using alternatives and solutions. Even though, follow all the steps it’s not easy as it seems. Looking back the conditions related to the decisions, we can mention the followings: uncertainty, risk and certainty. When people identify circumstances and facts, as well as its effects in a possible situation, they will make decisions with certainty. As long as the information decreases and it becomes ambiguous the risk becomes an important factor in the making decisions process because they are connected to probable objectives (clear or subjective (opinion judgment or intuition. To finish, uncertainty, involves people that make a decision with no or little information about circumstances or criteria with basis

  9. Computer programming and computer systems

    Hassitt, Anthony

    1966-01-01

    Computer Programming and Computer Systems imparts a "reading knowledge? of computer systems.This book describes the aspects of machine-language programming, monitor systems, computer hardware, and advanced programming that every thorough programmer should be acquainted with. This text discusses the automatic electronic digital computers, symbolic language, Reverse Polish Notation, and Fortran into assembly language. The routine for reading blocked tapes, dimension statements in subroutines, general-purpose input routine, and efficient use of memory are also elaborated.This publication is inten

  10. Decision Making Under Uncertainty

    2010-11-01

    A sound approach to rational decision making requires a decision maker to establish decision objectives, identify alternatives, and evaluate those...often violate the axioms of rationality when making decisions under uncertainty. The systematic description of such observations may lead to the...which leads to “anchoring” on the initial value. The fact that individuals have been shown to deviate from rationality when making decisions

  11. Computational biomechanics

    Ethier, C.R.

    2004-01-01

    Computational biomechanics is a fast-growing field that integrates modern biological techniques and computer modelling to solve problems of medical and biological interest. Modelling of blood flow in the large arteries is the best-known application of computational biomechanics, but there are many others. Described here is work being carried out in the laboratory on the modelling of blood flow in the coronary arteries and on the transport of viral particles in the eye. (author)

  12. A large-scale computer facility for computational aerodynamics

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  13. On teaching computer ethics within a computer science department.

    Quinn, Michael J

    2006-04-01

    The author has surveyed a quarter of the accredited undergraduate computer science programs in the United States. More than half of these programs offer a 'social and ethical implications of computing' course taught by a computer science faculty member, and there appears to be a trend toward teaching ethics classes within computer science departments. Although the decision to create an 'in house' computer ethics course may sometimes be a pragmatic response to pressure from the accreditation agency, this paper argues that teaching ethics within a computer science department can provide students and faculty members with numerous benefits. The paper lists topics that can be covered in a computer ethics course and offers some practical suggestions for making the course successful.

  14. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  15. Computational Composites

    Vallgårda, Anna K. A.

    to understand the computer as a material like any other material we would use for design, like wood, aluminum, or plastic. That as soon as the computer forms a composition with other materials it becomes just as approachable and inspiring as other smart materials. I present a series of investigations of what...... Computational Composite, and Telltale). Through the investigations, I show how the computer can be understood as a material and how it partakes in a new strand of materials whose expressions come to be in context. I uncover some of their essential material properties and potential expressions. I develop a way...

  16. Simply computing for seniors

    Clark, Linda

    2011-01-01

    Step-by-step instructions for seniors to get up and running on a home PC Answering the call for an up-to-date, straightforward computer guide targeted specifically for seniors, this helpful book includes easy-to-follow tutorials that escort you through the basics and shows you how to get the most out of your PC. Boasting an elegant, full-color interior with a clean, sophisticated look and feel, the layout makes it easy for you to find the information you need quickly. Author Linda Clark has earned her highly respected reputation through years of teaching computers at both the beginnin

  17. Optimal usage of computing grid network in the fields of nuclear fusion computing task

    Tenev, D.

    2006-01-01

    Nowadays the nuclear power becomes the main source of energy. To make its usage more efficient, the scientists created complicated simulation models, which require powerful computers. The grid computing is the answer to powerful and accessible computing resources. The article observes, and estimates the optimal configuration of the grid environment in the fields of the complicated nuclear fusion computing tasks. (author)

  18. Modelling decision-making by pilots

    Patrick, Nicholas J. M.

    1993-01-01

    Our scientific goal is to understand the process of human decision-making. Specifically, a model of human decision-making in piloting modern commercial aircraft which prescribes optimal behavior, and against which we can measure human sub-optimality is sought. This model should help us understand such diverse aspects of piloting as strategic decision-making, and the implicit decisions involved in attention allocation. Our engineering goal is to provide design specifications for (1) better computer-based decision-aids, and (2) better training programs for the human pilot (or human decision-maker, DM).

  19. Nickel: makes stainless steel strong

    Boland, Maeve A.

    2012-01-01

    Nickel is a silvery-white metal that is used mainly to make stainless steel and other alloys stronger and better able to withstand extreme temperatures and corrosive environments. Nickel was first identified as a unique element in 1751 by Baron Axel Fredrik Cronstedt, a Swedish mineralogist and chemist. He originally called the element kupfernickel because it was found in rock that looked like copper (kupfer) ore and because miners thought that "bad spirits" (nickel) in the rock were making it difficult for them to extract copper from it. Approximately 80 percent of the primary (not recycled) nickel consumed in the United States in 2011 was used in alloys, such as stainless steel and superalloys. Because nickel increases an alloy's resistance to corrosion and its ability to withstand extreme temperatures, equipment and parts made of nickel-bearing alloys are often used in harsh environments, such as those in chemical plants, petroleum refineries, jet engines, power generation facilities, and offshore installations. Medical equipment, cookware, and cutlery are often made of stainless steel because it is easy to clean and sterilize. All U.S. circulating coins except the penny are made of alloys that contain nickel. Nickel alloys are increasingly being used in making rechargeable batteries for portable computers, power tools, and hybrid and electric vehicles. Nickel is also plated onto such items as bathroom fixtures to reduce corrosion and provide an attractive finish.

  20. Visualizing a silicon quantum computer

    Sanders, Barry C; Hollenberg, Lloyd C L; Edmundson, Darran; Edmundson, Andrew

    2008-01-01

    Quantum computation is a fast-growing, multi-disciplinary research field. The purpose of a quantum computer is to execute quantum algorithms that efficiently solve computational problems intractable within the existing paradigm of 'classical' computing built on bits and Boolean gates. While collaboration between computer scientists, physicists, chemists, engineers, mathematicians and others is essential to the project's success, traditional disciplinary boundaries can hinder progress and make communicating the aims of quantum computing and future technologies difficult. We have developed a four minute animation as a tool for representing, understanding and communicating a silicon-based solid-state quantum computer to a variety of audiences, either as a stand-alone animation to be used by expert presenters or embedded into a longer movie as short animated sequences. The paper includes a generally applicable recipe for successful scientific animation production.

  1. Visualizing a silicon quantum computer

    Sanders, Barry C.; Hollenberg, Lloyd C. L.; Edmundson, Darran; Edmundson, Andrew

    2008-12-01

    Quantum computation is a fast-growing, multi-disciplinary research field. The purpose of a quantum computer is to execute quantum algorithms that efficiently solve computational problems intractable within the existing paradigm of 'classical' computing built on bits and Boolean gates. While collaboration between computer scientists, physicists, chemists, engineers, mathematicians and others is essential to the project's success, traditional disciplinary boundaries can hinder progress and make communicating the aims of quantum computing and future technologies difficult. We have developed a four minute animation as a tool for representing, understanding and communicating a silicon-based solid-state quantum computer to a variety of audiences, either as a stand-alone animation to be used by expert presenters or embedded into a longer movie as short animated sequences. The paper includes a generally applicable recipe for successful scientific animation production.

  2. Visualizing a silicon quantum computer

    Sanders, Barry C [Institute for Quantum Information Science, University of Calgary, Calgary, Alberta T2N 1N4 (Canada); Hollenberg, Lloyd C L [ARC Centre of Excellence for Quantum Computer Technology, School of Physics, University of Melbourne, Victoria 3010 (Australia); Edmundson, Darran; Edmundson, Andrew [EDM Studio Inc., Level 2, 850 16 Avenue SW, Calgary, Alberta T2R 0S9 (Canada)], E-mail: bsanders@qis.ucalgary.ca, E-mail: lloydch@unimelb.edu.au, E-mail: darran@edmstudio.com

    2008-12-15

    Quantum computation is a fast-growing, multi-disciplinary research field. The purpose of a quantum computer is to execute quantum algorithms that efficiently solve computational problems intractable within the existing paradigm of 'classical' computing built on bits and Boolean gates. While collaboration between computer scientists, physicists, chemists, engineers, mathematicians and others is essential to the project's success, traditional disciplinary boundaries can hinder progress and make communicating the aims of quantum computing and future technologies difficult. We have developed a four minute animation as a tool for representing, understanding and communicating a silicon-based solid-state quantum computer to a variety of audiences, either as a stand-alone animation to be used by expert presenters or embedded into a longer movie as short animated sequences. The paper includes a generally applicable recipe for successful scientific animation production.

  3. Decision-making based on emotional images.

    Katahira, Kentaro; Fujimura, Tomomi; Okanoya, Kazuo; Okada, Masato

    2011-01-01

    The emotional outcome of a choice affects subsequent decision making. While the relationship between decision making and emotion has attracted attention, studies on emotion and decision making have been independently developed. In this study, we investigated how the emotional valence of pictures, which was stochastically contingent on participants' choices, influenced subsequent decision making. In contrast to traditional value-based decision-making studies that used money or food as a reward, the "reward value" of the decision outcome, which guided the update of value for each choice, is unknown beforehand. To estimate the reward value of emotional pictures from participants' choice data, we used reinforcement learning models that have successfully been used in previous studies for modeling value-based decision making. Consequently, we found that the estimated reward value was asymmetric between positive and negative pictures. The negative reward value of negative pictures (relative to neutral pictures) was larger in magnitude than the positive reward value of positive pictures. This asymmetry was not observed in valence for an individual picture, which was rated by the participants regarding the emotion experienced upon viewing it. These results suggest that there may be a difference between experienced emotion and the effect of the experienced emotion on subsequent behavior. Our experimental and computational paradigm provides a novel way for quantifying how and what aspects of emotional events affect human behavior. The present study is a first step toward relating a large amount of knowledge in emotion science and in taking computational approaches to value-based decision making.

  4. Making Team Differences Work

    Strathman, Beth

    2015-01-01

    Most district and school leaders understand that recruiting group members who have differing backgrounds, perspectives, talents, and personalities makes for good decision-making. Unfortunately, simply assembling a variety of top-notch individuals does not necessarily mean their talents and perspectives will be fully considered. Beth Strathman…

  5. Making Smart Food Choices

    ... turn JavaScript on. Feature: Healthy Aging Making Smart Food Choices Past Issues / Winter 2015 Table of Contents Everyday ... NIH www.nia.nih.gov/Go4Life Making Smart Food Choices To maintain a healthy weight, balance the calories ...

  6. It Makes You Think

    Harden, Helen

    2009-01-01

    This article provides an overview of the "It Makes You Think" resource. The lessons provided by this resource show how students can learn about the global dimension through science. The "It Makes You Think" resource contains ten topics: (1) Metals in jewellery worldwide; (2) Global food market; (3) The worldwide travels of…

  7. Variation in decision making

    Dall, Sasha R. X.; Gosling, Samuel; Gordon D.A., Brown,; Dingemanse, Niels; Ido, Erev,; Martin, Kocher,; Laura, Schulz,; Todd, Peter M; Weissing, Franz; Wolf, Max; Hammerstein, Peter; Stevens, Jeffrey R.

    2012-01-01

    Variation in how organisms allocate their behavior over their lifetimes is key to determining Darwinian fitness., and thus the evolution of human and nonhuman decision making. This chapter explores how decision making varies across biologically and societally significant scales and what role such

  8. Making Healthy Choices Easier

    Guldborg Hansen, Pelle; Skov, Laurits Rohden; Lund Skov, Katrine

    2016-01-01

    . However, integration and testing of the nudge approach as part of more comprehensive public health strategies aimed at making healthy choices easier is being threatened by inadequate understandings of its scientific character, relationship with regulation and its ethical implications. This article reviews...... working with or incorporating the nudge approach into programs or policies aimed at making healthy choices easier...

  9. [Decision making in cariology

    Verdonschot, E.H.A.M.; Liem, S.L.; Palenstein Helderman, W.H. van

    2003-01-01

    By conducting an oral examination, during radiographic examination and in treatment planning procedures dentists make numerous decisions. A dentist will be required to make his decisions explicit. Decision trees and decision analyses may play an important role. In a decision analysis, the

  10. Culinary Decision Making.

    Curtis, Rob

    1987-01-01

    Advises directors of ways to include day care workers in the decision-making process. Enumerates benefits of using staff to help focus and direct changes in the day care center and discusses possible pitfalls in implementation of a collective decision-making approach to management. (NH)

  11. Preventive maintenance for computer systems - concepts & issues ...

    Performing preventive maintenance activities for the computer is not optional. The computer is a sensitive and delicate device that needs adequate time and attention to make it work properly. In this paper, the concept and issues on how to prolong the life span of the system, that is, the way to make the system last long and ...

  12. Libre courseware for Bayesian decision making

    Suzdaleva, Evgenia

    2005-01-01

    Roč. 1, č. 2 (2005), s. 1-3 ISSN 1860-7470 Grant - others:Commission EU(XE) 110330-CP-1-2003-1-ES-MINERVA-M Institutional research plan: CEZ:AV0Z10750506 Keywords : software tools * education * decision making Subject RIV: JC - Computer Hardware ; Software

  13. Assessing Professional Decision-Making Abilities.

    McNergney, Robert; Hinson, Stephanie

    1985-01-01

    Describes Teacher Development Decision Exercises, a computer-based method of diagnosing abilities of elementary and secondary school supervisors (principals, staff developers, curriculum coordinators) to make professional preactive or planning decisions. This approval simulates assessment of supervisors' abilities to use professional knowledge to…

  14. GPGPU COMPUTING

    BOGDAN OANCEA

    2012-05-01

    Full Text Available Since the first idea of using GPU to general purpose computing, things have evolved over the years and now there are several approaches to GPU programming. GPU computing practically began with the introduction of CUDA (Compute Unified Device Architecture by NVIDIA and Stream by AMD. These are APIs designed by the GPU vendors to be used together with the hardware that they provide. A new emerging standard, OpenCL (Open Computing Language tries to unify different GPU general computing API implementations and provides a framework for writing programs executed across heterogeneous platforms consisting of both CPUs and GPUs. OpenCL provides parallel computing using task-based and data-based parallelism. In this paper we will focus on the CUDA parallel computing architecture and programming model introduced by NVIDIA. We will present the benefits of the CUDA programming model. We will also compare the two main approaches, CUDA and AMD APP (STREAM and the new framwork, OpenCL that tries to unify the GPGPU computing models.

  15. Quantum Computing

    Home; Journals; Resonance – Journal of Science Education; Volume 5; Issue 9. Quantum Computing - Building Blocks of a Quantum Computer. C S Vijay Vishal Gupta. General Article Volume 5 Issue 9 September 2000 pp 69-81. Fulltext. Click here to view fulltext PDF. Permanent link:

  16. Platform computing

    2002-01-01

    "Platform Computing releases first grid-enabled workload management solution for IBM eServer Intel and UNIX high performance computing clusters. This Out-of-the-box solution maximizes the performance and capability of applications on IBM HPC clusters" (1/2 page) .

  17. Quantum computing

    Burba, M.; Lapitskaya, T.

    2017-01-01

    This article gives an elementary introduction to quantum computing. It is a draft for a book chapter of the "Handbook of Nature-Inspired and Innovative Computing", Eds. A. Zomaya, G.J. Milburn, J. Dongarra, D. Bader, R. Brent, M. Eshaghian-Wilner, F. Seredynski (Springer, Berlin Heidelberg New York, 2006).

  18. Computational Pathology

    Louis, David N.; Feldman, Michael; Carter, Alexis B.; Dighe, Anand S.; Pfeifer, John D.; Bry, Lynn; Almeida, Jonas S.; Saltz, Joel; Braun, Jonathan; Tomaszewski, John E.; Gilbertson, John R.; Sinard, John H.; Gerber, Georg K.; Galli, Stephen J.; Golden, Jeffrey A.; Becich, Michael J.

    2016-01-01

    Context We define the scope and needs within the new discipline of computational pathology, a discipline critical to the future of both the practice of pathology and, more broadly, medical practice in general. Objective To define the scope and needs of computational pathology. Data Sources A meeting was convened in Boston, Massachusetts, in July 2014 prior to the annual Association of Pathology Chairs meeting, and it was attended by a variety of pathologists, including individuals highly invested in pathology informatics as well as chairs of pathology departments. Conclusions The meeting made recommendations to promote computational pathology, including clearly defining the field and articulating its value propositions; asserting that the value propositions for health care systems must include means to incorporate robust computational approaches to implement data-driven methods that aid in guiding individual and population health care; leveraging computational pathology as a center for data interpretation in modern health care systems; stating that realizing the value proposition will require working with institutional administrations, other departments, and pathology colleagues; declaring that a robust pipeline should be fostered that trains and develops future computational pathologists, for those with both pathology and non-pathology backgrounds; and deciding that computational pathology should serve as a hub for data-related research in health care systems. The dissemination of these recommendations to pathology and bioinformatics departments should help facilitate the development of computational pathology. PMID:26098131

  19. Decision making uncertainty, imperfection, deliberation and scalability

    Kárný, Miroslav; Wolpert, David

    2015-01-01

    This volume focuses on uncovering the fundamental forces underlying dynamic decision making among multiple interacting, imperfect and selfish decision makers. The chapters are written by leading experts from different disciplines, all considering the many sources of imperfection in decision making, and always with an eye to decreasing the myriad discrepancies between theory and real world human decision making. Topics addressed include uncertainty, deliberation cost and the complexity arising from the inherent large computational scale of decision making in these systems. In particular, analyses and experiments are presented which concern: • task allocation to maximize “the wisdom of the crowd”; • design of a society of “edutainment” robots who account for one anothers’ emotional states; • recognizing and counteracting seemingly non-rational human decision making; • coping with extreme scale when learning causality in networks; • efficiently incorporating expert knowledge in personalized...

  20. Cloud Computing

    Krogh, Simon

    2013-01-01

    with technological changes, the paradigmatic pendulum has swung between increased centralization on one side and a focus on distributed computing that pushes IT power out to end users on the other. With the introduction of outsourcing and cloud computing, centralization in large data centers is again dominating...... the IT scene. In line with the views presented by Nicolas Carr in 2003 (Carr, 2003), it is a popular assumption that cloud computing will be the next utility (like water, electricity and gas) (Buyya, Yeo, Venugopal, Broberg, & Brandic, 2009). However, this assumption disregards the fact that most IT production......), for instance, in establishing and maintaining trust between the involved parties (Sabherwal, 1999). So far, research in cloud computing has neglected this perspective and focused entirely on aspects relating to technology, economy, security and legal questions. While the core technologies of cloud computing (e...

  1. Computability theory

    Weber, Rebecca

    2012-01-01

    What can we compute--even with unlimited resources? Is everything within reach? Or are computations necessarily drastically limited, not just in practice, but theoretically? These questions are at the heart of computability theory. The goal of this book is to give the reader a firm grounding in the fundamentals of computability theory and an overview of currently active areas of research, such as reverse mathematics and algorithmic randomness. Turing machines and partial recursive functions are explored in detail, and vital tools and concepts including coding, uniformity, and diagonalization are described explicitly. From there the material continues with universal machines, the halting problem, parametrization and the recursion theorem, and thence to computability for sets, enumerability, and Turing reduction and degrees. A few more advanced topics round out the book before the chapter on areas of research. The text is designed to be self-contained, with an entire chapter of preliminary material including re...

  2. Computational Streetscapes

    Paul M. Torrens

    2016-09-01

    Full Text Available Streetscapes have presented a long-standing interest in many fields. Recently, there has been a resurgence of attention on streetscape issues, catalyzed in large part by computing. Because of computing, there is more understanding, vistas, data, and analysis of and on streetscape phenomena than ever before. This diversity of lenses trained on streetscapes permits us to address long-standing questions, such as how people use information while mobile, how interactions with people and things occur on streets, how we might safeguard crowds, how we can design services to assist pedestrians, and how we could better support special populations as they traverse cities. Amid each of these avenues of inquiry, computing is facilitating new ways of posing these questions, particularly by expanding the scope of what-if exploration that is possible. With assistance from computing, consideration of streetscapes now reaches across scales, from the neurological interactions that form among place cells in the brain up to informatics that afford real-time views of activity over whole urban spaces. For some streetscape phenomena, computing allows us to build realistic but synthetic facsimiles in computation, which can function as artificial laboratories for testing ideas. In this paper, I review the domain science for studying streetscapes from vantages in physics, urban studies, animation and the visual arts, psychology, biology, and behavioral geography. I also review the computational developments shaping streetscape science, with particular emphasis on modeling and simulation as informed by data acquisition and generation, data models, path-planning heuristics, artificial intelligence for navigation and way-finding, timing, synthetic vision, steering routines, kinematics, and geometrical treatment of collision detection and avoidance. I also discuss the implications that the advances in computing streetscapes might have on emerging developments in cyber

  3. Computer methods in general relativity: algebraic computing

    Araujo, M E; Skea, J E F; Koutras, A; Krasinski, A; Hobill, D; McLenaghan, R G; Christensen, S M

    1993-01-01

    Karlhede & MacCallum [1] gave a procedure for determining the Lie algebra of the isometry group of an arbitrary pseudo-Riemannian manifold, which they intended to im- plement using the symbolic manipulation package SHEEP but never did. We have recently finished making this procedure explicit by giving an algorithm suitable for implemen- tation on a computer [2]. Specifically, we have written an algorithm for determining the isometry group of a spacetime (in four dimensions), and partially implemented this algorithm using the symbolic manipulation package CLASSI, which is an extension of SHEEP.

  4. Computed Tomography Status

    Hansche, B. D.

    1983-01-01

    Computed tomography (CT) is a relatively new radiographic technique which has become widely used in the medical field, where it is better known as computerized axial tomographic (CAT) scanning. This technique is also being adopted by the industrial radiographic community, although the greater range of densities, variation in samples sizes, plus possible requirement for finer resolution make it difficult to duplicate the excellent results that the medical scanners have achieved.

  5. Microcomputers and computer networks

    Owens, J.L.

    1976-01-01

    Computers, for all their speed and efficiency, have their foibles and failings. Until the advent of minicomputers, users often had to supervise their programs personally to make sure they executed correctly. Minicomputers could take over some of these chores, but they were too expensive to be dedicated to any but the most vital services. Inexpensive, easily programmed microcomputers are easing this limitation, and permitting a flood of new applications. 3 figures

  6. WHAT MAKES CHEMISTRY DIFFICULT?

    IICBA01

    School of Natural and Computational Science Dire Dawa University, Ethiopia,. 2 ... lack of teaching aids and the difficulty of the language of chemistry. ... lab every other week consisting of concept pretests on the web, hand-written homework, ...

  7. COMPUTATIONAL THINKING

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  8. Decision making and cancer.

    Reyna, Valerie F; Nelson, Wendy L; Han, Paul K; Pignone, Michael P

    2015-01-01

    We review decision making along the cancer continuum in the contemporary context of informed and shared decision making in which patients are encouraged to take a more active role in their health care. We discuss challenges to achieving informed and shared decision making, including cognitive limitations and emotional factors, but argue that understanding the mechanisms of decision making offers hope for improving decision support. Theoretical approaches to decision making that explain cognition, emotion, and their interaction are described, including classical psychophysical approaches, dual-process approaches that focus on conflicts between emotion versus cognition (or reason), and modern integrative approaches such as fuzzy-trace theory. In contrast to the earlier emphasis on rote use of numerical detail, modern approaches emphasize understanding the bottom-line gist of options (which encompasses emotion and other influences on meaning) and retrieving relevant social and moral values to apply to those gist representations. Finally, research on interventions to support better decision making in clinical settings is reviewed, drawing out implications for future research on decision making and cancer. PsycINFO Database Record (c) 2015 APA, all rights reserved.

  9. Decision making in water resource planning: Models and computer graphics

    Fedra, K; Carlsen, A J [ed.

    1987-01-01

    This paper describes some basic concepts of simulation-based decision support systems for water resources management and the role of symbolic, graphics-based user interfaces. Designed to allow direct and easy access to advanced methods of analysis and decision support for a broad and heterogeneous group of users, these systems combine data base management, system simulation, operations research techniques such as optimization, interactive data analysis, elements of advanced decision technology, and artificial intelligence, with a friendly and conversational, symbolic display oriented user interface. Important features of the interface are the use of several parallel or alternative styles of interaction and display, indlucing colour graphics and natural language. Combining quantitative numerical methods with qualitative and heuristic approaches, and giving the user direct and interactive control over the systems function, human knowledge, experience and judgement are integrated with formal approaches into a tightly coupled man-machine system through an intelligent and easily accessible user interface. 4 drawings, 42 references.

  10. A Computer Simulation of Organizational Decision-Making.

    1979-12-01

    into detail on the representation of the linkages between the variables. The next variable to be considered is the organizatinal outcome. Like...D.A., Organizational Learning: A Theor of Action Perspective; Addison-Wesley, Reading, Mass., 1978. Cohen, M.D., and March, J.G., Leadership and

  11. Can Computers Make the Grade in Writing Exams?

    Hadi-Tabassum, Samina

    2014-01-01

    Schools are scrambling to prepare students for the writing assessments aligned to the Common Core State Standards. In some states, writing has not been assessed for over a decade. Yet, with the use of computerized grading of the student's writing, many teachers are wondering how to best prepare students for the writing assessments that will…

  12. Introduction to computer ethics

    Ćorić Dragana M.

    2015-01-01

    Full Text Available Ethics is becoming one of the most often used but also misinter­preted words. It is often taken as an additional, corrective parameter to the policies and strategies that has to be adopted in the area of political works, environment, business, and medicine. Computer ethics thus makes the latest ethical discipline in the scientific sky. But its roots, as it was the case with environmental ethics, ranging decades; only the speech and the use of the same, as well as discussions on the postulates of computer ethics, are the results of rapid IT development in the last decade or two. In this paper, according to the title, will be shown introduction to computer ethics-its basis, the most important representatives, as well as the most important succession.

  13. Computer aided surface representation

    Barnhill, R.E.

    1990-02-19

    The central research problem of this project is the effective representation, computation, and display of surfaces interpolating to information in three or more dimensions. If the given information is located on another surface, then the problem is to construct a surface defined on a surface''. Sometimes properties of an already defined surface are desired, which is geometry processing''. Visualization of multivariate surfaces is possible by means of contouring higher dimensional surfaces. These problems and more are discussed below. The broad sweep from constructive mathematics through computational algorithms to computer graphics illustrations is utilized in this research. The breadth and depth of this research activity makes this research project unique.

  14. Computer Security Day

    CERN Bulletin

    2010-01-01

      Viruses, phishing, malware and cyber-criminals can all threaten your computer and your data, even at CERN! Experts will share their experience with you and offer solutions to keep your computer secure. Thursday, 10 June 2010, 9.30, Council Chamber Make a note in your diary! Presentations in French and English: How do hackers break into your computer? Quels sont les enjeux et conséquences des attaques informatiques contre le CERN ? How so criminals steal your money on the Internet? Comment utiliser votre ordinateur de manière sécurisée ? and a quiz: test your knowledge and win one of the many prizes that will be on offer! For more information and to follow the day's events via a live webcast go to: http://cern.ch/SecDay.  

  15. Computational physics

    Newman, Mark

    2013-01-01

    A complete introduction to the field of computational physics, with examples and exercises in the Python programming language. Computers play a central role in virtually every major physics discovery today, from astrophysics and particle physics to biophysics and condensed matter. This book explains the fundamentals of computational physics and describes in simple terms the techniques that every physicist should know, such as finite difference methods, numerical quadrature, and the fast Fourier transform. The book offers a complete introduction to the topic at the undergraduate level, and is also suitable for the advanced student or researcher who wants to learn the foundational elements of this important field.

  16. Cloud Computing

    Baun, Christian; Nimis, Jens; Tai, Stefan

    2011-01-01

    Cloud computing is a buzz-word in today's information technology (IT) that nobody can escape. But what is really behind it? There are many interpretations of this term, but no standardized or even uniform definition. Instead, as a result of the multi-faceted viewpoints and the diverse interests expressed by the various stakeholders, cloud computing is perceived as a rather fuzzy concept. With this book, the authors deliver an overview of cloud computing architecture, services, and applications. Their aim is to bring readers up to date on this technology and thus to provide a common basis for d

  17. Computational Viscoelasticity

    Marques, Severino P C

    2012-01-01

    This text is a guide how to solve problems in which viscoelasticity is present using existing commercial computational codes. The book gives information on codes’ structure and use, data preparation  and output interpretation and verification. The first part of the book introduces the reader to the subject, and to provide the models, equations and notation to be used in the computational applications. The second part shows the most important Computational techniques: Finite elements formulation, Boundary elements formulation, and presents the solutions of Viscoelastic problems with Abaqus.

  18. Optical computing.

    Stroke, G. W.

    1972-01-01

    Applications of the optical computer include an approach for increasing the sharpness of images obtained from the most powerful electron microscopes and fingerprint/credit card identification. The information-handling capability of the various optical computing processes is very great. Modern synthetic-aperture radars scan upward of 100,000 resolvable elements per second. Fields which have assumed major importance on the basis of optical computing principles are optical image deblurring, coherent side-looking synthetic-aperture radar, and correlative pattern recognition. Some examples of the most dramatic image deblurring results are shown.

  19. Phenomenological Computation?

    Brier, Søren

    2014-01-01

    Open peer commentary on the article “Info-computational Constructivism and Cognition” by Gordana Dodig-Crnkovic. Upshot: The main problems with info-computationalism are: (1) Its basic concept of natural computing has neither been defined theoretically or implemented practically. (2. It cannot...... encompass human concepts of subjective experience and intersubjective meaningful communication, which prevents it from being genuinely transdisciplinary. (3) Philosophically, it does not sufficiently accept the deep ontological differences between various paradigms such as von Foerster’s second- order...

  20. Sharing experience and knowledge with wearable computers

    Nilsson, Marcus; Drugge, Mikael; Parnes, Peter

    2004-01-01

    Wearable computer have mostly been looked on when used in isolation. But the wearable computer with Internet connection is a good tool for communication and for sharing knowledge and experience with other people. The unobtrusiveness of this type of equipment makes it easy to communicate at most type of locations and contexts. The wearable computer makes it easy to be a mediator of other people knowledge and becoming a knowledgeable user. This paper describes the experience gained from testing...

  1. Embracing the quantum limit in silicon computing.

    Morton, John J L; McCamey, Dane R; Eriksson, Mark A; Lyon, Stephen A

    2011-11-16

    Quantum computers hold the promise of massive performance enhancements across a range of applications, from cryptography and databases to revolutionary scientific simulation tools. Such computers would make use of the same quantum mechanical phenomena that pose limitations on the continued shrinking of conventional information processing devices. Many of the key requirements for quantum computing differ markedly from those of conventional computers. However, silicon, which plays a central part in conventional information processing, has many properties that make it a superb platform around which to build a quantum computer. © 2011 Macmillan Publishers Limited. All rights reserved

  2. Savannah River Site computing architecture

    1991-03-29

    A computing architecture is a framework for making decisions about the implementation of computer technology and the supporting infrastructure. Because of the size, diversity, and amount of resources dedicated to computing at the Savannah River Site (SRS), there must be an overall strategic plan that can be followed by the thousands of site personnel who make decisions daily that directly affect the SRS computing environment and impact the site's production and business systems. This plan must address the following requirements: There must be SRS-wide standards for procurement or development of computing systems (hardware and software). The site computing organizations must develop systems that end users find easy to use. Systems must be put in place to support the primary function of site information workers. The developers of computer systems must be given tools that automate and speed up the development of information systems and applications based on computer technology. This document describes a proposal for a site-wide computing architecture that addresses the above requirements. In summary, this architecture is standards-based data-driven, and workstation-oriented with larger systems being utilized for the delivery of needed information to users in a client-server relationship.

  3. Savannah River Site computing architecture

    1991-03-29

    A computing architecture is a framework for making decisions about the implementation of computer technology and the supporting infrastructure. Because of the size, diversity, and amount of resources dedicated to computing at the Savannah River Site (SRS), there must be an overall strategic plan that can be followed by the thousands of site personnel who make decisions daily that directly affect the SRS computing environment and impact the site`s production and business systems. This plan must address the following requirements: There must be SRS-wide standards for procurement or development of computing systems (hardware and software). The site computing organizations must develop systems that end users find easy to use. Systems must be put in place to support the primary function of site information workers. The developers of computer systems must be given tools that automate and speed up the development of information systems and applications based on computer technology. This document describes a proposal for a site-wide computing architecture that addresses the above requirements. In summary, this architecture is standards-based data-driven, and workstation-oriented with larger systems being utilized for the delivery of needed information to users in a client-server relationship.

  4. Preparing Future Secondary Computer Science Educators

    Ajwa, Iyad

    2007-01-01

    Although nearly every college offers a major in computer science, many computer science teachers at the secondary level have received little formal training. This paper presents details of a project that could make a significant contribution to national efforts to improve computer science education by combining teacher education and professional…

  5. Computer models for economic and silvicultural decisions

    Rosalie J. Ingram

    1989-01-01

    Computer systems can help simplify decisionmaking to manage forest ecosystems. We now have computer models to help make forest management decisions by predicting changes associated with a particular management action. Models also help you evaluate alternatives. To be effective, the computer models must be reliable and appropriate for your situation.

  6. Computers, Nanotechnology and Mind

    Ekdahl, Bertil

    2008-10-01

    In 1958, two years after the Dartmouth conference, where the term artificial intelligence was coined, Herbert Simon and Allen Newell asserted the existence of "machines that think, that learn and create." They were further prophesying that the machines' capacity would increase and be on par with the human mind. Now, 50 years later, computers perform many more tasks than one could imagine in the 1950s but, virtually, no computer can do more than could the first digital computer, developed by John von Neumann in the 1940s. Computers still follow algorithms, they do not create them. However, the development of nanotechnology seems to have given rise to new hopes. With nanotechnology two things are supposed to happen. Firstly, due to the small scale it will be possible to construct huge computer memories which are supposed to be the precondition for building an artificial brain, secondly, nanotechnology will make it possible to scan the brain which in turn will make reverse engineering possible; the mind will be decoded by studying the brain. The consequence of such a belief is that the brain is no more than a calculator, i.e., all that the mind can do is in principle the results of arithmetical operations. Computers are equivalent to formal systems which in turn was an answer to an idea by Hilbert that proofs should contain ideal statements for which operations cannot be applied in a contentual way. The advocates of artificial intelligence will place content in a machine that is developed not only to be free of content but also cannot contain content. In this paper I argue that the hope for artificial intelligence is in vain.

  7. Making people be healthy.

    Wilkinson, Timothy Martin

    2009-09-01

    How are we supposed to decide the rights and wrongs of banning smoking in bars, restricting adverts for junk food, nagging people into being screened for cancers, or banning the sale of party pills? The aim of this paper is to think through the political ethics of trying to make people healthier through influencing or restricting their choices. This paper covers: (1) Paternalism. What it is, what it assumes. (2) The place of health in well-being, and how this makes paternalism problematic. (3) The mistakes people make in acting in their own interests, and the implications for pro-health paternalism. (4) Autonomy objections to paternalism. The paper (5) finishes on a note of hope, by commending the currently fashionable libertarian paternalism: trying to have one's carrot cake and eat it too. A persistent theme is that thinking sensibly about making people healthier needs subtlety, not broad, ringing declarations.

  8. Making Ceramic Cameras

    Squibb, Matt

    2009-01-01

    This article describes how to make a clay camera. This idea of creating functional cameras from clay allows students to experience ceramics, photography, and painting all in one unit. (Contains 1 resource and 3 online resources.)

  9. Interactive Strategy-Making

    Andersen, Torben Juul

    2015-01-01

    This article outlines an interactive strategy-making model that combines central reasoning with ongoing learning from decentralised responses. The management literature often presents strategy as implementing an optimal plan identified through rational analysis and ascribes potential shortcomings...... to failed communication and execution of the planned actions. However, effective strategy-making comprises both central reasoning from forward-looking planning considerations and decentralised responses to emerging events as interacting elements in a dynamic adaptive system. The interaction between...

  10. Organizational decision making

    Grandori, Anna

    2015-01-01

    Approved for public release; distribution is unlimited This thesis develops a heuristic approach to organizational decision-making by synthesizing the classical, neo-classical and contingency approaches to organization theory. The conceptual framework developed also integrates the rational and cybernetic approaches with cognitive processes underlying the decision-making process. The components of the approach address the role of environment in organizational decision-maki...

  11. Making PMT halftone prints

    Corey, J.D.

    1977-05-01

    In the printing process for technical reports presently used at Bendix Kansas City Division, photographs are reproduced by pasting up PMT halftone prints on the artwork originals. These originals are used to make positive-working plastic plates for offset lithography. Instructions for making good-quality halftone prints using Eastman Kodak's PMT materials and processes are given in this report. 14 figures.

  12. Essentials of cloud computing

    Chandrasekaran, K

    2014-01-01

    ForewordPrefaceComputing ParadigmsLearning ObjectivesPreambleHigh-Performance ComputingParallel ComputingDistributed ComputingCluster ComputingGrid ComputingCloud ComputingBiocomputingMobile ComputingQuantum ComputingOptical ComputingNanocomputingNetwork ComputingSummaryReview PointsReview QuestionsFurther ReadingCloud Computing FundamentalsLearning ObjectivesPreambleMotivation for Cloud ComputingThe Need for Cloud ComputingDefining Cloud ComputingNIST Definition of Cloud ComputingCloud Computing Is a ServiceCloud Computing Is a Platform5-4-3 Principles of Cloud computingFive Essential Charact

  13. Personal Computers.

    Toong, Hoo-min D.; Gupta, Amar

    1982-01-01

    Describes the hardware, software, applications, and current proliferation of personal computers (microcomputers). Includes discussions of microprocessors, memory, output (including printers), application programs, the microcomputer industry, and major microcomputer manufacturers (Apple, Radio Shack, Commodore, and IBM). (JN)

  14. Computational Literacy

    Chongtay, Rocio; Robering, Klaus

    2016-01-01

    In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies for the acquisit......In recent years, there has been a growing interest in and recognition of the importance of Computational Literacy, a skill generally considered to be necessary for success in the 21st century. While much research has concentrated on requirements, tools, and teaching methodologies...... for the acquisition of Computational Literacy at basic educational levels, focus on higher levels of education has been much less prominent. The present paper considers the case of courses for higher education programs within the Humanities. A model is proposed which conceives of Computational Literacy as a layered...

  15. Computing Religion

    Nielbo, Kristoffer Laigaard; Braxton, Donald M.; Upal, Afzal

    2012-01-01

    The computational approach has become an invaluable tool in many fields that are directly relevant to research in religious phenomena. Yet the use of computational tools is almost absent in the study of religion. Given that religion is a cluster of interrelated phenomena and that research...... concerning these phenomena should strive for multilevel analysis, this article argues that the computational approach offers new methodological and theoretical opportunities to the study of religion. We argue that the computational approach offers 1.) an intermediary step between any theoretical construct...... and its targeted empirical space and 2.) a new kind of data which allows the researcher to observe abstract constructs, estimate likely outcomes, and optimize empirical designs. Because sophisticated mulitilevel research is a collaborative project we also seek to introduce to scholars of religion some...

  16. Computational Controversy

    Timmermans, Benjamin; Kuhn, Tobias; Beelen, Kaspar; Aroyo, Lora

    2017-01-01

    Climate change, vaccination, abortion, Trump: Many topics are surrounded by fierce controversies. The nature of such heated debates and their elements have been studied extensively in the social science literature. More recently, various computational approaches to controversy analysis have

  17. Grid Computing

    IAS Admin

    emergence of supercomputers led to the use of computer simula- tion as an .... Scientific and engineering applications (e.g., Tera grid secure gate way). Collaborative ... Encryption, privacy, protection from malicious software. Physical Layer.

  18. Making Deformable Template Models Operational

    Fisker, Rune

    2000-01-01

    for estimation of the model parameters, which applies a combination of a maximum likelihood and minimum distance criterion. Another contribution is a very fast search based initialization algorithm using a filter interpretation of the likelihood model. These two methods can be applied to most deformable template......Deformable template models are a very popular and powerful tool within the field of image processing and computer vision. This thesis treats this type of models extensively with special focus on handling their common difficulties, i.e. model parameter selection, initialization and optimization....... A proper handling of the common difficulties is essential for making the models operational by a non-expert user, which is a requirement for intensifying and commercializing the use of deformable template models. The thesis is organized as a collection of the most important articles, which has been...

  19. Distributed Decision Making and Control

    Rantzer, Anders

    2012-01-01

    Distributed Decision Making and Control is a mathematical treatment of relevant problems in distributed control, decision and multiagent systems, The research reported was prompted by the recent rapid development in large-scale networked and embedded systems and communications. One of the main reasons for the growing complexity in such systems is the dynamics introduced by computation and communication delays. Reliability, predictability, and efficient utilization of processing power and network resources are central issues and the new theory and design methods presented here are needed to analyze and optimize the complex interactions that arise between controllers, plants and networks. The text also helps to meet requirements arising from industrial practice for a more systematic approach to the design of distributed control structures and corresponding information interfaces Theory for coordination of many different control units is closely related to economics and game theory network uses being dictated by...

  20. Computer tomographs

    Niedzwiedzki, M.

    1982-01-01

    Physical foundations and the developments in the transmission and emission computer tomography are presented. On the basis of the available literature and private communications a comparison is made of the various transmission tomographs. A new technique of computer emission tomography ECT, unknown in Poland, is described. The evaluation of two methods of ECT, namely those of positron and single photon emission tomography is made. (author)

  1. Computational sustainability

    Kersting, Kristian; Morik, Katharina

    2016-01-01

    The book at hand gives an overview of the state of the art research in Computational Sustainability as well as case studies of different application scenarios. This covers topics such as renewable energy supply, energy storage and e-mobility, efficiency in data centers and networks, sustainable food and water supply, sustainable health, industrial production and quality, etc. The book describes computational methods and possible application scenarios.

  2. Computing farms

    Yeh, G.P.

    2000-01-01

    High-energy physics, nuclear physics, space sciences, and many other fields have large challenges in computing. In recent years, PCs have achieved performance comparable to the high-end UNIX workstations, at a small fraction of the price. We review the development and broad applications of commodity PCs as the solution to CPU needs, and look forward to the important and exciting future of large-scale PC computing

  3. Computational chemistry

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  4. Exercises in molecular computing.

    Stojanovic, Milan N; Stefanovic, Darko; Rudchenko, Sergei

    2014-06-17

    modified such that a stem-loop closes onto the substrate recognition region, making it unavailable for the substrate and thus rendering the deoxyribozyme inactive. But a conformational change can then be induced by an input oligonucleotide, complementary to the loop, to open the stem, allow the substrate to bind, and allow its cleavage to proceed, which is eventually reported via fluorescence. In this Account, several designs of this form are reviewed, along with their application in the construction of large circuits that exhibited complex logical and temporal relationships between the inputs and the outputs. Intelligent (in the sense of being capable of nontrivial information processing) theranostic (therapy + diagnostic) applications have always been the ultimate motivation for developing computing (i.e., decision-making) circuits, and we review our experiments with logic-gate elements bound to cell surfaces that evaluate the proximal presence of multiple markers on lymphocytes.

  5. Decision Making in Action

    Orasanu, Judith; Statler, Irving C. (Technical Monitor)

    1994-01-01

    The importance of decision-making to safety in complex, dynamic environments like mission control centers and offshore installations has been well established. NASA-ARC has a program of research dedicated to fostering safe and effective decision-making in the manned spaceflight environment. Because access to spaceflight is limited, environments with similar characteristics, including aviation and nuclear power plants, serve as analogs from which space-relevant data can be gathered and theories developed. Analyses of aviation accidents cite crew judgement and decision making as causes or contributing factors in over half of all accidents. A similar observation has been made in nuclear power plants. Yet laboratory research on decision making has not proven especially helpful in improving the quality of decisions in these kinds of environments. One reason is that the traditional, analytic decision models are inappropriate to multidimensional, high-risk environments, and do not accurately describe what expert human decision makers do when they make decisions that have consequences. A new model of dynamic, naturalistic decision making is offered that may prove useful for improving decision making in complex, isolated, confined and high-risk environments. Based on analyses of crew performance in full-mission simulators and accident reports, features that define effective decision strategies in abnormal or emergency situations have been identified. These include accurate situation assessment (including time and risk assessment), appreciation of the complexity of the problem, sensitivity to constraints on the decision, timeliness of the response, and use of adequate information. More effective crews also manage their workload to provide themselves with time and resources to make good decisions. In brief, good decisions are appropriate to the demands of the situation. Effective crew decision making and overall performance are mediated by crew communication. Communication

  6. The challenge of computer mathematics.

    Barendregt, Henk; Wiedijk, Freek

    2005-10-15

    Progress in the foundations of mathematics has made it possible to formulate all thinkable mathematical concepts, algorithms and proofs in one language and in an impeccable way. This is not in spite of, but partially based on the famous results of Gödel and Turing. In this way statements are about mathematical objects and algorithms, proofs show the correctness of statements and computations, and computations are dealing with objects and proofs. Interactive computer systems for a full integration of defining, computing and proving are based on this. The human defines concepts, constructs algorithms and provides proofs, while the machine checks that the definitions are well formed and the proofs and computations are correct. Results formalized so far demonstrate the feasibility of this 'computer mathematics'. Also there are very good applications. The challenge is to make the systems more mathematician-friendly, by building libraries and tools. The eventual goal is to help humans to learn, develop, communicate, referee and apply mathematics.

  7. Studi Perbandingan Layanan Cloud Computing

    Afdhal Afdhal

    2014-03-01

    Full Text Available In the past few years, cloud computing has became a dominant topic in the IT area. Cloud computing offers hardware, infrastructure, platform and applications without requiring end-users knowledge of the physical location and the configuration of providers who deliver the services. It has been a good solution to increase reliability, reduce computing cost, and make opportunities to IT industries to get more advantages. The purpose of this article is to present a better understanding of cloud delivery service, correlation and inter-dependency. This article compares and contrasts the different levels of delivery services and the development models, identify issues, and future directions on cloud computing. The end-users comprehension of cloud computing delivery service classification will equip them with knowledge to determine and decide which business model that will be chosen and adopted securely and comfortably. The last part of this article provides several recommendations for cloud computing service providers and end-users.

  8. Computational neuroscience a first course

    Mallot, Hanspeter A

    2013-01-01

    Computational Neuroscience - A First Course provides an essential introduction to computational neuroscience and  equips readers with a fundamental understanding of modeling the nervous system at the membrane, cellular, and network level. The book, which grew out of a lecture series held regularly for more than ten years to graduate students in neuroscience with backgrounds in biology, psychology and medicine, takes its readers on a journey through three fundamental domains of computational neuroscience: membrane biophysics, systems theory and artificial neural networks. The required mathematical concepts are kept as intuitive and simple as possible throughout the book, making it fully accessible to readers who are less familiar with mathematics. Overall, Computational Neuroscience - A First Course represents an essential reference guide for all neuroscientists who use computational methods in their daily work, as well as for any theoretical scientist approaching the field of computational neuroscience.

  9. Making Media Studies

    David Gauntlett

    2015-12-01

    Full Text Available This podcast is a recording of a research seminar that took place on December 3, 2015, at the University of Westminster's Communication and Media Research Institute (CAMRI. In this contribution, David Gauntlett discusses his new book, Making Media Studies, and other new work. In Making Media Studies (Peter Lang, 2015, Gauntlett proposes a vision of media studies based around doing and making – not about the acquisition of skills, as such, but an experience of building knowledge and understanding through creative hands-on engagement with all kinds of media. Gauntlett suggests that media studies scholars have failed to recognise the significance of everyday creativity – the vital drive of people to make, exchange, and learn together, supported by online networks. He argues that we should think about media in terms of conversations, inspirations, and making things happen. Media studies can be about genuine social change, he suggests, if we recognise the significance of everyday creativity, work to transform our tools, and learn to use them wisely. David Gauntlett is a Professor in the School of Media, Arts and Design at the University of Westminster, where he is also the School's Co-Director of Research. He is the author of several books, including: Creative Explorations (2007, Media, Gender and Identity: An Introduction (2nd edition 2008, Making is Connecting (2011, and Making Media Studies (2015. He has made a number of popular online resources, videos and playthings, and has pioneered creative research and workshop methods. He is external examiner for Information Experience Design at the Royal College of Art, London.

  10. Computational creativity

    López de Mántaras Badia, Ramon

    2013-12-01

    Full Text Available New technologies, and in particular artificial intelligence, are drastically changing the nature of creative processes. Computers are playing very significant roles in creative activities such as music, architecture, fine arts, and science. Indeed, the computer is already a canvas, a brush, a musical instrument, and so on. However, we believe that we must aim at more ambitious relations between computers and creativity. Rather than just seeing the computer as a tool to help human creators, we could see it as a creative entity in its own right. This view has triggered a new subfield of Artificial Intelligence called Computational Creativity. This article addresses the question of the possibility of achieving computational creativity through some examples of computer programs capable of replicating some aspects of creative behavior in the fields of music and science.Las nuevas tecnologías y en particular la Inteligencia Artificial están cambiando de forma importante la naturaleza del proceso creativo. Los ordenadores están jugando un papel muy significativo en actividades artísticas tales como la música, la arquitectura, las bellas artes y la ciencia. Efectivamente, el ordenador ya es el lienzo, el pincel, el instrumento musical, etc. Sin embargo creemos que debemos aspirar a relaciones más ambiciosas entre los ordenadores y la creatividad. En lugar de verlos solamente como herramientas de ayuda a la creación, los ordenadores podrían ser considerados agentes creativos. Este punto de vista ha dado lugar a un nuevo subcampo de la Inteligencia Artificial denominado Creatividad Computacional. En este artículo abordamos la cuestión de la posibilidad de alcanzar dicha creatividad computacional mediante algunos ejemplos de programas de ordenador capaces de replicar algunos aspectos relacionados con el comportamiento creativo en los ámbitos de la música y la ciencia.

  11. Making Our Food Safe

    Madsen, Michael

    2013-01-01

    Full text: As civilization has progressed societies have strived to make food safer; from using fire to cook our food, and boiling our water to make it safe to drink, advances in technology have helped kill microorganisms that can make food unsafe. The FAO/IAEA Joint Division helps provide technical assistance to Member States that want to implement irradiation technology in making their food safer. Food and waterborne diarrhoeal diseases are estimated to kill roughly 2.2 million people annually, of which 1.9 million are children. Irradiating some of the foods we eat can save many of these lives by reducing the risk of food poisoning and killing the organisms that cause disease. Irradiation works by treating food with a small dose of ionizing radiation, this radiation disrupts the bacteria’s DNA and cell membranes structure stopping the organism from reproducing or functioning, but does not make the food radioactive. It can be applied to a variety of foods from spices and seasonings, to fruits and vegetables and is similar to pasteurization, but without the need for high temperatures that might impair food quality. (author)

  12. What Makes an Object Memorable?

    Dubey, Rachit

    2016-02-19

    Recent studies on image memorability have shed light on what distinguishes the memorability of different images and the intrinsic and extrinsic properties that make those images memorable. However, a clear understanding of the memorability of specific objects inside an image remains elusive. In this paper, we provide the first attempt to answer the question: what exactly is remembered about an image? We augment both the images and object segmentations from the PASCAL-S dataset with ground truth memorability scores and shed light on the various factors and properties that make an object memorable (or forgettable) to humans. We analyze various visual factors that may influence object memorability (e.g. color, visual saliency, and object categories). We also study the correlation between object and image memorability and find that image memorability is greatly affected by the memorability of its most memorable object. Lastly, we explore the effectiveness of deep learning and other computational approaches in predicting object memorability in images. Our efforts offer a deeper understanding of memorability in general thereby opening up avenues for a wide variety of applications. © 2015 IEEE.

  13. What Makes an Object Memorable?

    Dubey, Rachit; Peterson, Joshua; Khosla, Aditya; Yang, Ming-Hsuan; Ghanem, Bernard

    2016-01-01

    Recent studies on image memorability have shed light on what distinguishes the memorability of different images and the intrinsic and extrinsic properties that make those images memorable. However, a clear understanding of the memorability of specific objects inside an image remains elusive. In this paper, we provide the first attempt to answer the question: what exactly is remembered about an image? We augment both the images and object segmentations from the PASCAL-S dataset with ground truth memorability scores and shed light on the various factors and properties that make an object memorable (or forgettable) to humans. We analyze various visual factors that may influence object memorability (e.g. color, visual saliency, and object categories). We also study the correlation between object and image memorability and find that image memorability is greatly affected by the memorability of its most memorable object. Lastly, we explore the effectiveness of deep learning and other computational approaches in predicting object memorability in images. Our efforts offer a deeper understanding of memorability in general thereby opening up avenues for a wide variety of applications. © 2015 IEEE.

  14. What makes virtual agents believable?

    Bogdanovych, Anton; Trescak, Tomas; Simoff, Simeon

    2016-01-01

    In this paper we investigate the concept of believability and make an attempt to isolate individual characteristics (features) that contribute to making virtual characters believable. As the result of this investigation we have produced a formalisation of believability and based on this formalisation built a computational framework focused on simulation of believable virtual agents that possess the identified features. In order to test whether the identified features are, in fact, responsible for agents being perceived as more believable, we have conducted a user study. In this study we tested user reactions towards the virtual characters that were created for a simulation of aboriginal inhabitants of a particular area of Sydney, Australia in 1770 A.D. The participants of our user study were exposed to short simulated scenes, in which virtual agents performed some behaviour in two different ways (while possessing a certain aspect of believability vs. not possessing it). The results of the study indicate that virtual agents that appear resource bounded, are aware of their environment, own interaction capabilities and their state in the world, agents that can adapt to changes in the environment and exist in correct social context are those that are being perceived as more believable. Further in the paper we discuss these and other believability features and provide a quantitative analysis of the level of contribution for each such feature to the overall perceived believability of a virtual agent.

  15. Computational Physics' Greatest Hits

    Bug, Amy

    2011-03-01

    The digital computer, has worked its way so effectively into our profession that now, roughly 65 years after its invention, it is virtually impossible to find a field of experimental or theoretical physics unaided by computational innovation. It is tough to think of another device about which one can make that claim. In the session ``What is computational physics?'' speakers will distinguish computation within the field of computational physics from this ubiquitous importance across all subfields of physics. This talk will recap the invited session ``Great Advances...Past, Present and Future'' in which five dramatic areas of discovery (five of our ``greatest hits'') are chronicled: The physics of many-boson systems via Path Integral Monte Carlo, the thermodynamic behavior of a huge number of diverse systems via Monte Carlo Methods, the discovery of new pharmaceutical agents via molecular dynamics, predictive simulations of global climate change via detailed, cross-disciplinary earth system models, and an understanding of the formation of the first structures in our universe via galaxy formation simulations. The talk will also identify ``greatest hits'' in our field from the teaching and research perspectives of other members of DCOMP, including its Executive Committee.

  16. Scientific computer simulation review

    Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.

    2015-01-01

    Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework

  17. Computer Vision for Timber Harvesting

    Dahl, Anders Lindbjerg

    The goal of this thesis is to investigate computer vision methods for timber harvesting operations. The background for developing computer vision for timber harvesting is to document origin of timber and to collect qualitative and quantitative parameters concerning the timber for efficient harvest...... segments. The purpose of image segmentation is to make the basis for more advanced computer vision methods like object recognition and classification. Our second method concerns image classification and we present a method where we classify small timber samples to tree species based on Active Appearance...... to the development of the logTracker system the described methods have a general applicability making them useful for many other computer vision problems....

  18. Computer tomographic diagnosis of echinococcosis

    Haertel, M.; Fretz, C.; Fuchs, W.A.

    1980-08-01

    The computer tomographic appearances and differential diagnosis in 22 patients with echinococcosis are described; of these, twelve were of the cystic and ten of the alveolar type. The computer tomographic appearances are characterised by the presence of daughter cysts (66%) within the sharply demarkated parasitic cyst of water density. In the absence of daughter cysts, a definite aetiological diagnosis cannot be made, although there is a tendency to clasification of the occassionally multiple echinococcus cysts. The computer tomographic appearances of advanced alveolar echinococcosis are characterised by partial collequative necrosis, with clacification around the necrotic areas (90%). The absence of CT evidence of partial necrosis and calsification of the pseudotumour makes it difficult to establish a specific diagnosis. The conclusive and non-invasive character of the procedure and its reproducibility makes computer tomography the method of choice for the diagnosis and follow-up of echinococcosis.

  19. Ethical Decision Making

    Lauesen, Linne Marie

    2012-01-01

    of the interaction between a corporation and its stakeholders. Methodology/approach: This paper offers a theoretical 'Organic Stakeholder Model' based on decision making theory, risk assessment and adaption to a rapidly changing world combined with appropriate stakeholder theory for ethical purposes in decision...... applicable): The Model is based on case studies, but the limited scope of the length of the paper did not leave room to show the empirical evidence, but only the theoretical study. Originality / value of a paper: The model offers a new way of combining risk management with ethical decision-making processes...... by the inclusion of multiple stakeholders. The conceptualization of the model enhances business ethics in decision making by managing and balancing stakeholder concerns with the same concerns as the traditional risk management models does – for the sake of the wider social responsibilities of the businesses...

  20. Making fictions sound real

    Langkjær, Birger

    2010-01-01

    This article examines the role that sound plays in making fictions perceptually real to film audiences, whether these fictions are realist or non-realist in content and narrative form. I will argue that some aspects of film sound practices and the kind of experiences they trigger are related...... to basic rules of human perception, whereas others are more properly explained in relation to how aesthetic devices, including sound, are used to characterise the fiction and thereby make it perceptually real to its audience. Finally, I will argue that not all genres can be defined by a simple taxonomy...... of sounds. Apart from an account of the kinds of sounds that typically appear in a specific genre, a genre analysis of sound may also benefit from a functionalist approach that focuses on how sounds can make both realist and non-realist aspects of genres sound real to audiences....

  1. Making Science Work.

    Thomas, Lewis

    1981-01-01

    Presents a viewpoint concerning the impact of recent scientific advances on society. Discusses biological discoveries, space exploration, computer technology, development of new astronomical theories, the behavioral sciences, and basic research. Challenges to keeping science current with technological advancement are also discussed. (DS)

  2. Event Prediction for Modeling Mental Simulation in Naturalistic Decision Making

    Kunde, Dietmar

    2005-01-01

    ... and increasingly important asymmetric warfare scenarios. Although improvements in computer technology support more and more detailed representations, human decision making is still far from being automated in a realistic way...

  3. Single versus Multiple Objective(s) Decision Making: An Application ...

    Rahel

    rather than exception in many real life decision-making circumstances. For example ...... stakeholders' relative importance of various attributes in the utility function. (Steuer 1986). ..... Multiple Criteria Optimization: Theory, Computation and.

  4. Intersubjective meaning making

    Davidsen, Jacob

    of single-touch screen interaction among 8-9 year-old children presented here, shows that while the constraints of single-touch screens does not support equality of interaction at the verbal and the physical level, there seems to be an intersubjective learning outcome. More precisely, the constraints...... of single-touch screens offer support for intersubjective meaning making in its ability of constraining the interaction. By presenting a short embodied interaction analysis of 22 seconds of collaboration, I illustrate how an embodied interaction perspective on intersubjective meaning making can tell...... a different story about touch-screen supported collaborative learning....

  5. Emotion and decision making.

    Lerner, Jennifer S; Li, Ye; Valdesolo, Piercarlo; Kassam, Karim S

    2015-01-03

    A revolution in the science of emotion has emerged in recent decades, with the potential to create a paradigm shift in decision theories. The research reveals that emotions constitute potent, pervasive, predictable, sometimes harmful and sometimes beneficial drivers of decision making. Across different domains, important regularities appear in the mechanisms through which emotions influence judgments and choices. We organize and analyze what has been learned from the past 35 years of work on emotion and decision making. In so doing, we propose the emotion-imbued choice model, which accounts for inputs from traditional rational choice theory and from newer emotion research, synthesizing scientific models.

  6. Quantum computing

    Steane, Andrew

    1998-01-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  7. Quantum computing

    Steane, Andrew [Department of Atomic and Laser Physics, University of Oxford, Clarendon Laboratory, Oxford (United Kingdom)

    1998-02-01

    The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This review aims to summarize not just quantum computing, but the whole subject of quantum information theory. Information can be identified as the most general thing which must propagate from a cause to an effect. It therefore has a fundamentally important role in the science of physics. However, the mathematical treatment of information, especially information processing, is quite recent, dating from the mid-20th century. This has meant that the full significance of information as a basic concept in physics is only now being discovered. This is especially true in quantum mechanics. The theory of quantum information and computing puts this significance on a firm footing, and has led to some profound and exciting new insights into the natural world. Among these are the use of quantum states to permit the secure transmission of classical information (quantum cryptography), the use of quantum entanglement to permit reliable transmission of quantum states (teleportation), the possibility of preserving quantum coherence in the presence of irreversible noise processes (quantum error correction), and the use of controlled quantum evolution for efficient computation (quantum computation). The common theme of all these insights is the use of quantum entanglement as a computational resource. It turns out that information theory and quantum mechanics fit together very well. In order to explain their relationship, this review begins with an introduction to classical information theory and computer science, including Shannon's theorem, error correcting codes, Turing machines and computational complexity. The principles of quantum mechanics are then outlined, and the Einstein, Podolsky and Rosen (EPR) experiment described. The EPR-Bell correlations, and quantum entanglement in general, form the essential new ingredient which distinguishes quantum from

  8. Challenges and Security in Cloud Computing

    Chang, Hyokyung; Choi, Euiin

    People who live in this world want to solve any problems as they happen then. An IT technology called Ubiquitous computing should help the situations easier and we call a technology which makes it even better and powerful cloud computing. Cloud computing, however, is at the stage of the beginning to implement and use and it faces a lot of challenges in technical matters and security issues. This paper looks at the cloud computing security.

  9. Multiparty Computations

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  10. Scientific computing

    Trangenstein, John A

    2017-01-01

    This is the third of three volumes providing a comprehensive presentation of the fundamentals of scientific computing. This volume discusses topics that depend more on calculus than linear algebra, in order to prepare the reader for solving differential equations. This book and its companions show how to determine the quality of computational results, and how to measure the relative efficiency of competing methods. Readers learn how to determine the maximum attainable accuracy of algorithms, and how to select the best method for computing problems. This book also discusses programming in several languages, including C++, Fortran and MATLAB. There are 90 examples, 200 exercises, 36 algorithms, 40 interactive JavaScript programs, 91 references to software programs and 1 case study. Topics are introduced with goals, literature references and links to public software. There are descriptions of the current algorithms in GSLIB and MATLAB. This book could be used for a second course in numerical methods, for either ...

  11. Computational Psychiatry

    Wang, Xiao-Jing; Krystal, John H.

    2014-01-01

    Psychiatric disorders such as autism and schizophrenia arise from abnormalities in brain systems that underlie cognitive, emotional and social functions. The brain is enormously complex and its abundant feedback loops on multiple scales preclude intuitive explication of circuit functions. In close interplay with experiments, theory and computational modeling are essential for understanding how, precisely, neural circuits generate flexible behaviors and their impairments give rise to psychiatric symptoms. This Perspective highlights recent progress in applying computational neuroscience to the study of mental disorders. We outline basic approaches, including identification of core deficits that cut across disease categories, biologically-realistic modeling bridging cellular and synaptic mechanisms with behavior, model-aided diagnosis. The need for new research strategies in psychiatry is urgent. Computational psychiatry potentially provides powerful tools for elucidating pathophysiology that may inform both diagnosis and treatment. To achieve this promise will require investment in cross-disciplinary training and research in this nascent field. PMID:25442941

  12. CHPS IN CLOUD COMPUTING ENVIRONMENT

    K.L.Giridas; A.Shajin Nargunam

    2012-01-01

    Workflow have been utilized to characterize a various form of applications concerning high processing and storage space demands. So, to make the cloud computing environment more eco-friendly,our research project was aiming in reducing E-waste accumulated by computers. In a hybrid cloud, the user has flexibility offered by public cloud resources that can be combined to the private resources pool as required. Our previous work described the process of combining the low range and mid range proce...

  13. Dynamic decision making without expected utility

    Nielsen, Thomas Dyhre; Jaffray, Jean-Yves

    2006-01-01

    Non-expected utility theories, such as rank dependent utility (RDU) theory, have been proposed as alternative models to EU theory in decision making under risk. These models do not share the separability property of expected utility theory. This implies that, in a decision tree, if the reduction...... maker’s discordant goals at the different decision nodes. Relative to the computations involved in the standard expected utility evaluation of a decision problem, the main computational increase is due to the identification of non-dominated strategies by linear programming. A simulation, using the rank...

  14. Computational artifacts

    Schmidt, Kjeld; Bansler, Jørgen P.

    2016-01-01

    The key concern of CSCW research is that of understanding computing technologies in the social context of their use, that is, as integral features of our practices and our lives, and to think of their design and implementation under that perspective. However, the question of the nature...... of that which is actually integrated in our practices is often discussed in confusing ways, if at all. The article aims to try to clarify the issue and in doing so revisits and reconsiders the notion of ‘computational artifact’....

  15. Computer security

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  16. Cloud Computing

    Antonopoulos, Nick

    2010-01-01

    Cloud computing has recently emerged as a subject of substantial industrial and academic interest, though its meaning and scope is hotly debated. For some researchers, clouds are a natural evolution towards the full commercialisation of grid systems, while others dismiss the term as a mere re-branding of existing pay-per-use technologies. From either perspective, 'cloud' is now the label of choice for accountable pay-per-use access to third party applications and computational resources on a massive scale. Clouds support patterns of less predictable resource use for applications and services a

  17. Computational Logistics

    Pacino, Dario; Voss, Stefan; Jensen, Rune Møller

    2013-01-01

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  18. Computational Logistics

    This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized in to...... in topical sections named: maritime shipping, road transport, vehicle routing problems, aviation applications, and logistics and supply chain management.......This book constitutes the refereed proceedings of the 4th International Conference on Computational Logistics, ICCL 2013, held in Copenhagen, Denmark, in September 2013. The 19 papers presented in this volume were carefully reviewed and selected for inclusion in the book. They are organized...

  19. Computational engineering

    2014-01-01

    The book presents state-of-the-art works in computational engineering. Focus is on mathematical modeling, numerical simulation, experimental validation and visualization in engineering sciences. In particular, the following topics are presented: constitutive models and their implementation into finite element codes, numerical models in nonlinear elasto-dynamics including seismic excitations, multiphase models in structural engineering and multiscale models of materials systems, sensitivity and reliability analysis of engineering structures, the application of scientific computing in urban water management and hydraulic engineering, and the application of genetic algorithms for the registration of laser scanner point clouds.

  20. Computer busses

    Buchanan, William

    2000-01-01

    As more and more equipment is interface or'bus' driven, either by the use of controllers or directly from PCs, the question of which bus to use is becoming increasingly important both in industry and in the office. 'Computer Busses' has been designed to help choose the best type of bus for the particular application.There are several books which cover individual busses, but none which provide a complete guide to computer busses. The author provides a basic theory of busses and draws examples and applications from real bus case studies. Busses are analysed using from a top-down approach, helpin

  1. A queueing model of pilot decision making in a multi-task flight management situation

    Walden, R. S.; Rouse, W. B.

    1977-01-01

    Allocation of decision making responsibility between pilot and computer is considered and a flight management task, designed for the study of pilot-computer interaction, is discussed. A queueing theory model of pilot decision making in this multi-task, control and monitoring situation is presented. An experimental investigation of pilot decision making and the resulting model parameters are discussed.

  2. Making Invisible Histories Visible

    Hanssen, Ana Maria

    2012-01-01

    This article features Omaha Public Schools' "Making Invisible Histories Visible" program, or MIHV. Omaha's schools have a low failure rate among 8th graders but a high one among high school freshmen. MIHV was created to help at-risk students "adjust to the increased demands of high school." By working alongside teachers and…

  3. In the making

    2005-01-01

    disciplines and includes other research areas with common interest in how people shape and make sense of things in an increasingly man-made world. The conference directs its interest towards the diversity, challenges, emerging practices and understanding of design. Rather than searching for common definitions...

  4. Strategic decision making

    Stokman, Frans N.; Assen, Marcel A.L.M. van; Knoop, Jelle van der; Oosten, Reinier C.H. van

    2000-01-01

    This paper introduces a methodology for strategic intervention in collective decision making.The methodology is based on (1) a decomposition of the problem into a few main controversial issues, (2) systematic interviews of subject area specialists to obtain a specification of the decision

  5. Making Room for Ethics

    Douglas-Jones, Rachel

    2017-01-01

    This article examines the work that goes in to ‘making room’ for ethics, literally and figuratively. It follows the activities of a capacity building Asia-Pacific NGO in training and recognising ethics review committees, using multi-sited field materials collected over 12 months between 2009...

  6. Making Deferred Taxes Relevant

    Brouwer, Arjan; Naarding, Ewout

    2018-01-01

    We analyse the conceptual problems in current accounting for deferred taxes and provide solutions derived from the literature in order to make International Financial Reporting Standards (IFRS) deferred tax numbers value-relevant. In our view, the empirical results concerning the value relevance of

  7. Time in the Making

    Dirckinck-Holmfeld, Katrine Remmen

    ? These are research questions Katrine Dirckinck - Holmfeld explores in the artistic research project Time in the Making: Rehearsing Reparative Critical Practices. Through the development of video installations Leap into Colour (20 12 - 2015) and movement (2012) and in dialogue with the work of artists Rania & Raed...

  8. Repeated Causal Decision Making

    Hagmayer, York; Meder, Bjorn

    2013-01-01

    Many of our decisions refer to actions that have a causal impact on the external environment. Such actions may not only allow for the mere learning of expected values or utilities but also for acquiring knowledge about the causal structure of our world. We used a repeated decision-making paradigm to examine what kind of knowledge people acquire in…

  9. What Makes Organization?

    Boll, Karen

    This article investigates a segmentation model used by the Danish Tax and Customs Administration to classify businesses’ motivational postures. The article uses two different conceptualizations of performativity to analyze what the model’s segmentations do; Hacking’s idea of making up people...

  10. Making Choices, Setting Goals

    Skinner, Timothy

    2013-01-01

    Diabetes management and education is very important. The way information is provided influences people's behaviours and thus outcomes. The way information is presented can increase or reduce the individual's ability to make informed decisions about their treatment and influences whether they acti...

  11. What Makes Clusters Decline?

    Østergaard, Christian Richter; Park, Eun Kyung

    2015-01-01

    Most studies on regional clusters focus on identifying factors and processes that make clusters grow. However, sometimes technologies and market conditions suddenly shift, and clusters decline. This paper analyses the process of decline of the wireless communication cluster in Denmark. The longit...... but being quick to withdraw in times of crisis....

  12. Making cocoa origin traceable

    Acierno, Valentina; Alewijn, Martin; Zomer, Paul; Ruth, van Saskia M.

    2018-01-01

    More and more attention is paid to sustainability in the cocoa production. Tools that assist in making sustainable cocoa traceable are therefore welcome. In the present study, the applicability of Flow Infusion-Electrospray Ionization- Mass Spectrometry (FI-ESI-MS) to assess the geographical origin

  13. Judgment and decision making.

    Fischhoff, Baruch

    2010-09-01

    The study of judgment and decision making entails three interrelated forms of research: (1) normative analysis, identifying the best courses of action, given decision makers' values; (2) descriptive studies, examining actual behavior in terms comparable to the normative analyses; and (3) prescriptive interventions, helping individuals to make better choices, bridging the gap between the normative ideal and the descriptive reality. The research is grounded in analytical foundations shared by economics, psychology, philosophy, and management science. Those foundations provide a framework for accommodating affective and social factors that shape and complement the cognitive processes of decision making. The decision sciences have grown through applications requiring collaboration with subject matter experts, familiar with the substance of the choices and the opportunities for interventions. Over the past half century, the field has shifted its emphasis from predicting choices, which can be successful without theoretical insight, to understanding the processes shaping them. Those processes are often revealed through biases that suggest non-normative processes. The practical importance of these biases depends on the sensitivity of specific decisions and the support that individuals have in making them. As a result, the field offers no simple summary of individuals' competence as decision makers, but a suite of theories and methods suited to capturing these sensitivities. Copyright © 2010 John Wiley & Sons, Ltd. For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.

  14. Making a Quit Plan

    ... BACK CLOSE SMOKEFREE.GOV HOME Create My Quit Plan Quitting starts now. Make a plan . Step 1 of 7 mark Step 2 of ... boosts your chances of success. Build a quit plan to get ready and find out what to ...

  15. Making media public

    Mollerup, Nina Grønlykke; Gaber, Sherief

    2015-01-01

    This article focuses on two related street screening initiatives, Tahrir Cinema and Kazeboon, which took place in Egypt mainly between 2011 and 2013. Based on long-term ethnographic studies and activist work, we explore street screenings as place-making and describe how participants at street scr...

  16. What makes workers happy?

    van der Meer, P.H.; Wielers, R.J.J.

    2013-01-01

    This article answers the question what makes workers happy? It does so by combining insights from micro-economics, sociology and psychology. Basis is the standard utility function of a worker that includes income and hours of work and is elaborated with job characteristics. In this way it is

  17. MULTICRITERIA DECISION-MAKING

    HENDRIKS, MMWB; DEBOER, JH; SMILDE, AK; DOORNBOS, DA

    1992-01-01

    Interest is growing in multicriteria decision making (MCDM) techniques and a large number of these techniques are now available. The purpose of this tutorial is to give a theoretical description of some of the MCDM techniques. Besides this we will give an overview of the differences and similarities

  18. Making students' frames explicit

    Nielsen, Louise Møller; Hansen, Poul Henrik Kyvsgaard

    2016-01-01

    Framing is a vital part of the design and innovation process. Frames are cognitive shortcuts (i.e. metaphors) that enable designers to connect insights about i.e. market opportunities and users needs with a set of solution principles and to test if this connection makes sense. Until now, framing...

  19. Making the Connection

    Perna, Mark C.

    2006-01-01

    Enrollment marketing is not just about enrollment; it is about creating relationships and serving one's community or target audience for many years. In this article, the author states that the first step in building such relationships is making a connection, and that is what effective marketing is all about. Administrators, teachers and critical…

  20. Designing for Decision Making

    Jonassen, David H.

    2012-01-01

    Decision making is the most common kind of problem solving. It is also an important component skill in other more ill-structured and complex kinds of problem solving, including policy problems and design problems. There are different kinds of decisions, including choices, acceptances, evaluations, and constructions. After describing the centrality…

  1. Making Cities Green.

    Goldstein, Neil B.; Engel, Jane

    1981-01-01

    Describes several examples of urban parks and the renewal of city open spaces. Community groups interested in getting funding from government or private sources must cope with budget restrictions by making effective, innovative use of available money. Government agencies with funds allocated for urban improvements are mentioned. (AM)

  2. Make time to move

    ... or after work. Schedule your exercise. Make getting exercise just as important as your other appointments. Set aside time in ... update 04-02-18. Related MedlinePlus Health Topics ... among the first to achieve this important distinction for online health information and services. Learn ...

  3. Making Lists, Enlisting Scientists

    Jensen, Casper Bruun

    2011-01-01

    was the indicator conceptualised? How were notions of scientific knowledge and collaboration inscribed and challenged in the process? The analysis shows a two-sided process in which scientists become engaged in making lists but which is simultaneously a way for research policy to enlist scientists. In conclusion...

  4. Making Images That Move

    Rennie, Richard

    2015-01-01

    The history of the moving image (the cinema) is well documented in books and on the Internet. This article offers a number of activities that can easily be carried out in a science class. They make use of the phenomenon of "Persistence of Vision." The activities presented herein demonstrate the functionality of the phenakistoscope, the…

  5. WHAT MAKES THINGS GO.

    Mobilization for Youth, Inc., New York, NY.

    THE INITIAL QUESTION IN THE TITLE IS ANSWERED THROUGH SIMPLE EXPERIMENTS FOR CULTURALLY DISADVANTAGED CHILDREN IN ELEMENTARY SCHOOL. MUSCLES, RUNNING, WATER, WIND, STEAM, FAST BURNING AND ELECTRICITY ARE FOUND TO "MAKE THINGS GO." USING THESE BASIC DISCOVERIES, VOCABULARY IS BUILT UP BY WORKING WITH DIFFERENT WORDS RELATING TO THE…

  6. Computational intelligence and neuromorphic computing potential for cybersecurity applications

    Pino, Robinson E.; Shevenell, Michael J.; Cam, Hasan; Mouallem, Pierre; Shumaker, Justin L.; Edwards, Arthur H.

    2013-05-01

    In today's highly mobile, networked, and interconnected internet world, the flow and volume of information is overwhelming and continuously increasing. Therefore, it is believed that the next frontier in technological evolution and development will rely in our ability to develop intelligent systems that can help us process, analyze, and make-sense of information autonomously just as a well-trained and educated human expert. In computational intelligence, neuromorphic computing promises to allow for the development of computing systems able to imitate natural neurobiological processes and form the foundation for intelligent system architectures.

  7. Dialogue with computers

    Filippazzi, F.

    1991-03-01

    As to whether or not it would be possible to make a computer maintain dialogue with its operator and give plausible statements without actually 'understanding' what is being spoken about, the answer is, within certain limits, yes. An idea of this was given about 25 years ago with MIT's J. Weizenbaum's ELIZA program, named after G. B. Shaw's Pygmalion Cockney flower-seller who learned to talk like a duchess. The operating mechanism by which a computer would be able to do likewise must satisfy three prerequisites: the language must be natural; the speech coherent; and the answers should be consistent for any given question even when that question is asked in a slightly different form. To make this possible, the dialogue must take place within a limited context (in fact, the ELIZA experiment involved a simulated doctor/patient in-studio conversation). This article presents a portion of that conversation, in which the doctor, i.e., the computer, evasively answers his patient's questions without actually ever coming to grips with the issue, to illustrate how such a man-machine interface mechanism works.

  8. Riemannian computing in computer vision

    Srivastava, Anuj

    2016-01-01

    This book presents a comprehensive treatise on Riemannian geometric computations and related statistical inferences in several computer vision problems. This edited volume includes chapter contributions from leading figures in the field of computer vision who are applying Riemannian geometric approaches in problems such as face recognition, activity recognition, object detection, biomedical image analysis, and structure-from-motion. Some of the mathematical entities that necessitate a geometric analysis include rotation matrices (e.g. in modeling camera motion), stick figures (e.g. for activity recognition), subspace comparisons (e.g. in face recognition), symmetric positive-definite matrices (e.g. in diffusion tensor imaging), and function-spaces (e.g. in studying shapes of closed contours).   ·         Illustrates Riemannian computing theory on applications in computer vision, machine learning, and robotics ·         Emphasis on algorithmic advances that will allow re-application in other...

  9. Statistical Computing

    inference and finite population sampling. Sudhakar Kunte. Elements of statistical computing are discussed in this series. ... which captain gets an option to decide whether to field first or bat first ... may of course not be fair, in the sense that the team which wins ... describe two methods of drawing a random number between 0.

  10. Computational biology

    Hartmann, Lars Røeboe; Jones, Neil; Simonsen, Jakob Grue

    2011-01-01

    Computation via biological devices has been the subject of close scrutiny since von Neumann’s early work some 60 years ago. In spite of the many relevant works in this field, the notion of programming biological devices seems to be, at best, ill-defined. While many devices are claimed or proved t...

  11. Computing News

    McCubbin, N

    2001-01-01

    We are still five years from the first LHC data, so we have plenty of time to get the computing into shape, don't we? Well, yes and no: there is time, but there's an awful lot to do! The recently-completed CERN Review of LHC Computing gives the flavour of the LHC computing challenge. The hardware scale for each of the LHC experiments is millions of 'SpecInt95' (SI95) units of cpu power and tens of PetaBytes of data storage. PCs today are about 20-30SI95, and expected to be about 100 SI95 by 2005, so it's a lot of PCs. This hardware will be distributed across several 'Regional Centres' of various sizes, connected by high-speed networks. How to realise this in an orderly and timely fashion is now being discussed in earnest by CERN, Funding Agencies, and the LHC experiments. Mixed in with this is, of course, the GRID concept...but that's a topic for another day! Of course hardware, networks and the GRID constitute just one part of the computing. Most of the ATLAS effort is spent on software development. What we ...

  12. Quantum Computation

    Home; Journals; Resonance – Journal of Science Education; Volume 16; Issue 9. Quantum Computation - Particle and Wave Aspects of Algorithms. Apoorva Patel. General Article Volume 16 Issue 9 September 2011 pp 821-835. Fulltext. Click here to view fulltext PDF. Permanent link:

  13. Cloud computing.

    Wink, Diane M

    2012-01-01

    In this bimonthly series, the author examines how nurse educators can use Internet and Web-based technologies such as search, communication, and collaborative writing tools; social networking and social bookmarking sites; virtual worlds; and Web-based teaching and learning programs. This article describes how cloud computing can be used in nursing education.

  14. Computer Recreations.

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  15. [Grid computing

    Wolinsky, H

    2003-01-01

    "Turn on a water spigot, and it's like tapping a bottomless barrel of water. Ditto for electricity: Flip the switch, and the supply is endless. But computing is another matter. Even with the Internet revolution enabling us to connect in new ways, we are still limited to self-contained systems running locally stored software, limited by corporate, institutional and geographic boundaries" (1 page).

  16. Computational Finance

    Rasmussen, Lykke

    One of the major challenges in todays post-crisis finance environment is calculating the sensitivities of complex products for hedging and risk management. Historically, these derivatives have been determined using bump-and-revalue, but due to the increasing magnitude of these computations does...

  17. Optical Computing

    Optical computing technology is, in general, developing in two directions. One approach is ... current support in many places, with private companies as well as governments in several countries encouraging such research work. For example, much ... which enables more information to be carried and data to be processed.

  18. Exploring the challenges of making data physical

    Alexander, Jason; Jansen, Yvonne; Hornbæk, Kasper

    2015-01-01

    Physical representations of data have existed for thousands of years. However, it is only now that advances in digital fabrication, actuated tangible interfaces, and shape-changing displays can support the emerging area of 'Data Physicalization': the study of computer-supported, physical....... The primary goal of this workshop is to bring together practitioners from a diverse range of communities to explore the challenges in 'making data physical' and to set a research roadmap for the next years....... representations of data and their support for cognition, communication, learning, problem solving and decision making. As physical artifacts, data physicalizations can tap more deeply into our perceptual exploration skills than classical computer setups, while their dynamic physicality alleviates some of the main...

  19. Decision making based on emotional images

    Kentaro eKatahira

    2011-10-01

    Full Text Available The emotional outcome of a choice affects subsequent decision making. While the relationship between decision making and emotion has attracted attention, studies on emotion and decision making have been independently developed. In this study, we investigated how the emotional valence of pictures, which was stochastically contingent on participants’ choices, influenced subsequent decision making. In contrast to traditional value-based decision-making studies that used money or food as a reward, the reward value of the decision outcome, which guided the update of value for each choice, is unknown beforehand. To estimate the reward value of emotional pictures from participants’ choice data, we used reinforcement learning models that have success- fully been used in previous studies for modeling value-based decision making. Consequently, we found that the estimated reward value was asymmetric between positive and negative pictures. The negative reward value of negative pictures (relative to neutral pictures was larger in magnitude than the positive reward value of positive pictures. This asymmetry was not observed in valence for an individual picture, which was rated by the participants regarding the emotion experienced upon viewing it. These results suggest that there may be a difference between experienced emotion and the effect of the experienced emotion on subsequent behavior. Our experimental and computational paradigm provides a novel way for quantifying how and what aspects of emotional events affect human behavior. The present study is a first step toward relating a large amount of knowledge in emotion science and in taking computational approaches to value-based decision making.

  20. Judgment and decision making.

    Mellers, B A; Schwartz, A; Cooke, A D

    1998-01-01

    For many decades, research in judgment and decision making has examined behavioral violations of rational choice theory. In that framework, rationality is expressed as a single correct decision shared by experimenters and subjects that satisfies internal coherence within a set of preferences and beliefs. Outside of psychology, social scientists are now debating the need to modify rational choice theory with behavioral assumptions. Within psychology, researchers are debating assumptions about errors for many different definitions of rationality. Alternative frameworks are being proposed. These frameworks view decisions as more reasonable and adaptive that previously thought. For example, "rule following." Rule following, which occurs when a rule or norm is applied to a situation, often minimizes effort and provides satisfying solutions that are "good enough," though not necessarily the best. When rules are ambiguous, people look for reasons to guide their decisions. They may also let their emotions take charge. This chapter presents recent research on judgment and decision making from traditional and alternative frameworks.