This book presents the latest research findings and reviews in the field of medical imaging technology, covering ultrasound diagnostics approaches for detecting osteoarthritis, breast carcinoma and cardiovascular conditions, image guided biopsy and segmentation techniques for detecting lung cancer, image fusion, and simulating fluid flows for cardiovascular applications. It offers a useful guide for students, lecturers and professional researchers in the fields of biomedical engineering and image processing.
Hartshorn, W.R.; Johnson, A.L.
The Savannah River Site Computing Architecture states that the site computing environment will be standards-based, data-driven, and workstation-oriented. Larger server systems deliver needed information to users in a client-server relationship. Goals of the Architecture include utilizing computing resources effectively, maintaining a high level of data integrity, developing a robust infrastructure, and storing data in such a way as to promote accessibility and usability. This document describes the current storage environment at Savannah River Site (SRS) and presents some of the problems that will be faced and strategies that are planned over the next few years.
Jung, R.E.; Schneider, D.; Ganeles, J.; Wismeijer, D.; Zwahlen, M.; Hämmerle, C.H.F.; Tahmaseb, A.
Purpose: To assess the literature on accuracy and clinical performance of computer technology applications in surgical implant dentistry. Materials and Methods: Electronic and manual literature searches were conducted to collect information about (1) the accuracy and (2) clinical performance of
Aderholdt, Ferrol [Tennessee Technological University; Caldwell, Blake A [ORNL; Hicks, Susan Elaine [ORNL; Koch, Scott M [ORNL; Naughton, III, Thomas J [ORNL; Pelfrey, Daniel S [ORNL; Pogge, James R [Tennessee Technological University; Scott, Stephen L [Tennessee Technological University; Shipman, Galen M [ORNL; Sorrillo, Lawrence [ORNL
High performance computing environments are often used for a wide variety of workloads ranging from simulation, data transformation and analysis, and complex workflows to name just a few. These systems may process data for a variety of users, often requiring strong separation between job allocations. There are many challenges to establishing these secure enclaves within the shared infrastructure of high-performance computing (HPC) environments. The isolation mechanisms in the system software are the basic building blocks for enabling secure compute enclaves. There are a variety of approaches and the focus of this report is to review the different virtualization technologies that facilitate the creation of secure compute enclaves. The report reviews current operating system (OS) protection mechanisms and modern virtualization technologies to better understand the performance/isolation properties. We also examine the feasibility of running ``virtualized'' computing resources as non-privileged users, and providing controlled administrative permissions for standard users running within a virtualized context. Our examination includes technologies such as Linux containers (LXC , Docker ) and full virtualization (KVM , Xen ). We categorize these different approaches to virtualization into two broad groups: OS-level virtualization and system-level virtualization. The OS-level virtualization uses containers to allow a single OS kernel to be partitioned to create Virtual Environments (VE), e.g., LXC. The resources within the host's kernel are only virtualized in the sense of separate namespaces. In contrast, system-level virtualization uses hypervisors to manage multiple OS kernels and virtualize the physical resources (hardware) to create Virtual Machines (VM), e.g., Xen, KVM. This terminology of VE and VM, detailed in Section 2, is used throughout the report to distinguish between the two different approaches to providing virtualized execution
Tahmaseb, Ali; Wismeijer, Daniel; Coucke, Wim; Derksen, Wiebe
To assess the literature on the accuracy and clinical performance of static computer-assisted implant surgery in implant dentistry. Electronic and manual literature searches were applied to collect information about (1) the accuracy and (2) clinical performance of static computer-assisted implant systems. Meta-regression analysis was performed to summarize the accuracy studies. Failure/complication rates were investigated using a generalized linear mixed model for binary outcomes and a logit link to model implant failure rate. From 2,359 articles, 14 survival and 24 accuracy studies were included in this systematic review. Nine different static image guidance systems were involved. The meta-analysis of the accuracy (24 clinical and preclinical studies) revealed a total mean error of 1.12 mm (maximum of 4.5 mm) at the entry point measured in 1,530 implants and 1.39 mm at the apex (maximum of 7.1 mm) measured in 1,465 implants. For the 14 included survival studies (total of 1,941 implants) using static computer-assisted implant dentistry, the mean failure rate was 2.7% (0% to 10%) after an observation period of at least 12 months. In 36.4% of the treated cases, intraoperative or prosthetic complications were reported, which included: template fractures during the surgery, change of plan because of factors such as limited primary implant stability, need for additional grafting procedures, prosthetic screw loosening, prosthetic misfit, and prosthesis fracture. Different levels of quantity and quality of evidence were available for static computer-assisted implant placement, with tight-fitting high implant survival rates after only 12 months of observation in different indications achieving a variable level of accuracy. Future long-term clinical data are necessary to identify clinical indications; detect accuracy; assess risk; and justify additional radiation doses, effort, and costs associated with computer-assisted implant surgery.
While computing is applied widely in the production segment of the petroleum industry, its effective application is the primary goal of computing management. Computing technology has changed significantly since the 1950's, when computers first began to influence petroleum technology. The ability to accomplish traditional tasks faster and more economically probably is the most important effect that computing has had on the industry. While speed and lower cost are important, are they enough? Can computing change the basic functions of the industry? When new computing technology is introduced improperly, it can clash with traditional petroleum technology. This paper examines the role of management in merging these technologies
Lloyd, J. F., Sr.
Industrial radiography is a well established, reliable means of providing nondestructive structural integrity information. The majority of industrial radiographs are interpreted by trained human eyes using transmitted light and various visual aids. Hundreds of miles of radiographic information are evaluated, documented and archived annually. In many instances, there are serious considerations in terms of interpreter fatigue, subjectivity and limited archival space. Quite often it is difficult to quickly retrieve radiographic information for further analysis or investigation. Methods of improving the quality and efficiency of the radiographic process are being explored, developed and incorporated whenever feasible. High resolution cameras, digital image processing, and mass digital data storage offer interesting possibilities for improving the industrial radiographic process. A review is presented of computer aided radiographic interpretation technology in terms of how it could be used to enhance the radiographic interpretation process in evaluating radiographs of aluminum welds.
Dillon-Marable, Elizabeth; Valentine, Thomas
The purpose of this study was to better understand what optimal computer technology integration looks like in adult basic skills education (ABSE). One question guided the research: How is computer technology integration best conceptualized and measured? The study used the Delphi method to map the construct of computer technology integration and…
Computer technology and evolution : from artificial intelligence to artificial life. - In: Advances in the philosophy of technology / ed. by Evandro Agazzi ... - Newark, Del. : Soc. for Philosophy and Technology, 1999. - S. 105-119
A review is presented on the currently available technologies for nuclear reactor analyses by computer. The important distinction is made between traditional computer calculation and advanced computer simulation. Simulation needs are defined to support the design, operation, maintenance and safety of isotope production reactors. Existing methods of computer analyses are categorized in accordance with the type of computer involved in their execution: micro, mini, mainframe and supercomputers. Both general and special-purpose computers are discussed. Major computer codes are described, with regard for their use in analyzing isotope production reactors. It has been determined in this review that conventional systems codes (TRAC, RELAP5, RETRAN, etc.) cannot meet four essential conditions for viable reactor simulation: simulation fidelity, on-line interactive operation with convenient graphics, high simulation speed, and at low cost. These conditions can be met by special-purpose computers (such as the AD100 of ADI), which are specifically designed for high-speed simulation of complex systems. The greatest shortcoming of existing systems codes (TRAC, RELAP5) is their mismatch between very high computational efforts and low simulation fidelity. The drift flux formulation (HIPA) is the viable alternative to the complicated two-fluid model. No existing computer code has the capability of accommodating all important processes in the core geometry of isotope production reactors. Experiments are needed (heat transfer measurements) to provide necessary correlations. It is important for the nuclear community, both in government, industry and universities, to begin to take advantage of modern simulation technologies and equipment. 41 refs
Çiftçi, Emrullah Yasin
Intercultural communication is now a crucial part of our globalizing lives; however, not everyone has an opportunity to engage in an intercultural interaction with people from different cultures. Computer-based technologies are promising in creating environments for people to communicate with people from diverse cultures. This qualitative…
Hales, H. Lee
As the cost of computing decreases, computer aids are becoming readily available for facility planning, construction, and operation; three important technologies are decision support systems, computer-aided design, and management information systems. This article discusses the applications and availability of these systems, hardware and software…
How well the computer site manager avoids future dangers and takes advantage of future opportunities depends to a considerable degree on how much anticipatory information he has available. People who rise in management are expected with each successive promotion to concern themselves with events further in the future. It is the function of technology projection to increase this stock of information about possible future developments in order to put planning and decision making on a more rational basis. Past efforts at computer technology projections have an accuracy that declines exponentially with time. Thus, precisely defined technology projections beyond about three years should be used with considerable caution. This paper reviews both subjective and objective methods of technology projection and gives examples of each. For an integrated view of future prospects in computer technology, a framework for technology projection is proposed.
This review is published ten times a year to communicate, to a broad audience, Lawrence Livermore National Laboratory`s scientific and technological accomplishments, particularly in the Laboratory`s core mission areas - global security, energy and the environment, and bioscience and biotechnology. This review for the month of July 1996 discusses: Frontiers of research in advanced computations, The multibeam Fabry-Perot velocimeter: Efficient measurement of high velocities, High-tech tools for the American textile industry, and Rock mechanics: can the Tuff take the stress.
Davies, T Claire; Mudge, Suzie; Ameratunga, Shanthi; Stott, N Susan
The purpose of this study was to systematically review published evidence on the development, use, and effectiveness of devices and technologies that enable or enhance self-directed computer access by individuals with cerebral palsy (CP). Nine electronic databases were searched using keywords 'computer', 'software', 'spastic', 'athetoid', and 'cerebral palsy'; the reference lists of articles thus identified were also searched. Thirty articles were selected for review, with 23 reports of development and usability testing of devices and seven evaluations of algorithms to increase computer recognition of input and cursor movements. Twenty-four studies had fewer than 10 participants with CP, with a wide age range of 5 to 77 years. Computer task performance was usually tested, but only three groups sought participant feedback on ease and comfort of use. International standards exist to evaluate effectiveness of non-keyboard devices, but only one group undertook this testing. None of the study designs were higher than American Academy for Cerebral Palsy and Developmental Medicine level IV. Access solutions for individuals with CP are in the early stages of development. Future work should include assessment of end-user comfort, effort, and performance as well as design features. Engaging users and therapists when designing and evaluating technologies to enhance computer access may increase acceptance and improve performance.
Quirk, W.J.; Bookless, W.A.
The Lawrence Livermore National Laboratory, operated by the University of California for the United States Department of Energy, was established in 1952 to do research on nuclear weapons and magnetic fusion energy. Since then, in response to new national needs, we have added other major programs, including technology transfer, laser science (fusion, isotope separation, materials processing), biology and biotechnology, environmental research and remediation, arms control and nonproliferation, advanced defense technology, and applied energy technology. These programs, in turn, require research in basic scientific disciplines, including chemistry and materials science, computing science and technology, engineering, and physics. The Laboratory also carries out a variety of projects for other federal agencies. Energy and Technology Review is published monthly to report on unclassified work in all our programs. This issue reviews work performed in the areas of modified retoring for waste treatment and underground stripping to remove contamination
Noor, Ahmed K.; Venneri, Samuel L.
Derived from matrix methods of structural analysis and finite element methods developed over the last three decades, computational structures technology (CST) blends computer science, numerical analysis, and approximation theory into structural analysis and synthesis. Recent significant advances in CST include stochastic-based modeling, strategies for performing large-scale structural calculations on new computing systems, and the integration of CST with other disciplinary modules for multidisciplinary analysis and design. New methodologies have been developed at NASA for integrated fluid-thermal structural analysis and integrated aerodynamic-structure-control design. The need for multiple views of data for different modules also led to the development of a number of sophisticated data-base management systems. For CST to play a role in the future development of structures technology and in the multidisciplinary design of future flight vehicles, major advances and computational tools are needed in a number of key areas.
Please note this is a Short Discount publication. This year's edition of Computer Architecture Technology Trends analyses the trends which are taking place in the architecture of computing systems today. Due to the sheer number of different applications to which computers are being applied, there seems no end to the different adoptions which proliferate. There are, however, some underlying trends which appear. Decision makers should be aware of these trends when specifying architectures, particularly for future applications. This report is fully revised and updated and provides insight in
Poggio, A.J. (ed.)
This issue of Energy and Technology Review contains: Neutron Penumbral Imaging of Laser-Fusion Targets--using our new penumbral-imaging diagnostic, we have obtained the first images that can be used to measure directly the deuterium-tritium burn region in laser-driven fusion targets; Computed Tomography for Nondestructive Evaluation--various computed tomography systems and computational techniques are used in nondestructive evaluation; Three-Dimensional Image Analysis for Studying Nuclear Chromatin Structure--we have developed an optic-electronic system for acquiring cross-sectional views of cell nuclei, and computer codes to analyze these images and reconstruct the three-dimensional structures they represent; Imaging in the Nuclear Test Program--advanced techniques produce images of unprecedented detail and resolution from Nevada Test Site data; and Computational X-Ray Holography--visible-light experiments and numerically simulated holograms test our ideas about an x-ray microscope for biological research.
Stephenson, Aoife; McDonough, Suzanne M; Murphy, Marie H; Nugent, Chris D; Mair, Jacqueline L
High levels of sedentary behaviour (SB) are associated with negative health consequences. Technology enhanced solutions such as mobile applications, activity monitors, prompting software, texts, emails and websites are being harnessed to reduce SB. The aim of this paper is to evaluate the effectiveness of such technology enhanced interventions aimed at reducing SB in healthy adults and to examine the behaviour change techniques (BCTs) used. Five electronic databases were searched to identify randomised-controlled trials (RCTs), published up to June 2016. Interventions using computer, mobile or wearable technologies to facilitate a reduction in SB, using a measure of sedentary time as an outcome, were eligible for inclusion. Risk of bias was assessed using the Cochrane Collaboration's tool and interventions were coded using the BCT Taxonomy (v1). Meta-analysis of 15/17 RCTs suggested that computer, mobile and wearable technology tools resulted in a mean reduction of -41.28 min per day (min/day) of sitting time (95% CI -60.99, -21.58, I2 = 77%, n = 1402), in favour of the intervention group at end point follow-up. The pooled effects showed mean reductions at short (≤ 3 months), medium (>3 to 6 months), and long-term follow-up (>6 months) of -42.42 min/day, -37.23 min/day and -1.65 min/day, respectively. Overall, 16/17 studies were deemed as having a high or unclear risk of bias, and 1/17 was judged to be at a low risk of bias. A total of 46 BCTs (14 unique) were coded for the computer, mobile and wearable components of the interventions. The most frequently coded were "prompts and cues", "self-monitoring of behaviour", "social support (unspecified)" and "goal setting (behaviour)". Interventions using computer, mobile and wearable technologies can be effective in reducing SB. Effectiveness appeared most prominent in the short-term and lessened over time. A range of BCTs have been implemented in these interventions. Future studies need to improve reporting
Full Text Available Cloud computing has been a tremendous innovation, through which applications became available online, accessible through an Internet connection and using any computing device (computer, smartphone or tablet. According to one of the most recent studies conducted in 2012 by Everest Group and Cloud Connect, 57% of companies said they already use SaaS application (Software as a Service, and 38% reported using standard tools PaaS (Platform as a Service. However, in the most cases, the users of these solutions highlighted the fact that one of the main obstacles in the development of this technology is the fact that, in cloud, the application is not available without an Internet connection. The new challenge of the cloud system has become now the offline, specifically accessing SaaS applications without being connected to the Internet. This topic is directly related to user productivity within companies as productivity growth is one of the key promises of cloud computing system applications transformation. The aim of this paper is the presentation of some important aspects related to the offline cloud system and regulatory trends in the European Union (EU.
Kaizer, Joshua S.; Heller, A. Kevin; Oberkampf, William L.
Before the results of a scientific computer simulation are used for any purpose, it should be determined if those results can be trusted. Answering that question of trust is the domain of scientific computer simulation review. There is limited literature that focuses on simulation review, and most is specific to the review of a particular type of simulation. This work is intended to provide a foundation for a common understanding of simulation review. This is accomplished through three contributions. First, scientific computer simulation review is formally defined. This definition identifies the scope of simulation review and provides the boundaries of the review process. Second, maturity assessment theory is developed. This development clarifies the concepts of maturity criteria, maturity assessment sets, and maturity assessment frameworks, which are essential for performing simulation review. Finally, simulation review is described as the application of a maturity assessment framework. This is illustrated through evaluating a simulation review performed by the U.S. Nuclear Regulatory Commission. In making these contributions, this work provides a means for a more objective assessment of a simulation’s trustworthiness and takes the next step in establishing scientific computer simulation review as its own field. - Highlights: • We define scientific computer simulation review. • We develop maturity assessment theory. • We formally define a maturity assessment framework. • We describe simulation review as the application of a maturity framework. • We provide an example of a simulation review using a maturity framework
This monthly science and technology review features a report about Lawrence Livermore National Laboratory work on an awesome, inevitable, unpredictable, and potentially dangerous natural phenomenon, lightning. This feature article tells of the development of guidance by Laboratory engineers on how to deal with the effects of lightning on Department of Energy facilities, especially those where nuclear and high explosive materials are handled and stored. Other topics are Groundwater Modeling: More Cost Effective Cleanup by Design, Dual- Band Infrared Computed Tomography: Searching for Hidden Defects, and Plating Shop Moves to Finish Off Waste
Vabishchevich, Petr N
This book discusses questions of numerical solutions of applied problems on parallel computing systems. Nowadays, engineering and scientific computations are carried out on parallel computing systems, which provide parallel data processing on a few computing nodes. In constructing computational algorithms, mathematical problems are separated in relatively independent subproblems in order to solve them on a single computing node.
Vogt, Ramona L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Meissner, Caryn N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chinn, Ken B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
This is the September issue of the Lawrence Livermore National Laboratory's Science & Technology Review, which communicates, to a broad audience, the Laboratory’s scientific and technological accomplishments in fulfilling its primary missions. This month, there are features on "Laboratory Investments Drive Computational Advances" and "Laying the Groundwork for Extreme-Scale Computing." Research highlights include "Nuclear Data Moves into the 21st Century", "Peering into the Future of Lick Observatory", and "Facility Drives Hydrogen Vehicle Innovations."
Borisov, Victor S; Grigoriev, Aleksander V 1; Kolesov, Alexandr E 1; Popov, Petr A 1; Sirditov, Ivan K 1; Vabishchevich, Petr N 1; Vasilieva, Maria V 1; Zakharov, Petr E 1; Vabishchevich, Petr N 0
In this book we describe the basic elements of present computational technologies that use the algorithmic languages C/C++. The emphasis is on GNU compilers and libraries, FOSS for the solution of computational mathematics problems and visualization of the obtained data. Many examples illustrate the basic features of computational technologies.
Stone, H. S.
Advances in computing technology have been led by consistently improving semiconductor technology. The semiconductor industry has turned out ever faster, smaller, and less expensive devices since transistorized computers were first introduced 20 years ago. For the next decade, there appear to be new advances possible, with the rate of introduction of improved devices at least equal to the historic trends. The implication of these projections is that computers will enter new markets and will truly be pervasive in business, home, and factory as their cost diminishes and their computational power expands to new levels. The computer industry as we know it today will be greatly altered in the next decade, primarily because the raw computer system will give way to computer-based turn-key information and control systems.
Brockmeier, Lantry L.; Sermon, Janet M.; Hope, Warren C.
This investigation sought information about principals and their relationship with computer technology. Several questions were fundamental to the inquiry. Are principals prepared to facilitate the attainment of technology's promise through the integration of computer technology into the teaching and learning process? Are principals prepared to use…
The objective of this systematic review is to identify current computer-assisted technologies used for managing patients with a need to re-establish craniofacial appearance, subjective discomfort and stomatognathic function, and the extent of their clinical documentation. Electronic search strategies were used for locating clinical studies in MEDLINE through PubMed and in the Cochrane library, and in the grey literature through searches on Google Scholar. The searches for commercial digital products for use in oral rehabilitation resulted in identifying 225 products per November 2016, used for patient diagnostics, communication and therapy purposes, and for other computer-assisted applications in context with oral rehabilitation. About one-third of these products were described in about 350 papers reporting from clinical human studies. The great majority of digital products for use in oral rehabilitation has no clinical documentation at all, while the products from a distinct minority of manufacturers have frequently appeared in more or less scientific reports. Moore's law apply also to digital dentistry, which predicts that the capacity of microprocessors will continue to become faster and with lower cost per performance unit, and innovative software programs will harness these improvements in performance. The net effect is the noticeable short product life cycle of digital products developed for use in oral rehabilitation and often lack of supportive clinical documentation. Nonetheless, clinicians must request clinically meaningful information about new digital products to assess net benefits for the patients or the dental professionals and not accept only technological verbiage as a basis for product purchases. © 2017 John Wiley & Sons Ltd.
Ponder, Tim, Comp.; Ropog, Marty, Comp.; Keating, Joseph, Comp.
This document provides general information on computer viruses, how to help protect a computer network from them, measures to take if a computer becomes infected. Highlights include the origins of computer viruses; virus contraction; a description of some common virus types (File Virus, Boot Sector/Partition Table Viruses, Trojan Horses, and…
Fischer, James (Technical Monitor); Merkey, Phillip
This grant supported the effort to characterize the problem domain of the Earth Science Technology Office's Computational Technologies Project, to engage the Beowulf Cluster Computing Community as well as the High Performance Computing Research Community so that we can predict the applicability of said technologies to the scientific community represented by the CT project and formulate long term strategies to provide the computational resources necessary to attain the anticipated scientific objectives of the CT project. Specifically, the goal of the evaluation effort is to use the information gathered over the course of the Round-3 investigations to quantify the trends in scientific expectations, the algorithmic requirements and capabilities of high-performance computers to satisfy this anticipated need.
Choi, Yun Cheol; Han, Tack Don; Im, Sun Beom
This book consists of four parts. The first part describes IT technology and information community understanding of computer system, constitution of software system and information system and application of software. The second part is about computer network, information and communication, application and internet service. The third part contains application and multi media, application of mobile computer, ubiquitous computing and ubiquitous environment and computer and digital life. The last part explains information security and ethics of information-oriented society, information industry and IT venture, digital contents technology and industry and the future and development of information-oriented society.
Hashim, Hajah Rugayah Hj.; Mustapha, Wan Narita
As we move further into the new millennium, the need to involve and adapt learners with new technology have been the main aim of many institutions of higher learning in Malaysia. The involvement of the government in huge technology-based projects like the Multimedia Super Corridor Highway (MSC) and one of its flagships, the Smart Schools have…
Luis Fernando Nicolas-Alonso
Full Text Available A brain-computer interface (BCI is a hardware and software communications system that permits cerebral activity alone to control computers or external devices. The immediate goal of BCI research is to provide communications capabilities to severely disabled people who are totally paralyzed or ‘locked in’ by neurological neuromuscular disorders, such as amyotrophic lateral sclerosis, brain stem stroke, or spinal cord injury. Here, we review the state-of-the-art of BCIs, looking at the different steps that form a standard BCI: signal acquisition, preprocessing or signal enhancement, feature extraction, classification and the control interface. We discuss their advantages, drawbacks, and latest advances, and we survey the numerous technologies reported in the scientific literature to design each step of a BCI. First, the review examines the neuroimaging modalities used in the signal acquisition step, each of which monitors a different functional brain activity such as electrical, magnetic or metabolic activity. Second, the review discusses different electrophysiological control signals that determine user intentions, which can be detected in brain activity. Third, the review includes some techniques used in the signal enhancement step to deal with the artifacts in the control signals and improve the performance. Fourth, the review studies some mathematic algorithms used in the feature extraction and classification steps which translate the information in the control signals into commands that operate a computer or other device. Finally, the review provides an overview of various BCI applications that control a range of devices.
Shell Oil Company used a COSMIC program, called VISCEL to insure the accuracy of the company's new computer code for analyzing polymers, and chemical compounds. Shell reported that there were no other programs available that could provide the necessary calculations. Shell produces chemicals for plastic products used in the manufacture of automobiles, housewares, appliances, film, textiles, electronic equipment and furniture.
This article reviews the procedures which are used to review software written for computer based instrumentation and control functions in nuclear facilities. The utilization of computer based control systems is becoming much more prevalent in such installations, in addition to being retrofit into existing systems. Currently, the Nuclear Regulatory System uses Regulatory Guide 1.152, open-quotes Criteria for Programmable Digital Computer System Software in Safety-Related Systems of Nuclear Power Plantsclose quotes and ANSI/IEEE-ANS-7-4.3.2-1982, open-quotes Application Criteria for Programmable Digital Computer Systems in Safety Systems of Nuclear Power Generating Stationsclose quotes for guidance when performing reviews of digital systems. There is great concern about the process of verification and validation of these codes, so when inspections are done of such systems, inspectors examine very closely the processes which were followed in developing the codes, the errors which were detected, how they were found, and the analysis which went into tracing down the causes behind the errors to insure such errors were not propagated again in the future
The Lawrence Livermore National Laboratory publishes the Energy and Technology Review Monthly. This periodical reviews progress mode is selected programs at the laboratory. This issue includes articles on in-situ coal gasification, on chromosomal aberrations in human sperm, on high speed cell sorting and on supercomputers.
The Lawrence Livermore National Laboratory publishes the Energy and Technology Review Monthly. This periodical reviews progress mode is selected programs at the laboratory. This issue includes articles on in-situ coal gasification, on chromosomal aberrations in human sperm, on high speed cell sorting and on supercomputers
Guise, Max Joseph; Wendt, Jeremy Daniel
We describe the current state-of-the-art in Trusted Computing Technologies - focusing mainly on Intel's Trusted Execution Technology (TXT). This document is based on existing documentation and tests of two existing TXT-based systems: Intel's Trusted Boot and Invisible Things Lab's Qubes OS. We describe what features are lacking in current implementations, describe what a mature system could provide, and present a list of developments to watch. Critical systems perform operation-critical computations on high importance data. In such systems, the inputs, computation steps, and outputs may be highly sensitive. Sensitive components must be protected from both unauthorized release, and unauthorized alteration: Unauthorized users should not access the sensitive input and sensitive output data, nor be able to alter them; the computation contains intermediate data with the same requirements, and executes algorithms that the unauthorized should not be able to know or alter. Due to various system requirements, such critical systems are frequently built from commercial hardware, employ commercial software, and require network access. These hardware, software, and network system components increase the risk that sensitive input data, computation, and output data may be compromised.
Palmer, D S
Presentation technology is available, and it does not have to be expensive. This article describes computer hardware and software concepts for graphics use, and recommends principles for making cost-effective buying decisions. Also included is a previously published technique for making custom computer graphic 35-mm slides at minimal expense. This information is vital to anyone lecturing without the support of a custom graphics laboratory.
Kóczy, László; Mesiar, Radko; Kacprzyk, Janusz
A broad spectrum of modern Information Technology (IT) tools, techniques, main developments and still open challenges is presented. Emphasis is on new research directions in various fields of science and technology that are related to data analysis, data mining, knowledge discovery, information retrieval, clustering and classification, decision making and decision support, control, computational mathematics and physics, to name a few. Applications in many relevant fields are presented, notably in telecommunication, social networks, recommender systems, fault detection, robotics, image analysis and recognition, electronics, etc. The methods used by the authors range from high level formal mathematical tools and techniques, through algorithmic and computational tools, to modern metaheuristics.
Bookless, W.A.; McElroy, L.; Wheatcraft, D.; Middleton, C.; Shang, S. [eds.
Two articles are included: the industrial computing initiative, and artificial hip joints (applying weapons expertise to medical technology). Three research highlights (briefs) are included: KEN project (face recognition), modeling groundwater flow and chemical migration, and gas and oil national information infrastructure.
Bookless, W.A.; McElroy, L.; Wheatcraft, D.; Middleton, C.; Shang, S.
Two articles are included: the industrial computing initiative, and artificial hip joints (applying weapons expertise to medical technology). Three research highlights (briefs) are included: KEN project (face recognition), modeling groundwater flow and chemical migration, and gas and oil national information infrastructure
Full Text Available With the increasing development of computer and communications technology growth and increasing needs and development of information systems security. The problem of security must be approached with greater caution. With the development of computer and communication technologies have developed numerous tools to protect files and other information. A set of tools, procedures, policies and solutions to defend against attacks are collectively referred to as computer network security. It is necessary above all to define and learn about the concepts of attack, risk, threat, vulnerability and asset value. During the design and implementation of information systems should primarily take into account a set of measures to increase security and maintenance at an acceptable level of risk. In any case, there is a need to know the risks in the information system. Sources of potential security problems are challenges and attacks, while the risk relates to the probable outcome and its associated costs due to occurrence of certain events. There are numerous techniques help protect your computer: cryptography, authentication, checked the software, licenses and certificates, valid authorization... This paper explains some of the procedures and potential threats to break into the network and computers as well as potential programs that are used. Guidance and explanation of these programs is not to cause a break-in at someone else's computer, but to highlight the vulnerability of the computer's capabilities.
Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128
This is the first of two issues commemorating the 30th anniversary of the Lawrence Livermore National Laboratory. The early history of the laboratory is reviewed, including: the LLNL-Nevada organization; project Plowshare; the chemistry and materials science department; and development of computer systems. (GHT)
This is the first of two issues commemorating the 30th anniversary of the Lawrence Livermore National Laboratory. The early history of the laboratory is reviewed, including: the LLNL-Nevada organization; project Plowshare; the chemistry and materials science department; and development of computer systems
Abdeldayem, Hossin A.; Frazier, Donald O.; Penn, Benjamin; Paley, Mark S.; Witherow, William K.; Banks, Curtis; Hicks, Rosilen; Shields, Angela
The rapidly increasing demand for greater speed and efficiency on the information superhighway requires significant improvements over conventional electronic logic circuits. Optical interconnections and optical integrated circuits are strong candidates to provide the way out of the extreme limitations imposed on the growth of speed and complexity of nowadays computations by the conventional electronic logic circuits. The new optical technology has increased the demand for high quality optical materials. NASA's recent involvement in processing optical materials in space has demonstrated that a new and unique class of high quality optical materials are processible in a microgravity environment. Microgravity processing can induce improved orders in these materials and could have a significant impact on the development of optical computers. We will discuss NASA's role in processing these materials and report on some of the associated nonlinear optical properties which are quite useful for optical computers technology.
This document is the August, 1995 issue of the Science and Technology review, a Lawrence Berkeley Laboratory publication. It contains two major articles, one on Scanning Tunneling Microscopy - as applied to materials engineering studies, and one on risk assessment, in this case looking primarily at a health care problem. Separate articles will be indexed from this journal to the energy database.
Antonakos, James L
Covering a broad range of new topics in computer technology and programming, this volume discusses encryption techniques, SQL generation, Web 2.0 technologies, and visual sensor networks. It also examines reconfigurable computing, video streaming, animation techniques, and more. Readers will learn about an educational tool and game to help students learn computer programming. The book also explores a new medical technology paradigm centered on wireless technology and cloud computing designed to overcome the problems of increasing health technology costs.
Computer Based Training (CBT) offers great potential for revolutionizing the training environment. Tremendous advances in computer cost performance, instructional design science, and authoring systems have combined to put CBT within the reach of all. The ability of today's CBT systems to implement powerful training strategies, simulate complex processes and systems, and individualize and control the training process make it certain that CBT will now, at long last, live up to its potential. This paper reviews the major technologies and trends involved and offers some suggestions for getting started in CBT
Münchgesang, Wolfram; Meisner, Patrick; Yushin, Gleb
Commercial electrochemical capacitors (supercapacitors) are not limited to mobile electronics anymore, but have reached the field of large-scale applications, like smart grid, wind turbines, power for large scale ground, water and aerial transportation, energy-efficient industrial equipment and others. This review gives a short overview of the current state-of-the-art of electrochemical capacitors, their commercial applications and the impact of technological development on performance
Research activities at Lawrence Livermore National Laboratory are described in the Energy and Technology Review. This issue includes articles on measuring chromosome changes in people exposed to cigarette smoke, sloshing-ion experiments in the tandem mirror experiment, aluminum-air battery development, and a speech by Edward Teller on national defense. Abstracts of the first three have been prepared separately for the data base
Brown, P.S. (ed.)
Research activities at Lawrence Livermore National Laboratory are described in the Energy and Technology Review. This issue includes articles on measuring chromosome changes in people exposed to cigarette smoke, sloshing-ion experiments in the tandem mirror experiment, aluminum-air battery development, and a speech by Edward Teller on national defense. Abstracts of the first three have been prepared separately for the data base. (GHT)
Digital computers are machines that can be programmed to perform logical and arithmetical operations. Contemporary digital computers are ''universal,'' in the sense that a program that runs on one computer can, if properly compiled, run on any other computer that has access to enough memory space and time. Any one universal computer can simulate the operation of any other; and the set of tasks that any such machine can perform is common to all universal machines. Since Bennett's discovery that computation can be carried out in a non-dissipative fashion, a number of Hamiltonian quantum-mechanical systems have been proposed whose time-evolutions over discrete intervals are equivalent to those of specific universal computers. The first quantum-mechanical treatment of computers was given by Benioff, who exhibited a Hamiltonian system with a basis whose members corresponded to the logical states of a Turing machine. In order to make the Hamiltonian local, in the sense that its structure depended only on the part of the computation being performed at that time, Benioff found it necessary to make the Hamiltonian time-dependent. Feynman discovered a way to make the computational Hamiltonian both local and time-independent by incorporating the direction of computation in the initial condition. In Feynman's quantum computer, the program is a carefully prepared wave packet that propagates through different computational states. Deutsch presented a quantum computer that exploits the possibility of existing in a superposition of computational states to perform tasks that a classical computer cannot, such as generating purely random numbers, and carrying out superpositions of computations as a method of parallel processing. In this paper, we show that such computers, by virtue of their common function, possess a common form for their quantum dynamics
Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5......Computational Topology by Herbert Edelsbrunner and John L. Harer. American Matheamtical Society, 2010 - ISBN 978-0-8218-4925-5...
Gowrisankaran, Sowjanya; Sheedy, James E
Computer vision syndrome (CVS) is a collection of symptoms related to prolonged work at a computer display. This article reviews the current knowledge about the symptoms, related factors and treatment modalities for CVS. Relevant literature on CVS published during the past 65 years was analyzed. Symptoms reported by computer users are classified into internal ocular symptoms (strain and ache), external ocular symptoms (dryness, irritation, burning), visual symptoms (blur, double vision) and musculoskeletal symptoms (neck and shoulder pain). The major factors associated with CVS are either environmental (improper lighting, display position and viewing distance) and/or dependent on the user's visual abilities (uncorrected refractive error, oculomotor disorders and tear film abnormalities). Although the factors associated with CVS have been identified the physiological mechanisms that underlie CVS are not completely understood. Additionally, advances in technology have led to the increased use of hand-held devices, which might impose somewhat different visual challenges compared to desktop displays. Further research is required to better understand the physiological mechanisms underlying CVS and symptoms associated with the use of hand-held and stereoscopic displays.
McCool, A C; Garand, M M
A survey research study profiled foodservices and foodservice managers in health care and educational institutions that applied computer technology to their operations. The survey also examined the extent to which computers were applied to management and client service functions. Both the size and the type of institution were found to be significantly related to computer usage. The larger the institution, the greater the extent of indicated usage. Educational institutions used computers more than all types of health care institutions. Mainframe systems (time shared internally or externally) were the predominant computers used. Internal mainframe systems and minicomputers were used significantly more by educational institutions than by health care institutions. The manager most likely to use computers was a man of any age with at least a bachelor's degree who was employed full-time within the institution. He had taken at least six business management courses and had at least some understanding of and ability to apply systems management concepts to his daily management practices. Applications were categorized into five functional areas: menu, purchasing/storage, production, client service, and managerial information. Managerial information applications were most frequently reported by all respondents, with large institutions and elementary/secondary schools reporting the greatest usage for those applications. Several purchase/storage and production applications were significantly related to type or to size or to both, with large institutions and college/university foodservices reporting the greatest usage. Menu precosting was the only significant menu function, and that was significant only relative to institutional type. No client service functions were significantly related to either type or size.
Bell, Christopher; Lachman, Roy
NASA programs for manned space flight are in their 27th year. Scientists and engineers who worked continuously on the development of aerospace technology during that period are approaching retirement. The resulting loss to the organization will be considerable. Although this problem is general to the NASA community, the problem was explored in terms of the institutional memory and technical expertise of a single individual in the Man-Systems division. The main domain of the expert was spacecraft lighting, which became the subject area for analysis in these studies. The report starts with an analysis of the cumulative expertise and institutional memory of technical employees of organizations such as NASA. A set of solutions to this problem are examined and found inadequate. Two solutions were investigated at length: hypertext and expert systems. Illustrative examples were provided of hypertext and expert system representation of spacecraft lighting. These computer technologies can be used to ameliorate the problem of the loss of invaluable personnel.
Three review articles are presented. The first describes the Lawrence Livermore Laboratory role in the research and development of oil-shale retorting technology through its studies of the relevant chemical and physical processes, mathematical models, and new retorting concepts. Second is a discussion of investigation of properties of dense molecular fluids at high pressures and temperatures to improve understanding of high-explosive behavior, giant-planet structure, and hydrodynamic shock interactions. Third, by totally computerizing the triple-quadrupole mass spectrometer system, the laboratory has produced a general-purpose instrument of unrivaled speed, selectivity, and adaptability for the analysis and identification of trace organic constituents in complex chemical mixtures. (GHT)
Three review articles are presented. The first describes the Lawrence Livermore Laboratory role in the research and development of oil-shale retorting technology through its studies of the relevant chemical and physical processes, mathematical models, and new retorting concepts. Second is a discussion of investigation of properties of dense molecular fluids at high pressures and temperatures to improve understanding of high-explosive behavior, giant-planet structure, and hydrodynamic shock interactions. Third, by totally computerizing the triple-quadrupole mass spectrometer system, the laboratory has produced a general-purpose instrument of unrivaled speed, selectivity, and adaptability for the analysis and identification of trace organic constituents in complex chemical mixtures
Post, Douglass E.
The primary purpose of `Computational Atomic Structure' is to give a potential user of the Multi-Configuration Hartree-Fock (MCHF) Atomic Structure Package an outline of the physics and computational methods in the package, guidance on how to use the package, and information on how to interpret and use the computational results. The book is successful in all three aspects. In addition, the book provides a good overview and review of the physics of atomic structure that would be useful to the plasma physicist interested in refreshing his knowledge of atomic structure and quantum mechanics. While most of the subjects are covered in greater detail in other sources, the book is reasonably self-contained, and, in most cases, the reader can understand the basic material without recourse to other sources. The MCHF package is the standard package for computing atomic structure and wavefunctions for single or multielectron ions and atoms. It is available from a number of ftp sites. When the code was originally written in FORTRAN 77, it could only be run on large mainframes. With the advances in computer technology, the suite of codes can now be compiled and run on present day workstations and personal computers and is thus available for use by any physicist, even those with extremely modest computing resources. Sample calculations in interactive mode are included in the book to illustrate the input needed for the code, what types of results and information the code can produce, and whether the user has installed the code correctly. The user can also specify the calculational level, from simple Hartree-Fock to multiconfiguration Hartree-Fock. The MCHF method begins by finding approximate wavefunctions for the bound states of an atomic system. This involves minimizing the energy of the bound state using a variational technique. Once the wavefunctions have been determined, other atomic properties, such as the transition rates, can be determined. The book begins with an
Johnson, K.C. (ed.)
This issue of Energy Technology Review'' gives the annual review of the programs at Lawrence Livermore National Laboratory. This State of the Laboratory issue includes discussions of all major programs: Defense Systems; Laser Research; Magnetic Fusion Energy; Energy and Earth Sciences; Environmental Technology Program; Biomedical and Environmental Science; Engineering; Physics; Chemistry and Materials Science; Computations; and Administrative and Institutional Services. An index is also given of the 1991 achievements with contact names and telephone number.
Quirk, W.J. [ed.
The Lawrence Livermore National Laboratory was established in 1952 to do research on nuclear weapons and magnetic fusion energy. Since then, we other major programs have been added including laser fusion, and laser isotope separation, biomedical and environmental science, strategic defense and applied energy technology. These programs, in turn, require research in basic scientific disciplines, including chemistry and materials science, computer science and technology, engineering and physics. In this issue, Herald Brown, the Laboratory`s third director and now counselor at the Center for Strategic and International Studies, reminisces about his years at Livermore and comments about the Laboratory`s role in the future. Also an article on visualizing dynamic systems in three dimensions is presented. Researchers can use our interactive algorithms to translate massive quantities of numerical data into visual form and can assign the visual markers of their choice to represent three- dimensional phenomena in a two-dimensional setting, such as a monitor screen. Major work has been done in the visualization of climate modeling, but the algorithms can be used for visualizing virtually any phenomena.
The unifying theme of the Nuclear Technology Review 2002 (NTR-2002) is the importance of innovation. Innovation makes it possible to step beyond incremental evolutionary improvements constrained by diminishing returns. For crop production and public health, for example, the sterile insect technique created a whole new path for future improvements, distinctly different from applying ever larger amounts of pesticides. Nuclear techniques offer a new and safer approach to removing the world's estimated 60,000,000 abandoned land mines. New precision techniques create the potential for ever less intrusive and more effective radiation treatments for cancer. For nuclear power continuing innovation will be a key factor in closing the projection gap between long term global energy scenarios in which nuclear power expands substantially and near term scenarios with only modest expansion or even decline. While the NTR-2002 presents a worldwide review of the state-of-the-art of nuclear science and technology, and not an annual report on IAEA activities, it notes areas where the Agency has a particularly important role to play. Part I of the NTR-2002 'Fundamentals of Nuclear Development', reviews developments in the field of nuclear, atomic and molecular data. Research reactors remain essential to progress in nuclear science and technology. Part I reviews advances in radioisotope production, the use of accelerators and neutron activation analysis relevant to applications ranging from medicine particularly the light against cancer to industry. Part I also reviews developments in nuclear instrumentation and nuclear fusion, particularly in connection with the International Thermonuclear Experimental Reactor. Part II begins with a summary of nuclear power production in 2001. At the end of 2001 there were 438 nuclear power plants (NPPs) in operation, corresponding to a total capacity of 353 GW(e), more than 10000 reactor-years of cumulative operating experience and about 16% of global
Johnson, Roger L.
Advanced laptop and small personal computer technology is presented in the form of the viewgraphs. The following areas of hand carried computers and mobile workstation technology are covered: background, applications, high end products, technology trends, requirements for the Control Center application, and recommendations for the future.
(e) unit in Germany and one 148 MW(e) unit in Japan. Current expansion and growth prospects are centred in Asia. Eighteen of the 31 reactors under construction at the end of 2003 are located in China, India, Japan, the Republic of Korea and the Democratic People's Republic of Korea. Twenty-one of the last 30 reactors to have been connected to the grid are in the Far East and South Asia. This years review also summarises the activities of the International Project On Innovative Nuclear Reactors And Fuel Cycles (INPRO), application of nuclear technology in environmental protection, water resources development, particularly desalination reactors, agriculture and human health
This issue of Science & Technology Review covers the following topics: (1) We Will Always Need Basic Science--Commentary by Tomas Diaz de la Rubia; (2) When Semiconductors Go Nano--experiments and computer simulations reveal some surprising behavior of semiconductors at the nanoscale; (3) Retinal Prosthesis Provides Hope for Restoring Sight--A microelectrode array is being developed for a retinal prosthesis; (4) Maglev on the Development Track for Urban Transportation--Inductrack, a Livermore concept to levitate train cars using permanent magnets, will be demonstrated on a 120-meter-long test track; and (5) Power Plant on a Chip Moves Closer to Reality--Laboratory-designed fuel processor gives power boost to dime-size fuel cell.
Satish Kumar*, Vishal Thakur, Payal Thakur, Ashok Kumar Kashyap
Cloud computing era is the most resourceful, elastic, utilized and scalable period for internet technology to use the computing resources over the internet successfully. Cloud computing did not provide only the speed, accuracy, storage capacity and efficiency for computing but it also lead to propagate the green computing and resource utilization. In this research paper, a brief description of cloud computing, cloud services and cloud security challenges is given. Also the literature review o...
Hasselbring, Ted S.; Glaser, Candyce H. Williams
Reviews the role of computer technology in promoting the education of children with special needs within regular classrooms, discussing: technologies for students with mild learning and behavioral disorders, speech and language disorders, hearing impairments, visual impairments, and severe physical disabilities. Examines barriers to effective…
Noting a recent increase in the number of cases of computer crime and computer piracy, this paper takes up the question, "How can understanding the social context of computing help us--as parents, educators, and members of government and industry--to educate young people to become morally responsible members of an electronic information…
Shemelis Nigatu Gebremariam
Full Text Available Biodiesel is a fuel with various benefits over the conventional diesel fuel. It is derived from renewable resources, it has less emission to environment, it is biodegradable so has very limited toxicity and above all its production can be decentralized so that it could have a potential in helping rural economies. However, there are also some worth mentioning challenges associated with production of biodiesel. Among them repeatedly mentioned are the cost of feedstock and the choice of convenient technology for efficient production of the fuel from diverse feedstock types. There are four main routes by which raw vegetable oil and/or animal fat can be made suitable for use as substituent fuel in diesel engines without modification. These are direct use or blending of oils, micro-emulsion, thermal cracking or pyrolysis and transesterification reaction. Due to the quality of the fuel produced, the transesterification method is the most preferred way to produce biodiesel from diverse feedstock types. Through this method, oils and fats (triglycerides are converted to their alkyl esters with reduced viscosity to near diesel fuel levels. There are different techniques to carry out transesterification reaction for biodiesel production. Each technique has its own advantages and disadvantages as well as its own specifically convenient feedstock character. There are also some very important reaction conditions to be given due attention in each of this techniques for efficient production of biodiesel, such as molar ratio of alcohol to oil, type and amount of catalyst, reaction temperature, reaction time, reaction medium, type and relative amount of solvents, among others. This review is meant to investigate the main transesterification techniques for biodiesel production in terms of their choice of feedstock character as well as their determinately required reaction conditions for efficient biodiesel production, so that to give an overview on their advantages
This issue features articles on multiprocessing computers, fission and fusion breeder reactors, and supracompression of high-explosive detonation products. Abstracts were prepared separately for each article
Njagi, K. O.; Havice, W. L.
Recent advances in the contemporary world, especially in the area of computer technology, have heralded the development and implementation of new and innovative teaching strategies and particularly with the Internet revolution. This study assessed students' attitude towards computer technology. Specifically, the study assessed differences in…
Keengwe, Jared; Anyanwu, Longy O.
The purpose of the study was to determine students' perception of instructional integration of computer technology to improve learning. Two key questions were investigated in this study: (a) What is the students' perception of faculty integration of computer technology into classroom instruction? (b) To what extent does the students' perception of…
Items 1 - 29 of 29 ... The Journal of Applied Science, Engineering and Technology covers research activities and development in the field of Applied Sciences and Technology as it relates to Agricultural Engineering, Biotechnology, Computer Science and Engineering Computations, Civil Engineering, Food Science and ...
Cadarache, France.The IAEA's International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO) grew to 24 members, with the addition in 2005 of Ukraine and the United States of America. Current INPRO activities include completion of a user manual on the INPRO methodology, application of the methodology to assessing innovative nuclear energy systems (INSs) in national and multinational studies, analyses of the role and structure of INSs in meeting energy demands in a sustainable manner, and selection of the most suitable areas for collaborative development. Developments in accelerator based techniques, production of radioisotopes and some novel uses of nanotechnology are also reported. Nuclear technologies continue to play key and often unique roles in food production and safety, in human and animal health, in water resource management and in the environment. Mutation breeding of crops, for example, has led to the use of previously unusable land in many countries for rice production. In human health, the use of stable isotopes is becoming an accepted tool for the development of nutrition programmes. Nuclear medicine is benefiting from technological advances in computing. Sustainable water management and desalination remain high on the international agenda. New developments in isotopic analysis of hydrological samples hold promise for increasing the use of isotopes in water resources management. Advances in sampling and analytical techniques have assisted in better understanding of the environment. Developments in all these areas are also reported
Quirk, W.J.; Canada, J.; de Vore, L.; Gleason, K.; Kirvel, R.D.; Kroopnick, H.; McElroy, L.
This issue highlights the Lawrence Livermore National Laboratory`s 1993 accomplishments in our mission areas and core programs: economic competitiveness, national security, energy, the environment, lasers, biology and biotechnology, engineering, physics, chemistry, materials science, computers and computing, and science and math education. Secondary topics include: nonproliferation, arms control, international security, environmental remediation, and waste management.
Computing of the future will be affected by a number of fundamental technologies in development today, many of which are already on the way to becoming commercialized. In this series of lectures, we will discuss hardware and software development that will become mainstream in the timeframe of a few years and how they will shape or change the computing landscape - commercial and personal alike. Topics range from processor and memory aspects, programming models and the limits of artificial intelligence, up to end-user interaction with wearables or e-textiles. We discuss the impact of these technologies on the art of programming, the data centres of the future and daily life. On the second day of the Future Computing Technology series, we will talk about ubiquitous computing. From smart watches through mobile devices to virtual reality, computing devices surround us, and innovative new technologies are introduces every day. We will briefly explore how this propagation might continue, how computers can take ove...
Villegas, David; Rodero, Ivan; Fong, Liana; Bobroff, Norman; Liu, Yanbin; Parashar, Manish; Sadjadi, S. Masoud
The fields of Grid, Utility and Cloud Computing have a set of common objectives in harnessing shared resources to optimally meet a great variety of demands cost-effectively and in a timely manner Since Grid Computing started its technological journey about a decade earlier than Cloud Computing, the Cloud can benefit from the technologies and experience of the Grid in building an infrastructure for distributed computing. Our comparison of Grid and Cloud starts with their basic characteristics and interaction models with clients, resource consumers and providers. Then the similarities and differences in architectural layers and key usage patterns are examined. This is followed by an in depth look at the technologies and best practices that have applicability from Grid to Cloud computing, including scheduling, service orientation, security, data management, monitoring, interoperability, simulation and autonomic support. Finally, we offer insights on how these techniques will help solve the current challenges faced by Cloud computing.
McMahon, D H
The October 2003 issue of Science & Technology Review consists of the following articles: (1) Award-Winning Technologies from Collaborative Efforts--Commentary by Hal Graboske; (2) BASIS Counters Airborne Bioterrorism--The Biological Aerosol Sentry and Information System is the first integrated biodefense system; (3) In the Chips for the Coming Decade--A new system is the first full-field lithography tool for use at extreme ultraviolet wavelengths; (4) Smoothing the Way to Print the Next Generation of Computer Chips--With ion-beam thin-film planarization, the reticles and projection optics made for extreme ultraviolet lithography are nearly defect-free; (5) Eyes Can See Clearly Now--The MEMS-based adaptive optics phoropter improves the process of measuring and correcting eyesight aberrations; (6) This Switch Takes the Heat--A thermally compensated Q-switch reduces the light leakage on high-average-power lasers; (7) Laser Process Forms Thick, Curved Metal Parts--A new process shapes parts to exact specifications, improving their resistance to fatigue and corrosion cracking; and (8) Characterizing Tiny Objects without Damaging Them--Livermore researchers are developing nondestructive techniques to probe the Lilliputian world of mesoscale objects.
Hatherleigh Co., Ltd., New York, NY.
Designed as a special continuing education program for rehabilitation professionals, this document is divided into five lessons. Lesson 1, "Technology Assessment: Determining the Needs" (Ricardo G. Cerna), includes discussion on technology assessment and determining the needs of the clients as well as different types of assistive…
Three articles and two briefs discuss ongoing research at Lawrence Livermore National Laboratory. Topics in this issue include: construction of human chromosome library (brief); dispersion of liquified gases (brief); magma evolution; energy flow diagrams; and computer simulation of particulate flow
Fisher, Charles, Ed.; Dwyer, David C., Ed.; Yocam, Keith, Ed.
This volume examines learning in the age of technology, describes changing practices in technology-rich classrooms, and proposes new ways to support teachers as they incorporate technology into their work. It commemorates the eleventh anniversary of the Apple Classrooms of Tomorrow (ACOT) Project, when Apple Computer, Inc., in partnership with a…
Buckler, Chris; Koperski, Kevin; Loveland, Thomas R.
Although technology education evolved over time, and pressure increased to infuse more engineering principles and increase links to STEM (science technology, engineering, and mathematics) initiatives, there has never been an official alignment between technology and engineering education and computer science. There is movement at the federal level…
The OSART programme of the IAEA has become an effective vehicle for promoting international co-operation for the enhancement of plant operational safety. In order to maintain consistency in the OSART reviews, OSART Guidelines have been developed which are intended to ensure that the reviewing process is comprehensive. Computer technology is an area in which rapid development is taking place and new applications may be computerized to further enhance safety and the effectiveness of the plant. Supplementary guidance and reference material is needed to help attain comprehensiveness and consistency in OSART reviews. This document is devoted to the utilization of on-site and off-site computers in such a way that the safe operation of the plant is supported. In addition to the main text, there are several annexes illustrating adequate practices as found at various operating nuclear power plants. Refs, figs and tabs
Duoss, Eric B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kotta, Paul R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Meissner, Caryn N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chinn, Ken [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
This is the September 2017 edition of the LLNL, Science and Technology Review. At Lawrence Livermore National Laboratory, we focus on science and technology research to ensure our nation’s security. We also apply that expertise to solve other important national problems in energy, bioscience, and the environment. Science & Technology Review is published eight times a year to communicate, to a broad audience, the Laboratory’s scientific and technological accomplishments in fulfilling its primary missions. The publication’s goal is to help readers understand these accomplishments and appreciate their value to the individual citizen, the nation, and the world.
Research is described in three areas, high-technology design of unconventional, nonnuclear weapons, a model for analyzing special nuclear materials safeguards decisions, and a nuclear weapons accident exercise (NUWAX-81)
EKREN, Nazmi; DURSUN, Bahtiyar; AYKUT, Ercan
It is well known that the computer in lighting technology is a vital component for lighting designers. Lighting computer programs are preferred in preparing architectural projects in lighting techniques, especially in lighting calculations. Lighting computer programs, which arise with the aim of helping lighting designers, gain more interest day by day. The most important property of lighting computer programs is the ability to enable the simulation of lighting projects without requiring any ...
The year 2006 saw increasing activities in the field of nuclear power. Significant plans for expansion were announced in some countries and plans for introducing nuclear power in some others. The year began with announcements by both the Russian Federation and the United States of America of international fuel cycle proposals in anticipation of a substantial expansion of nuclear power worldwide. In January, Russian President Vladimir Putin outlined a proposal to create 'a system of international centres providing nuclear fuel cycle services, including enrichment, on a non-discriminatory basis and under the control of the IAEA'. In February, the USA proposed a Global Nuclear Energy Partnership to develop advanced recycling technologies that would not separate pure plutonium; international collaboration in supplying fuel for States which agree not to pursue enrichment and reprocessing; advanced reactors to consume recycled spent fuel while providing energy; and safe and secure small reactors suited to the needs of developing countries. New medium-term projections by the IAEA and the International Energy Agency present a picture with opportunities for substantial nuclear expansion, but still with notable uncertainty. A number of countries have announced plans for significant expansion: China, India, Japan, Pakistan, the Russian Federation and the Republic of Korea. Announcements of planned license applications by US companies and consortia mentioned approximately 25 new reactors. Two site preparation applications were submitted in Canada. A major energy review by the United Kingdom concluded that new nuclear power stations would make a significant contribution to meeting the UK's energy policy goals. Utilities from Estonia, Lithuania and Latvia launched a joint feasibility study of a new nuclear power plant to serve all three countries, and Belarus, Egypt, Indonesia, Nigeria and Turkey made announcements of steps they are taking toward their first nuclear power plants
The report delineates the various computer system components and extrapolates past trends in light of industry goals and physical limitations to predict what individual components and entire systems will look like in the second half of this decade. T...
Hardman, John; Tu, Eugene (Technical Monitor)
The Computing, Information and Communications Technology Program (CICT) was established in 2001 to ensure NASA's Continuing leadership in emerging technologies. It is a coordinated, Agency-wide effort to develop and deploy key enabling technologies for a broad range of mission-critical tasks. The NASA CICT program is designed to address Agency-specific computing, information, and communications technology requirements beyond the projected capabilities of commercially available solutions. The areas of technical focus have been chosen for their impact on NASA's missions, their national importance, and the technical challenge they provide to the Program. In order to meet its objectives, the CICT Program is organized into the following four technology focused projects: 1) Computing, Networking and Information Systems (CNIS); 2) Intelligent Systems (IS); 3) Space Communications (SC); 4) Information Technology Strategic Research (ITSR).
Brief discussions are given on research work at the National CTR Computer Center at Livermore, high-field superconducting magnets for fusion reactors, and digital image processing for plasma diagnostics. Some developmental work on multifilamentary superconductors of Nb--Ti and Nb--Sn is described. Some complex mathematical techniques for enhancing and restoring digital images for plasma diagnostics are discussed
Noor, A. K.; Storaasli, O. O.; Fulton, R. E.
Advances in computer technology which may have an impact on computational mechanics and flight vehicle structures technology were reviewed. The characteristics of supersystems, highly parallel systems, and small systems are summarized. The interrelations of numerical algorithms and software with parallel architectures are discussed. A scenario for future hardware/software environment and engineering analysis systems is presented. Research areas with potential for improving the effectiveness of analysis methods in the new environment are identified.
Stowers, I.F.; Crawford, R.B.; Esser, M.A.; Lien, P.L.; O'Neal, E.; Van Dyke, P.
The state of the laboratory address by LLNL Director Roger Batzel is summarized, and a breakdown of the laboratory funding is given. The Livermore defense-related committment is described, including the design and development of advanced nuclear weapons as well as research in inertial confinement fusion, nonnuclear ordnance, and particle beam technology. LLNL is also applying its scientific and engineering resources to the dual challenge of meeting future energy needs without degrading the quality of the biosphere. Some representative examples are given of the supporting groups vital for providing the specialized expertise and new technologies required by the laboratory's major research programs
Stowers, I.F.; Crawford, R.B.; Esser, M.A.; Lien, P.L.; O' Neal, E.; Van Dyke, P. (eds.)
The state of the laboratory address by LLNL Director Roger Batzel is summarized, and a breakdown of the laboratory funding is given. The Livermore defense-related committment is described, including the design and development of advanced nuclear weapons as well as research in inertial confinement fusion, nonnuclear ordnance, and particle beam technology. LLNL is also applying its scientific and engineering resources to the dual challenge of meeting future energy needs without degrading the quality of the biosphere. Some representative examples are given of the supporting groups vital for providing the specialized expertise and new technologies required by the laboratory's major research programs. (GHT)
Carr, R.B.; McCleb, C.S.; Prono, J.K.
Brief discussions of research progress on the following topics are given: (1) lasers and laser applications, (2) advanced energy systems, (3) science and technology, and (4) national security. Some experiments on the in-flight laser irradiation of ammonia pellets are discussed
VanDalsem, William R.
The goal of the Computing, Information, and Communications Technology (CICT) program is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communications technologies. This viewgraph presentation includes diagrams of how the political guidance behind CICT is structured. The presentation profiles each part of the NASA Mission in detail, and relates the Mission to the activities of CICT. CICT's Integrated Capability Goal is illustrated, and hypothetical missions which could be enabled by CICT are profiled. CICT technology development is profiled.
This journal contains 7 articles pertaining to astrophysics. The first article is an overview of the other 6 articles and also a tribute to Jim Wilson and his work in the fields of general relativity and numerical astrophysics. The six articles are on the following subjects: (1) computer simulations of black hole accretion; (2) calculations on the collapse of the iron core of a massive star; (3) stellar-collapse models which reveal a possible site for nucleosynthesis of elements heavier than iron; (4) modeling sources for gravitational radiation; (5) the development of a computer program for finite-difference mesh calculations and its applications to astrophysics; (6) the existence of neutrinos with nonzero rest mass are used to explain the universe. Abstracts of each of the articles were prepared separately. (SC)
Mandal, Jyotsna; Auluck, Nitin; Nagarajaram, H
This book highlights a collection of high-quality peer-reviewed research papers presented at the Ninth International Conference on Advanced Computing & Communication Technologies (ICACCT-2015) held at Asia Pacific Institute of Information Technology, Panipat, India during 27–29 November 2015. The book discusses a wide variety of industrial, engineering and scientific applications of the emerging techniques. Researchers from academia and industry present their original work and exchange ideas, information, techniques and applications in the field of Advanced Computing and Communication Technology.
Research programs at LLNL are reviewed. This issue discusses validation of the pulsed-power design for FXR, the NOVA plasma shutter, thermal control of the MFTF superconducting magnet, a low-energy x-ray spectrometer for pulsed-source diagnostics, micromachining, the electronics engineer's design station, and brazing with a laser microtorch
Carr, R.B.; Bathgate, M.B.; Crawford, R.B.; McCaleb, C.S.; Prono, J.K. (eds.)
The chief objective of LLL's biomedical and environmental research program is to enlarge mankind's understanding of the implications of energy-related chemical and radioactive effluents in the biosphere. The effluents are studied at their sources, during transport through the environment, and at impact on critical resources, important ecosystems, and man himself. We are pursuing several projects to acquire such knowledge in time to guide the development of energy technologies toward safe, reasonable, and optimal choices.
Three areas of research are discussed: microcomputer technology applied to inspecting machined parts to determine roundness in ultraprecision measurements; development of an electrolytic technique for preparing dinitrogen pentoxide as a potentially less expensive step in the large-scale synthesis of the explosive HMX; and the application of frequency conversion to short wavelengths in the Novette and Nova lasers to improve the performance of inertial-confinement fusion targets. (GHT)
The use of encapsulation technology to produce a compliant waste form is an outgrowth from existing polymer industry technology and applications. During the past 12 years, the Department of Energy (DOE) has been researching the use of this technology to treat mixed wastes (i.e., containing hazardous and radioactive wastes). The two primary encapsulation techniques are microencapsulation and macroencapsulation. Microencapsulation is the thorough mixing of a binding agent with a powdered waste, such as incinerator ash. Macroencapsulation coats the surface of bulk wastes, such as lead debris. Cement, modified cement, and polyethylene are the binding agents which have been researched the most. Cement and modified cement have been the most commonly used binding agents to date. However, recent research conducted by DOE laboratories have shown that polyethylene is more durable and cost effective than cements. The compressive strength, leachability, resistance to chemical degradation, etc., of polyethylene is significantly greater than that of cement and modified cement. Because higher waste loads can be used with polyethylene encapsulant, the total cost of polyethylene encapsulation is significantly less costly than cement treatment. The only research lacking in the assessment of polyethylene encapsulation treatment for mixed wastes is pilot and full-scale testing with actual waste materials. To date, only simulated wastes have been tested. The Rocky Flats Environmental Technology Site had planned to conduct pilot studies using actual wastes during 1996. This experiment should provide similar results to the previous tests that used simulated wastes. If this hypothesis is validated as anticipated, it will be clear that polyethylene encapsulation should be pursued by DOE to produce compliant waste forms
Three areas of research are discussed: microcomputer technology applied to inspecting machined parts to determine roundness in ultraprecision measurements; development of an electrolytic technique for preparing dinitrogen pentoxide as a potentially less expensive step in the large-scale synthesis of the explosive HMX; and the application of frequency conversion to short wavelengths in the Novette and Nova lasers to improve the performance of inertial-confinement fusion targets
Carr, R.B.; Bathgate, M.B.; Crawford, R.B.; McCaleb, C.S.; Prono, J.K.
The chief objective of LLL's biomedical and environmental research program is to enlarge mankind's understanding of the implications of energy-related chemical and radioactive effluents in the biosphere. The effluents are studied at their sources, during transport through the environment, and at impact on critical resources, important ecosystems, and man himself. We are pursuing several projects to acquire such knowledge in time to guide the development of energy technologies toward safe, reasonable, and optimal choices
Stowers, I.F. (ed.)
Three research programs at Lawrence Livermore Laboratory are described. (1) The solid-state microscope, specifically designed for computer input, enables automated high-resolution population screening for blood-cell abnormalities or early signs of cancer. Nonmedical applications appear possible in powder metallurgy, geology, and semiconductor fabrication. (2) The studies of ion-atom collisions have led to improved atomic-structure measurements, new techniques for determining elemental composition, and better x-ray detector calibrations. (3) A new and promising source of high-power laser radiation has characteristics that may make it feasible for the production of fusion power on a commercial scale. (GHJ)
Three research programs at Lawrence Livermore Laboratory are described. (1) The solid-state microscope, specifically designed for computer input, enables automated high-resolution population screening for blood-cell abnormalities or early signs of cancer. Nonmedical applications appear possible in powder metallurgy, geology, and semiconductor fabrication. (2) The studies of ion-atom collisions have led to improved atomic-structure measurements, new techniques for determining elemental composition, and better x-ray detector calibrations. (3) A new and promising source of high-power laser radiation has characteristics that may make it feasible for the production of fusion power on a commercial scale
Seacord, C. L.; Vaughn, D.
A multi-year, multi-faceted program is underway to investigate and develop potential improvements in airframes, engines, and avionics for general aviation aircraft. The objective of this study was to assemble information that will allow the government to assess the trends in computer and computer/operator interface technology that may have application to general aviation in the 1980's and beyond. The current state of the art of computer hardware is assessed, technical developments in computer hardware are predicted, and nonaviation large volume users of computer hardware are identified.
An overview is given of research programs at a two-stage light-gas gun facility. Representative gas-gun experiments are described, and the impact of this research on other LLNL programs and on high-pressure physics work in general are discussed. Particular applications reported include: measurement of equations of state for various materials, synthesis and study of novel materials, and studies of high explosives. Specialized diagnostic techniques for gas-gun experiments are reviewed
Brief reviews are presented of research programs at Lawrence Livermore Laboratory. In one, fast and precise measurement techniques to meet the demanding specifications for microsphere targets used in inertial-confinement fusion experiments are described. Another program is described in which a Raman-spectroscopy microprobe is used to perform molecular-structure analyses on submicron-size particles. Finally, the first year of the controlled thermonuclear reactions program is described
An overview is given of research programs at a two-stage light-gas gun facility. Representative gas-gun experiments are described, and the impact of this research on other LLNL programs and on high-pressure physics work in general are discussed. Particular applications reported include: measurement of equations of state for various materials, synthesis and study of novel materials, and studies of high explosives. Specialized diagnostic techniques for gas-gun experiments are reviewed. (LEW)
Brey, Philip A.E.; Soraker, Johnny; Meijers, A.
Philosophy has been described as having taken a “computational turn,” referring to the ways in which computers and information technology throw new light upon traditional philosophical issues, provide new tools and concepts for philosophical reasoning, and pose theoretical and practical questions
Bauer, Janice P
This book presents leading-edge research from across the globe in the field of computer science research, technology and applications. Each contribution has been carefully selected for inclusion based on the significance of the research to this fast-moving and diverse field. Some topics included are: network topology; agile programming; virtualization; and reconfigurable computing.
Following a brief report on the production of pulsed beams of slow positrons (0.1-10 keV), articles are presented in three subject areas. The first on earthquake safety of nuclear power plants describes a method of earthquake risk assessment that addresses the interrelationships of a nuclear power plant and its emergency safety features. The second reports studies of the geology of the Yucca Flat area of the Nevada Test Site that should help minimize the risk of accidental venting and reduce the cost of drilling emplacement holes for nuclear tests. The third presents a new computer-based flow cytometric technique, developed at LLNL that makes it possible to classify chromosomes according to both their shape and their DNA content
Zhang, Shufen; Yan, Hongcan; Chen, Xuebin
With the development of multi-core processors, virtualization, distributed storage, broadband Internet and automatic management, a new type of computing mode named cloud computing is produced. It distributes computation task on the resource pool which consists of massive computers, so the application systems can obtain the computing power, the storage space and software service according to its demand. It can concentrate all the computing resources and manage them automatically by the software without intervene. This makes application offers not to annoy for tedious details and more absorbed in his business. It will be advantageous to innovation and reduce cost. It's the ultimate goal of cloud computing to provide calculation, services and applications as a public facility for the public, So that people can use the computer resources just like using water, electricity, gas and telephone. Currently, the understanding of cloud computing is developing and changing constantly, cloud computing still has no unanimous definition. This paper describes three main service forms of cloud computing: SAAS, PAAS, IAAS, compared the definition of cloud computing which is given by Google, Amazon, IBM and other companies, summarized the basic characteristics of cloud computing, and emphasized on the key technologies such as data storage, data management, virtualization and programming model.
In a nutshell, the existing Internet provides to us content in the forms of videos, emails and information served up in web pages. With Cloud Computing, the next generation of Internet will allow us to "buy" IT services from a web portal, drastic expanding the types of merchandise available beyond those on e-commerce sites such as eBay and Taobao. We would be able to rent from a virtual storefront the basic necessities to build a virtual data center: such as CPU, memory, storage, and add on top of that the middleware necessary: web application servers, databases, enterprise server bus, etc. as the platform(s) to support the applications we would like to either rent from an Independent Software Vendor (ISV) or develop ourselves. Together this is what we call as "IT as a Service," or ITaaS, bundled to us the end users as a virtual data center.
In 2011, nuclear energy continued to play an important role in global electricity production despite the accident at the Fukushima Daiichi nuclear power plant (NPP). Total generating nuclear power capacity was slightly lower than in previous years due to the permanent shutdown of 13 reactors in 2011, including 8 in Germany and 4 in Japan in the wake of the accident. However, there were 7 new grid connections compared to 5 in 2010, 2 in 2009 and none in 2008. Significant growth in the use of nuclear energy worldwide is still anticipated - between 35% and 100% by 2030 - although the Agency projections for 2030 are 7-8% lower than projections made in 2010. The factors that have contributed to an increased interest in nuclear power did not change: an increasing global demand for energy, concerns about climate change, energy security and uncertainty about fossil fuel supplies. Most of the growth is still expected in countries that already have operating NPPs, especially in Asia, with China and India remaining the main centres of expansion while the Russian Federation will also remain a centre of strong growth. The 7-8% drop in projected growth for 2030 reflects an accelerated phase-out of nuclear power in Germany, some immediate shutdowns and a government review of the planned expansion in Japan, as well as temporary delays in expansion in several other countries. Measures taken by countries as a result of the Fukushima Daiichi nuclear accident have been varied. A number of countries announced reviews of their programmes. Belgium, Germany and Switzerland took additional steps to phase out nuclear power entirely while others re-emphasized their expansion plans. Many Member States carried out national safety assessment reviews in 2011 (often called 'stress tests'), and commitments were made to complete any remaining assessments promptly and to implement the necessary corrective action. In countries considering the introduction of nuclear power, interest remained strong
Vogt, Ramona L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chinn, Ken B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kotta, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Meissner, Caryn N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
At Lawrence Livermore National Laboratory, we focus on science and technology research to ensure our nation’s security. We also apply that expertise to solve other important national problems in energy, bioscience, and the environment. Science & Technology Review is published eight times a year to communicate, to a broad audience, the Laboratory’s scientific and technological accomplishments in fulfilling its primary missions. The publication’s goal is to help readers understand these accomplishments and appreciate their value to the individual citizen, the nation, and the world.
, on a non-discriminatory basis and under the control of the IAEA. Nineteen countries signed a statement of principles of the Global Nuclear Energy Partnership, which aims at accelerating development and deployment of advanced fuel cycle technologies to foster development, improve the environment, and reduce the risk of nuclear proliferation. The NRC approved the release of most of the Big Rock Point nuclear power plant site and most of the Yankee Rowe nuclear power plant site for unrestricted public use. Thus, ten power plants around the world have been completely decommissioned with their sites released for unconditional use. Seventeen plants have been partially dismantled and safely enclosed. Thirty-two are being dismantled prior to eventual site release, and thirty-four reactors are undergoing minimum dismantling prior to long term enclosure. In September, the IAEA launched a new Network of Centres of Excellence for Decommissioning to improve the flow of knowledge and experience among those engaged in decommissioning and to encourage organizations in developed Member States to contribute to the activities of Member States requiring decommissioning assistance. Nuclear and isotopic techniques continue to make substantive contributions in agriculture, human health, the marine and terrestrial environments as well as in water resource management. In food and agriculture, plant mutation breeding is supporting the development of new varieties of crops that bring enhanced yields while also providing significant environmental benefits through reduced requirements for fertilizers and increased resistance to biotic and abiotic stresses. The genetic enhancement of biomass crops is useful in responding to increasing demands for biofuels. In addition to the continuing use of irradiation for sanitary purposes, the use of irradiation for phytosanitary applications, especially those applications related to quarantine measures, is increasing.
In 2009, construction started on 12 new nuclear power reactors, the largest number since 1985, and projections of future nuclear power growth were once again revised upwards. However, only two new reactors were connected to the grid, and, with three reactors retired during the year, the total nuclear power capacity around the world dropped slightly for the second year in a row. Current expansion, as well as near term and long term growth prospects, remain centred in Asia. Ten of the 12 construction starts were in Asia, as were both of the new grid connections. Although the global financial crisis that started in the second half of 2008 did not dampen overall projections for nuclear power, it was cited as a contributing factor in near-term delays or postponements affecting nuclear projects in some regions of the world. In some European countries where previously there were restrictions on the future use of nuclear power, there was a trend towards reconsidering these policies. Interest in starting new nuclear power programmes remained high. Over 60 Member States have expressed to the IAEA interest in considering the introduction of nuclear power, and, in 2009, the IAEA conducted its first Integrated Nuclear Infrastructure Review missions in Jordan, Indonesia and Vietnam. Estimates of identified conventional uranium resources (at less than $130/kg U) increased slightly, due mainly to increases reported by Australia, Canada and Namibia. Uranium spot prices declined, and final data for 2009 are expected to show a consequent decrease in uranium exploration and development. The Board of Governors has authorized the IAEA Director General to sign an agreement with the Russian Federation to establish an international reserve of low enriched uranium (LEU). It would contain 120 tonnes of LEU that could be made available to a country affected by a non-commercial interruption of its LEU supply. The agreement between the IAEA and the Russian Federation was signed in March 2010
The year 2008 was paradoxical for nuclear power. Projections of future growth were revised upwards, but no new reactors were connected to the grid. It was the first year since 1955 without at least one new reactor coming on-line. There were, however, ten construction starts, the most since 1985. At least until the global financial crisis, cost estimates reported for new nuclear reactors were often higher than those in previous years, particularly in regions with less recent experience in new construction. However, growth targets for nuclear power were raised in the Russian Federation, and similar considerations were under review in China. India negotiated a safeguards agreement with the Agency in August, and the Nuclear Suppliers Group subsequently exempted India from previous restrictions on nuclear trade, which should allow India to accelerate its planned expansion of nuclear power. In the USA, the Nuclear Regulatory Commission (NRC) received combined licence (COL) applications for 26 new reactors. The US Department of Energy (USDOE) received 19 'Part I applications' for Federal loan guarantees to build 21 new reactors. Nonetheless, current expansion, as well as near term and long term growth prospects, remain centred in Asia. Of the ten construction starts in 2008, eight were in Asia. Twenty-eight of the 44 reactors under construction at the end of the year were in Asia, as were 28 of the last 39 new reactors to have been connected to the grid. Armenia joined the Russian Federation and Kazakhstan as members of the International Uranium Enrichment Centre in Angarsk, Siberia. The Ukrainian Government announced that Ukraine would also join. AREVA and USEC applied to the USDOE for loan guarantees for the construction of AREVA's proposed Eagle Rock Enrichment Facility and USEC's American Centrifuge Plant. Construction of an underground repository for low and medium level radioactive waste began at the former Konrad iron mine in Germany. The USDOE submitted a formal
Computing of the future will be affected by a number of fundamental technologies in development today, many of which are already on the way to becoming commercialized. In this series of lectures, we will discuss hardware and software development that will become mainstream in the timeframe of a few years and how they will shape or change the computing landscape - commercial and personal alike. Topics range from processor and memory aspects, programming models and the limits of artificial intelligence, up to end-user interaction with wearables or e-textiles. We discuss the impact of these technologies on the art of programming, the data centres of the future and daily life. On the third day of the Future Computing Technology series, we will touch on societal aspects of the future of computing. Our perception of computers may at time seem passive, but in reality we are a vital chain of the feedback loop. Human-computer interaction, innovative forms of computers, privacy, process automation, threats and medica...
Proteins are one of the most versatile modular assembling systems in nature. Experimentally, more than 110 000 protein structures have been identified and more are deposited every day in the Protein Data Bank. Such an enormous structural variety is to a first approximation controlled by the sequence of amino acids along the peptide chain of each protein. Understanding how the structural and functional properties of the target can be encoded in this sequence is the main objective of protein design. Unfortunately, rational protein design remains one of the major challenges across the disciplines of biology, physics and chemistry. The implications of solving this problem are enormous and branch into materials science, drug design, evolution and even cryptography. For instance, in the field of drug design an effective computational method to design protein-based ligands for biological targets such as viruses, bacteria or tumour cells, could give a significant boost to the development of new therapies with reduced side effects. In materials science, self-assembly is a highly desired property and soon artificial proteins could represent a new class of designable self-assembling materials. The scope of this review is to describe the state of the art in computational protein design methods and give the reader an outline of what developments could be expected in the near future. (topical review)
In this report a review of available codes is performed and some code intercomparisons are also discussed. The number of codes treating natural waters (groundwater, lake water, sea water) is large. Most geochemical computer codes treat equilibrium conditions, although some codes with kinetic capability are available. A geochemical equilibrium model consists of a computer code, solving a set of equations by some numerical method and a data base, consisting of thermodynamic data required for the calculations. There are some codes which treat coupled geochemical and transport modeling. Some of these codes solve the equilibrium and transport equations simultaneously while other solve the equations separately from each other. The coupled codes require a large computer capacity and have thus as yet limited use. Three code intercomparisons have been found in literature. It may be concluded that there are many codes available for geochemical calculations but most of them require a user that us quite familiar with the code. The user also has to know the geochemical system in order to judge the reliability of the results. A high quality data base is necessary to obtain a reliable result. The best results may be expected for the major species of natural waters. For more complicated problems, including trace elements, precipitation/dissolution, adsorption, etc., the results seem to be less reliable. (With 44 refs.) (author)
Martin, Hedley G
Mathematics for Engineering, Technology and Computing Science is a text on mathematics for courses in engineering, technology, and computing science. It covers linear algebra, ordinary differential equations, and vector analysis, together with line and multiple integrals. This book consists of eight chapters and begins with a discussion on determinants and linear equations, with emphasis on how the value of a determinant is defined and how it may be obtained. Solution of linear equations and the dependence between linear equations are also considered. The next chapter introduces the reader to
GÜNBAYI, İlhan; CANTÜRK, Gökhan
The aim of the study is to determine the usage of computer technology in school administration, primary school administrators’ attitudes towards computer technology, administrators’ and teachers’ computer literacy level. The study was modeled as a survey search. The population of the study consists primary school principals, assistant principals in public primary schools in the center of Antalya. The data were collected from 161 (%51) administrator questionnaires in 68 of 129 public primary s...
Antonio Mariano Carlos Junior
Full Text Available Cloud computing represents a new paradigm that enables the utility computing model in which computing resources are offered and consumed as a service on demand, pay as you go, in a model similar to that of electricity. The objective of this work is to identify threats and opportunities of the cloud computing model and its influence on the strategy of companies that consume Information Technology. To do so, a survey was conducted with leading IT executives from 64 medium and large companies in Brazil. We noticed the influence of cloud computing in migrating the focus of the corporate IT departments to the management of services and contracts, resulting into the demand for professionals with IT analyst integrated and business profiles.
This edited book presents scientific results of the 4th International Conference on Applied Computing and Information Technology (ACIT 2016) which was held on December 12–14, 2016 in Las Vegas, USA. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. The aim of this conference was also to bring out the research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the best papers from those papers accepted for presentation at the conference. The papers were chosen based on review scores submitted by members of the Program Committee, and underwent further rigorous rounds of review. Th...
Proteins are one of the most versatile modular assembling systems in nature. Experimentally, more than 110 000 protein structures have been identified and more are deposited every day in the Protein Data Bank. Such an enormous structural variety is to a first approximation controlled by the sequence of amino acids along the peptide chain of each protein. Understanding how the structural and functional properties of the target can be encoded in this sequence is the main objective of protein design. Unfortunately, rational protein design remains one of the major challenges across the disciplines of biology, physics and chemistry. The implications of solving this problem are enormous and branch into materials science, drug design, evolution and even cryptography. For instance, in the field of drug design an effective computational method to design protein-based ligands for biological targets such as viruses, bacteria or tumour cells, could give a significant boost to the development of new therapies with reduced side effects. In materials science, self-assembly is a highly desired property and soon artificial proteins could represent a new class of designable self-assembling materials. The scope of this review is to describe the state of the art in computational protein design methods and give the reader an outline of what developments could be expected in the near future.
The DOE Office of ADP Management organized a group of scientists and computer professionals, mostly from their own national laboratories, to prepare an annually updated technology forecast to accompany the Department's five-year ADP Plan. The activities of the task force were originally reported in an informal presentation made at the ACM Conference in 1978. This presentation represents an update of that report. It also deals with the process of applying the results obtained at a particular computing center, Brookhaven National Laboratory. Computer technology forecasting is a difficult and hazardous endeavor, but it can reap considerable advantage. The forecast performed on an industry-wide basis can be applied to the particular needs of a given installation, and thus give installation managers considerable guidance in planning. A beneficial side effect of this process is that it forces installation managers, who might otherwise tend to preoccupy themselves with immediate problems, to focus on longer term goals and means to their ends
Raju, K; Mandal, Jyotsna; Bhateja, Vikrant
The book is about all aspects of computing, communication, general sciences and educational research covered at the Second International Conference on Computer & Communication Technologies held during 24-26 July 2015 at Hyderabad. It hosted by CMR Technical Campus in association with Division – V (Education & Research) CSI, India. After a rigorous review only quality papers are selected and included in this book. The entire book is divided into three volumes. Three volumes cover a variety of topics which include medical imaging, networks, data mining, intelligent computing, software design, image processing, mobile computing, digital signals and speech processing, video surveillance and processing, web mining, wireless sensor networks, circuit analysis, fuzzy systems, antenna and communication systems, biomedical signal processing and applications, cloud computing, embedded systems applications and cyber security and digital forensic. The readers of these volumes will be highly benefited from the te...
Computing of the future will be affected by a number of fundamental technologies in development today, many of which are already on the way to becoming commercialized. In this series of lectures, we will discuss hardware and software development that will become mainstream in the timeframe of a few years and how they will shape or change the computing landscape - commercial and personal alike. Topics range from processor and memory aspects, programming models and the limits of artificial intelligence, up to end-user interaction with wearables or e-textiles. We discuss the impact of these technologies on the art of programming, the data centres of the future and daily life. Lecturer's short bio: Andrzej Nowak has 10 years of experience in computing technologies, primarily from CERN openlab and Intel. At CERN, he managed a research lab collaborating with Intel and was part of the openlab Chief Technology Office. Andrzej also worked closely and initiated projects with the private sector (e.g. HP and Go...
The year 2004 marked the 50th anniversary of civilian nuclear power generation. While the current outlook for nuclear energy remains mixed, there is clearly a sense of rising expectations. Both the OECD International Energy Agency and the IAEA adjusted their medium-term projections for nuclear power upwards. The IAEA now projects 423 - 592 GW(e) of nuclear power installed worldwide in 2030, compared to 366 GW(e) at the end of 2004. This is driven by nuclear power's performance record, by growing energy needs around the world coupled with rising oil and natural gas prices, by new environmental constraints including entry-into-force of the Kyoto Protocol, by concerns about energy supply security in a number of countries, and by ambitious expansion plans in several key countries. National research on advanced reactor designs continues on all reactor categories - water cooled, gas cooled, liquid metal cooled, and hybrid systems. Five members of the US-initiated Generation IV International Forum (GIF) signed a framework agreement on international collaboration in research and development on Generation IV nuclear energy systems in February 2005. The IAEA's International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO) grew to 23 members. It completed a series of case studies testing its assessment methodology and the final report on the updated INPRO methodology was published in December. The realization of the International Thermonuclear Experimental Reactor, ITER, came closer with the announcement on 28 June 2005 by the ITER parties. The aim of ITER is to demonstrate the scientific and technological feasibility of fusion energy by constructing a functional fusion power plant. Nuclear technology developments are rapid and cover many fields of application. Not all can be covered in this update review, but certain key areas and trends are covered where these are seen to be of significant interest to IAEA Member States, and which are of relevance to and have
Crespin, Timothy R; Austin, James T
This article reviews computer applications developed and utilized by industrial-organizational (I-O) psychologists, both in practice and in research. A primary emphasis is on applications developed for Internet usage, because this "network of networks" changes the way I-O psychologists work. The review focuses on traditional and emerging topics in I-O psychology. The first topic involves information technology applications in measurement, defined broadly across levels of analysis (persons, groups, organizations) and domains (abilities, personality, attitudes). Discussion then focuses on individual learning at work, both in formal training and in coping with continual automation of work. A section on job analysis follows, illustrating the role of computers and the Internet in studying jobs. Shifting focus to the group level of analysis, we briefly review how information technology is being used to understand and support cooperative work. Finally, special emphasis is given to the emerging "third discipline" in I-O psychology research-computational modeling of behavioral events in organizations. Throughout this review, themes of innovation and dissemination underlie a continuum between research and practice. The review concludes by setting a framework for I-O psychology in a computerized and networked world.
Small computing devices which rival the compact size of traditional personal digital assistants (PDA) have recently established a market niche. These computing devices are small enough to be considered unobtrusive for humans to wear. The computing devices are also powerful enough to run full multi-tasking general purpose operating systems. This paper will explore the wearable computer information system for dismounted applications recently fielded for ground-based US Air Force use. The environments that the information systems are used in will be reviewed, as well as a description of the net-centric, ground-based warrior. The paper will conclude with a discussion regarding the importance of intuitive, usable, and unobtrusive operator interfaces for dismounted operators.
Luis de la Fuente Valentin
Full Text Available This paper presents a desk research that analysed available recent studies in the field of Technology Enhanced Learning. The desk research is focused on work produced in the frame of FP6 and FP7 European programs, in the area of Information and Communication Technologies. It concentrates in technologies that support existing forms of learning, and also in technologies that enhance new learning paradigms. This approach includes already adopted and successfully piloted technologies. The elaboration of the desk research had three main parts: firstly, the collection of documents from CORDIS and other institutions related to TEL research; secondly, the identification of relevant terms appearing in those documents and the elaboration of a thesaurus; and thirdly, a quantitative analysis of each term occurrences. Many of the identified technologies belong to the fields of interactive multimedia, Human-computer Interaction and-or related to recommendation and learning analytics. This study becomes a thorough review of the current state of these fields through the actual development of R&D European projects. This research, will be used as a basis to better understand the evolution of the sector, and to focus future research efforts on these sectors and their application to education.
Wikler, David; Coussaert, Olivier; Schoovaerts, Frédéric; Joly, Anthony; Levivier, Marc
Stereotactic radiosurgery treatment principles and irradiation techniques have shown little evolution since its introduction in 1968. Conversely, technology progress linked to computers has produced a major impact on the methods used for treatment planning and dose delivery. In order to fully comprehend modern radiosurgery approaches, one has to acquire good insight of the underlying technology, specifically computer software. In this chapter, we describe the evolution from X-ray films to high-resolution digital imaging, the shift from simple trigonometric calculation to highly complex algorithms and new perspectives in patient follow-up. If these changes open new prospects, they also add complexity, which leads to new pitfalls and limits of the stereotactic radiosurgery method.
Yang, Kunde; Tian, Mengjun; Zhang, Hainan; Zhao, Yamei
This paper introduces one of the young, energetic and rapidly growing research fields in biomedical engineering-Brain-computer interface (BCI) technology, which can provide augmentative communication and control capabilities to patients with severe motor disabilities. We summarize the first two international meetings for BCI, and present the most typical research fruits. The problems in current studies and the direction for future investigation are analyzed.
This study aims to show the present conditions about the usage of cloud computing in the department of Computer Education and Instructional Technology (CEIT) amongst teacher trainees in School of Necatibey Education, Balikesir University, Turkey. In this study, a questionnaire with open-ended questions was used. 17 CEIT teacher trainees…
Full Text Available Pressure for better measurement of stated learning outcomes has resulted in a demand for more frequent assessment. The resources available are seen to be static or dwindling, but Information and Communications Technology is seen to increase productivity by automating assessment tasks. This paper reviews computer-assisted assessment (CAA and suggests future developments. A search was conducted of CAA-related literature from the past decade to trace the development of CAA from the beginnings of its large-scale use in higher education. Lack of resources, individual inertia and risk propensity are key barriers for individual academics, while proper resourcing and cultural factors outweigh technical barriers at the institutional level.
Lecca, G.; Petitdidier, M.; Hluchy, L.; Ivanovic, M.; Kussul, N.; Ray, N.; Thieron, V.
SummaryAdvances in e-Infrastructure promise to revolutionize sensing systems and the way in which data are collected and assimilated, and complex water systems are simulated and visualized. According to the EU Infrastructure 2010 work-programme, data and compute infrastructures and their underlying technologies, either oriented to tackle scientific challenges or complex problem solving in engineering, are expected to converge together into the so-called knowledge infrastructures, leading to a more effective research, education and innovation in the next decade and beyond. Grid technology is recognized as a fundamental component of e-Infrastructures. Nevertheless, this emerging paradigm highlights several topics, including data management, algorithm optimization, security, performance (speed, throughput, bandwidth, etc.), and scientific cooperation and collaboration issues that require further examination to fully exploit it and to better inform future research policies. The paper illustrates the results of six different surface and subsurface hydrology applications that have been deployed on the Grid. All the applications aim to answer to strong requirements from the Civil Society at large, relatively to natural and anthropogenic risks. Grid technology has been successfully tested to improve flood prediction, groundwater resources management and Black Sea hydrological survey, by providing large computing resources. It is also shown that Grid technology facilitates e-cooperation among partners by means of services for authentication and authorization, seamless access to distributed data sources, data protection and access right, and standardization.
Adumitroaie, Virgil; Hua, Hook; Lincoln, William; Block, Gary; Mrozinski, Joseph; Shelton, Kacie; Weisbin, Charles; Elfes, Alberto; Smith, Jeffrey
Strategic Assessment of Risk and Technology (START) is a user-friendly computer program that assists human managers in making decisions regarding research-and-development investment portfolios in the presence of uncertainties and of non-technological constraints that include budgetary and time limits, restrictions related to infrastructure, and programmatic and institutional priorities. START facilitates quantitative analysis of technologies, capabilities, missions, scenarios and programs, and thereby enables the selection and scheduling of value-optimal development efforts. START incorporates features that, variously, perform or support a unique combination of functions, most of which are not systematically performed or supported by prior decision- support software. These functions include the following: Optimal portfolio selection using an expected-utility-based assessment of capabilities and technologies; Temporal investment recommendations; Distinctions between enhancing and enabling capabilities; Analysis of partial funding for enhancing capabilities; and Sensitivity and uncertainty analysis. START can run on almost any computing hardware, within Linux and related operating systems that include Mac OS X versions 10.3 and later, and can run in Windows under the Cygwin environment. START can be distributed in binary code form. START calls, as external libraries, several open-source software packages. Output is in Excel (.xls) file format.
Boonkrong, Sirapat; Unger, Herwig
This proceedings book presents recent research work and results in the area of communication and information technologies. The chapters of this book contain the main, well-selected and reviewed contributions of scientists who met at the 12th International Conference on Computing and Information Technology (IC2IT) held during 7th - 8th July 2016 in Khon Kaen, Thailand The book is divided into three parts: “User Centric Data Mining and Text Processing”, “Data Mining Algoritms and their Applications” and “Optimization of Complex Networks”.
Dennerlein, Jack T
Because mobile computing technologies, such as notebook computers, smart mobile phones, and tablet computers afford users many different configurations through their intended mobility, there is concern about their effects on musculoskeletal pain and a need for usage recommendations. Therefore the main goal of this paper to determine which best practices surrounding the use of mobile computing devices can be gleaned from current field and laboratory studies of mobile computing devices. An expert review was completed. Field studies have documented various user configurations, which often include non-neutral postures, that users adopt when using mobile technology, along with some evidence suggesting that longer duration of use is associated with more discomfort. It is therefore prudent for users to take advantage of their mobility and not get stuck in any given posture for too long. The use of accessories such as appropriate cases or riser stands, as well as external keyboards and pointing devices, can also improve postures and comfort. Overall, the state of ergonomics for mobile technology is a work in progress and there are more research questions to be addressed.
We have entered the age of ''terascale'' scientific computing. Processors and system architecture both continue to evolve; hundred-teraFLOP computers are expected in the next few years, and petaFLOP computers toward the end of this decade are conceivable. This ever-increasing power to solve previously intractable numerical problems benefits almost every field of science and engineering and is revolutionizing some of them, notably including accelerator physics and technology. At existing accelerators, it will help us optimize performance, expand operational parameter envelopes, and increase reliability. Design decisions for next-generation machines will be informed by unprecedented comprehensive and accurate modeling, as well as computer-aided engineering; all this will increase the likelihood that even their most advanced subsystems can be commissioned on time, within budget, and up to specifications. Advanced computing is also vital to developing new means of acceleration and exploring the behavior of beams under extreme conditions. With continued progress it will someday become reasonable to speak of a complete numerical model of all phenomena important to a particular accelerator.
Poyneer, L A
This month's issue has the following articles: (1) A New Era in Climate System Analysis - Commentary by William H. Goldstein; (2) Seeking Clues to Climate Change - By comparing past climate records with results from computer simulations, Livermore scientists can better understand why Earth's climate has changed and how it might change in the future; (3) Finding and Fixing a Supercomputer's Faults - Livermore experts have developed innovative methods to detect hardware faults in supercomputers and help applications recover from errors that do occur; (4) Targeting Ignition - Enhancements to the cryogenic targets for National Ignition Facility experiments are furthering work to achieve fusion ignition with energy gain; (5) Neural Implants Come of Age - A new generation of fully implantable, biocompatible neural prosthetics offers hope to patients with neurological impairment; and (6) Incubator Busy Growing Energy Technologies - Six collaborations with industrial partners are using the Laboratory's high-performance computing resources to find solutions to urgent energy-related problems.
Wang, Jingyu; Zhang, Min; Gao, Zhongxue; Adhikari, Benu
Fresh foods are perishable, seasonal and regional in nature and their storage, transportation, and preservation of freshness are quite challenging. Smart storage technologies can online detection and monitor the changes of quality parameters and storage environment of fresh foods during storage, so that operators can make timely adjustments to reduce the loss. This article reviews the smart storage technologies from two aspects: online detection technologies and smartly monitoring technologies for fresh foods. Online detection technologies include electronic nose, nuclear magnetic resonance (NMR), near infrared spectroscopy (NIRS), hyperspectral imaging and computer vision. Smartly monitoring technologies mainly include some intelligent indicators for monitoring the change of storage environment. Smart storage technologies applied to fresh foods need to be highly efficient and nondestructive and need to be competitively priced. In this work, we have critically reviewed the principles, applications, and development trends of smart storage technologies.
Majernik, Jaroslav; Pancerz, Krzysztof; Zaitseva, Elena
This book presents latest results and selected applications of Computational Intelligence in Biomedical Technologies. Most of contributions deal with problems of Biomedical and Medical Informatics, ranging from theoretical considerations to practical applications. Various aspects of development methods and algorithms in Biomedical and Medical Informatics as well as Algorithms for medical image processing, modeling methods are discussed. Individual contributions also cover medical decision making support, estimation of risks of treatments, reliability of medical systems, problems of practical clinical applications and many other topics This book is intended for scientists interested in problems of Biomedical Technologies, for researchers and academic staff, for all dealing with Biomedical and Medical Informatics, as well as PhD students. Useful information is offered also to IT companies, developers of equipment and/or software for medicine and medical professionals. .
Ümmü Gülsüm DURUKAN
Full Text Available The purpose of this study is to determine the “technology” perception of prospective computer and instructional technologies (CEIT teachers by metaphors. For this purpose the study was carried out with 53 first-year prospective teachers studying in the Department of CEIT in a public university in the fall term of 2014-2015 academic year. The forms consisting of the statement “Technology is like ……………because ………” written few times were used as a data collection tool. Phenomenography design was used in the study and the data were analyzed by content analysis. According to the study's findings, it was found that out of 118 valid metaphors developed by the prospective teachers, 103 of them were included in positive category, 7 were in negative category and 8 were in the neutral category.
Nwaogu, Ugochukwu Chibuzoh; Tiedje, Niels Skat
The importance of foundry coating in improving the surface quality of castings cannot be over emphasized. The appli-cation of mould and core washes creates a high thermal integrity barrier between the metal and the mould resulting in the reduction of the thermal shock experienced by the sand system....... These thermal shock leads to series of surface de-fects such as veining/finning, metal penetration, burn-on/in, scab, rat tail, erosion etc. The use of coatings reduces the tendency of occurrence of these defects. However, the understanding of the coating, its components, characteristics and mechanism of action...... is important. In this review, a detailed description of these topics and examples are provided where necessary. A potential area of research in foundry coating development, using sol-gel process is suggested. The application of sol-gel technology in the development of foundry coatings is a novel approach....
Zachenhofer, Iris; Cejna, Manfred; Schuster, Antonius; Donat, Markus; Roessler, Karl
Computed tomography angiography (CTA) is a time and cost saving investigation for postoperative evaluation of clipped cerebral aneurysm patients. A retrospective study was conducted to analyse image quality and artefact generation due to implanted aneurysm clips using a new technology. MSCTA was performed pre- and postoperatively using a Philips Brilliance 64-detector-row CT scanner. Altogether, 32 clipping sites were analysed in 27 patients (11 female and 16 male, mean ages 52a, from 24 to 72 years). Clip number per aneurysm was 2.3 mean (from 1 to 4), 54 clips were made of titanium alloy and 5 of cobalt alloy. Altogether, image quality was rated 1.8 mean, using a scale from 1 (very good) to 5 (unserviceable) and clip artefacts were rated 2.4 mean, using a 5 point rating scale (1 no artefacts, 5 unserviceable due to artefacts). A significant loss of image quality and rise of artefacts was found when using cobalt alloy clips (1.4 versus 4.2 and 2.1 versus 4.0). In 72% of all investigations, an excellent image quality was found. Excluding the cobalt clip group, 85% of scans showed excellent image quality. Artefacts were absent or minimal (grade 1 or 2) in 69% of all investigations and in 81% in the pure titanium clip group. In 64-row MSCTA of good image quality with low artefacts, it was possible to detect small aneurysm remnants of 2mm size in individual patients. By using titanium alloy clips, in our study up to 85% of postoperative CTA images were of excellent quality with absent or minimal artefacts in 81% and seem adequate to detect small aneurysm remnants. Copyright 2010 Elsevier B.V. All rights reserved.
The tides are caused by gravitational attraction of the sun and the moon acting upon the world's oceans. This creates a clean renewable form of energy which can in principle be tapped for the benefit of mankind. This paper reviews the status of tidal energy, including the magnitude of the resource, the technology which is available for its extraction, the economics, possible environmental effects and non-technical barriers to its implementation. Although the total energy flux of the tides is large, at about 2 TW, in practice only a very small fraction of this total potential can be utilised in the foreseeable future. This is because the energy is spread diffusely over a wide area, requiring large and expensive plant for its collection, and is often available remote from centres of consumption. The best mechanism for exploiting tidal energy is to employ estuarine barrages at suitable sites with high tidal ranges. The technology is relatively mature and components are commercially available now. Also, many of the best sites for implementation have been identified. However, the pace and extent of commercial exploitation of tidal energy is likely to be significantly influenced, both by the treatment of environmental costs of competing fossil fuels, and by the availability of construction capital at modest real interest rates. The largest projects could require the involvement of national governments if they are to succeed. (author) 8 figs., 2 tabs., 19 refs
Kristiansen, M.; Schoenbach, K.M.; Schaefer, G.
Review of opening switch technology is given. Classification of open switches applied in pulsed power technology is presented. The most familiar opening switches are fuses. It is shown that a strong oxidizer (H 2 O 2 in water), especially in combination with wires of Al, increases the maximum voltage. Thermally driven opening switches are the result of attempts to achive the speed and economy of fuse opening switches but with added advantage of repetitive operation. The search for coordinate materials for this type of opening switch is in its infancy and it is difficult to predict how successful such a switch may be. Explosive opening switches offer the possibility of precise timing and permit the delay before explosion to be controlled independently of current flowing through the switch. Plasma guns, dense plasma focus and MHD switches are also considered. Diffuse discharge opening switches are attractive for repetitive operation. The plasma erosion switch operates on a very short time scale of 10 ns to 100 ns, both to regard to conduction and opening times
Advanced Information Technology in Education
The volume includes a set of selected papers extended and revised from the 2011 International Conference on Computers and Advanced Technology in Education. With the development of computers and advanced technology, the human social activities are changing basically. Education, especially the education reforms in different countries, has been experiencing the great help from the computers and advanced technology. Generally speaking, education is a field which needs more information, while the computers, advanced technology and internet are a good information provider. Also, with the aid of the computer and advanced technology, persons can make the education an effective combination. Therefore, computers and advanced technology should be regarded as an important media in the modern education. Volume Advanced Information Technology in Education is to provide a forum for researchers, educators, engineers, and government officials involved in the general areas of computers and advanced technology in education to d...
Rajaa Mahdi Musawi
Full Text Available During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR simulators, augmented reality (AR and computer aided design/computer aided manufacturing (CAD/CAM systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established.This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice.Keywords: Virtual Reality Exposure Therapy; Immersion; Computer-Aided Design; Dentistry; Education
In this study, the Oak Ridge National Laboratory (ORNL) is performing a technology review to assess the market for commercially available power electronic converters that can be used to connect microturbines to either the electric grid or local loads. The intent of the review is to facilitate an assessment of the present status of marketed power conversion technology to determine how versatile the designs are for potentially providing different services to the grid based on changes in market direction, new industry standards, and the critical needs of the local service provider. The project includes data gathering efforts and documentation of the state-of-the-art design approaches that are being used by microturbine manufacturers in their power conversion electronics development and refinement. This project task entails a review of power converters used in microturbines sized between 20 kW and 1 MW. The power converters permit microturbine generators, with their non-synchronous, high frequency output, to interface with the grid or local loads. The power converters produce 50- to 60-Hz power that can be used for local loads or, using interface electronics, synchronized for connection to the local feeder and/or microgrid. The power electronics enable operation in a stand-alone mode as a voltage source or in grid-connect mode as a current source. Some microturbines are designed to automatically switch between the two modes. The information obtained in this data gathering effort will provide a basis for determining how close the microturbine industry is to providing services such as voltage regulation, combined control of both voltage and current, fast/seamless mode transfers, enhanced reliability, reduced cost converters, reactive power supply, power quality, and other ancillary services. Some power quality improvements will require the addition of storage devices; therefore, the task should also determine what must be done to enable the power conversion circuits to
DOE Order 5637.1, ''Classified Computer Security,'' requires regular reviews of the computer security activities for an ADP system and for a site. Based on experiences gained in the Los Alamos computer security program through interactions with DOE facilities, we have developed a methodology to aid a site or security officer in performing a comprehensive computer security review. The methodology is designed to aid a reviewer in defining goals of the review (e.g., preparation for inspection), determining security requirements based on DOE policies, determining threats/vulnerabilities based on DOE and local threat guidance, and identifying critical system components to be reviewed. Application of the methodology will result in review procedures and checklists oriented to the review goals, the target system, and DOE policy requirements. The review methodology can be used to prepare for an audit or inspection and as a periodic self-check tool to determine the status of the computer security program for a site or specific ADP system. 1 tab
Reynolds, David; Tinker, Mike
MSFC has a strong diverse portfolio of technology development projects, ranging from flight projects to very low Technology Readiness Level (TRL) laboratory projects. The 2015 Year in Review highlights the Center's technology projects and celebrates their accomplishments to raise awareness of technology development work that is integral to the success of future Agency flight programs.
Shirley, D. J.
Southwest Research Institute (SwRI) has developed and delivered spacecraft computers for a number of different near-Earth-orbit spacecraft including shuttle experiments and SDIO free-flyer experiments. We describe the evolution of the basic SwRI spacecraft computer design from those weighing in at 20 to 25 lb and using 20 to 30 W to newer models weighing less than 5 lb and using only about 5 W, yet delivering twice the processing throughput. Because of their reduced size, weight, and power, these newer designs are especially applicable to planetary instrument requirements. The basis of our design evolution has been the availability of more powerful processor chip sets and the development of higher density packaging technology, coupled with more aggressive design strategies in incorporating high-density FPGA technology and use of high-density memory chips. In addition to reductions in size, weight, and power, the newer designs also address the necessity of survival in the harsh radiation environment of space. Spurred by participation in such programs as MSTI, LACE, RME, Delta 181, Delta Star, and RADARSAT, our designs have evolved in response to program demands to be small, low-powered units, radiation tolerant enough to be suitable for both Earth-orbit microsats and for planetary instruments. Present designs already include MIL-STD-1750 and Multi-Chip Module (MCM) technology with near-term plans to include RISC processors and higher-density MCM's. Long term plans include development of whole-core processors on one or two MCM's.
The 2017 Building Technologies Office Peer Review Report summarizes the feedback submitted by reviewers for the 109 Building Technologies Office (BTO) projects presented at the 2017 BTO Peer Review. The report presents an overview of the goals and activities under each technology program area, a summary of project scores for each program, and a brief analysis of general evaluation trends within each program area or its constituent subprograms.
Hussam Alddin Shihab Ahmed
Full Text Available Cloud computing is an internet based model that empower on demand ease of access and pay for the usage of each access to shared pool of networks. It is yet another innovation that fulfills a client's necessity for computing resources like systems, stockpiling, servers, administrations and applications. Securing the Data is considered one of the principle significant challenges and concerns for cloud computing. This persistent problem is getting more affective due to the changes in improving cloud computing technology. From the perspective of the Clients, cloud computing is a security hazard especially when it comes to assurance affirmation issues and data security, remain the most basically which backs off for appropriation of Cloud Computing administrations. This paper audits and breaks down the essential issue of cloud computing and depicts the information security and protection of privacy issues in cloud.
Computing is anticipated to have an increasingly expansive impact on the sciences overall, becoming the third, crucial component of a "golden triangle" that includes mathematics and experimental and theoretical science. However, even more true with computing than with math and science, we are not preparing our students for this new reality. It is…
Zhou, Shuigeng; Liao, Ruiqi; Guan, Jihong
In the past decades, with the rapid development of high-throughput technologies, biology research has generated an unprecedented amount of data. In order to store and process such a great amount of data, cloud computing and MapReduce were applied to many fields of bioinformatics. In this paper, we first introduce the basic concepts of cloud computing and MapReduce, and their applications in bioinformatics. We then highlight some problems challenging the applications of cloud computing and MapReduce to bioinformatics. Finally, we give a brief guideline for using cloud computing in biology research.
Sheshadri, Holalu; Padma, M
PES College of Engineering is organizing an International Conference on Emerging Research in Electronics, Computer Science and Technology (ICERECT-12) in Mandya and merging the event with Golden Jubilee of the Institute. The Proceedings of the Conference presents high quality, peer reviewed articles from the field of Electronics, Computer Science and Technology. The book is a compilation of research papers from the cutting-edge technologies and it is targeted towards the scientific community actively involved in research activities.
Since the reform and opening up, with the continuous development of science and technology in China, more and more advanced science and technology have emerged under the trend of diversification. Multimedia imaging technology, for example, has a significant and positive impact on computer aided manufacturing engineering in China. From the perspective of scientific and technological advancement and development, the multimedia image technology has a very positive influence on the application and development of computer-aided manufacturing engineering, whether in function or function play. Therefore, this paper mainly starts from the concept of multimedia image technology to analyze the application of multimedia image technology in computer aided manufacturing engineering.
Islam, M. Sirajul; Grönlund, Åke
This paper is based on a systematic literature review relevant to classroom integration of computer technologies in schools. The purpose of this review is to gain an accumulated view of uses, impacts and implementations of 1:1 computing initiatives for school children. Unlike previous reviews this study is not limited to certain countries or…
Fulton, R. E.; Voigt, S. J.
A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.
Leite, Bruno Barros; Ribeiro, Nuno Carrilho
Computed Tomography (CT) has been available since the 70s and has experienced a dramatic technical evolution. Multi-detector technology is our current standard, offering capabilities unthinkable only a decade ago. Yet, we must nor forget the ionizing nature of CT's scanning energy (X-rays). It represents the most important cause of medical-associated radiation exposure to the general public, with a trend to increase. It is compulsory to intervene with the objective of dose reduction, following ALARA policies. Currently there are some technical advances that allow dose reduction, without sacrificing diagnostic image capabilities. However, human intervention is also essential. We must keep investment on education so that CT exams are don when they are really useful in clinical decision. Alternative techniques should also be considered. Image quality must not be searched disregarding the biological effects of radiation. Generally, it is possible to obtain clinically acceptable images with lower dose protocols. (author)
Calamia, J R
Although the development of computer-aided design (CAD) and computer-aided manufacture (CAM) technology and the benefits of increased productivity became obvious in the automobile and aerospace industries in the 1970s, investigations of this technology's application in the field of dentistry did not begin until the 1980s. Only now are we beginning to see the fruits of this work with the commercial availability of some systems; the potential for this technology seems boundless. This article reviews the recent literature with emphasis on the period from June 1992 to May 1993. This review should familiarize the reader with some of the latest developments in this technology, including a brief description of some systems currently available and the clinical and economical rationale for their acceptance into the dental mainstream. This article concentrates on a particular system, the Cerec (Siemens/Pelton and Crane, Charlotte, NC) system, for three reasons: First, this system has been available since 1985 and, as a result, has a track record of almost 7 years of data. Most of the data have just recently been released and consequently, much of this year's literature on CAD-CAM is monopolized by studies using this system. Second, this system was developed as a mobile, affordable, direct chairside CAD-CAM restorative method. As such, it is of special interest to the patient, providing a one-visit restoration. Third, the author is currently engaged in research using this particular system and has a working knowledge of this system's capabilities.
Full Text Available Simulation modelling has proved to be one of the most powerful tools available to the Operations Research Analyst. The development of micro-computer technology has reached a state of maturity where the micro-computer can provide the necessary computing power and consequently various powerful and inexpensive simulation languages for micro-computers have became available. This paper will attempt to provide an introduction to the general philosophy and characteristics of some of the available micro-computer simulation languages. The emphasis will be on the characteristics of the specific micro-computer implementation rather than on a comparison of the modelling features of the various languages. Such comparisons may be found elsewhere.
Pesaran, A. A.
This paper overviews applications of desiccant technology for dehumidifying commercial and institutional buildings. Because of various market, policy, and regulatory factors, this technology is especially attractive for dehumidification applications in the I990s.
This project aims to strengthen the capacity of the Mozambique Ministry of Science and Technology (MOST) to govern the country's science, technology and innovation (STI) system, and of researchers and policymakers to conduct systematic reviews of STI policy implementation. It will do so by supporting a review of the ...
National Bureau of Standards (DOC), Washington, DC. Inst. for Computer Sciences and Technology.
This bibliography lists publications of the Institute for Computer Sciences and Technology of the National Bureau of Standards. Publications are listed by subject in the areas of computer security, computer networking, and automation technology. Sections list publications of: (1) current Federal Information Processing Standards; (2) computer…
The article explores contradictions in teachers' perceptions regarding the place of computer technologies in education. The research population included 47 teachers who have incorporated computers in the classroom for several years. The teachers expressed positive attitudes regarding the decisive importance of computer technologies in furthering…
Analyzes concepts, technologies and challenges related to mobile computing and networking. Defines basic concepts of cellular systems. Describes the evolution of wireless technologies that constitute the foundations of mobile computing and ubiquitous networking. Presents characterization and issues of mobile computing. Analyzes economical and…
VanDalsem, William R.
The Computing, Information and Communications Technology (CICT) Program's goal is to enable NASA's Scientific Research, Space Exploration, and Aerospace Technology Missions with greater mission assurance, for less cost, with increased science return through the development and use of advanced computing, information and communication technologies
Soft Computing in Information Communication Technology
This is a collection of the accepted papers concerning soft computing in information communication technology. All accepted papers are subjected to strict peer-reviewing by 2 expert referees. The resultant dissemination of the latest research results, and the exchanges of views concerning the future research directions to be taken in this field makes the work of immense value to all those having an interest in the topics covered. The present book represents a cooperative effort to seek out the best strategies for effecting improvements in the quality and the reliability of Neural Networks, Swarm Intelligence, Evolutionary Computing, Image Processing Internet Security, Data Security, Data Mining, Network Security and Protection of data and Cyber laws. Our sincere appreciation and thanks go to these authors for their contributions to this conference. I hope you can gain lots of useful information from the book.
In recent years, intense usage of computing has been the main strategy of investigations in several scientific research projects. The progress in computing technology has opened unprecedented opportunities for systematic collection of experimental data and the associated analysis that were considered impossible only few years ago. This paper focusses on the strategies in use: it reviews the various components that are necessary for an effective solution that ensures the storage, the long term preservation, and the worldwide distribution of large quantities of data that are necessary in a large scientific research project. The paper also mentions several examples of data management solutions used in High Energy Physics for the CERN Large Hadron Collider (LHC) experiments in Geneva, Switzerland which generate more than 30,000 terabytes of data every year that need to be preserved, analyzed, and made available to a community of several tenth of thousands scientists worldwide.
Hosam Farouk El-Sofany
Full Text Available Cloud computing is a new computing model which is based on the grid computing, distributed computing, parallel computing and virtualization technologies define the shape of a new technology. It is the core technology of the next generation of network computing platform, especially in the field of education, cloud computing is the basic environment and platform of the future E-learning. It provides secure data storage, convenient internet services and strong computing power. This article mainly focuses on the research of the application of cloud computing in E-learning environment. The research study shows that the cloud platform is valued for both students and instructors to achieve the course objective. The paper presents the nature, benefits and cloud computing services, as a platform for e-learning environment.
Building Technologies Office
The 2016 Building Technologies Office Peer Review Report summarizes the feedback submitted by reviewers of the 67 BTO projects presented at the 2016 BTO Peer Review. The report presents an overview of the goals and activities under each technology program area, a summary of project scores for each program, and a brief analysis of general evaluation trends within each program area or its constituent subprograms.
Callan, Judith A; Wright, Jesse; Siegle, Greg J; Howland, Robert H; Kepler, Britney B
Major depression (MDD) is a common and disabling disorder. Research has shown that most people with MDD receive either no treatment or inadequate treatment. Computer and mobile technologies may offer solutions for the delivery of therapies to untreated or inadequately treated individuals with MDD. The authors review currently available technologies and research aimed at relieving symptoms of MDD. These technologies include computer-assisted cognitive-behavior therapy (CCBT), web-based self-help, Internet self-help support groups, mobile psychotherapeutic interventions (i.e., mobile applications or apps), technology enhanced exercise, and biosensing technology. Copyright © 2017 Elsevier Inc. All rights reserved.
This book discusses neural computation, a network or circuit of biological neurons and relatedly, particle accelerators, a scientific instrument which accelerates charged particles such as protons, electrons and deuterons. Accelerators have a very broad range of applications in many industrial fields, from high energy physics to medical isotope production. Nuclear technology is one of the fields discussed in this book. The development that has been reached by particle accelerators in energy and particle intensity has opened the possibility to a wide number of new applications in nuclear technology. This book reviews the applications in the nuclear energy field and the design features of high power neutron sources are explained. Surface treatments of niobium flat samples and superconducting radio frequency cavities by a new technique called gas cluster ion beam are also studied in detail, as well as the process of electropolishing. Furthermore, magnetic devises such as solenoids, dipoles and undulators, which ...
Yang Yanming; Dai Guiling
All previous National Conferences on computer application in the field of nuclear science and technology sponsored by the Society of Nuclear Electronics and Detection Technology are reviewed. Surveys are geiven on the basic situations and technique levels of computer applications for each time period. Some points concerning possible developments of computer techniques are given as well
Iankoulova, Iliana; Daneva, Maia; Rolland, C; Castro, J.; Pastor, O
Many publications have dealt with various types of security requirements in cloud computing but not all types have been explored in sufficient depth. It is also hard to understand which types of requirements have been under-researched and which are most investigated. This paper's goal is to provide a comprehensive and structured overview of cloud computing security requirements and solutions. We carried out a systematic review and identified security requirements from previous publications th...
This report describes the activities of the DEFACTO project, a Design Environment for Adaptive Computing Technology funded under the DARPA Adaptive Computing Systems and Just-In-Time-Hardware programs...
This volume (Volume II) of the Railroad Classification Yard Technology Manual documents the railroad classification yard computer systems methodology. The subjects covered are: functional description of process control and inventory computer systems,...
Lawler, James P.; Joseph, Anthony
Education in entrepreneurship is becoming a critical area of curricula for computer science students. Few schools of computer science have a concentration in entrepreneurship in the computing curricula. The paper presents Technology Entrepreneurship in the curricula at a leading school of computer science and information systems, in which students…
Xu, Linli; Tian, Wenya
The 2012 International Conference on Emerging Computation and Information teChnologies for Education (ECICE 2012) was held on Jan. 15-16, 2012, Hangzhou, China. The main results of the conference are presented in this proceedings book of carefully reviewed and accepted paper addressing the hottest issues in emerging computation and information technologies used for education. The volume covers a wide series of topics in the area, including Computer-Assisted Education, Educational Information Systems, Web-based Learning, etc.
Alberta Education, 2013
Cloud computing is Internet-based computing in which shared resources, software and information are delivered as a service that computers or mobile devices can access on demand. Cloud computing is already used extensively in education. Free or low-cost cloud-based services are used daily by learners and educators to support learning, social…
Haekkinen, R.J.; Hirsch, C.; Krause, E.; Kytoemaa, H.K. [eds.
The report is a mid-term evaluation of the Computational Fluid Dynamics (CFD) Technology Programme started by Technology Development Centre Finland (TEKES) in 1995 as a five-year initiative to be concluded in 1999. The main goal of the programme is to increase the know-how and application of CFD in Finnish industry, to coordinate and thus provide a better basis for co-operation between national CFD activities and encouraging research laboratories and industry to establish co-operation with the international CFD community. The projects of the programme focus on the following areas: (1) studies of modeling the physics and dynamics of the behaviour of fluid material, (2) expressing the physical models in a numerical mode and developing a computer codes, (3) evaluating and testing current physical models and developing new ones, (4) developing new numerical algorithms, solvers, and pre- and post-processing software, and (5) applying the new computational tools to problems relevant to their ultimate industrial use. The report consists of two sections. The first considers issues concerning the whole programme and the second reviews each project
Manhas, Melissa; Kuo, Mu-Hsing
This systematic review examines a total of eighteen studies on the use of health information technologies to improve public health. Health information technologies are tools that allow for the management of health information in computerized systems. Health information technology, including electronic health records, computers/emails, social media, and cellphones/text messaging are becoming widespread and readily accessible to populations around the globe. In this review, the use of these technologies and interventions are discussed and evaluated for their potential to improve public health. This review found some good-quality evidence on the use of electronic health records and little good-quality evidence on the use of email, social media, cell phones and text messaging to improve healthcare, illustrating the need for further study in these areas.
Bhateja, Vikrant; Raju, K; Janakiramaiah, B
The book is a compilation of high-quality scientific papers presented at the 3rd International Conference on Computer & Communication Technologies (IC3T 2016). The individual papers address cutting-edge technologies and applications of soft computing, artificial intelligence and communication. In addition, a variety of further topics are discussed, which include data mining, machine intelligence, fuzzy computing, sensor networks, signal and image processing, human-computer interaction, web intelligence, etc. As such, it offers readers a valuable and unique resource.
Nuclear engineering has undergone extensive progress over the years. In the past century, colossal developments have been made and with specific reference to the mathematical theory and computational science underlying this discipline, advances in areas such as high-order discretization methods, Krylov Methods and Iteration Acceleration have steadily grown. Nuclear Computational Science: A Century in Review addresses these topics and many more; topics which hold special ties to the first half of the century, and topics focused around the unique combination of nuclear engineering, computational
Cupps, Kimberly C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
The High Performance Computing Operations Review (HPCOR) meeting—requested by the ASC and ASCR program headquarters at DOE—was held November 5 and 6, 2013, at the Marriott Hotel in San Francisco, CA. The purpose of the review was to discuss the processes and practices for HPC integration and its related software and facilities. Experiences and lessons learned from the most recent systems deployed were covered in order to benefit the deployment of new systems.
Celik, Vehbi; Yesilyurt, Etem
There is a large body of research regarding computer supported education, perceptions of computer self-efficacy, computer anxiety and the technological attitudes of teachers and teacher candidates. However, no study has been conducted on the correlation between and effect of computer supported education, perceived computer self-efficacy, computer…
Federal Laboratory Consortium — SERVICES PROVIDED BY THE COMPUTER CORE FACILITYEvaluation, purchase, set up, and maintenance of the computer hardware and network for the 170 users in the research...
This document summarizes the recommendations and evaluations provided by an independent external panel of experts at the 2013 U.S. Department of Energy Bioenergy Technologies Office's Peer Review meeting.
Devell, L.; Aggeryd, I.; Hultgren, Aa.; Lundell, B.; Pedersen, T.
A summary review of the development of new nuclear rector technology is presented in this report. Fuel cycle strategies and waste handling developments are also commented. Different plans for dismantling nuclear weapons are presented. 18 refs
Carter, Launor F.
Two aspects of educational technology are considered. The first involves the development of educational technology highly dependent on computer equipment, and the development of a computer assisted instruction language called Programmed Language for Interactive Teaching (PLANIT). The second aspect involves the development of educational technology…
Laufenberg, Lawrence; Tu, Eugene (Technical Monitor)
The CICT Program is part of the NASA Aerospace Technology Enterprise's fundamental technology thrust to develop tools. processes, and technologies that enable new aerospace system capabilities and missions. The CICT Program's four key objectives are: Provide seamless access to NASA resources- including ground-, air-, and space-based distributed information technology resources-so that NASA scientists and engineers can more easily control missions, make new scientific discoveries, and design the next-generation space vehicles, provide high-data delivery from these assets directly to users for missions, develop goal-oriented human-centered systems, and research, develop and evaluate revolutionary technology.
technology is a Canadian product, the “ Bionic Energy Harvester”, which is a knee brace that harvests energy from the motion of the knee joint. It is...Needs of Future Warriors. Washington D.C., USA: The National Academies Press, 2004.  Task Force Devil Combined Arms Assessment Team. The Modern
Griffith, R.V.; Anderson, K.J.
The Annual Technology Review covers the period from October 1983 to September 1984. Topics reviewed include Nuclear Criticality Information System, nuclear dosimetry, personnel dosimetry, laser chemistry, electric filters and neutron spectrometry. Individual papers are indexed and abstracted for the data base. (DT)
The study investigated the rationales for the adoption of cloud computing technology for library services in NOUN Library. Issues related to the existing computer network available in NOUN library such as LAN, WAN, rationales for the adoption of cloud computing in NOUN library such as the need to disclose their collections ...
The CERN Computer Centre is reviewing strategies for optimizing the use of the existing infrastructure in the future, and in the likely scenario that any extension will be remote from CERN, and in the light of the way other large facilities are today being operated. Over the past six months, CERN has been investigating modern and widely-used tools and procedures used for virtualisation, clouds and fabric management in order to reduce operational effort, increase agility and support unattended remote computer centres. This presentation will give the details on the project’s motivations, current status and areas for future investigation.
This paper presents a review of different behaviours of fuel elements in relation to fission gases. The influence of the fission gas level and its location regarding differential fuel structures is commented. The gas release process at high burnup is described and some topics that need to be further investigated are suggested. (author)
Bearinger, J P
This month's issue has the following articles: (1) Remembering the Laboratory's First Director - Commentary by Harold Brown; (2) Herbert F. York (1921-2009): A Life of Firsts, an Ambassador for Peace - The Laboratory's first director, who died on May 19, 2009, used his expertise in science and technology to advance arms control and prevent nuclear war; (3) Searching for Life in Extreme Environments - DNA will help researchers discover new marine species and prepare to search for life on other planets; (4) Energy Goes with the Flow - Lawrence Livermore is one of the few organizations that distills the big picture about energy resources and use into a concise diagram; and (5) The Radiant Side of Sound - An experimental method that converts sound waves into light may lead to new technologies for scientific and industrial applications.
The following two abstracts are for the 2 feature stories in this issue of ''Science and Technology Review''. (1) ''Forewarnings of Coming Hazards''--The Atmospheric Release Advisory Capability (ARAC) at Lawrence Livermore is an emergency response organization chartered to aid Department of Energy and Department of Defense sites when radioactive or toxic material is released into the atmosphere. Developed from studies beginning in the 1960s, it became a funded operational program in the late 1970s. Using an emergency response modeling system now in its third generation, ARAC scientists predict how atmospheric releases that could affect public health and safety will disperse. The ARAC system has evolved through experience gained during regular training exercises and in over 160 alerts and emergency responses to date. The work of ARAC scientists described in the article demonstrates the different modeling challenges they encounter in preparing for and responding to a variety of atmospheric emergencies. (2) ''Unraveling the Mystery of Detonation''--Laboratory experts in the detonation of high explosives are putting the computational power of the Accelerated Strategic Computing Initiative (ASCI) to the test. Their research centers on insensitive explosives, whose behavior during detonation is slower and more complex than that of sensitive explosives. The article features three research projects, which are exploring detonation from different angles: the initiation phase, the molecules produced during detonation, and further development of CHEETAH, a thermochemical detonation code. All research teams are using ASCI supercomputers, which have increased their ability to simulate the detonation process by a factor of 100,000
This research aims to answer the question, "How has the use of computer technology benefited the compulsory education system, focusing on Design and Technology?" In order to reply this question, it was necessary to focus on interactive whiteboards, e-portfolios and digital projectors as the main technology formats. An initial literature…
Chiariello, Andrea Gaetano [Ass. EURATOM/ENEA/CREATE, Dipartimento di Ingegneria Industriale e dell’Informazione, Seconda Università di Napoli, Via Roma 29, Aversa (CE) (Italy); Formisano, Alessandro, E-mail: Alessandro.Formisano@unina2.it [Ass. EURATOM/ENEA/CREATE, Dipartimento di Ingegneria Industriale e dell’Informazione, Seconda Università di Napoli, Via Roma 29, Aversa (CE) (Italy); Martone, Raffaele [Ass. EURATOM/ENEA/CREATE, Dipartimento di Ingegneria Industriale e dell’Informazione, Seconda Università di Napoli, Via Roma 29, Aversa (CE) (Italy)
Highlights: ► The paper deals with high accuracy numerical simulations of high field magnets. ► The porting of existing codes of High Performance Computing architectures allowed to obtain a relevant speedup while not reducing computational accuracy. ► Some examples of applications, referred to ITER-like magnets, are reported. -- Abstract: One of the main issues in the simulation of Tokamaks functioning is the reliable and accurate computation of actual field maps in the plasma chamber. In this paper a tool able to accurately compute magnetic field maps produced by active coils of any 3D shape, wound with high number of conductors, is presented. Under linearity assumption, the coil winding is modeled by means of “sticks”, following each conductor's shape, and the contribution of each stick is computed using high speed Graphic Computing Units (GPU's). Relevant speed enhancements with respect to standard parallel computing environment are achieved in this way.
Nihal Esam Abuzinadah; Areej Abbas Malibari; Paul Krause
Studies have shown that deaf and hearing-impaired students have many difficulties in learning applied disciplines such as Medicine, Engineering, and Computer Programming. This study aims to investigate the readiness of deaf students to pursue higher education in applied sciences, more specifically in computer science. This involves investigating their capabilities in computer skills and applications. Computer programming is an integral component in the technological field that can facilitate ...
Ruthberg, Zella G.
This is a collection of consensus reports, each produced at a session of an invitational workshop sponsored by the National Bureau of Standards. The purpose of the workshop was to explore the state-of-the-art and define appropriate subjects for future research in the audit and evaluation of computer security. Leading experts in the audit and…
Full Text Available Linda Volonino, Reynaldo Anzaldua, and Jana Godwin (2007. Computer Forensics: Principles and Practices. Pearson/Prentice Hall. 534 pages, ISBN: 0-13-154727-5 (paper, US$85.33Reviewed by Jigang Liu (Jigang.Liu@metrostate.edu, Department of Information and Computer Sciences, College of Arts and Sciences, Metropolitan State University, St. Paul, MN 55106â€œComputer Forensics: Principles and Practicesâ€ by Linda Volonino, Reynaldo Anzaldua, and Jana Godwin, published by Pearson/Prentice Hall in 2007 is one of the newest computer forensics textbooks on the market. The goal of the book, as the authors put it, is to teach â€œstudents who want to learn about electronic evidence â€“ including what types exist and where it may be found â€“ and the computer forensics methods to investigate itâ€ so that they will be prepared â€œin a career in information security, criminal justice, accounting, law enforcement, and federal investigations â€“ as well as computer forensics.â€Linda, Reynaldo, and Jana are not only experienced college professors, but also industry bounded professionals. All of them have substantial working experience with law firms or law enforcement in dealing with both civil and criminal cases. They are all certified information system security professionals (CISSP. Their teaching experience at the college level and their working experience on real cases make this book a must-read book for a college professor.(see PDF for full review
Radousky, H B
This month's issue has the following articles: (1) Shaking the Foundations of Solar-System Science--Commentary by William H. Goldstein; (2) Stardust Results Challenge Astronomical Convention--The first samples retrieved from a comet are a treasure trove of surprises to Laboratory researchers; (3) Fire in the Hole--Underground coal gasification may help to meet future energy supply challenges with a production process from the past; (4) Big Physics in Small Spaces--A newly developed computer model successfully simulates particle-laden fluids flowing through complex microfluidic systems; (5) A New Block on the Periodic Table--Livermore and Russian scientists add a new block to the periodic table with the creation of element 118; and (6) A Search for Patterns and Connections--Throughout his career, Edward Teller searched for mathematical solutions to explain the physical world.
Bearinger, J P
This month's issue has the following articles: (1) A Safer and Even More Effective TATB - Commentary by Bruce T. Goodwin; (2) Dissolving Molecules to Improve Their Performance - Computer scientists and chemists have teamed to develop a green method for recycling a valuable high explosive that is no longer manufactured; (3) Exceptional People Producing Great Science - Postdoctoral researchers lend their expertise to projects that support the Laboratory's missions; (4) Revealing the Identities and Functions of Microbes - A new imaging technique illuminates bacterial metabolic pathways and complex relationships; and (5) A Laser Look inside Planets - Laser-driven ramp compression may one day reveal the interior structure of Earth-like planets in other solar systems.
Szuch, John R.
The Internal Fluid Mechanics Division of the NASA Lewis Research Center is combining the key elements of computational fluid dynamics, aerothermodynamic experiments, and advanced computational technology to bring internal computational fluid dynamics (ICFM) to a state of practical application for aerospace propulsion system design. This paper presents an overview of efforts underway at NASA Lewis to advance and apply computational technology to ICFM. These efforts include the use of modern, software engineering principles for code development, the development of an AI-based user-interface for large codes, the establishment of a high-performance, data communications network to link ICFM researchers and facilities, and the application of parallel processing to speed up computationally intensive and/or time-critical ICFM problems. A multistage compressor flow physics program is cited as an example of efforts to use advanced computational technology to enhance a current NASA Lewis ICFM research program.
Gunjan, Vinit Kumar; Venkatesh, C; Amarnath, M
This book highlights the experimental investigations that have been carried out on magnetic resonance imaging and computed tomography (MRI & CT) images using state-of-the-art Computational Image processing techniques, and tabulates the statistical values wherever necessary. In a very simple and straightforward way, it explains how image processing methods are used to improve the quality of medical images and facilitate analysis. It offers a valuable resource for researchers, engineers, medical doctors and bioinformatics experts alike.
Quirk, W.J.; Canada, J.; de Vore, L.; Gleason, K.; Kirvel, R.D.; Kroopnick, H.; McElroy, L.; Van Dyke, P. [eds.
This monthly report of research activities at Lawrence Livermore Laboratory highlights three different research programs. First, the Forensic Science Center supports a broad range of analytical techniques that focus on detecting and analyzing chemical, biological, and nuclear species. Analyses are useful in the areas of nonproliferation, counterterrorism, and law enforcement. Second, starting in 1977, the laboratory initiated a series of studies to understand a high incidence of melanoma among employees. Continued study shows that mortality from this disease has decreased from the levels seen in the 1980`s. Third, to help coordinate the laboratory`s diverse research projects that can provide better healthcare tools to the public, the lab is creating the new Center for Healthcare Technologies.
Chinn, D J
This month's issue has the following articles: (1) Homeland Security Begins Abroad--Commentary by John C. Doesburg; (2) Out of Harm's Way--New physical protection and accountability systems, together with a focus on security, safeguard nuclear materials in the Russian Federation; (3) A Calculated Journey to the Center of the Earth--Determining the permeability of partially melted metals in a mineral matrix unlocks secrets about the formation of Earth's core; (4) Wireless That Works--Communication technologies using ultrawideband radar are improving national security; and (5) Power to the People--Edward Teller envisioned safe and plentiful nuclear power for peaceful applications.
Bearinger, J P
This month's issue has the following articles: (1) Countering the Growing Chem-Bio Threat -- Commentary by Penrose (Parney) C. Albright; (2) Responding to a Terrorist Attack Involving Chemical Warfare Agents -- Livermore scientists are helping the nation strengthen plans to swiftly respond to an incident involving chemical warfare agents; (3) Revealing the Secrets of a Deadly Disease -- A Livermore-developed system helps scientists better understand how plague bacteria infect healthy host cells; (4) A New Application for a Weapons Code -- Simulations reveal for the first time how blast waves cause traumatic brain injuries; (5) Testing Valuable National Assets for X-Ray Damage -- Experiments at the National Ignition Facility are measuring the effects of radiation on critical systems; and (6) An Efficient Way to Harness the Sun's Power -- New solar thermal technology is designed to supply residential electric power at nearly half of the current retail price.
Brigham, Tara J
Affective computing technologies are designed to sense and respond based on human emotions. This technology allows a computer system to process the information gathered from various sensors to assess the emotional state of an individual. The system then offers a distinct response based on what it "felt." While this is completely unlike how most people interact with electronics today, this technology is likely to trickle into future everyday life. This column will explain what affective computing is, some of its benefits, and concerns with its adoption. It will also provide an overview of its implication in the library setting and offer selected examples of how and where it is currently being used.
Unger, Herwig; Boonkrong, Sirapat; IC2IT2013
This volume contains the papers of the 9th International Conference on Computing and Information Technology (IC2IT 2013) held at King Mongkut's University of Technology North Bangkok (KMUTNB), Bangkok, Thailand, on May 9th-10th, 2013. Traditionally, the conference is organized in conjunction with the National Conference on Computing and Information Technology, one of the leading Thai national events in the area of Computer Science and Engineering. The conference as well as this volume is structured into 3 main tracks on Data Networks/Communication, Data Mining/Machine Learning, and Human Interfaces/Image processing.
Geothermal Technologies Office
Geothermal Technologies Office conducted its annual program peer review in April of 2013. The review provided an independent, expert evaluation of the technical progress and merit of GTO-funded projects. Further, the review was a forum for feedback and recommendations on future GTO strategic planning. During the course of the peer review, DOE-funded projects were evaluated for 1) their contribution to the mission and goals of the GTO and 2) their progress against stated project objectives. Principal Investigators (PIs) came together in sessions organized by topic “tracks” to disseminate information, progress, and results to a panel of independent experts as well as attendees.
Hollett, Douglas [Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States); Stillman, Greg [Energy Efficiency and Renewable Energy (EERE), Washington, DC (United States)
On June 6-10, 2011, the U.S. Department of Energy (DOE), Office of Energy Efficiency and Renewable Energy (EERE), Geothermal Technologies Program (GTP or the Program) conducted its annual program peer review in Bethesda, Maryland. In accordance with the EERE Peer Review Guide, the review provides an independent, expert evaluation of the strategic goals and direction of the program and is a forum for feedback and recommendations on future program planning. The purpose of the review was to evaluate DOE-funded projects for their contribution to the mission and goals of the Program and to assess progress made against stated objectives.
Full Text Available With increasing consumers demand and tightening of food and environmental regulations, traditional food-processing techniques have lost their optimum performance which gave rise to new and powerful technologies. Ultrasonic is a one of the fast, versatile, emerging, and promising non-destructive green technology used in the food industry from last few years. The ultrasound is being carried out in various areas of food technology namely crystallization, freezing, bleaching, degassing, extraction, drying, filtration, emulsification, sterilization, cutting, etc. Ultrasound is being applied as an effective preservation tool in many food-processing fields viz. vegetables and fruits, cereal products, honey, gels, proteins, enzymes, microbial inactivation, cereal technology, water treatment, diary technology, etc. This review summarizes the latest knowledge on impact and application of ultrasound in food technology.
Xhafa, Fatos [Polytechnic Univ. of Catalonia, Barcelona (Spain). Dept. of Languages and Informatics Systems; Caballe, Santi; Daradoumis, Thanasis [Open Univ. of Catalonia, Barcelona (Spain). Dept. of Computer Sciences Multimedia and Telecommunications; Abraham, Ajith [Machine Intelligence Research Labs (MIR Labs), Auburn, WA (United States). Scientific Network for Innovation and Research Excellence; Juan Perez, Angel Alejandro (eds.) [Open Univ. of Catalonia, Barcelona (Spain). Dept. of Information Sciences
E-Learning has become one of the most wide spread ways of distance teaching and learning. Technologies such as Web, Grid, and Mobile and Wireless networks are pushing teaching and learning communities to find new and intelligent ways of using these technologies to enhance teaching and learning activities. Indeed, these new technologies can play an important role in increasing the support to teachers and learners, to shorten the time to learning and teaching; yet, it is necessary to use intelligent techniques to take advantage of these new technologies to achieve the desired support to teachers and learners and enhance learners' performance in distributed learning environments. The chapters of this volume bring advances in using intelligent techniques for technology enhanced learning as well as development of e-Learning applications based on such techniques and supported by technology. Such intelligent techniques include clustering and classification for personalization of learning, intelligent context-aware techniques, adaptive learning, data mining techniques and ontologies in e-Learning systems, among others. Academics, scientists, software developers, teachers and tutors and students interested in e-Learning will find this book useful for their academic, research and practice activity. (orig.)
Applied Computing and Information Technology
This book presents the selected results of the 1st International Symposium on Applied Computers and Information Technology (ACIT 2013) held on August 31 – September 4, 2013 in Matsue City, Japan, which brought together researchers, scientists, engineers, industry practitioners, and students to discuss all aspects of Applied Computers & Information Technology, and its practical challenges. This book includes the best 12 papers presented at the conference, which were chosen based on review scores submitted by members of the program committee and underwent further rigorous rounds of review.
John, Lisha J
Laboratory based practical classes, have been the corner stone of undergraduate pharmacology learning. Ethical issues with the use of animals and rapid development of information technology has led to newer trends in teaching and learning such as computer assisted learning. Computer assisted learning (CAL) software includes computer based packages, focusing on interactive instruction in a specific subject area, collection of animal experiments that encourage students to understand concepts in pharmacology. CAL offers a number of advantages to both students and teachers; most important being meeting the learning objectives. Few disadvantages and pitfalls to implementation in medical schools are also associated with CAL sessions. This article reviews the trend of CAL in pharmacology, advantages, disadvantages and pitfalls to the implementation of CAL.
Alena, Richard; Liu, Yuan-Kwei; Goforth, Andre; Fernquist, Alan R.
Attention is given to current activities in the Embedded Data Processor and Portable Computer Technology testbed configurations that are part of the Advanced Data Systems Architectures Testbed at the Information Sciences Division at NASA Ames Research Center. The Embedded Data Processor Testbed evaluates advanced microprocessors for potential use in mission and payload applications within the Space Station Freedom Program. The Portable Computer Technology (PCT) Testbed integrates and demonstrates advanced portable computing devices and data system architectures. The PCT Testbed uses both commercial and custom-developed devices to demonstrate the feasibility of functional expansion and networking for portable computers in flight missions.
Ciftcioglu, O.; Durmisevic, S.; Sariyildiz, S.
The last decade, civil engineering has exercised a rapidly growing interest in the application of neurally inspired computing techniques. The motive for this interest was the promises of certain information processing characteristics, which are similar to some extend, to those of human brain. The
Under a Space Act Agreement, NASA partnered with Seattle-based Amazon Web Services to make the agency's climate and Earth science satellite data publicly available on the company's servers. Users can access the data for free, but they can also pay to use Amazon's computing services to analyze and visualize information using the same software available to NASA researchers.
Disciplinary Foundations and Global ImpactEvolving Discipline of Information Systems Heikki TopiDiscipline of Information Technology Barry M. Lunt and Han ReichgeltInformation Systems as a Practical Discipline Juhani IivariInformation Technology Han Reichgelt, Joseph J. Ekstrom, Art Gowan, and Barry M. LuntSociotechnical Approaches to the Study of Information Systems Steve Sawyer and Mohammad Hossein JarrahiIT and Global Development Erkki SutinenUsing ICT for Development, Societal Transformation, and Beyond Sherif KamelTechnical Foundations of Data and Database ManagementData Models Avi Silber
Smith, Rachel Charlotte
Emerging technologies are providing a new field for design anthropological inquiry that unite experiences, imaginaries and materialities in complex way and demands new approaches to developing sustainable computational futures....
Plotnikov, A.V.; Prilutskij, D.A.; Selishchev, S.V.
The paper outlines one of the promising standards to transmit images in medicine, in radiology in particular. the essence of the standard DICOM is disclosed and promises of its introduction into computer-aided medical technologies
Global networks, which are the primary pillars of the modern manufacturing industry and supply chains, can only cope with the new challenges, requirements and demands when supported by new computing and Internet-based technologies. Cloud Manufacturing: Distributed Computing Technologies for Global and Sustainable Manufacturing introduces a new paradigm for scalable service-oriented sustainable and globally distributed manufacturing systems. The eleven chapters in this book provide an updated overview of the latest technological development and applications in relevant research areas. Following an introduction to the essential features of Cloud Computing, chapters cover a range of methods and applications such as the factors that actually affect adoption of the Cloud Computing technology in manufacturing companies and new geometrical simplification method to stream 3-Dimensional design and manufacturing data via the Internet. This is further supported case studies and real life data for Waste Electrical ...
Mcmahon, E. M.
This paper describes a security guard device that is currently being developed by Computer Sciences Corporation (CSC). The methods used to provide assurance that the system meets its security requirements include the system architecture, a system security evaluation, and the application of formal and informal verification techniques. The combination of state-of-the-art technology and the incorporation of new verification procedures results in a demonstration of the feasibility of computer security technology for operational applications.
Kristopher M. Day, MD
Conclusion:. Modern 3D technology allows the surgeon to better analyze complex craniofacial deformities, precisely plan surgical correction with computer simulation of results, customize osteotomies, plan distractions, and print 3DPCI, as needed. The use of advanced 3D computer technology can be applied safely and potentially improve aesthetic and functional outcomes after complex craniofacial reconstruction. These techniques warrant further study and may be reproducible in various centers of care.
Unger, Herwig; Meesad, Phayung
Computer and Information Technology (CIT) are now involved in governmental, industrial, and business domains more than ever. Thus, it is important for CIT personnel to continue academic research to improve technology and its adoption to modern applications. The up-to-date research and technologies must be distributed to researchers and CIT community continuously to aid future development. The 10th International Conference on Computing and Information Technology (IC 2 IT2014) organized by King Mongkut's University of Technology North Bangkok (KMUTNB) and partners provides an exchange of the state of the art and future developments in the two key areas of this process: Computer Networking and Data Mining. Behind the background of the foundation of ASEAN, it becomes clear that efficient languages, business principles and communication methods need to be adapted, unified and especially optimized to gain a maximum benefit to the users and customers of future IT systems.
Today, computer technology is within the reach of practically any industrial corporation regardless of product size. This manual highlights a few of the many applications of computers in the process industry and provides the technical reader with a basic understanding of computer technology, terminology, and the interactions among the various elements of a process computer system. The manual has been organized to separate process applications and economics from computer technology. Chapter 1 introduces the present status of process computer technology and describes the four major applications - monitoring, analysis, control, and optimization. The basic components of a process computer system also are defined. Energy-saving applications in the four major categories defined in Chapter 1 are discussed in Chapter 2. The economics of process computer systems is the topic of Chapter 3, where the historical trend of process computer system costs is presented. Evaluating a process for the possible implementation of a computer system requires a basic understanding of computer technology as well as familiarity with the potential applications; Chapter 4 provides enough technical information for an evaluation. Computer and associated peripheral costs and the logical sequence of steps in the development of a microprocessor-based process control system are covered in Chapter 5.
Vogt, R. L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Meissner, C. N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kotta, P. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
At Lawrence Livermore National Laboratory, we focus on science and technology research to ensure our nation’s security. We also apply that expertise to solve other important national problems in energy, bioscience, and the environment. Science & Technology Review is published eight times a year to communicate, to a broad audience, the Laboratory’s scientific and technological accomplishments in fulfilling its primary missions. The publication’s goal is to help readers understand these accomplishments and appreciate their value to the individual citizen, the nation, and the world.
Orme, C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Meissner, C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kotta, P. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
At Lawrence Livermore National Laboratory, we focus on science and technology research to ensure our nation’s security. We also apply that expertise to solve other important national problems in energy, bioscience, and the environment. Science & Technology Review is published eight times a year to communicate, to a broad audience, the Laboratory’s scientific and technological accomplishments in fulfilling its primary missions. The publication’s goal is to help readers understand these accomplishments and appreciate their value to the individual citizen, the nation, and the world.
Orme, C. A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Meissner, C. N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kotta, P. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
Kristensen, Jannie Friis
In my thesis work I am investigating how the design of pervasive/ubiquitous computing technology, relate to the flexible and individual work practice of nomadic workers. Through empirical studies and with an experimental systems development approach, the work is focused on: a) Supporting...... interpretation and inclusion of implicit and invisible as well as explicit and visible characteristics of artifacts, users and use practices. b) Identifying breakdowns in human-computer interaction situations, with particular emphasis on the computation that happens "behind the scenes" in the pervasive computing...... environment, and how that computational process at a sufficient level is made intelligible, visible, accountable and negotiable to the human participant....
victor - wiley
Full Text Available Computer vision has been studied from many persective. It expands from raw data recording into techniques and ideas combining digital image processing, pattern recognition, machine learning and computer graphics. The wide usage has attracted many scholars to integrate with many disciplines and fields. This paper provide a survey of the recent technologies and theoretical concept explaining the development of computer vision especially related to image processing using different areas of their field application. Computer vision helps scholars to analyze images and video to obtain necessary information, understand information on events or descriptions, and scenic pattern. It used method of multi-range application domain with massive data analysis. This paper provides contribution of recent development on reviews related to computer vision, image processing, and their related studies. We categorized the computer vision mainstream into four group e.g., image processing, object recognition, and machine learning. We also provide brief explanation on the up-to-date information about the techniques and their performance.
Burwell, Sasha; Sample, Matthew; Racine, Eric
Brain-Computer Interface (BCI) is a set of technologies that are of increasing interest to researchers. BCI has been proposed as assistive technology for individuals who are non-communicative or paralyzed, such as those with amyotrophic lateral sclerosis or spinal cord injury. The technology has also been suggested for enhancement and entertainment uses, and there are companies currently marketing BCI devices for those purposes (e.g., gaming) as well as health-related purposes (e.g., communication). The unprecedented direct connection created by BCI between human brains and computer hardware raises various ethical, social, and legal challenges that merit further examination and discussion. To identify and characterize the key issues associated with BCI use, we performed a scoping review of biomedical ethics literature, analyzing the ethics concerns cited across multiple disciplines, including philosophy and medicine. Based on this investigation, we report that BCI research and its potential translation to therapeutic intervention generate significant ethical, legal, and social concerns, notably with regards to personhood, stigma, autonomy, privacy, research ethics, safety, responsibility, and justice. Our review of the literature determined, furthermore, that while these issues have been enumerated extensively, few concrete recommendations have been expressed. We conclude that future research should focus on remedying a lack of practical solutions to the ethical challenges of BCI, alongside the collection of empirical data on the perspectives of the public, BCI users, and BCI researchers.
Héctor Alejandro Galvis
Full Text Available This theoretical review addresses the construct of beliefs in education and English as a foreign language, and their impact when integrating technology. A thorough definition and categorization of teachers’ beliefs will be provided. In addition, studies conducted in various educational settings examining the effects of teachers’ beliefs and the use of technology will be reviewed. Additional information on models attempting to explain human behavior and the use of computers will be presented as well in order to discuss these research results in light of local efforts made to solve the gap of integrating technology through the Computadores para Educar Program in Colombian public schools.
Ko, Ping-Ru T; Kientz, Julie A; Choe, Eun Kyoung; Kay, Matthew; Landis, Carol A; Watson, Nathaniel F
To review sleep related consumer technologies, including mobile electronic device "apps," wearable devices, and other technologies. Validation and methodological transparency, the effect on clinical sleep medicine, and various social, legal, and ethical issues are discussed. We reviewed publications from the digital libraries of the Association for Computing Machinery, Institute of Electrical and Electronics Engineers, and PubMed; publications from consumer technology websites; and mobile device app marketplaces. Search terms included "sleep technology," "sleep app," and "sleep monitoring." Consumer sleep technologies are categorized by delivery platform including mobile device apps (integrated with a mobile operating system and utilizing mobile device functions such as the camera or microphone), wearable devices (on the body or attached to clothing), embedded devices (integrated into furniture or other fixtures in the native sleep environment), accessory appliances, and conventional desktop/website resources. Their primary goals include facilitation of sleep induction or wakening, self-guided sleep assessment, entertainment, social connection, information sharing, and sleep education. Consumer sleep technologies are changing the landscape of sleep health and clinical sleep medicine. These technologies have the potential to both improve and impair collective and individual sleep health depending on method of implementation. © 2015 American Academy of Sleep Medicine.
Chamberlain, Alan; Bødker, Mads; Hazzard, Adrian
Audio-based mobile technology is opening up a range of new interactive possibilities. This paper brings some of those possibilities to light by offering a range of perspectives based in this area. It is not only the technical systems that are developing, but novel approaches to the design...... and understanding of audio-based mobile systems are evolving to offer new perspectives on interaction and design and support such systems to be applied in areas, such as the humanities....
Adoption of new technology has complicating components both from the selection, as well as decision-making criteria and process. Although new technology such as cloud computing provides great benefits especially to the developing countries, it has challenges that may complicate the selection decision and subsequent adoption process. This study…
Lee, Jieun; Husman, Jenefer; Scott, Kimberly A.; Eggum-Wilkens, Natalie D.
The COMPUGIRLS: Culturally relevant technology program for adolescent girls was developed to promote underrepresented girls' future possible selves and career pathways in computer-related technology fields. We hypothesized that the COMPUGIRLS would promote academic possible selves and self-regulation to achieve these possible selves. We compared…
Blum, Debra E.
Since 1985, the University of Missouri at Columbia's School of Journalism has been developing a high-technology environment for student work, including word processing, electronic imaging, networked personal computers, and telecommunications. Some faculty worry that the emphasis on technology may overshadow the concepts, principles, and substance…
This paper reports on an action research study that investigated factors influencing TESOL (teaching English to speakers of other languages) teacher candidates' (TCs) selection and use of technology in the English as a second language (ESL) classroom and the influence of explicit training in context in the use of computer technology for second…
Book review. Information and Communication Technologies for Development in Africa: Volume 2. The Experience with Community Telecentres By Florence Etta and Sheila Parvyn-Wamahiu (2003). Kibet A Ngetich. Abstract. No Abstract Available Africa Development Vol. XXX (1&2) 2005: 254-256. Article Metrics. No metrics ...
Book review. Information and Communication Technologies for Development in Africa: Volume 2. The Experience with Community Telecentres By Florence Etta and Sheila Parvyn-Wamahiu (2003). Kibet A Ngetich. Abstract. No Abstract Available Africa Development Vol. XXX (1&2) 2005: 254-256.
The technology applied for the design and construction of containment vent filters is compiled and reviewed. The national positions leading to the selection of venting or method of filtration are extracted from position papers. Several areas of further information needs are identified
On May 7-10, 2012, the U.S. Department of Energy, Office of Energy Efficiency and Renewable Energy, Geothermal Technologies Office conducted its annual program peer review in Westminster, CO. In accordance with the EERE Peer Review Guide, the review provides an independent, expert evaluation of the strategic goals and direction of the office and is a forum for feedback and recommendations on future office planning. The purpose of the review was to evaluate DOE-funded projects for their contribution to the mission and goals of the office and to assess progress made against stated objectives. Project scoring results, expert reviewer comments, and key findings and recommendations are included in this report.
Garrett, L. B.
The interactive Design and Evaluation of Advanced Spacecraft (IDEAS) system is described, together with planned capability increases in the IDEAS system. The system's disciplines consist of interactive graphics and interactive computing. A single user at an interactive terminal can create, design, analyze, and conduct parametric studies of earth-orbiting satellites, which represents a timely and cost-effective method during the conceptual design phase where various missions and spacecraft options require evaluation. Spacecraft concepts evaluated include microwave radiometer satellites, communication satellite systems, solar-powered lasers, power platforms, and orbiting space stations.
Gnoffo, Peter A.
Several current challenges in computational fluid dynamics and aerothermodynamics for hypersonic vehicle applications are discussed. Example simulations are presented from code validation and code benchmarking efforts to illustrate capabilities and limitations. Opportunities to advance the state-of-art in algorithms, grid generation and adaptation, and code validation are identified. Highlights of diverse efforts to address these challenges are then discussed. One such effort to re-engineer and synthesize the existing analysis capability in LAURA, VULCAN, and FUN3D will provide context for these discussions. The critical (and evolving) role of agile software engineering practice in the capability enhancement process is also noted.
Paluzzi, Peter; Miller, Rosalind; Kurihara, West; Eskey, Megan
Over the past several months, major industry vendors have made a business case for the network computer as a win-win solution toward lowering total cost of ownership. This report provides results from Phase I of the Ames Research Center network computer evaluation project. It identifies factors to be considered for determining cost of ownership; further, it examines where, when, and how network computer technology might fit in NASA's desktop computing architecture.
Sweeney, W. T.
As technology transfer becomes more popular and proves to be an economical method for companies of all sizes to take advantage of a tremendous amount of new and available technology from sources all over the world, the introduction of computers and terminals into the international technology transfer process is proving to be a successful method for companies to take part in this beneficial approach to new business opportunities.
Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, V. V.; Ryazanov, D.K.; Tellin, A.I.
In the report the role and purposes of computer simulation in nuclear technologies development is discussed. The authors consider such applications of computer simulation as nuclear safety researches, optimization of technical and economic parameters of acting nuclear plant, planning and support of reactor experiments, research and design new devices and technologies, design and development of 'simulators' for operating personnel training. Among marked applications the following aspects of computer simulation are discussed in the report: neutron-physical, thermal and hydrodynamics models, simulation of isotope structure change and damage dose accumulation for materials under irradiation, simulation of reactor control structures. (authors)
jisuanji numerical integrator and computer ; MANIAC shuxue guanxi " mathenatical relation 02 shuxue guanxishi mathematical relation 03 shuxue gTinafa m...AD-A119 5 AIR INTELLIGENCE GROUP (7602ND) APO SAN FRANCISCO 96263 F/G 5/2 AUG SoCHINESE-ENGLISH AUTOMATION AND COMPUTER TECHNOLOGY DICTIONARY...Chinese-English Automation and Computer Technology Dictionary VOL 2 ItT: SEP 2LECTE \\This dcuflent h as een c i tsrO tog public te1a sae’ I d~suil to
The Department of Energy (DOE), the National Aeronautics and Space Administration (NASA) and the Jet Propulsion Laboratory (JPL) established a DOE lead management team and an Advanced Conversion Technology Review Panel. The panel was tasked with providing the management team with an assessment and ranking of the three advanced conversion technologies. The three advanced conversion technologies were alkali metal thermal to electric converter (AMTEC), Stirling engine converter (SEC), and thermophotovoltaic (TPV). To rate and rank these three technologies, five criteria were developed: (1) Performance, (2) Development and Cost/Production and Cost/Schedule Risk, (3) Spacecraft Interface and Operations, (4) Ability to Scale Conversion, and (5) Safety. Discussed are the relative importance of each of these criteria and the rankings of the three advanced conversion technologies. It was the conclusion of the panel that the technology decision should be based on the risk that DOE and NASA are willing to accept. SEC is the most mature technology and would provide the lowest risk option. However, if more risk is acceptable, AMTEC not only provides benefits in the spacecraft interface but is also predicted to outperform the SEC. It was proposed that if AMTEC were selected, funding should be provided at a reasonable level to support back-up technology to be developed in a parallel fashion until AMTEC has proven its capability. The panel report and conclusion were provided to DOE in February 1997
This book explores near-threshold computing (NTC), a design-space using techniques to run digital chips (processors) near the lowest possible voltage. Readers will be enabled with specific techniques to design chips that are extremely robust; tolerating variability and resilient against errors. Variability-aware voltage and frequency allocation schemes will be presented that will provide performance guarantees, when moving toward near-threshold manycore chips. · Provides an introduction to near-threshold computing, enabling reader with a variety of tools to face the challenges of the power/utilization wall; · Demonstrates how to design efficient voltage regulation, so that each region of the chip can operate at the most efficient voltage and frequency point; · Investigates how performance guarantees can be ensured when moving towards NTC manycores through variability-aware voltage and frequency allocation schemes. .
Lamos, Joseph P.
This review of the evolution of programmed instruction from Pressey and Skinner to the present suggests that current computer technology will be able to free the learner from the limitations of time and place as Pressey originally proposed. It is noted that Skinner provided the necessary foundation for treating the learning process on an…
Dixit, R. K.
Articles in this issue of "Global Journal of Computer Science and Technology" include: (1) Input Data Processing Techniques in Intrusion Detection Systems--Short Review (Suhair H. Amer and John A. Hamilton, Jr.); (2) Semantic Annotation of Stock Photography for CBIR Using MPEG-7 standards (R. Balasubramani and V. Kannan); (3) An Experimental Study…
The mobile computing is a computing system in which a computer and all necessary accessories like files and software are taken out to the field. It is a system of computing through which it is being able to use a computing device even when someone being mobile and therefore changing location. The portability is one of the important aspects of mobile computing. The mobile phones are being used to gather scientific data from remote and isolated places that could not be possible to retrieve by other means. The scientists are initiating to use mobile devices and web-based applications to systematically explore interesting scientific aspects of their surroundings, ranging from climate change, environmental pollution to earthquake monitoring. This mobile revolution enables new ideas and innovations to spread out more quickly and efficiently. Here we will discuss in brief about the mobile computing technology, its sensing, challenges and the applications. (author)
Vogt, Ramona L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Meissner, Caryn N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Chinn, Ken B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
At Lawrence Livermore National Laboratory, we focus on science and technology research to ensure our nation’s security. We also apply that expertise to solve other important national problems in energy, bioscience, and the environment. Science & Technology Review is published eight times a year to communicate, to a broad audience, the Laboratory’s scientific and technological accomplishments in fulfilling its primary missions. The publication’s goal is to help readers understand these accomplishments and appreciate their value to the individual citizen, the nation, and the world. In this issue for the months of July and August 2016, there are two features: one on Science and Technology in Support of Nuclear Nonproliferation, and another on Seeking Out Hidden Radioactive Materials. Then there are highlights are three research projects--on optics, plasma science, and the nature of neutrinos--along with a news section and patents and awards.
Sefcik, J A [ed.
This issue of Energy and Technology Review focuses on cooperative research and development agreements (CRADAs)-one of the Laboratory's most effective means of technology transfer. The first article chronicles the legislative evolution of these agreements. The second article examines the potential beneficial effects of CRADAs on the national economy and discusses their role in the development and marketing of Laboratory technologies. The third article provides information on how to initiate and develop CRADAs at LLNL, and the fourth and fifth articles describe the Laboratory's two most prominent technology transfer projects. One is a 30-month CRADA with General Motors to develop advanced lasers for cutting, welding, and heat-treating operations. The cover photograph shows this laser cutting through a piece of steel 1/16 of an inch thick. The other project is a three-year CRADA with Amoco, Chevron-Conoco, and Unocal to refine our oil shale retorting process.
Nikolic, R J
This month's issue has the following articles: (1) High-Performance Computing for Energy Innovation - Commentary by Tomas Diaz de la Rubia; (2) Simulating the Next Generation of Energy Technologies - Projects using high-performance computing demonstrate Livermore's computational horsepower and improve the quality of energy solutions and the speed of deployment; (3) ARC Comes into Focus - The Advanced Radiographic Capability, a petawatt-class laser, can penetrate dense objects to reveal material dynamics during National Ignition Facility experiments; (4) A New Method to Track Viral Evolution - A sensitive technique developed at the Laboratory can identify virus mutations that may jump from host to host; and (5) Data for Defense: New Software Finds It Fast - Department of Defense warfighters and planners are using Livermore software systems to extract pertinent information from massive amounts of data.
Carroll, M.; McCracken, J.; Shope, T.
Nearly all industrial operations generate unwanted dust, particulate matter, and/or liquid wastes. Waste dust and particulates can be readily tracked to other work locations, and airborne particulates can be spread through ventilation systems to all locations within a building, and even vented outside the building - a serious concern for processes involving hazardous, radioactive, or nuclear materials. Several varieties of vacuum systems have been proposed and/or are commercially available for clean up of both solid and liquid hazardous and nuclear materials. A review of current technologies highlights both the advantages and disadvantages of the various systems, and demonstrates the need for a system designed to address issues specific to hazardous and nuclear material cleanup. A review of previous and current hazardous/nuclear material cleanup technologies is presented. From simple conventional vacuums modified for use in industrial operations, to systems specifically engineered for such purposes, the advantages and disadvantages are examined in light of the following criteria: minimal worker exposure; minimal secondary waste generation;reduced equipment maintenance and consumable parts; simplicity of design, yet fully compatible with all waste types; and ease of use. The work effort reviews past, existing and proposed technologies in light of such considerations. Accomplishments of selected systems are presented, including identified areas where technological improvements could be suggested
Anisenkov, Alexey; The ATLAS collaboration; Alandes Pradillo, Maria
AGIS is the information system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing (ADC) applications and services. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others.
Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)
Sanathanan, L.P.; Reilly, C.A.; Marshall, S.A.; Wilzbach, K.E.
This document contains annotated synopses of available information pertinent to health impacts of synthetic fuel technologies under development, and identifies needs for further information. The report focuses on carcinogenesis, which appears to be a special problem with coal conversion technologies. This review is intended to serve as a reference for the NEPA Affairs Division of DOE in its evaluation of the overall synthetic fuel program and specific projects in the program. Updated versions of this document are expected to be prepared annually or semiannually as new information becomes available.
Full Text Available Highway management systems are used to improve safety and driving comfort on highways by using control strategies and providing information and warnings to drivers. They use several strategies starting from speed and lane management, through incident detection and warning systems, ramp metering, weather information up to, for example, informing drivers about alternative roads. This paper provides a review of the existing approaches to highway management systems, particularly speed harmonization and ramp metering. It is focused only on modern and advanced approaches, such as soft computing, multi-agent methods and their interconnection. Its objective is to provide guidance in the wide field of highway management and to point out the most relevant recent activities which demonstrate that development in the field of highway management is still important and that the existing research exhibits potential for further enhancement.
Joseph, Anito; Mehrotra, Anuj; Trick, Michael
Computer Science and Operations Research continue to have a synergistic relationship and this book represents the results of cross-fertilization between OR/MS and CS/AI. It is this interface of OR/CS that makes possible advances that could not have been achieved in isolation. Taken collectively, these articles are indicative of the state-of-the-art in the interface between OR/MS and CS/AI and of the high caliber of research being conducted by members of the INFORMS Computing Society. EXTENDING THE HORIZONS: Advances in Computing, Optimization, and Decision Technologies is a volume that presents the latest, leading research in the design and analysis of algorithms, computational optimization, heuristic search and learning, modeling languages, parallel and distributed computing, simulation, computational logic and visualization. This volume also emphasizes a variety of novel applications in the interface of CS, AI, and OR/MS.
Full Text Available In our digitally connected world, the law is arguably behind the technological developments of the Internet age. While this causes many issues for law enforcement, it is of particular concern in the area of child pornography in the United States. With the wide availability of technologies such as digital cameras, peer-to-peer file sharing, strong encryption, Internet anonymizers and cloud computing, the creation and distribution of child pornography has become more widespread. Simultaneously, fighting the growth of this crime has become more difficult. This paper explores the development of both the legal and technological environments surrounding digital child pornography. In doing so, we cover the complications that court decisions have given law enforcement who are trying to investigate and prosecute child pornographers. We then provide a review of the technologies used in this crime and the forensic challenges that cloud computing creates for law enforcement. We note that both legal and technological developments since the 1990s seem to be working to the advantage of users and sellers of child pornography. Before concluding, we provide a discussion and offer observations regarding this subject.
Failor, B.; Stull, S.; Wheatcraft, D. [eds.
This review is published ten times a year to communicate, to a broad audience, Lawrence Livermore National Laboratory`s scientific and technological accomplishments, particularly in the Laboratory`s core mission areas - global security, energy and the environment, and bioscience and biotechnology. Topics discussed in this August 1996 issue are: Keeping the nuclear stockpile safe, secure, and reliable; Molten salt takes the bang out of high explosives; Security clearances meet the electronic age; and Exploring oil fields with crosshole electromagnetic induction.
Rouse, William B.; Morris, Nancy M.
Technology-driven efforts to implement computer technology often encounter problems due to lack of acceptance or begrudging acceptance of the personnel involved. It is argued that individuals' acceptance of automation, in terms of either computerization or computer aiding, is heavily influenced by their perceptions of the impact of the automation on their discretion in performing their jobs. It is suggested that desired levels of discretion reflect needs to feel in control and achieve self-satisfaction in task performance, as well as perceptions of inadequacies of computer technology. Discussion of these factors leads to a structured set of considerations for performing front-end analysis, deciding what to automate, and implementing the resulting changes.
Barr, Neil G; Randall, Glen E; Archer, Norman P; Musson, David M
The use of Internet-enabled technology (information and communication technology such as smartphone applications) may enrich information exchange among providers and, consequently, improve health care delivery. The purpose of this systematic review was to gain a greater understanding of the role that Internet-enabled technology plays in enhancing communication among physicians. Studies were identified through a search in three electronic platforms: the Association for Computing Machinery Digital Library, ProQuest, and Web of Science. The search identified 5140 articles; of these, 21 met all inclusion criteria. In general, physicians were satisfied with Internet-enabled technology, but consensus was lacking regarding whether Internet-enabled technology improved efficiency or made a difference to clinical decision-making. Internet-enabled technology can play an important role in enhancing communication among physicians, but the extent of that benefit is influenced by (1) the impact of Internet-enabled technology on existing work practices, (2) the availability of adequate resources, and (3) the nature of institutional elements, such as privacy legislation.
Strenge, D.L.; Watson, E.C.; Droppo, J.G.
The development of technological bases for siting nuclear fuel cycle facilities requires calculational models and computer codes for the evaluation of risks and the assessment of environmental impact of radioactive effluents. A literature search and review of available computer programs revealed that no one program was capable of performing all of the great variety of calculations (i.e., external dose, internal dose, population dose, chronic release, accidental release, etc.). Available literature on existing computer programs has been reviewed and a description of each program reviewed is given
Strenge, D.L.; Watson, E.C.; Droppo, J.G.
The development of technological bases for siting nuclear fuel cycle facilities requires calculational models and computer codes for the evaluation of risks and the assessment of environmental impact of radioactive effluents. A literature search and review of available computer programs revealed that no one program was capable of performing all of the great variety of calculations (i.e., external dose, internal dose, population dose, chronic release, accidental release, etc.). Available literature on existing computer programs has been reviewed and a description of each program reviewed is given.
Möller, M.; Vuik, C.
Quantum computing technologies have become a hot topic in academia and industry receiving much attention and financial support from all sides. Building a quantum computer that can be used practically is in itself an outstanding challenge that has become the ‘new race to the moon’. Next to
Castillo, Michael; McGuire, Kenyon; Sorgi, Alan
The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.
The 2012 DOE Hydrogen Program and Vehicle Technologies Program Annual Merit Review and Peer Evaluation Meeting was held May 14-18, 2012 in Crystal City, Virginia. The review encompassed all of the work done by the Hydrogen Program and the Vehicle Technologies Program: a total of 309 individual activities were reviewed for Vehicle Technologies, by a total of 189 reviewers. A total of 1,473 individual review responses were received for the technical reviews.
Luk, Wayne; Pocek, Ken
Field-Programmable Custom Computing Technology: Architectures, Tools, and Applications brings together in one place important contributions and up-to-date research results in this fast-moving area. In seven selected chapters, the book describes the latest advances in architectures, design methods, and applications of field-programmable devices for high-performance reconfigurable systems. The contributors to this work were selected from the leading researchers and practitioners in the field. It will be valuable to anyone working or researching in the field of custom computing technology. It serves as an excellent reference, providing insight into some of the most challenging issues being examined today.
Labaugh, R. J.
A study was conducted to determine how major improvements in spacecraft computer systems can be obtained from recent advances in hardware and software technology. Investigations into integrated circuit technology indicated that the CMOS/SOS chip set being developed for the Air Force Avionics Laboratory at Wright Patterson had the best potential for improving the performance of spaceborne computer systems. An integral part of the chip set is the bit slice arithmetic and logic unit. The flexibility allowed by microprogramming, combined with the software investigations, led to the specification of a baseline architecture and instruction set.
The main objective of CSAIT 2013 is to provide a forum for researchers, educators, engineers and government officials involved in the general areas of Computational Sciences and Information Technology to disseminate their latest research results and exchange views on the future research directions of these fields. A medium like this provides an opportunity to the academicians and industrial professionals to exchange and integrate practice of computer science, application of the academic ideas, improve the academic depth. The in-depth discussions on the subject provide an international communication platform for educational technology and scientific research for the world's universities, engineering field experts, professionals and business executives.
Nam, Chang Soo; Kim, Sung-Phil; Krusienkki, Dean; Nijholt, Antinus
This report commisioned by the Korean American Scientists and Engineers Association (KSEA) and written with the support of the Korea Federation of Science and Technology Societies (KOFST) surveys research and development trends in the area of brain-computer interface (Brain-Computer Interfaces, BCI)
Thompson, A; Maskery, I; Leach, R K
In this review, the use of x-ray computed tomography (XCT) is examined, identifying the requirement for volumetric dimensional measurements in industrial verification of additively manufactured (AM) parts. The XCT technology and AM processes are summarised, and their historical use is documented. The use of XCT and AM as tools for medical reverse engineering is discussed, and the transition of XCT from a tool used solely for imaging to a vital metrological instrument is documented. The current states of the combined technologies are then examined in detail, separated into porosity measurements and general dimensional measurements. In the conclusions of this review, the limitation of resolution on improvement of porosity measurements and the lack of research regarding the measurement of surface texture are identified as the primary barriers to ongoing adoption of XCT in AM. The limitations of both AM and XCT regarding slow speeds and high costs, when compared to other manufacturing and measurement techniques, are also noted as general barriers to continued adoption of XCT and AM. (topical review)
Bearinger, J P
This month's issue has the following articles: (1) Advanced Materials for Our Past, Present, and Future - Commentary by Tomas Diaz de la Rubia; (2) A Defensive 'Coat' for Materials under Attack - Amorphous metal coatings provide the strength and corrosion resistance needed to protect military vessels and spent nuclear fuel containers; (3) Too Close for Comfort - Laboratory scientists are analyzing the feasibility of using nuclear explosives to disrupt or divert asteroids on a collision course with Earth; (4) Hyperion: A Titan of High-Performance Computing Systems - Livermore is collaborating with 10 computing industry leaders to create a test bed for Linux cluster hardware and software technologies; (5) Isolating Pathogens for Speedy Identification - An automated miniature device separates viruses, bacteria, genetic material, and proteins from nasal swabs and blood and urine samples for speedy identification.
T. A. Todd; T. A. Todd; J. D. Law; R. S. Herbst
Integral to the Advanced Fuel Cycle Initiative (AFCI) Program’s proposed closed nuclear fuel cycle, the fission products cesium and strontium in the dissolved spent nuclear fuel stream are to be separated and managed separately. A comprehensive literature survey is presented to identify cesium and strontium separation technologies that have the highest potential and to focus research and development efforts on these technologies. Removal of these high-heat-emitting fission products reduces the radiation fields in subsequent fuel cycle reprocessing streams and provides a significant short-term (100 yr) heat source reduction in the repository. This, along with separation of actinides, may provide a substantial future improvement in the amount of fuel that could be stored in a geologic repository. The survey and review of the candidate cesium and strontium separation technologies are presented herein. Because the AFCI program intends to manage cesium and strontium together, technologies that simultaneously separate both elements are of the greatest interest, relative to technologies that separate only one of the two elements.
Olson, J. R.
The feasibility of using low-cost, portable computer technology to help a helicopter pilot optimize flight parameters to minimize fuel consumption and takeoff and landing noise was demonstrated. Eight separate computer programs were developed for use in the helicopter cockpit using a hand-held computer. The programs provide the helicopter pilot with the ability to calculate power required, minimum fuel consumption for both range and endurance, maximum speed and a minimum noise profile for both takeoff and landing. Each program is defined by a maximum of two magnetic cards. The helicopter pilot is required to key in the proper input parameter such as gross weight, outside air temperature or pressure altitude.
This volume discusses how the novel technologies of semantic web and workflow have been integrated into the grid and grid services. It focuses on sharing resources, data replication, data management, fault tolerance, scheduling, broadcasting, and load balancing algorithms. The book discusses emerging developments in grid computing, including cloud computing, and explores large-scale computing in high energy physics, weather forecasting, and more. The contributors often use simulations to evaluate the performance of models and algorithms. In the appendices, they present two types of easy-to-use open source software written in Java
Bilgin, Mehmet Selim; Baytaroğlu, Ebru Nur; Erdem, Ali; Dilber, Erhan
The aim of this review was to investigate usage of computer-aided design/computer-aided manufacture (CAD/CAM) such as milling and rapid prototyping (RP) technologies for removable denture fabrication. An electronic search was conducted in the PubMed/MEDLINE, ScienceDirect, Google Scholar, and Web of Science databases. Databases were searched from 1987 to 2014. The search was performed using a variety of keywords including CAD/CAM, complete/partial dentures, RP, rapid manufacturing, digitally designed, milled, computerized, and machined. The identified developments (in chronological order), techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication are summarized. Using a variety of keywords and aiming to find the topic, 78 publications were initially searched. For the main topic, the abstract of these 78 articles were scanned, and 52 publications were selected for reading in detail. Full-text of these articles was gained and searched in detail. Totally, 40 articles that discussed the techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication and the articles were incorporated in this review. Totally, 16 of the papers summarized in the table. Following review of all relevant publications, it can be concluded that current innovations and technological developments of CAD/CAM and RP allow the digitally planning and manufacturing of removable dentures from start to finish. As a result according to the literature review CAD/CAM techniques and supportive maxillomandibular relationship transfer devices are growing fast. In the close future, fabricating removable dentures will become medical informatics instead of needing a technical staff and procedures. However the methods have several limitations for now. PMID:27095912
Bilgin, Mehmet Selim; Baytaroğlu, Ebru Nur; Erdem, Ali; Dilber, Erhan
The aim of this review was to investigate usage of computer-aided design/computer-aided manufacture (CAD/CAM) such as milling and rapid prototyping (RP) technologies for removable denture fabrication. An electronic search was conducted in the PubMed/MEDLINE, ScienceDirect, Google Scholar, and Web of Science databases. Databases were searched from 1987 to 2014. The search was performed using a variety of keywords including CAD/CAM, complete/partial dentures, RP, rapid manufacturing, digitally designed, milled, computerized, and machined. The identified developments (in chronological order), techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication are summarized. Using a variety of keywords and aiming to find the topic, 78 publications were initially searched. For the main topic, the abstract of these 78 articles were scanned, and 52 publications were selected for reading in detail. Full-text of these articles was gained and searched in detail. Totally, 40 articles that discussed the techniques, advantages, and disadvantages of CAD/CAM and RP for removable denture fabrication and the articles were incorporated in this review. Totally, 16 of the papers summarized in the table. Following review of all relevant publications, it can be concluded that current innovations and technological developments of CAD/CAM and RP allow the digitally planning and manufacturing of removable dentures from start to finish. As a result according to the literature review CAD/CAM techniques and supportive maxillomandibular relationship transfer devices are growing fast. In the close future, fabricating removable dentures will become medical informatics instead of needing a technical staff and procedures. However the methods have several limitations for now.
Failor, B.; Stull, S. [eds.
There are two main feature articles in this publication. The first article tells of how using off-the-shelf computers, state-of-the-art CCDs, and a network of collaborators, scientist at Lawrence Livermore National Lab explore the composition of dark matter. Indications are that MACHOs (MAssive Compact Halo Objects) make up the bulk of dark matter in the universe. The second article discusses a new breed of Livermore-developed, flywheel-based energy storage systems using new materials, new technologies, and new thinking to develop a new electromechanical battery. Patents and research highlights are also listed in this publication.
Meesad, Phayung; Boonkrong, Sirapat
This book presents recent research work and results in the area of communication and information technologies. The book includes the main results of the 11th International Conference on Computing and Information Technology (IC2IT) held during July 2nd-3rd, 2015 in Bangkok, Thailand. The book is divided into the two main parts Data Mining and Machine Learning as well as Data Network and Communications. New algorithms and methods of data mining asr discussed as well as innovative applications and state-of-the-art technologies on data mining, machine learning and data networking.
Park, Young C.
The development of digital information transfer, storage and communication methods influences a significant effect on education. The assimilation of pervasive computing and communication technologies marks another great step forward, with Ubiquitous Learning (U-learning) emerging for next generation learners. In the evolutionary view the 5G (or…
Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 9. What's New in Computers ? MMX Technology for Multimedia PCs. S Balakrishnan. Feature Article Volume 2 Issue 9 September 1997 pp 48-57. Fulltext. Click here to view fulltext PDF. Permanent link:
Jonaitis, Leigh A.
Through an examination of literature in the fields of Basic Writing and developmental education, this essay provides some historical perspective and examines the prevalent discourses on the use of computer-mediated technologies in the basic writing classroom. The author uses Bertram Bruce's (1997) framework of various "stances" on…
Housner, Jerry M.; Pinson, Larry D.
The effect of NASA's computational structures Technology (CST) research on aerospace vehicle design and operation is discussed. The application of this research to proposed version of a high-speed civil transport, to composite structures in aerospace, to the study of crack growth, and to resolving field problems is addressed.
Full Text Available Our times are characterized by strong changes in technology that have become reality in many areas of society. When compared to production, transport, services, etc education, as a rule, slowly opens to new technologies. However, children at their homes and outside the schools live in a technologically rich environment, and they expect the change in education in accordance with the imperatives of the education for the twenty-first century. In this sense, systems for automated data processing, multimedia systems, then distance learning, virtual schools and other technologies are being introduced into education. They lead to an increase in students' activities, quality evaluation of their knowledge and finally to their progress, all in accordance with individual abilities and knowledge. Mathematics and computers often appear together in the teaching process. Taking into account the teaching of mathematics, computers and software packages have a significant role. The program requirements are not dominant. The emphasis is on mathematical content and the method of presentation. Computers are especially used in solving various mathematical tasks and self-learning of mathematics. Still, many problems that require solutions appear in the process: how to organise lectures, practice, textbooks, collected mathematical problems, written exams, how to assign and check homework. The answers to these questions are not simple and they will probably be sought continuously, with an increasing use of computers in the teaching process. In this paper I have tried to solve some of the questions above.
Nielsen, Kirsten Mølgaard; Nielsen, Jens Frederik Dalsgaard
This paper describes the integration of IT in the education of electronic and computer technology engineers at Institute of Electronic Systems, Aalborg Uni-versity, Denmark. At the Institute Information Technology is an important tool in the aspects of the education as well as for communication...
Guide to Cloud Computing for Business and Technology Managers: From Distributed Computing to Cloudware Applications unravels the mystery of cloud computing and explains how it can transform the operating contexts of business enterprises. It provides a clear understanding of what cloud computing really means, what it can do, and when it is practical to use. Addressing the primary management and operation concerns of cloudware, including performance, measurement, monitoring, and security, this pragmatic book:Introduces the enterprise applications integration (EAI) solutions that were a first ste
Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio
Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.
Anisenkov, Alexey; Di Girolamo, Alessandro; Alandes Pradillo, Maria
The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computing model and data structures used by Distributed Computing applications and services are continuously evolving and trend to fit newer requirements from ADC community. In this note, we describe the evolution and the recent developments of AGIS functionalities, related to integration of new technologies recently become widely used in ATLAS Computing, like flexible computing utilization of opportunistic Cloud and HPC resources, ObjectStore services integration for Distributed Data Management (Rucio) and ATLAS workload management (PanDA) systems, unified storage protocols declaration required for PandDA Pilot site movers and others. The improvements of information model and general updates are also shown, in particular we explain how other collaborations outside ATLAS could benefit the system as a computing resources information catalogue. AGIS is evolving towards a common information system, not coupled to a specific experiment.
Blackburn, J. F.
The emergence of the personal computer, the growing use of distributed systems, and the increasing demand for supercomputers and mini-supercomputers are causing a profound impact on the European computer market. An equally profound development in telecommunications is the integration of voice, data, and images in the public network systems - the Integrated Service Digital Network (ISDN). The programs being mounted in Europe to meet the challenges of these technologies are described. The Europe-wide trends and actions with respect to computers, telecommunications, and microelectronics are discussed, and the major European collaborative programs in these fields are described. Specific attention is given to the European Strategic Programme for Research and Development in Information (ESPRIT); Research in Advanced Communications for Europe (RACE); European Research Coordination Agency (Eureka) programs; Joint European Submicron Silicon Initiative (JESSI); and the recently combined programs Basic Research Industrial Technologies in Europe/European Research in Advanced Materials (BRITE/EURAM).
Annan, Nana Kofi
these ubiquitous ICTs can be used to facilitate teaching and learning, based on a conceptual framework, which uses mobile learning platform to supplement traditional face-to-face method of education. The study uses both qualitative and quantitative data with action research as the strategy of inquiry. The study......The emergence of mobile computing and communication technologies has come with it, an unprecedented transformation in digitalising every aspect of human activities. This transformation has brought about a high degree of mobility in the way knowledge is constructed, processed, stored...... and disseminated through the use of portable information and communication technologies (ICTs) such as smart phones, tablets, personal computers and laptop computers. These mobile devices use mobile communication infrastructure to promote the mobility affordances for human activities anywhere and anytime. Although...
As mobile computing technologies have been more powerful and inclusive in people's daily life, the issue of mobile assisted language learning (MALL) has also been widely explored in CALL research. Many researches on MALL consider the emerging mobile technologies have considerable potentials for the effective language learning. This review study…
Bîldea, Costin Sorin; Pătruţ, Cătălin; Jørgensen, Sten Bay
Process intensification in distillation systems has received much attention during the pastdecades, with the aim of increasing both energy and separation efficiency. Varioustechniques, such as internal heat-integrated distillation, membrane distillation, rotating packedbed, dividing-wall columns...... and reactive distillation were studied and reported in literature. All these techniques employ the conventional continuous counter-current contact of vapor andliquid phases. Cyclic distillation technology is based on an alternative operating mode usingseparate phase movement which leads to key practical...... advantages in both chemical andbiochemical processes. This article provides a mini-review of cyclic distillation technology.The topics covered include the working principle, design and control methods, main benefitsand limitations as well as current industrial applications. Cyclic distillation can...
Failor, B.; Upadhye, R.; Wheatcraft, D. [eds.
This review is published ten times a year to communicate, to a broad audience, Lawrence Livermore National Laboratory`s scientific and technological accomplishments in fulfilling its primary missions. The feature articles are `Taking Lasers beyond the National Ignition Facility` and `Jumpin` Jupiter! Metallic Hydrogen`. The first article describes the ultimate goal of laser fusion as the production of electricity by inertial confinement fusion. Advances in diode-laser technology promise to take another step closer to that goal. The latter article discusses a Laboratory team`s efforts to provide evidence for the metallization of hydrogen based on the team`s expertise in shock compression. A commentary on `The Next Frontiers of Advanced Lasers Research is provided, and a research highlight is given on `Modeling Human Joints and Prosthetic Implants.
Hules, J. [ed.
National Energy Research Scientific Computing Center (NERSC) provides researchers with high-performance computing tools to tackle science`s biggest and most challenging problems. Founded in 1974 by DOE/ER, the Controlled Thermonuclear Research Computer Center was the first unclassified supercomputer center and was the model for those that followed. Over the years the center`s name was changed to the National Magnetic Fusion Energy Computer Center and then to NERSC; it was relocated to LBNL. NERSC, one of the largest unclassified scientific computing resources in the world, is the principal provider of general-purpose computing services to DOE/ER programs: Magnetic Fusion Energy, High Energy and Nuclear Physics, Basic Energy Sciences, Health and Environmental Research, and the Office of Computational and Technology Research. NERSC users are a diverse community located throughout US and in several foreign countries. This brochure describes: the NERSC advantage, its computational resources and services, future technologies, scientific resources, and computational science of scale (interdisciplinary research over a decade or longer; examples: combustion in engines, waste management chemistry, global climate change modeling).
Robinson, M.T.; Hou, M.
In 1986, H. H. Andersen reviewed attempts to understand sputtering by computer simulation and identified several areas where further research was needed: potential energy functions for molecular dynamics (MD) modelling; the role of inelastic effects on sputtering, especially near the target surface; the modelling of surface binding in models based on the binary collision approximation (BCA); aspects of cluster emission in MD models; and angular distributions of sputtered particles. To these may be added kinetic energy distributions of sputtered particles and the relationships between MD and BCA models, as well as the development of intermediate models. Many of these topics are discussed. Recent advances in BCA modelling include the explicit evaluation of the time in strict BCA codes and the development of intermediate codes able to simulate certain many-particle problems realistically. Developments in MD modelling include the wide-spread use of many-body potentials in sputtering calculations, inclusion of realistic electron excitation and electron-phonon interactions, and several studies of cluster ion impacts on solid surfaces
This study examines the impact of technology on the USSR's social system from the perspective of Soviet ideological development. The analysis of information and computer technologies within this framework de-emphasizes both modernization theories and those that assume unchallenged Communist Party control over technological development. Previous studies have examined the level of Soviet technological achievements and the gap between this level and those in the West, many referring to ideological boundaries of Soviet technological development without, however, systematically analyzing the resulting implications for the Soviet ideology of Marxism-Leninism. This study develops a framework for analyzing the impact of new technologies in the USSR in the fields of technology, ideology, and the scientific and technological revolution. On the basis of this framework, examination turns to the relevant Soviet theoretical and technical literature and debates among Soviety elites, concluding that the introduction of information and computer technologies and the organization of computer networks has exacerbated tensions in Soviety Marxism-Leninism.
Full Text Available Jewelry production is a process of precious raw materials and low losses in processing. The traditional manual mode is unable to meet the needs of enterprises in reality, while the involvement of computer technology can just solve this practical problem. At present, the problem of restricting the application for computer in jewelry production is mainly a failure to find a production model that can serve the whole industry chain with the computer as the core of production. This paper designs a “synchronous and diversified” production model with “computer aided design technology” and “rapid prototyping technology” as the core, and tests with actual production cases, and achieves certain results, which are forward-looking and advanced.
Morgan, Paul; Rost, Daniel; Price, Daniel; Corcoran, Noel; Satake, Masaki; Hu, Peter; Peng, Danping; Yonenaga, Dean; Tolani, Vikram; Wolf, Yulian; Shah, Pinkesh
As optical lithography continues to extend into sub-0.35 k1 regime, mask defect inspection and subsequent review has become tremendously challenging, and indeed the largest component to mask manufacturing cost. The routine use of various resolution enhancement techniques (RET) have resulted in complex mask patterns, which together with the need to detect even smaller defects due to higher MEEFs, now requires an inspection engineer to use combination of inspection modes. This is achieved in 193nm AeraTM mask inspection systems wherein masks are not only inspected at their scanner equivalent aerial exposure conditions, but also at higher Numerical Aperture resolution, and special reflected-light, and single-die contamination modes, providing better coverage over all available patterns, and defect types. Once the required defects are detected by the inspection system, comprehensively reviewing and dispositioning each defect then becomes the Achilles heel of the overall mask inspection process. Traditionally, defects have been reviewed manually by an operator, which makes the process error-prone especially given the low-contrast in the convoluted aerial images. Such manual review also limits the quality and quantity of classifications in terms of the different types of characterization and number of defects that can practically be reviewed by a person. In some ways, such manual classification limits the capability of the inspection tool itself from being setup to detect smaller defects since it often results in many more defects that need to be then manually reviewed. Paper 8681-109 at SPIE Advanced Lithography 2013 discussed an innovative approach to actinic mask defect review using computational technology, and focused on Die-to-Die transmitted aerial and high-resolution inspections. In this approach, every defect is characterized in two different ways, viz., quantitatively in terms of its print impact on wafer, and qualitatively in terms of its nature and origin in
Full Text Available To overcome difficulties associated with conventional techniques, impressions with IOS (intraoral scanner and CAD/CAM (computer-aided design and manufacturing technologies were developed for dental practice. The last decade has seen an increasing number of optical IOS devices, and these are based on different technologies; the choice of which may impact on clinical use. To allow informed choice before purchasing or renewing an IOS, this article summarizes first the technologies currently used (light projection, distance object determination, and reconstruction. In the second section, the clinical considerations of each strategy such as handling, learning curve, powdering, scanning paths, tracking, and mesh quality are discussed. The last section is dedicated to the accuracy of files and of the intermaxillary relationship registered with IOS as the rendering of files in the graphical user interface is often misleading. This overview leads to the conclusion that the current IOS is adapted for a common practice, although differences exist between the technologies employed. An important aspect highlighted in this review is the reduction in the volume of hardware which has led to an increase in the importance of software-based technologies.
Full Text Available The main function of traditional proppants is to provide and maintain conductive fractures during well production where proppants should meet closure stress requirement and show resistance to diagenesis under downhole conditions. Many different proppants have been developed in the oil & gas industry, with various types, sizes, shapes, and applications. While most proppants are simply made of silica or ceramics, advanced proppants like ultra-lightweight proppant is also desirable since it reduces proppant settling and requires low viscosity fluids to transport. Additionally, multifunctional proppants may be used as a crude way to detect hydraulic fracture geometry or as matrices to slowly release downhole chemical additives, besides their basic function of maintaining conductive hydraulic fractures. Different from the conventional approach where proppant is pumped downhole in frac fluids, a revolutionary way to generate in-situ spherical proppants has been reported recently. This paper presents a comprehensive review of over 100 papers published in the past several decades on the subject. The objectives of this review study are to provide an overview of current proppant technologies, including different types, compositions, and shapes of proppants, new technologies to pump and organize proppants downhole such as channel fracturing, and also in-situ proppant generation. Finally, the paper sheds light on the current challenges and emphasizes needs for new proppant development for unconventional resources.
To meet the challenges of a radically new and technologically demanding century, the Federal Computing, Information, and Communications (CIC) programs are investing in long-term research and development (R and D) to advance computing, information, and communications in the United States. CIC R and D programs help Federal departments and agencies to fulfill their evolving missions, assure the long-term national security, better understand and manage the physical environment, improve health care, help improve the teaching of children, provide tools for lifelong training and distance learning to the workforce, and sustain critical US economic competitiveness. One of the nine committees of the National Science and Technology Council (NSTC), the Committee on Computing, Information, and Communications (CCIC)--through its CIC R and D Subcommittee--coordinates R and D programs conducted by twelve Federal departments and agencies in cooperation with US academia and industry. These R and D programs are organized into five Program Component Areas: (1) HECC--High End Computing and Computation; (2) LSN--Large Scale Networking, including the Next Generation Internet Initiative; (3) HCS--High Confidence Systems; (4) HuCS--Human Centered Systems; and (5) ETHR--Education, Training, and Human Resources. A brief synopsis of FY 1997 accomplishments and FY 1998 goals by PCA is presented. This report, which supplements the President`s Fiscal Year 1998 Budget, describes the interagency CIC programs.
Xie, Bo; Fan, Xiang; Li, Sijian
Navigation based on computer version technology, which has the characteristics of strong independence, high precision and is not susceptible to electrical interference, has attracted more and more attention in the filed of UAV navigation research. Early navigation project based on computer version technology mainly applied to autonomous ground robot. In recent years, the visual navigation system is widely applied to unmanned machine, deep space detector and underwater robot. That further stimulate the research of integrated navigation algorithm based on computer version technology. In China, with many types of UAV development and two lunar exploration, the three phase of the project started, there has been significant progress in the study of visual navigation. The paper expounds the development of navigation based on computer version technology in the filed of UAV navigation research and draw a conclusion that visual navigation is mainly applied to three aspects as follows.(1) Acquisition of UAV navigation parameters. The parameters, including UAV attitude, position and velocity information could be got according to the relationship between the images from sensors and carrier's attitude, the relationship between instant matching images and the reference images and the relationship between carrier's velocity and characteristics of sequential images.(2) Autonomous obstacle avoidance. There are many ways to achieve obstacle avoidance in UAV navigation. The methods based on computer version technology ,including feature matching, template matching, image frames and so on, are mainly introduced. (3) The target tracking, positioning. Using the obtained images, UAV position is calculated by using optical flow method, MeanShift algorithm, CamShift algorithm, Kalman filtering and particle filter algotithm. The paper expounds three kinds of mainstream visual system. (1) High speed visual system. It uses parallel structure, with which image detection and processing are
Available treatment technologies for methyl tertiary butyl ether (MTBE) contamination in soil, groundwater, and recovered groundwater are reviewed and assessed. MTBE contamination is becoming an important issue due to the increasing prevalence and regulation of this gasoline additive. In addition, MTBE is more soluble and more mobile in groundwater than most hydrocarbons, so it is usually the first gasoline constituent to reach sensitive receptors. Treatment of MTBE is complicated by its Henry's constant, which is lower than most other gasoline constituents. Furthermore, evidence of biodegradability of MTBE is mixed, and MTBE does not degrade rapidly abiotically. Groundwater pumping is usually employed to contain and collect MTBE-contaminated groundwater, often successfully because of its high aqueous solubility. Air sparging/soil vapor extraction is also successfully employed to treat MTBE, but its effectiveness is reduced by the low Henry's constant of MTBE. Sparging and other aerobic bioremediation approaches are hampered by the poor biodegradability of MTBE. Oxidation technologies, such as ozone injection, hold promise for rapid in situ remediation of MTBE. Treatment of recovered groundwater contaminated with MTBE is also problematic. MTBE adsorbs poorly to granular activated carbon; advanced oxidation processes are effective on MTBE, but entail high capital and operating costs; bioreactors are of questionable effectiveness on MTBE. Air stripping is usually the most cost-effective treatment technology for MTBE so long as the off gas from the air stripper can be discharged without treatment. However, off gas treatment is expensive, so groundwater is sometimes heated to reduce the requirement for stripping air
Vogt, R. L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Meissner, C. N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kotta, P. R. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
At Lawrence Livermore National Laboratory, we focus on science and technology research to ensure our nation’s security. We also apply that expertise to solve other important national problems in energy, bioscience, and the environment. Science & Technology Review is published eight times a year to communicate, to a broad audience, the Laboratory’s scientific and technological accomplishments in fulfilling its primary missions. The publication’s goal is to help readers understand these accomplishments and appreciate their value to the individual citizen, the nation, and the world. The Laboratory is operated by Lawrence Livermore National Security, LLC (LLNS), for the Department of Energy’s National Nuclear Security Administration. LLNS is a partnership involving Bechtel National, University of California, Babcock & Wilcox, Washington Division of URS Corporation, and Battelle in affiliation with Texas A&M University. More information about LLNS is available online at www.llnsllc.com. Please address any correspondence (including name and address changes) to S&TR, Mail Stop L-664, Lawrence Livermore National Laboratory, P.O. Box 808, Livermore, California 94551, or telephone (925) 423-3893. Our e-mail address is firstname.lastname@example.org. S&TR is available on the Web at str.llnl.gov.
Lundstrom, Stephen F.; Larsen, Ronald L.
Aircraft have become more and more dependent on computers (information processing) for improved performance and safety. It is clear that this activity will grow, since information processing technology has advanced by a factor of 10 every 5 years for the past 35 years and will continue to do so. Breakthroughs in device technology, from vacuum tubes through transistors to integrated circuits, contribute to this rapid pace. This progress is nearly matched by similar, though not as dramatic, advances in numerical software and algorithms. Progress has not been easy. Many technical and nontechnical challenges were surmounted. The outlook is for continued growth in capability but will require surmounting new challenges. The technology forecast presented in this report has been developed by extrapolating current trends and assessing the possibilities of several high-risk research topics. In the process, critical problem areas that require research and development emphasis have been identified. The outlook assumes a positive perspective; the projected capabilities are possible by the year 2000, and adequate resources will be made available to achieve them. Computer and information technology forecasts and the potential impacts of this technology on aeronautics are identified. Critical issues and technical challenges underlying the achievement of forecasted performance and benefits are addressed.
Quirk, W.A.; Canada, J.; de Vore, L.; Gleason, K.; Kirvel, R.; Kroopnick, H.; McElroy, L.; Sanford, N.M.; Van Dyke, P.T. [eds.
The Lawrence Livermore National Laboratory was established in 1952 to do research on nuclear weapons and magnetic fusion energy. Since then other major programs have been added, including laser fusion and laser isotope separation, biomedical and environmental science, strategic defense, and applied energy technology. These programs require basic research in chemistry, materials science, computer science, engineering and physics. This bulletin is published on a monthly basis to report on unclassified work in all of the programs. There are two articles in this issue. Herbert F. York reminisces about the early days in Livermore, emphasizing the legacy of E.O. Lawrence, and comments on the role of the Laboratory in the future. COG, a new,high-resolution code for modeling radiation transport is described. The code is a new Monte Carlo neutron/photon transport code that solves complex radiation shielding and nuclear criticality problems. It is now available for high-speed desktop workstations as well as mainframes.
Bearinger, J P
This months issue has the following articles: (1) Innovation Is Key to Prosperity and Security --Commentary by Erik J. Stenehjem; (2) Taking Ultrafast Snapshots of Material Changes--The dynamic transmission electron microscope captures images a million times faster than conventional instruments; (3) Automated Technology for Laser Fusion Systems--The first completely computer-controlled system for aligning laser beams is helping make fusion research possible; (4) Protecting the Nation through Secure Cargo--A new device tracks and monitors cargo containers during transit to improve national security; (5) Atom by Atom, Layer by Layer--Extremely thin sandwiches of materials called nanolaminates exhibit remarkable, highly useful properties; and (6) Predicting the Bizarre Properties of Plutonium--A supercomputing 'grand challenge' team has made highly precise predictions of the behavior of plutonium's most important solid phase.
Bearinger, J P
This month's issue has the following articles: (1) Innovative Materials Rise to the Radiation Challenge - Commentary by Bruce Warner; (2) The Hunt for Better Radiation Detection - New materials will help radiation detectors pick up weak signals and accurately identify illicit radioactive sources; (3) Time-Critical Technology Identifies Deadly Bloodborne Pathogens - A portable device can simultaneously distinguish up to five bloodborne pathogens in just minutes; (4) Defending Computer Networks against Attack - A Laboratory effort takes a new approach to detecting increasingly sophisticated cyber attacks; and (5) Imaging Cargo's Inner Secrets - Livermore-University of California collaborators are modeling a new radiographic technique for identifying nuclear materials concealed inside cargo containers.
The use of computational modelling in all areas of science and engineering has in recent years escalated to the point where it underpins much of current research. However, the distinction must be made between computer systems in which no knowledge of the underlying computer technology or computational theory is required and those areas of research where the mastery of computational techniques is of great value, almost essential, for final year undergraduates or masters students planning to pursue a career in research. Such a field of research in the latter category is continuum mechanics, and in particular non-linear material behaviour, which is the core topic of this book. The focus of the book on computational plasticity embodies techniques of relevance not only to academic researchers, but also of interest to industrialists engaged in the production of components using bulk or sheet forming processes. Of particular interest is the guidance on how to create modules for use with the commercial system Abaqus for specific types of material behaviour. The book is in two parts, the first of which contains six chapters, starting with microplasticity, but predominantly on continuum plasticity. The first chapter on microplasticty gives a brief description of the grain structure of metals and the existence of slip systems within the grains. This provides an introduction to the concept of incompressibility during plastic deformation, the nature of plastic yield and the importance of the critically resolved shear stress on the slip planes (Schmid's law). Some knowledge of the notation commonly used to describe slip systems is assumed, which will be familiar to students of metallurgy, but anyone with a more general engineering background may need to undertake additional reading to understand the various descriptions. Any lack of knowledge in this area however, is of no disadvantage as it serves only as an introduction and the book moves on quickly to continuum plasticity
Sarkar, Chandan Kumar
Responding to recent developments and a growing VLSI circuit manufacturing market, Technology Computer Aided Design: Simulation for VLSI MOSFET examines advanced MOSFET processes and devices through TCAD numerical simulations. The book provides a balanced summary of TCAD and MOSFET basic concepts, equations, physics, and new technologies related to TCAD and MOSFET. A firm grasp of these concepts allows for the design of better models, thus streamlining the design process, saving time and money. This book places emphasis on the importance of modeling and simulations of VLSI MOS transistors and
Noblin, Alice; Cortelyou-Ward, Kendall; Servan, Rosa M
Cloud computing technology has the potential to transform medical practices and improve patient engagement and quality of care. However, issues such as privacy and security and "fit" can make incorporation of the cloud an intimidating decision for many physicians. This article summarizes the four most common types of clouds and discusses their ideal uses, how they engage patients, and how they improve the quality of care offered. This technology also can be used to meet Meaningful Use requirements 1 and 2; and, if speculation is correct, the cloud will provide the necessary support needed for Meaningful Use 3 as well.
Millán, J. d. R.; Rupp, R.; Müller-Putz, G. R.; Murray-Smith, R.; Giugliemma, C.; Tangermann, M.; Vidaurre, C.; Cincotti, F.; Kübler, A.; Leeb, R.; Neuper, C.; Müller, K.-R.; Mattia, D.
In recent years, new research has brought the field of electroencephalogram (EEG)-based brain–computer interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely, “Communication and Control”, “Motor Substitution”, “Entertainment”, and “Motor Recovery”. We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user–machine adaptation algorithms, the exploitation of users’ mental states for BCI reliability and confidence measures, the incorporation of principles in human–computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices. PMID:20877434
Millán, J D R; Rupp, R; Müller-Putz, G R; Murray-Smith, R; Giugliemma, C; Tangermann, M; Vidaurre, C; Cincotti, F; Kübler, A; Leeb, R; Neuper, C; Müller, K-R; Mattia, D
In recent years, new research has brought the field of electroencephalogram (EEG)-based brain-computer interfacing (BCI) out of its infancy and into a phase of relative maturity through many demonstrated prototypes such as brain-controlled wheelchairs, keyboards, and computer games. With this proof-of-concept phase in the past, the time is now ripe to focus on the development of practical BCI technologies that can be brought out of the lab and into real-world applications. In particular, we focus on the prospect of improving the lives of countless disabled individuals through a combination of BCI technology with existing assistive technologies (AT). In pursuit of more practical BCIs for use outside of the lab, in this paper, we identify four application areas where disabled individuals could greatly benefit from advancements in BCI technology, namely, "Communication and Control", "Motor Substitution", "Entertainment", and "Motor Recovery". We review the current state of the art and possible future developments, while discussing the main research issues in these four areas. In particular, we expect the most progress in the development of technologies such as hybrid BCI architectures, user-machine adaptation algorithms, the exploitation of users' mental states for BCI reliability and confidence measures, the incorporation of principles in human-computer interaction (HCI) to improve BCI usability, and the development of novel BCI technology including better EEG devices.
Sameena; Mohd Inayatullah
Computers are probably one of the biggest scientific inventions of the modern era, and since then they have become an integral part of our life. The increased usage of computers have lead to variety of ocular symptoms which includ es eye strain, tired eyes, irritation, redness, blurred vision, and diplopia, collectively referred to as Computer Vision Syndrome (CVS). CVS may have a significant impact not only on visual com fort but also occupational productivit...
The rapid development in the field of computer networks and systems brings both convenience and security threats for users. Security threats include network security and data security. Network security refers to the reliability, confidentiality, integrity and availability of the information in the system. The main objective of network security is to maintain the authenticity, integrity, confidentiality, availability of the network. This paper introduces the details of the technologies used in...
Contemporary music education started late in China on the basis of western teaching theories formed its own unique system, which has a great influence on present computer music technology. This paper explores that contemporary music education is analyzed advantages and disadvantages of the influence on the development of Chinese class music, and the solutions are found out to the existing problems, summed up the reality enlightenment of that the contemporary music on the impact of education.
Full Text Available Contemporary music education started late in China on the basis of western teaching theories formed its own unique system, which has a great influence on present computer music technology. This paper explores that contemporary music education is analyzed advantages and disadvantages of the influence on the development of Chinese class music, and the solutions are found out to the existing problems, summed up the reality enlightenment of that the contemporary music on the impact of education.
As human beings, we trust our five senses, that allow us to experience the world and communicate. Since our birth, the amount of data that every day we can acquire is impressive and such a richness reflects the complexity of humankind in arts, technology, etc. The advent of computers and the consequent progress in Data Science and Artificial Intelligence showed how large amounts of data can contain some sort of “intelligence” themselves. Machines learn and create a superimposed layer of reali...
Dragt, Alex J.
Dr. Dragt of the University of Maryland is one of the Institutional Principal Investigators for the SciDAC Accelerator Modeling Project Advanced Computing for 21st Century Accelerator Science and Technology whose principal investigators are Dr. Kwok Ko (Stanford Linear Accelerator Center) and Dr. Robert Ryne (Lawrence Berkeley National Laboratory). This report covers the activities of Dr. Dragt while at Berkeley during spring 2002 and at Maryland during fall 2003
Garth, Christoph; Deines, Eduard; Joy, Kenneth I.; Bethel, E. Wes; Childs, Hank; Weber, Gunther; Ahern, Sean; Pugmire, Dave; Sanderson, Allen; Johnson, Chris
State-of-the-art computational science simulations generate large-scale vector field data sets. Visualization and analysis is a key aspect of obtaining insight into these data sets and represents an important challenge. This article discusses possibilities and challenges of modern vector field visualization and focuses on methods and techniques developed in the SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) and deployed in the open-source visualization tool, VisIt.
Science research in general and magnetic fusion research in particular continue to grow in size and complexity resulting in a concurrent growth in collaborations between experimental sites and laboratories worldwide. The simultaneous increase in wide area network speeds has made it practical to envision distributed working environments that are as productive as traditionally collocated work. In computing power, it has become reasonable to decouple production and consumption resulting in the ability to construct computing grids in a similar manner as the electrical power grid. Grid computing, the secure integration of computer systems over high speed networks to provide on-demand access to data analysis capabilities and related functions, is being deployed as an alternative to traditional resource sharing among institutions. For human interaction, advanced collaborative environments are being researched and deployed to have distributed group work that is as productive as traditional meetings. The DOE Scientific Discovery through Advanced Computing Program initiative has sponsored several collaboratory projects, including the National Fusion Collaboratory Project, to utilize recent advances in grid computing and advanced collaborative environments to further research in several specific scientific domains. For fusion, the collaborative technology being deployed is being used in present day research and is also scalable to future research, in particular, to the International Thermonuclear Experimental Reactor experiment that will require extensive collaboration capability worldwide. This paper briefly reviews the concepts of grid computing and advanced collaborative environments and gives specific examples of how these technologies are being used in fusion research today
Diaz Tovar, Carlos Axel
increase along with growing interest in biofuels, the oleochemical industry faces in the upcoming years major challenges in terms of design and development of better products and more sustainable processes to make them. Computer-aided methods and tools for process synthesis, modeling and simulation...... are widely used for design, analysis, and optimization of processes in the chemical and petrochemical industries. These computer-aided tools have helped the chemical industry to evolve beyond commodities toward specialty chemicals and ‘consumer oriented chemicals based products’. Unfortunately...... to develop systematic computer-aided methods (property models) and tools (database) related to the prediction of the necessary physical properties suitable for design and analysis of processes employing lipid technologies. The methods and tools include: the development of a lipid-database (CAPEC...
Toth, L.M.; Gat, U.; Del Cul, G.D.; Dai, S.; Williams, D.F.
The current status of molten salt reactor development is discussed with reference to the experience from the Oak Ridge Molten Salt Reactor Experiment. Assessment of the future for this reactor system is reviewed with consideration of both advantages and disadvantages. Application of this concept to ADTT (accelerator driven transmutation technology) needs appears to be feasible by drawing on the MSRE experience. Key chemical considerations remain as: solubility, redox behavior, and chemical activity and their importance to ADTT planning is briefly explained. Priorities in the future development of molten salts for these applications are listed, with the foremost being the acceptance of the 2LiF-BeF 2 solvent system. 8 refs, 2 figs
The area of technology classified as heat pumps generally refers to refrigerators, heat pumps and heat engines. This review is restricted to the literature on magnetic refrigerators and magnetic heat pumps which are referred to interchangeably. Significant progress has been made on the development of engineering prototypes of cryogenic, nonregenerative magnetic refrigerators utilizing conductive heat transfer in the 0.1 K to 20 K temperature range. Advances have also been made in analysis of regenerative magnetic refrigerators and heat pumps utilizing the active magnetic regeneration (AMR) concept. Units based on AMR are being modeled, designed and/or built to operate in various temperature ranges including 1.8-4.5 K, 4-15 K, 15-85 K, and 270-320 K. The near room temperature units have been scaled to 50 kW as both refrigerators and heat pumps. The progress of magnetic refrigeration over the last three years is summarized and discussed
Verma, Neelam; Bhardwaj, Atul
Pesticides, due to their lucrative outcomes, are majorly implicated in agricultural fields for crop production enhancement. Due to their pest removal properties, pesticides of various classes have been designed to persist in the environment over a longer duration after their application to achieve maximum effectiveness. Apart from their recalcitrant structure and agricultural benefits, pesticides also impose acute toxicological effects onto the other various life forms. Their accumulation in the living system may prove to be detrimental if established in higher concentrations. Thus, their prompt and accurate analysis is a crucial matter of concern. Conventional techniques like chromatographic techniques (HPLC, GC, etc.) used for pesticides detection are associated with various limitations like stumpy sensitivity and efficiency, time consumption, laboriousity, requirement of expensive equipments and highly trained technicians, and many more. So there is a need to recruit the methods which can detect these neurotoxic compounds sensitively, selectively, rapidly, and easily in the field. Present work is a brief review of the pesticide effects, their current usage scenario, permissible limits in various food stuffs and 21st century advancements of biosensor technology for pesticide detection. Due to their exceptional performance capabilities, easiness in operation and on-site working, numerous biosensors have been developed for bio-monitoring of various environmental samples for pesticide evaluation immensely throughout the globe. Till date, based on sensing element (enzyme based, antibody based, etc.) and type of detection method used (Electrochemical, optical, and piezoelectric, etc.), a number of biosensors have been developed for pesticide detection. In present communication, authors have summarized 21st century's approaches of biosensor technology for pesticide detection such as enzyme-based biosensors, immunosensors, aptamers, molecularly imprinted polymers, and
There is a Significant Relationship Between Computer Attitudes and Library Anxiety Among African American Graduate Students. A review of: Jiao, Qun G., and Anthony J. Onwuegbuzie. “The Impact of Information Technology on Library Anxiety: The Role of Computer Attitudes.” Information Technology & Libraries 23.4 (Dec. 2004: 138 ‐44.
Full Text Available Objective – To investigate whether African American students’ computer attitudes predict levels of library anxiety. Design – A user study in which two instruments were administered to a group of graduate students to measure computer attitudes and library anxiety. Setting – The College of Education at an historically black college and university in the United States of America. Subjects – Ninety ‐four, predominantly female, African American graduate students, ranging in age from 22 ‐62 years old, and enrolled in either a statistics or a measurement course. Methods – Two instruments, the Computer Attitude Scale (CAS and the Library Anxiety Scale (LAS were administered to all the study participants. The Computer Anxiety Scale contains forty Likert ‐type items that assess individuals’ attitudes toward computers and their use. It includes four scales which can be administered separately: 1. Anxiety or fear of computers 2. Confidence in the ability to use computers 3. Liking or enjoying working with computers 4. Computer usefulnessThe LAS contains forty ‐three, 5 ‐point, Likert ‐format items that assess levels of library anxiety experienced by college students. It also has five subscales as follows: 1. Barriers with staff 2. Affective barriers 3. Comfort with the library 4. Knowledge of the library 5. Mechanical barriers Main results – There were twenty correlations between the library anxiety subscale scores and the computer attitude subscale scores. Four of these correlations were statistically significant. Liking or enjoying working with computers was statistically significantly linked to affective barriers, comfort with the library, and knowledge of the library. There was also a statistically significant association between an attitude of computer usefulness and knowledge of the library. Conclusion – These findings suggest that in this group of students there is a medium to strong relationship between computer
Full Text Available Solid Oxide Fuel Cell (SOFC is one type of high temperature fuel cell that appears to be one of the most promising technology to provide the efficient and clean energy production for wide range of applications (from small units to large scale power plants. This paper reviews the current status and related researches on SOFC technologies. In details, the research trend for the development of SOFC components(i.e. anode, electrolyte, cathode, and interconnect are presented. Later, the current important designs of SOFC (i.e. Seal-less Tubular Design, Segmented Cell in Series Design, Monolithic Design and Flat Plate Design are exampled. In addition, the possible operations of SOFC (i.e. external reforming, indirect internal reforming, and direct internal reforming are discussed. Lastly, the research studies on applications of SOFCs with co-generation (i.e. SOFC with Combined Heat and Power (SOFC-CHP, SOFC with Gas Turbine (SOFC-GT and SOFC with chemical production are given.
Brownsell, Simon; Bradley, David; Blackburn, Steve; Cardinaux, Fabien; Hawley, Mark S
The evidence base for lifestyle monitoring is relatively weak, even though there are significant numbers of commercial installations around the world. We conducted a literature review to summarize the current position with regard to lifestyle monitoring based on sensors in the home. In total, 74 papers met the inclusion criteria. Only four papers reported trials involving 20 or more subjects, with a further 21 papers reporting trials involving one or more subjects. Most papers (n = 49) were concerned with technology development. Motion detection was the most common of the technologies employed, followed by door and electrical appliance usage. The predominant monitoring strategy was that of detecting changes in activity. However, little attention has been given to determining when or how changes in the profile of activity should be used to raise a call for assistance from a health or care professional. Lifestyle monitoring remains a relatively immature research area in which there is little detailed understanding of how to provide comprehensive and effective systems.
Wang, Wei-Cheng [National Renewable Energy Lab. (NREL), Golden, CO (United States); Tao, Ling [National Renewable Energy Lab. (NREL), Golden, CO (United States); Markham, Jennifer [National Renewable Energy Lab. (NREL), Golden, CO (United States); Zhang, Yanan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Tan, Eric [National Renewable Energy Lab. (NREL), Golden, CO (United States); Batan, Liaw [National Renewable Energy Lab. (NREL), Golden, CO (United States); Warner, Ethan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Biddy, Mary [National Renewable Energy Lab. (NREL), Golden, CO (United States)
Biomass-derived jet (biojet) fuel has become a key element in the aviation industry’s strategy to reduce operating costs and environmental impacts. Researchers from the oil-refining industry, the aviation industry, government, biofuel companies, agricultural organizations, and academia are working toward developing commercially viable and sustainable processes that produce long-lasting renewable jet fuels with low production costs and low greenhouse gas emissions. Additionally, jet fuels must meet ASTM International specifications and potentially be a 100% drop-in replacement for the current petroleum jet fuel. The combustion characteristics and engine tests demonstrate the benefits of running the aviation gas turbine with biojet fuels. In this study, the current technologies for producing renewable jet fuels, categorized by alcohols-to-jet, oil-to-jet, syngas-to-jet, and sugar-to-jet pathways, are reviewed. The main challenges for each technology pathway, including feedstock availability, conceptual process design, process economics, life-cycle assessment of greenhouse gas emissions, and commercial readiness, are discussed. Although the feedstock price and availability and energy intensity of the process are significant barriers, biomass-derived jet fuel has the potential to replace a significant portion of conventional jet fuel required to meet commercial and military demand.
Green, Kenneth C.
This report presents findings of a June 1998 survey of computing officials at 1,623 two- and four-year U.S. colleges and universities concerning the use of computer technology. The survey found that computing and information technology (IT) are now core components of the campus environment and classroom experience. However, key aspects of IT…
Cohen, B.I.; Cohen, R.H.; Byers, J.A.
The LLNL MFE Theory and Computations Program supports computational efforts in the following areas: (1) Magnetohydrodynamic equilibrium and stability; (2) Fluid and kinetic edge plasma simulation and modeling; (3) Kinetic and fluid core turbulent transport simulation; (4) Comprehensive tokamak modeling (CORSICA Project) - transport, MHD equilibrium and stability, edge physics, heating, turbulent transport, etc. and (5) Other: ECRH ray tracing, reflectometry, plasma processing. This report discusses algorithm and codes pertaining to these areas
Advanced scientific computational methods and their applications of nuclear technologies. (1) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (1)
Oka, Yoshiaki; Okuda, Hiroshi
Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the first issue showing their overview and introduction of continuum simulation methods. Finite element method as their applications is also reviewed. (T. Tanaka)
V. Madhu Viswanatham
Full Text Available The expansion of networking infrastructure has provided a novel way to store and access resources in a reliable, convenient and affordable means of technology called the Cloud. The cloud has become so popular and established its dominance in many recent world innovations and has highly influenced the trend of the Business process Management with the advantage of shared resources. The ability to remain disaster tolerant, on-demand scalability, flexible deployment and cost effectiveness has made the future world technologies like Internet of Things, to determine the cloud as their data and processing center. However, along with the implementation of cloud based technologies, we must also address the issues involved in its realization. This paper is a review on the advancements, scopes and issues involved in realizing a secured cloud powered environments.
Silver, A.; Kleinsasser, A.; Kerber, G.; Herr, Q.; Dorojevets, M.; Bunyk, P.; Abelson, L.
This paper describes our programme to develop and demonstrate ultra-high performance single flux quantum (SFQ) VLSI technology that will enable superconducting digital processors for petaFLOPS-scale computing. In the hybrid technology, multi-threaded architecture, the computational engine to power a petaFLOPS machine at affordable power will consist of 4096 SFQ multi-chip processors, with 50 to 100 GHz clock frequency and associated cryogenic RAM. We present the superconducting technology requirements, progress to date and our plan to meet these requirements. We improved SFQ Nb VLSI by two generations, to a 8 kA cm-2, 1.25 µm junction process, incorporated new CAD tools into our methodology, demonstrated methods for recycling the bias current and data communication at speeds up to 60 Gb s-1, both on and between chips through passive transmission lines. FLUX-1 is the most ambitious project implemented in SFQ technology to date, a prototype general-purpose 8 bit microprocessor chip. We are testing the FLUX-1 chip (5K gates, 20 GHz clock) and designing a 32 bit floating-point SFQ multiplier with vector-register memory. We report correct operation of the complete stripline-connected gate library with large bias margins, as well as several larger functional units used in FLUX-1. The next stage will be an SFQ multi-processor machine. Important challenges include further reducing chip supply current and on-chip power dissipation, developing at least 64 kbit, sub-nanosecond cryogenic RAM chips, developing thermally and electrically efficient high data rate cryogenic-to-ambient input/output technology and improving Nb VLSI to increase gate density.
Ploog, Bertram O; Scharf, Alexa; Nelson, DeShawn; Brooks, Patricia J
Major advances in multimedia computer technology over the past decades have made sophisticated computer games readily available to the public. This, combined with the observation that most children, including those with autism spectrum disorders (ASD), show an affinity to computers, has led researchers to recognize the potential of computer technology as an effective and efficient tool in research and treatment. This paper reviews the use of computer-assisted technology (CAT), excluding strictly internet-based approaches, to enhance social, communicative, and language development in individuals with ASD by dividing the vast literature into four main areas: language, emotion recognition, theory of mind, and social skills. Although many studies illustrate the tremendous promise of CAT to enhance skills of individuals with ASD, most lack rigorous, scientific assessment of efficacy relative to non-CAT approaches.
Trevino, Luis C.; Olcmen, Semih; Polites, Michael
The problem to be addressed in this paper is to explore how the use of Soft Computing Technologies (SCT) could be employed to further improve overall engine system reliability and performance. Specifically, this will be presented by enhancing rocket engine control and engine health management (EHM) using SCT coupled with conventional control technologies, and sound software engineering practices used in Marshall s Flight Software Group. The principle goals are to improve software management, software development time and maintenance, processor execution, fault tolerance and mitigation, and nonlinear control in power level transitions. The intent is not to discuss any shortcomings of existing engine control and EHM methodologies, but to provide alternative design choices for control, EHM, implementation, performance, and sustaining engineering. The approaches outlined in this paper will require knowledge in the fields of rocket engine propulsion, software engineering for embedded systems, and soft computing technologies (i.e., neural networks, fuzzy logic, and Bayesian belief networks), much of which is presented in this paper. The first targeted demonstration rocket engine platform is the MC-1 (formerly FASTRAC Engine) which is simulated with hardware and software in the Marshall Avionics & Software Testbed laboratory that
Kim, Jung Taek; Park, Won Man; Kim, Jung Soo; Seong, Soeng Hwan; Hur, Sub; Cho, Jae Hwan; Jung, Hyung Gue
The overall goal of this joint research project was to develop and demonstrate advanced sensors and computational technology for continuous monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This project included investigating and adapting several advanced sensor technologies from Korean and US national laboratory research communities, some of which were developed and applied in non-nuclear industries. The project team investigated and developed sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms. The researchers installed sensors and conducted condition monitoring tests on two test loops, a check valve (an active component) and a piping elbow (a passive component), to demonstrate the feasibility of using advanced sensors and computational technology to achieve the project goal. Acoustic emission (AE) devices, optical fiber sensors, accelerometers, and ultrasonic transducers (UTs) were used to detect mechanical vibratory response of check valve and piping elbow in normal and degraded configurations. Chemical sensors were also installed to monitor the water chemistry in the piping elbow test loop. Analysis results of processed sensor data indicate that it is feasible to differentiate between the normal and degraded (with selected degradation mechanisms) configurations of these two components from the acquired sensor signals, but it is questionable that these methods can reliably identify the level and type of degradation. Additional research and development efforts are needed to refine the differentiation techniques and to reduce the level of uncertainties
The start-up of LHC is foreseen to take place in the autumn and we will be in the public spotlight again. This increases the necessity to be vigilant with respect to computer security and the defacement of an experiment’s Web page in September last year shows that we should be particularly attentive. Attackers are permanently probing CERN and so we must all do the maximum to reduce future risks. Security is a hierarchical responsibility and requires to balance the allocation of resources between making systems work and making them secure. Thus all of us, whether users, developers, system experts, administrators, or managers are responsible for securing our computing assets. These include computers, software applications, documents, accounts and passwords. There is no "silver bullet" for securing systems, which can only be achieved by a painstaking search for all possible vulnerabilities followed by their mitigation. Additional advice on particular topics can be obtained from the relevant I...
Khalid A.O. Arafa
Full Text Available Objective: To assess the level of evidence that supports the quality of fit for removable partial denture (RPD fabricated by computer-aided designing/computer aided manufacturing (CAD/CAM and rapid prototyping (RP technology. Methods: An electronic search was performed in Google Scholar, PubMed, and Cochrane library search engines, using Boolean operators. All articles published in English and published in the period from 1950 until April 2017 were eligible to be included in this review. The total number of articles contained the search terms in any part of the article (including titles, abstracts, or article texts were screened, which resulted in 214 articles. After exclusion of irrelevant and duplicated articles, 12 papers were included in this systematic review. Results: All the included studies were case reports, except one study, which was a case series that recruited 10 study participants. The visual and tactile examination in the cast or clinically in the patient’s mouth was the most-used method for assessment of the fit of RPDs. From all included studies, only one has assessed the internal fit between RPDs and oral tissues using silicone registration material. The vast majority of included studies found that the fit of RPDs ranged from satisfactory to excellent fit. Conclusion: Despite the lack of clinical trials that provide strong evidence, the available evidence supported the claim of good fit of RPDs fabricated by new technologies using CAD/CAM.
Arafa, Khalid A O
To assess the level of evidence that supports the quality of fit for removable partial denture (RPD) fabricated by computer-aided designing/computer aided manufacturing (CAD/CAM) and rapid prototyping (RP) technology. Methods: An electronic search was performed in Google Scholar, PubMed, and Cochrane library search engines, using Boolean operators. All articles published in English and published in the period from 1950 until April 2017 were eligible to be included in this review. The total number of articles contained the search terms in any part of the article (including titles, abstracts, or article texts) were screened, which resulted in 214 articles. After exclusion of irrelevant and duplicated articles, 12 papers were included in this systematic review. Results: All the included studies were case reports, except one study, which was a case series that recruited 10 study participants. The visual and tactile examination in the cast or clinically in the patient's mouth was the most-used method for assessment of the fit of RPDs. From all included studies, only one has assessed the internal fit between RPDs and oral tissues using silicone registration material. The vast majority of included studies found that the fit of RPDs ranged from satisfactory to excellent fit. Conclusion: Despite the lack of clinical trials that provide strong evidence, the available evidence supported the claim of good fit of RPDs fabricated by new technologies using CAD/CAM.
This book offers a state of the art collection covering themes related to Advanced Intelligent Computational Technologies and Decision Support Systems which can be applied to fields like healthcare assisting the humans in solving problems. The book brings forward a wealth of ideas, algorithms and case studies in themes like: intelligent predictive diagnosis; intelligent analyzing of medical images; new format for coding of single and sequences of medical images; Medical Decision Support Systems; diagnosis of Down’s syndrome; computational perspectives for electronic fetal monitoring; efficient compression of CT Images; adaptive interpolation and halftoning for medical images; applications of artificial neural networks for real-life problems solving; present and perspectives for Electronic Healthcare Record Systems; adaptive approaches for noise reduction in sequences of CT images etc.
Malony, Allen D. [Univ. of Oregon, Eugene, OR (United States); Shende, Sameer [Univ. of Oregon, Eugene, OR (United States)
Our accomplishments over the last three years of the DOE project Application- Specific Performance Technology for Productive Parallel Computing (DOE Agreement: DE-FG02-05ER25680) are described below. The project will have met all of its objectives by the time of its completion at the end of September, 2008. Two extensive yearly progress reports were produced in in March 2006 and 2007 and were previously submitted to the DOE Office of Advanced Scientific Computing Research (OASCR). Following an overview of the objectives of the project, we summarize for each of the project areas the achievements in the first two years, and then describe in some more detail the project accomplishments this past year. At the end, we discuss the relationship of the proposed renewal application to the work done on the current project.
Background. Studies indicate that computed radiography (CR) can lead to increased radiation dose to patients. It is therefore important to relate the exposure indicators provided by CR manufacturers to the radiation dose delivered so as to assess the radiation dose delivered to patients directly from the exposure indicators.
Kegel, Roeland Hendrik,Pieter; Barth, Susanne; Klaassen, Randy; Wieringa, Roelf J.
Although there have been many attempts to define the concept `computer literacy', no consensus has been reached: many variations of the concept exist within literature. The majority of papers does not explicitly define the concept at all, instead using an unjustified subset of elements related to
Dec 5, 2011 ... Computational fluid dynamics is a tool that has been used in recent years to develop numerical models that improve our understanding of the interaction of variables that make up the climate inside greenhouses. In the past five years, more realistic studies have appeared due mainly to the development of ...
Iankoulova, Iliana; Daneva, Maia; Rolland, C; Castro, J.; Pastor, O
Many publications have dealt with various types of security requirements in cloud computing but not all types have been explored in sufficient depth. It is also hard to understand which types of requirements have been under-researched and which are most investigated. This paper's goal is to provide
Patel, S.; Durack, C.; Abella, F.; Shemesh, H.; Roig, M.; Lemberg, K.
Cone beam computed tomography (CBCT) produces undistorted three-dimensional information of the maxillofacial skeleton, including the teeth and their surrounding tissues with a lower effective radiation dose than computed tomography. The aim of this paper is to: (i) review the current literature on
Eckert, Marcel; Meyer, Dominik; Haase, Jan; Klauer, Bernd
One of the key future challenges for reconfigurable computing is to enable higher design productivity and a more easy way to use reconfigurable computing systems for users that are unfamiliar with the underlying concepts. One way of doing this is to provide standardization and abstraction, usually supported and enforced by an operating system. This article gives historical review and a summary on ideas and key concepts to include reconfigurable computing aspects in operating systems. The arti...
Meng, Weizhi; Tischhauser, Elmar Wolfgang; Wang, Qingju
With the purpose of identifying cyber threats and possible incidents, intrusion detection systems (IDSs) are widely deployed in various computer networks. In order to enhance the detection capability of a single IDS, collaborative intrusion detection networks (or collaborative IDSs) have been...... developed, which allow IDS nodes to exchange data with each other. However, data and trust management still remain two challenges for current detection architectures, which may degrade the effectiveness of such detection systems. In recent years, blockchain technology has shown its adaptability in many...... fields such as supply chain management, international payment, interbanking and so on. As blockchain can protect the integrity of data storage and ensure process transparency, it has a potential to be applied to intrusion detection domain. Motivated by this, this work provides a review regarding...
Vrabel, Debra A.
The media environment of the future may be dramatically different from what exists today. As new computing and communications technologies evolve and synthesize to form a global, integrated communications system of networks, public domain hardware and software, and consumer products, it will be possible for citizens to fulfill most information needs at any time and from any place, to obtain desired information easily and quickly, to obtain information in a variety of forms, and to experience and interact with information in a variety of ways. This system will transform almost every institution, every profession, and every aspect of human life--including the creation, packaging, and distribution of news and information by media organizations. This paper presents one vision of a 21st century global information system and how it might be used by citizens. It surveys some of the technologies now on the market that are paving the way for new media environment.
A symposium on the technology and applications of computational fluid dynamics (CFD) was held in Pretoria from 21-23 Nov 1988. The following aspects were covered: multilevel adaptive methods and multigrid solvers in CFD, a symbolic processing approach to CFD, interplay between CFD and analytical approximations, CFD on a transfer array, the application of CFD in high speed aerodynamics, numerical simulation of laminar blood flow, two-phase flow modelling in nuclear accident analysis, and the finite difference scheme for the numerical solution of fluid flow
This book is a collection of the accepted papers concerning soft computing in information communication technology. The resultant dissemination of the latest research results, and the exchanges of views concerning the future research directions to be taken in this field makes the work of immense value to all those having an interest in the topics covered. The present book represents a cooperative effort to seek out the best strategies for effecting improvements in the quality and the reliability of Fuzzy Logic, Machine Learning, Cryptography, Pattern Recognition, Bioinformatics, Biomedical Engineering, Advancements in ICT.
Gray, W.H.; Neal, R.E.; Cobb, C.K.
Addressing a technical plan developed in consideration with major US manufacturers, software and hardware providers, and government representatives, the Technologies Enabling Agile Manufacturing (TEAM) program is leveraging the expertise and resources of industry, universities, and federal agencies to develop, integrate, and deploy leap-ahead manufacturing technologies. One of the TEAM program`s goals is to transition products from design to production faster, more efficiently, and at less cost. TEAM`s technology development strategy also provides all participants with early experience in establishing and working within an electronic enterprise that includes access to high-speed networks and high-performance computing and storage systems. The TEAM program uses the cross-cutting tools it collects, develops, and integrates to demonstrate and deploy agile manufacturing capabilities for three high-priority processes identified by industry: material removal, sheet metal forming, electro-mechanical assembly. This paper reviews the current status of the TEAM program with emphasis upon TEAM`s information infrastructure.
Srivastava, HM; Venturino, Ezio; Resch, Michael; Gupta, Vijay
The book discusses important results in modern mathematical models and high performance computing, such as applied operations research, simulation of operations, statistical modeling and applications, invisibility regions and regular meta-materials, unmanned vehicles, modern radar techniques/SAR imaging, satellite remote sensing, coding, and robotic systems. Furthermore, it is valuable as a reference work and as a basis for further study and research. All contributing authors are respected academicians, scientists and researchers from around the globe. All the papers were presented at the international conference on Modern Mathematical Methods and High Performance Computing in Science & Technology (M3HPCST 2015), held at Raj Kumar Goel Institute of Technology, Ghaziabad, India, from 27–29 December 2015, and peer-reviewed by international experts. The conference provided an exceptional platform for leading researchers, academicians, developers, engineers and technocrats from a broad range of disciplines ...
Rakhimova, Alina E.; Yashina, Marianna E.; Mukhamadiarova, Albina F.; Sharipova, Astrid V.
The article deals with the description of the process of development sociocultural knowledge and competences using computer technologies. On the whole the development of modern computer technologies allows teachers to broaden trainees' sociocultural outlook and trace their progress online. Observation of modern computer technologies and estimation…
Baranovski, A.; Loebel-Carpenter, L.; Garzoglio, G.; Herber, R.; Illingworth, R.; Kennedy, R.; Kreymer, A.; Kumar, A.; Lueking, L.; Lyon, A.; Merritt, W.; Terekhov, I.; Trumbo, J.; Veseli, S.; White, S.; St. Denis, R.; Jain, S.; Nishandar, A.
SAMGrid is a globally distributed system for data handling and job management, developed at Fermilab for the D0 and CDF experiments in Run II. The Condor system is being developed at the University of Wisconsin for management of distributed resources, computational and otherwise. We briefly review the SAMGrid architecture and its interaction with Condor, which was presented earlier. We then present our experiences using the system in production, which have two distinct aspects. At the global level, we deployed Condor-G, the Grid-extended Condor, for the resource brokering and global scheduling of our jobs. At the heart of the system is Condor's Matchmaking Service. As a more recent work at the computing element level, we have been benefiting from the large computing cluster at the University of Wisconsin campus. The architecture of the computing facility and the philosophy of Condor's resource management have prompted us to improve the application infrastructure for D0 and CDF, in aspects such as parting with the shared file system or reliance on resources being dedicated. As a result, we have increased productivity and made our applications more portable and Grid-ready. Our fruitful collaboration with the Condor team has been made possible by the Particle Physics Data Grid
Cai, Guoqiang; Liu, Weibin; Xing, Weiwei
Proceedings of the 2012 International Conference on Information Technology and Software Engineering presents selected articles from this major event, which was held in Beijing, December 8-10, 2012. This book presents the latest research trends, methods and experimental results in the fields of information technology and software engineering, covering various state-of-the-art research theories and approaches. The subjects range from intelligent computing to information processing, software engineering, Web, unified modeling language (UML), multimedia, communication technologies, system identification, graphics and visualizing, etc. The proceedings provide a major interdisciplinary forum for researchers and engineers to present the most innovative studies and advances, which can serve as an excellent reference work for researchers and graduate students working on information technology and software engineering. Prof. Wei Lu, Dr. Guoqiang Cai, Prof. Weibin Liu and Dr. Weiwei Xing all work at Beijing Jiaotong Uni...
The inspection process within the Department of Energy (DOE) serves the function of analyzing and reporting on the performance of security measures and controls in specific areas at sites throughout DOE. Three aspects of this process are discussed based on experience in computer security: Policy basis of performance inspections; Role and form of standards and criteria in inspections; and Conducting an inspection using the standards and criteria. Inspections are based on DOE and other applicable policy in each area. These policy statements have a compliance orientation in which the paper trail is often more clearly discernible than the security intention. The relationship of policy to performance inspections is discussed. To facilitate bridging the gap between the paper trail and the security intention defined by policy, standards and criteria were developed in each area. The consensus process and structure of the resulting product for computer security are discussed. Standards and criteria are inspection tools that support the site in preparing for an inspection and the inspector in conducting one. They form a systematic approach that facilitates consistency in the analysis and reporting of inspection results. Experience using the computer security standards and criteria is discussed
Liou, Hsien-Chin; Peng, Zhong-Yan
The interactive functions of weblogs facilitate computer-mediated peer reviews for collaborative writing. As limited research has been conducted on examining the training effects of peer reviews on students' peer comments, their revision quality, and their perceptions when composing in weblogs, the present case study aims to fill the gap. Thirteen…
Full Text Available For Canada to compete effectively in the digital world, beginning teachers need to play an important role in integrating computer technology into the curriculum. Equipment and connectivity do not guarantee successful or productive use of computers in the classroom, but the combination of the teaching style and technology use has the potential to change education. In this research, the computer self-efficacy beliefs of 210 preservice teachers after their first practice teaching placements were examined. First, the quantitative component of the study involved the use of Computer User Self-Efficacy (CUSE scale where students’ previous undergraduate degree, licensure area, experience and familiarity with software packages were found to have statistically significant effects on computer self-efficacy. Second, the qualitative data indicated that society and school were the most positive factors that influenced preservice teachers’ attitudes towards computers, while the family had the highest percentage of negative influence. Findings reveal that although preservice teachers had completed only two months of the program, those with higher CUSE scores were more ready to integrate computers into their lessons than those with lower scores. Résumé: Pour que le Canada puisse entrer en compétition dans le monde numérique, les nouveaux enseignants devront jouer un rôle important d’intégration des technologies informatiques dans le curriculum. Les équipements et la connectivité ne garantissent pas une utilisation gagnante ou productive de l’ordinateur en salle de classe, mais la combinaison de styles d’enseignement et d’usages de la technologie a le potentiel de changer l’éducation. Dans cette étude, les croyances d’auto-efficacité à l’ordinateur de 210 futurs enseignants après leur première affectation ont été examinées. Premièrement, la partie quantitative de l’étude impliquait l’utilisation de l’échelle du Computer
Santiago Jaime Reyes
Full Text Available The experience developing modern digital programs with highly qualified profesoors with several years of teaching postgraduate biological sciences matters is described. A small group of selected professors with a minimum knowledged or basic domain in computer software were invited to develop digital programs in the items of their interest,the purpose is to establish the bases for construction of an available digital library. The products to develop are a series of CD-ROM with program source in HTML format. The didactic strategy responds to a personal tutorship, step by step workshop, to build its own project (without programming languages. The workshop begins generating trust in very simple activities. It is designed to learn building and to advance evaluating the progress. It is fulfilled the necessity to put up-to-date the available material that regularly uses to impart the classes (video, slides, pictures, articles, examples etc. The information and computing technologies ICT are a indispensable tool to diffuse the knowledge to a coarser and more diverse public in the topics of their speciality. The obtained products are 8 CD ROM with didactic programs designed with scientific and technological bases.
Bailey, David H.; Lefton, Lew
On one hand, the field of high-performance scientific computing is thriving beyond measure. Performance of leading-edge systems on scientific calculations, as measured say by the Top500 list, has increased by an astounding factor of 8000 during the 15-year period from 1993 to 2008, which is slightly faster even than Moore's Law. Even more importantly, remarkable advances in numerical algorithms, numerical libraries and parallel programming environments have led to improvements in the scope of what can be computed that are entirely on a par with the advances in computing hardware. And these successes have spread far beyond the confines of large government-operated laboratories, many universities, modest-sized research institutes and private firms now operate clusters that differ only in scale from the behemoth systems at the large-scale facilities. In the wake of these recent successes, researchers from fields that heretofore have not been part of the scientific computing world have been drawn into the arena. For example, at the recent SC07 conference, the exhibit hall, which long has hosted displays from leading computer systems vendors and government laboratories, featured some 70 exhibitors who had not previously participated. In spite of all these exciting developments, and in spite of the clear need to present these concepts to a much broader technical audience, there is a perplexing dearth of training material and textbooks in the field, particularly at the introductory level. Only a handful of universities offer coursework in the specific area of highly parallel scientific computing, and instructors of such courses typically rely on custom-assembled material. For example, the present reviewer and Robert F. Lucas relied on materials assembled in a somewhat ad-hoc fashion from colleagues and personal resources when presenting a course on parallel scientific computing at the University of California, Berkeley, a few years ago. Thus it is indeed refreshing
Advanced scientific computational methods and their applications to nuclear technologies. (4) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (4)
Sekimura, Naoto; Okita, Taira
Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the fourth issue showing the overview of scientific computational methods with the introduction of continuum simulation methods and their applications. Simulation methods on physical radiation effects on materials are reviewed based on the process such as binary collision approximation, molecular dynamics, kinematic Monte Carlo method, reaction rate method and dislocation dynamics. (T. Tanaka)
Qureshi, M. Zubair Akbar; Ashraf, Muhammad
Nanofluids and heat transfer enhancement in real systems continue to be a widely research area of nanotechnology. An effort has been made to give a comprehensive review on time-wise development from different aspects of the nanofluids. The exceptional structures of nanofluids, for example, dispersion of nanoparticles volume fraction, thermophoresis phenomenon, Brownian motion, improvement in thermal conductivity, and especially heat transfer enhancement, etc., have been addressed in a mathematical perspective. The influence of important parameters like particle's (loading, material, size and shape-factor), base fluids type, temperature, additives, clustering and p H value has been considered. In addition, the summary-chart is presented for a better understanding of the mathematical structure of the Newtonian as well as non-Newtonian nanofluids. Some important results have been discussed for future work. This review article will be helpful for scientists and researchers.
The 2013 Building Technologies Office Program Peer Review Report summarizes the results of the 2013 Building Technologies Office (BTO) peer review, which was held in Washington, D.C., on April 2–4, 2013. The review was attended by over 300 participants and included presentations on 59 BTO-funded projects: 29 from BTO’s Emerging Technologies Program, 20 from the Commercial Buildings Integration Program, 6 from the Residential Buildings Integration Program, and 4 from the Building Energy Codes Program. This report summarizes the scores and comments provided by the independent reviewers for each project.
Quirk, W.J.; Canada, J.; de Vore, L.; Gleason, K.; Kirvel, R.D.; Kroopnick, H.; McElroy, L. [eds.
For the 40-plus years of the Cold War, both the United States and the Soviet Union built up nuclear stockpiles of tens of thousands of weapons. Now, as the Cold War has ended and tensions between the superpowers have subsided, the US faces the task of significantly reducing its nuclear arsenal. Many thousands of nuclear weapons are being removed from the stockpile as a result of recent treaties and unilateral decisions. This issue of Energy and Technology Review describes the Laboratory`s role in the nation`s effort to dismantle these weapons safely and rapidly. The dismantlement of the United States` nuclear weapons takes place at the Department of Energy`s Pantex facility near Amarillo, Texas. The first article in this issue summarizes the Laboratory`s involvement in dismantling Livermore-designed nuclear weapons. LLNL (like Los Alamos) has responsibility for the weapons it designed, from design concept to retirement. In the past, the responsibilities ended when the weapon was retired from the stockpile. Now however, the role has been extended to include dismantlement. The second article reports on an incident that occurred in November 1992, in which the pit of a W48 warhead cracked during dismantlement. The Laboratory was called upon to handle the pit safely and determine the causes of the cracking. The third article explores a variety of methods proposed for reusing the high explosives after they are removed from the weapon. In the past, Laboratory work on nuclear weapons focused primarily on design and development. However, as the size and composition of the US stockpile changes with evolving international conditions, they will be called upon with increasing frequency to provide the scientific and technical expertise needed to dismantle the nation`s retired nuclear weapons safely and efficiently.
Full Text Available Cloud computing is one of the rising technologies that takes set of connections users to the next level. Cloud is a technology where resources are paid as per usage rather than owned. One of the major challenges in this technology is Security. Biometric systems provide the answer to ensure that the rendered services are accessed only by a legal user or an authorized user and no one else. Biometric systems recognize users based on behavioral or physiological characteristics. The advantages of such systems over traditional validation methods such as passwords and IDs are well known and hence biometric systems are progressively gaining ground in terms of usage. This paper brings about a new replica of a security system where in users have to offer multiple biometric finger prints during Enrollment for a service. These templates are stored at the cloud providers section. The users are authenticated based on these finger print designed templates which have to be provided in the order of arbitrary numbers or imaginary numbers that are generated every time continuously. Both finger prints templates and images are present and they provided every time duration are encrypted or modified for enhanced security.
Bookless, W.A.; Stull, S.; Cassady, C.; Kaiper, G.; Ledbetter, G.; McElroy, L.; Parker, A. [eds.
This issue of Energy and Technology Review highlights the Laboratory`s 1994 accomplishments in their mission areas and core programs--economic competitiveness, national security, lasers, energy, the environment, biology and biotechnology, engineering, physics and space science, chemistry and materials science, computations, and science and math education. LLNL is a major national resource of science and technology expertise, and they are committed to applying this expertise to meet vital national needs.
In response to requests by Member States, the Secretariat produces a comprehensive Nuclear Technology Review each year. Attached is this year's report, which highlights notable developments principally in 2012. The Nuclear Technology Review 2013 covers the following areas: power applications, atomic and nuclear data, accelerators and research reactors, and nuclear sciences and applications. Additional documentation associated with the Nuclear Technology Review 2013 is available on the Agency's website1 in English on nuclear hydrogen production technology and preliminary lessons learned from the Fukushima Daiichi accident for advanced nuclear power plant technology development. Information on the IAEA's activities related to nuclear science and technology can also be found in the IAEA's Annual Report 2012 (GC(57)/3), in particular the Technology section, and the Technical Cooperation Report for 2012 (GC(57)/INF/4). The document has been modified to take account, to the extent possible, of specific comments by the Board of Governors and other comments received from Member States. (author)
Satake, Shin-ichi; Kunugi, Tomoaki
Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the third issue showing the introduction of continuum simulation methods and their applications. Spectral methods and multi-interface calculation methods in fluid dynamics are reviewed. (T. Tanaka)
Green, Kenneth C.
This report presents the findings of a June, 1996, survey of computing officials at 660 two- and four-year colleges and universities across the United States concerning the use of computer technology on college campuses. The survey found that instructional integration and user support emerged as the two most important information technology (IT)…
Neshati, Ramin; Watt, Russell; Eastham, James
Developing new products, services, systems, and processes has become an imperative for any firm expecting to thrive in today’s fast-paced and hyper-competitive environment. This volume integrates academic and practical insights to present fresh perspectives on new product development and innovation, showcasing lessons learned on the technological frontier. The first part emphasizes decision making. The second part focuses on technology evaluation, including cost-benefit analysis, material selection, and scenarios. The third part features in-depth case studies to present innovation management tools, such as customer needs identification, technology standardization, and risk management. The fourth part highlights important international trends, such as globalization and outsourcing. Finally the fifth part explores social and political aspects.
is secrecy: reviewer anonymity and reviewee non-anonymity ( Altura , 1990; Clayson, 1995; Gresty, 1995; Neetens, 1995). If honest and frank viewpoints...personal gain. BIBLIOGRAPHY Altura , B.T. 1990. Is Anonymous Peer-Review The Best Way To Review And Accept Manuscripts? In: Magnesium And Trace Elements. 9
O'Hara, J.; Brown, W.; Granda, T.; Baker, C.
Advanced control rooms (ACRs) for future nuclear power plants are being designed utilizing computer-based technologies. The US Nuclear Regulatory Commission reviews the human engineering aspects of such control rooms to ensure that they are designed to good human factors engineering principles and that operator performance and reliability are appropriately supported in order to protect public health and safety. This paper describes the rationale, general approach, and initial development of an NRC Advanced Control Room Design Review Guideline. 20 refs., 1 fig
Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp
Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.
The 2015 U.S. Department of Energy (DOE) Fuel Cell Technologies Office (FCTO) and Vehicle Technologies Office (VTO) Annual Merit Review and Peer Evaluation Meeting (AMR) was held June 8-12, 2015, in Arlington, Virginia. The review encompassed all of the work done by the FCTO and the VTO: 258 individual activities were reviewed for VTO, by 170 reviewers. A total of 1,095 individual review responses were received for the VTO technical reviews. The objective of the meeting was to review the accomplishments and plans for VTO over the previous 12 months, and provide an opportunity for industry, government, and academia to give inputs to DOE on the Office with a structured and formal methodology. The meeting also provided attendees with a forum for interaction and technology information transfer.
The 2014 U.S. Department of Energy (DOE) Fuel Cell Technologies Office (FCTO) and Vehicle Technologies Office (VTO) Annual Merit Review and Peer Evaluation Meeting (AMR) was held June 16-20, 2014, in Washington, DC. The review encompassed all of the work done by the FCTO and the VTO: a total of 295 individual activities were reviewed for VTO, by a total of 179 reviewers. A total of 1,354 individual review responses were received for the VTO technical reviews. The objective of the meeting was to review the accomplishments and plans for VTO over the previous 12 months, and provide an opportunity for industry, government, and academia to give inputs to DOE on the Office with a structured and formal methodology. The meeting also provided attendees with a forum for interaction and technology information transfer.
The 2013 U.S. Department of Energy (DOE) Fuel Cell Technologies Office (FCTO) and Vehicle Technologies Office (VTO) Annual Merit Review and Peer Evaluation Meeting (AMR) was held May 13-17, 2013, in Crystal City, Virginia. The review encompassed all of the work done by the FCTO and the VTO: a total of 287 individual activities were reviewed for VTO, by a total of 187 reviewers. A total of 1,382 individual review responses were received for the VTO technical reviews. The objective of the meeting was to review the accomplishments and plans for VTO over the previous 12 months, and provide an opportunity for industry, government, and academia to give inputs to DOE on the Office with a structured and formal methodology. The meeting also provided attendees with a forum for interaction and technology information transfer.
Clough, Bonnie A; Casey, Leanne M
Although there are several of reviews of technology in psychology, none to date has focused on technological adjuncts for improving traditional face to face therapy. However, examination of response, adherence, and dropout rates suggests there is considerable scope for improving traditional face to face services. The purpose of this paper was to examine technological adjuncts used to enhance psychotherapy practice. This review focused only on those technologies designed to supplement or enhance traditional therapy methods. Adjuncts designed to reduce direct therapist contact or change the medium of communication were not included. Adjuncts reviewed were mobile phones, personal digital assistants, biofeedback and virtual reality. Limitations in the current literature and directions for future research were identified and discussed. This review provides a comprehensive examination of the way in which adjunctive technologies may be incorporated into face to face therapy. Copyright © 2011 Elsevier Ltd. All rights reserved.
Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul
Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application
Schultz, Jan R...; Hodges, Harry N.
Fiber optic technology offers many advantages for upgrading nuclear survivability in systems such as the Airborne Command Post EC-135 aircraft, including weight and cost savings, EMI and EMC immunity, high data rates. The greatest advantage seen for nuclear survivable systems, however, is that a fiber optic system's EMP hardness can be maintained more easily with the use of fiber optics than with shielded cables or other protective methods. TRW recently completed a study to determine the feasibility of using fiber optic technology in an EC-135 aircraft environment. Since this study was conducted for a USAF Logistics Command Agency, a feasible system had to be one which could be realistically priced by an integrating contractor. Thus, any fiber optic approach would have to be well developed before it could be considered feasible. During the course of the study problem areas were encountered which are associated with the readiness of the technology for use rather than with the technology itself. These included connectors, standards, fiber radiation resistance, busing, maintenance, and logistics. Because these problems areas have not been resolved, it was concluded that fiber optic technology, despite its advantages, is not ready for directed procurement (i.e., included as a requirement in a prime mission equipment specification). However, offers by a manufacturer to use fiber optic technology in lieu of conventional technology should be considered. This paper treats these problems in more detail, addresses the areas which need further development, and discusses the hardness maintenance advantages of using fiber optic technology.
Shojaei, Abouzar; Motallebzadeh, Khalil
This book is a very helpful book which gives us information and knowledge of using technology in language learning and teaching. It contains detailed consideration to articulatory and auditory Language learning as well as to the practicalities of English language learning. The book discusses the relationship between English language learning and technology.
Markov, E.M.; Voronezhtsev, Yu.I.; Gol'dade, V.A.
Publications on the laser technologies of ceramic coating production, ceramics treatment and ceramics manufacture are analyzed for the past 5 years. Features of production processes utilizing the interaction of laser radiation with ceramics and other substances which form the ceramics as a result of such interaction are considered. Possible ways of improving laser technologies of ceramics treatment are outlined
Shojaei, Abouzar; Motallebzadeh, Khalil
This book is a very helpful book which gives us information and knowledge of using technology in language learning and teaching. It contains detailed consideration to articulatory and auditory Language learning as well as to the practicalities of English language learning. The book discusses the relationship between English language learning and technology.
Noar, Seth M; Black, Hulda G; Pierce, Larson B
To conduct a meta-analysis of computer technology-based HIV prevention behavioral interventions aimed at increasing condom use among a variety of at-risk populations. Systematic review and meta-analysis of existing published and unpublished studies testing computer-based interventions. Meta-analytic techniques were used to compute and aggregate effect sizes for 12 randomized controlled trials that met inclusion criteria. Variables that had the potential to moderate intervention efficacy were also tested. The overall mean weighted effect size for condom use was d = 0.259 (95% confidence interval = 0.201, 0.317; Z = 8.74, P partners, and incident sexually transmitted diseases. In addition, interventions were significantly more efficacious when they were directed at men or women (versus mixed sex groups), utilized individualized tailoring, used a Stages of Change model, and had more intervention sessions. Computer technology-based HIV prevention interventions have similar efficacy to more traditional human-delivered interventions. Given their low cost to deliver, ability to customize intervention content, and flexible dissemination channels, they hold much promise for the future of HIV prevention.
Ahmad Nabil Abdul Rahim; Alfred, S.L.; Phongsakorn, P.
Malaysia is in the stage of conducting Preliminary Technical Feasibility Study for the Deployment of Small Modular Reactor (SMR). There are different types of SMR, some already under construction in Argentina (CAREM) and China (HTR-PM) - (light water reactor and high temperature reactor technologies), others with near-term deployment such as SMART in South Korea, ACP100 in China, mPower and NuScale in the US, and others with longer term deployment prospects (liquid-metal cooled reactor technologies). The study was mainly to get an overview of the technology available in the market. The SMR ranking in the study was done through listing out the most deployable technology in the market according to their types. As a new comer country, the proven technology with an excellent operation history will usually be the main consideration points. (author)
Patel, S; Durack, C; Abella, F; Shemesh, H; Roig, M; Lemberg, K
Cone beam computed tomography (CBCT) produces undistorted three-dimensional information of the maxillofacial skeleton, including the teeth and their surrounding tissues with a lower effective radiation dose than computed tomography. The aim of this paper is to: (i) review the current literature on the applications and limitations of CBCT; (ii) make recommendations for the use of CBCT in Endodontics; (iii) highlight areas of further research of CBCT in Endodontics. © 2014 International Endodontic Journal. Published by John Wiley & Sons Ltd.
Brandin, D.; Wieder, H.; Spicer, W.; Nevins, J.; Oxender, D.
The series studies Japanese research and development in four high-technology areas - computer science, opto and microelectronics, mechatronics (a term created by the Japanese to describe the union of mechanical and electronic engineering to produce the next generation of machines, robots, and the like), and biotechnology. The evaluations were conducted by panels of U.S. scientists - chosen from academia, government, and industry - actively involved in research in areas of expertise. The studies were prepared for the purpose of aiding the U.S. response to Japan's technological challenge. The main focus of the assessments is on the current status and long-term direction and emphasis of Japanese research and development. Other aspects covered include evolution of the state of the art; identification of Japanese researchers, R and D organizations, and resources; and comparative U.S. efforts. The general time frame of the studies corresponds to future industrial applications and potential commercial impacts spanning approximately the next two decades.
Balduccelli, C.; Bologna, S.; Di Costanzo, G.; Vicoli, G.
This document provides an overview about problems related to the engineering of computer based systems for industrial risk prevention and emergency management. Such systems are rather complex and subject to precise reliability and safety requirements. With the evolution of informatic technologies, such systems are becoming to be the means for building protective barriers for reduction of risk associated with plant operations. For giving more generality to this document, and for not concentrating on only a specific plant, the emergency management systems will be dealt with more details than ones for accident prevention. The document is organized in six chapters. Chapter one is an introduction to the problem and to its state of art, with particular emphasis to the aspects of safety requirements definition. Chapter two is an introduction to the problems related to the emergency management and to the training of operators in charge of this task. Chapter three deals in details the topic of the Training Support Systems, in particular about MUSTER (multi-user system for training and evaluation of environmental emergency response) system. Chapter four deals in details the topic of decision support systems, in particular about ISEM (information technology support for emergency management) system. Chapter five illustrates an application of support to the operators of Civil Protection Department for the management of emergencies in the fields of industrial chemical. Chapter six is about a synthesis of the state of art and the future possibilities, identifying some research and development activities more promising for the future
Rodríguez, Iyubanit; Herskovic, Valeria; Gerea, Carmen; Fuentes, Carolina; Rossel, Pedro O; Marques, Maíra; Campos, Mauricio
Monitoring of patients may decrease treatment costs and improve quality of care. Pain is the most common health problem that people seek help for in hospitals. Therefore, monitoring patients with pain may have significant impact in improving treatment. Several studies have studied factors affecting pain; however, no previous study has reviewed the contextual information that a monitoring system may capture to characterize a patient's situation. The objective of this study was to conduct a systematic review to (1) determine what types of technologies have been used to monitor adults with pain, and (2) construct a model of the context information that may be used to implement apps and devices aimed at monitoring adults with pain. A literature search (2005-2015) was conducted in electronic databases pertaining to medical and computer science literature (PubMed, Science Direct, ACM Digital Library, and IEEE Xplore) using a defined search string. Article selection was done through a process of removing duplicates, analyzing title and abstract, and then reviewing the full text of the article. In the final analysis, 87 articles were included and 53 of them (61%) used technologies to collect contextual information. A total of 49 types of context information were found and a five-dimension (activity, identity, wellness, environment, physiological) model of context information to monitor adults with pain was proposed, expanding on a previous model. Most technological interfaces for pain monitoring were wearable, possibly because they can be used in more realistic contexts. Few studies focused on older adults, creating a relevant avenue of research on how to create devices for users that may have impaired cognitive skills or low digital literacy. The design of monitoring devices and interfaces for adults with pain must deal with the challenge of selecting relevant contextual information to understand the user's situation, and not overburdening or inconveniencing users with
... system, and of researchers and policymakers to conduct systematic reviews of STI policy implementation. It will do so by supporting a review of the implementation of the current strategy (2006-2016) and a study to assess the feasibility and likely structure of a multidisciplinary scientific and industrial research organization ...
Colandrea, John Louis
Because computer technology represents a major financial outlay for school districts and is an efficient method of preparing and delivering lessons, studying the process of teacher adoption of computer use is beneficial and adds to the current body of knowledge. Because the teacher is the ultimate user of computer technology for lesson preparation…
Gilakjani, Abbas Pourhosein
Computer technology has changed the ways we work, learn, interact and spend our leisure time. Computer technology has changed every aspect of our daily life--how and where we get our news, how we order goods and services, and how we communicate. This study investigates some of the significant issues concerning the use of computer technology…
Gilakjani, Abbas Pourhosein
There are many factors for teachers to use computer technology in their classrooms. The goal of this study is to identify some of the important factors contributing the teachers' use of computer technology. The first goal of this paper is to discuss computer self-efficacy. The second goal is to explain teaching experience. The third goal is to…
Featured articles in this issue cover progress in these areas: Advancing Technologies and Applications in Nondestructive Evaluation; Atomic Engineering with Multilayers; Marrying Astrophysics with the Earth; Continuing Work in Breast Cancer Detection Technologies. Furthermore, this issue lists patents issued to and/or the awards received by Laboratory employees. It also includes an index of the contents of all the issues published in calendar year 1997.
Iryna Yu. Slipchuk
Full Text Available The main problem of this article is using the computer technologies in the process of biology education. This article contains the results of explorations in computer technologies and the influence of these technologies on the quality of knowledge and level of progress in secondary school.
Warnick, Bryan R.
This essay is an attempt to understand how technological metaphors, particularly computer metaphors, are relevant to moral education. After discussing various types of technological metaphors, it is argued that technological metaphors enter moral thought through their "functional descriptions." The computer metaphor is then explored by turning to…
Jeong, Hye In; Kim, Yeolib
This study investigated kindergarten teachers' decision-making process regarding the acceptance of computer technology. We incorporated the Technology Acceptance Model framework, in addition to computer self-efficacy, subjective norm, and personal innovativeness in education technology as external variables. The data were obtained from 160…
Ewell, Robert N.
The U.S. Space Foundation displayed its prototype Space Technology Hall of Fame exhibit design at the Technology 2003 conference in Anaheim, CA, December 7-9, 1993. In order to sample public opinion on space technology in general and the exhibit in particular, a computer-based survey was set up as a part of the display. The data collected was analyzed.
Full Text Available "nSurgical and prosthodontic implant complications are often the case of inadvertent improper diagnosis, planning, and placement. These complications always pose a significant challenge in implant dentistry. In this article, it is attempted to depict a new technique in which advanced software program along with newly developed CAD/CAM technology called rapid prototyping will be used. This technology permits graphic and complex 3D implant simulation and then fabrication of computer-generated surgical templates. The best position of implants planned at first by taking into consideration of encased bone density of each implants and then existing occlusion. In this paper, the evolution of Computer Guided Implantology and the many benefits achieved from this very sophisticated technology described as a literature review.
The 2017 U.S. Department of Energy (DOE) Hydrogen and Fuel Cells Program and Vehicle Technologies Office (VTO) Annual Merit Review and Peer Evaluation Meeting (AMR) was held June 5-9, 2017, in Washington, DC. The review encompassed work done by the Hydrogen and Fuel Cells Program and VTO: 263 individual activities were reviewed for VTO by 191 reviewers. Exactly 1,241 individual review responses were received for the VTO technical reviews. The objective of the meeting was to review the accomplishments and plans for VTO over the previous 12 months, and provide an opportunity for industry, government, and academia to give inputs to DOE with a structured and formal methodology. The meeting also provided attendees with a forum for interaction and technology information transfer.
The 2016 U.S. Department of Energy (DOE) Hydrogen and Fuel Cells Program and Vehicle Technologies Office (VTO) Annual Merit Review and Peer Evaluation Meeting (AMR) was held June 6-9, 2016, in Washington, DC. The review encompassed work done by the Hydrogen and Fuel Cells Program and VTO: 226 individual activities were reviewed for VTO, by 171 reviewers. A total of 1,044 individual review responses were received for the VTO technical reviews. The objective of the meeting was to review the accomplishments and plans for VTO over the previous 12 months, and provide an opportunity for industry, government, and academia to give inputs to DOE with a structured and formal methodology. The meeting also provided attendees with a forum for interaction and technology information transfer.
This report describes practical issues for federal agencies to consider if they choose program peer review for internal purposes and/ or to contribute to satisfying the requirements of the Government...
Sanberg, Paul R; Vindrola-Padros, Cecilia; Eve, David J; Federoff, Howard J
The following commentary provides a discussion of the articles published in Technology and Innovation in 2010 and where possible places them into context with those reported in Cell Transplantation. These articles can be divided into the following topics: a) models for innovation and technological commercialization, b) the ethical and legal consequences of the emergence of new technologies, c) research on novel technologies and methods, and d) the difficulties involved in peer review and scientific assessment. The articles shed light on the effects of technological innovation and commercialization on scientific ethical regulation, the establishment of legal standards for the protection of intellectual property, and the development of financial models.
Purpose: The purpose of this study was to review the social scientific literature associated with medical imaging technology. Methods: An extensive search of published studies in nursing, psychology and anthropology was undertaken to support the radiography specific literature. Results: Following a broad definition of technology and its profound influence on society, an analysis of imaging literature revealed a complex relationship between technology and human interactions. Examples are cited for CT, MRI and ultrasound. Conclusion: It is suggested that any attempt to understand imaging technology must place at its centre the perspectives of patients and radiographers. Scientific descriptors must be balanced with equal deliberation given to 'soft technology'
Reed, Daniel [University of Iowa; Berzins, Martin [University of Utah; Pennington, Robert; Sarkar, Vivek [Rice University; Taylor, Valerie [Texas A& M University
On November 19, 2014, the Advanced Scientific Computing Advisory Committee (ASCAC) was charged with reviewing the Department of Energy’s conceptual design for the Exascale Computing Initiative (ECI). In particular, this included assessing whether there are significant gaps in the ECI plan or areas that need to be given priority or extra management attention. Given the breadth and depth of previous reviews of the technical challenges inherent in exascale system design and deployment, the subcommittee focused its assessment on organizational and management issues, considering technical issues only as they informed organizational or management priorities and structures. This report presents the observations and recommendations of the subcommittee.
Gowen, Emma; Hamilton, Antonia
Altered motor behaviour is commonly reported in Autism Spectrum Disorder, but the aetiology remains unclear. Here, we have taken a computational approach in order to break down motor control into different components and review the functioning of each process. Our findings suggest abnormalities in two areas--poor integration of information for…
Alberta Education, 2006
This literature review is intended to provide practical information; lessons learned and promising practices which have been drawn from recent Kindergarten to Grade 12 (K-12) one-to-one mobile computing research reports and related articles. The information is presented in the form of answers to the following questions: (1) How is one-to-one…
The articles in this month`s issue are entitled Site 300`s New Contained Firing Facility, Computational Electromagnetics: Codes and Capabilities, Ergonomics Research:Impact on Injuries, and The Linear Electric Motor: Instability at 1,000 g`s.
Slides are reproduced that describe the importance of having high performance number crunching and graphics capability. They also indicate the types of research and development underway at Ames Research Center to ensure that, in the near term, Ames is a smart buyer and user, and in the long-term that Ames knows the best possible solutions for number crunching and graphics needs. The drivers for this research are real computational physics applications of interest to Ames and NASA. They are concerned with how to map the applications, and how to maximize the physics learned from the results of the calculations. The computer graphics activities are aimed at getting maximum information from the three-dimensional calculations by using the real time manipulation of three-dimensional data on the Silicon Graphics workstation. Work is underway on new algorithms that will permit the display of experimental results that are sparse and random, the same way that the dense and regular computed results are displayed.
Christoffersen, Ralph E.; McSwiggen, James; Konings, Danielle
Ribozymes are RNA molecules that act enzymatically to cleave other RNA molecules. The cleavage reaction requires the binding of ribozyme to specific sites on the target RNA through (mostly) Watson-Crick base-pairing interactions. Association of ribozyme with target completes a three-dimensional ribozyme/target complex which results in cleavage of the target RNA. We are employing both computational and experimental approaches to identify sites on target RNA molecules that are open to ribozyme attack and to determine which ribozymes are most active against those sites. Two types of computational technologies are available for aiding in the identification of target sites and design of active ribozymes. First, DNA/RNA sequence analysis software is employed to identify sequence motifs necessary for ribozyme cleavage and to look for sequence conservation between different sources of the target organism so that ribozymes with the broadest possible target range can be designed. Second, RNA folding algorithms are employed to predict the secondary structure of both ribozyme and target RNA in an attempt to identify combinations of ribozyme and target site that will successfully associate prior to ribozyme cleavage. The RNA folding algorithms utilize a set of thermodynamic parameters obtained from measurements on short RNA duplexes; while these rules give reasonable predictions of secondary structure for a small set of highly structured RNAs, they remain largely untested for predicting the structure of messenger RNAs. This paper outlines the current status of designing ribozymes that fold correctly and of locating target sites that are sufficiently unfolded to allow ribozyme cleavage.
Shelton, Brett E; Uz, Cigdem
Technologies that provide immersive experiences continue to become more ubiquitous across all age groups. This paper presents a review of the literature to provide a snapshot of the current state of research involving the use of immersive technologies and the elderly. A narrative literature review was conducted using the ScienceDirect, EBSCOhost, Springerlink and ERIC databases to summarize primary studies from which conclusions were drawn into a holistic interpretation. The majority of the studies examined the effect of immersive technologies on elder peoples' age-related declines, including sensory and motor changes (vision, hearing, motor skills), cognitive changes and social changes. Various immersive technologies have been described and tested to address these age-related changes, and have been categorized as 'games and simulations', 'robotics' and 'social technologies'. In most cases, promising results were found for immersive technologies to challenge age-related declines, especially through the increase of morale. © 2014 S. Karger AG, Basel.
Wretman, Christopher J.; Macy, Rebecca J.
Given the growing prevalence of technology-based instruction, social work faculty need a clear understanding of the strengths and limitations of these methods. We systematically examined the evidence for technology-based instruction in social work education. Using comprehensive and rigorous methods, 38 articles were included in the review. Of…
Networking and Information Technology Research and Development, Executive Office of the President — The Federal High Performance Computing and Communications HPCC Program was created to accelerate the development of future generations of high performance computers...
Intelligent Vehicle-Highway Systems (IVHS) technologies include a range of communications and control technologies. The U.S. Department of Transportation has applied IVHS technologies, such as electronic payment media, automatic vehicle locator syste...
Ahmad Jonidi Jafari
Full Text Available Bioaerosols are air pollutants that affect human health in various routes. They are characteristically diverse; such as bacteria, viruses and fungi, that everyone has different characteristics and effects, various solutions and technologies are studied or applied for their removal and inactivation. Regarding to lack of specific and integrated publications about the different air quality guidelines for bioaerosols and the methods and technologies, attending to approach the standards, purpose of this study was set on the development of the issue. The importance of presence of bioaerosols in breathing air and related standards and guidelines, also controlling technologies such as filtration, ultraviolet (UV radiation, photo catalyst, temperature and electrostatic precipitators were surveyed in this study by using the scientific literature.Given the results, UV irradiation and photocatalytic methods are ineffective for allergens. In this way, filtration is unable for inactivation of the bioaerosols, then there is the threat that they can aerosolize again. Hence, these technologies individually cannot provide the air quality standards which have established for sensitive conditions such as operation rooms. Regarding the discussions, application of the methods that include collection and inactivation of the bioaerosols simultaneously, such as electrostatic precipitators, could be more effective in the likewise environments.
... costly construction modifications, and use of substandard borrow material, environmental damage to the site, post construction remedial work, and even failure of a structure and subsequent litigation. Trenchless technology can be defined as the use of construction methods to install and repair underground infrastructure ...
Nowadays, the technological advancement in mobile devices has made possible the development of hypermedia applications that exploit their features. A potential application domain for mobile devices is multimedia educational applications and modules. Such modules may be shared, commented and further reused under other circumstances through the…
It was estimated that there would be over 55 million end-user programmers in 2012 in many different fields such as engineering,insurance and banking, and the numbers are not expected to have dwindled since. Consequently, technological advancements of spreadsheets is of great interest to a wide...
Mickan, Sharon; Tilson, Julie K; Atherton, Helen; Roberts, Nia Wyn; Heneghan, Carl
Handheld computers and mobile devices provide instant access to vast amounts and types of useful information for health care professionals. Their reduced size and increased processing speed has led to rapid adoption in health care. Thus, it is important to identify whether handheld computers are actually effective in clinical practice. A scoping review of systematic reviews was designed to provide a quick overview of the documented evidence of effectiveness for health care professionals using handheld computers in their clinical work. A detailed search, sensitive for systematic reviews was applied for Cochrane, Medline, EMBASE, PsycINFO, Allied and Complementary Medicine Database (AMED), Global Health, and Cumulative Index to Nursing and Allied Health Literature (CINAHL) databases. All outcomes that demonstrated effectiveness in clinical practice were included. Classroom learning and patient use of handheld computers were excluded. Quality was assessed using the Assessment of Multiple Systematic Reviews (AMSTAR) tool. A previously published conceptual framework was used as the basis for dual data extraction. Reported outcomes were summarized according to the primary function of the handheld computer. Five systematic reviews met the inclusion and quality criteria. Together, they reviewed 138 unique primary studies. Most reviewed descriptive intervention studies, where physicians, pharmacists, or medical students used personal digital assistants. Effectiveness was demonstrated across four distinct functions of handheld computers: patient documentation, patient care, information seeking, and professional work patterns. Within each of these functions, a range of positive outcomes were reported using both objective and self-report measures. The use of handheld computers improved patient documentation through more complete recording, fewer documentation errors, and increased efficiency. Handheld computers provided easy access to clinical decision support systems and
Networking and Information Technology Research and Development, Executive Office of the President — As the 21st century approaches, the rapid convergence of computing, communications, and information technology promises unprecedented opportunities for scientific...
Full Text Available Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein.
Voulodimos, Athanasios; Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios
Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein.
Full Text Available One of the key future challenges for reconfigurable computing is to enable higher design productivity and a more easy way to use reconfigurable computing systems for users that are unfamiliar with the underlying concepts. One way of doing this is to provide standardization and abstraction, usually supported and enforced by an operating system. This article gives historical review and a summary on ideas and key concepts to include reconfigurable computing aspects in operating systems. The article also presents an overview on published and available operating systems targeting the area of reconfigurable computing. The purpose of this article is to identify and summarize common patterns among those systems that can be seen as de facto standard. Furthermore, open problems, not covered by these already available systems, are identified.
Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979
Doulamis, Nikolaos; Doulamis, Anastasios; Protopapadakis, Eftychios
Over the last years deep learning methods have been shown to outperform previous state-of-the-art machine learning techniques in several fields, with computer vision being one of the most prominent cases. This review paper provides a brief overview of some of the most significant deep learning schemes used in computer vision problems, that is, Convolutional Neural Networks, Deep Boltzmann Machines and Deep Belief Networks, and Stacked Denoising Autoencoders. A brief account of their history, structure, advantages, and limitations is given, followed by a description of their applications in various computer vision tasks, such as object detection, face recognition, action and activity recognition, and human pose estimation. Finally, a brief overview is given of future directions in designing deep learning schemes for computer vision problems and the challenges involved therein. PMID:29487619
Annual Merit Review and Peer Evaluation Meeting to review the FY2008 accomplishments and FY2009 plans for the Vehicle Technologies Program, and provide an opportunity for industry, government, and academic to give inputs to DOE on the Program with a structured and formal methodology.
Full Text Available Distance learning presents great potential for mitigating field problems on pesticide application technology. Thus, due to the lack of teaching material about pesticide spraying technology in the Portuguese language and the increasing availability of distance learning, this study developed and evaluated a computer program for distance learning about the theory of pesticide spraying technology using the tools of information technology. The modules comprising the course, named Pulverizar, were: (1 Basic concepts, (2 Factors that affect application, (3 Equipments, (4 Spraying nozzles, (5 Sprayer calibration, (6 Aerial application, (7 Chemigation, (8 Physical-chemical properties, (9 Formulations, (10 Adjuvants, (11 Water quality, and (12 Adequate use of pesticides. The program was made available to the public on July 1st, 2008, hosted at the web site www.pulverizar.iciag.ufu.br, and was simple, robust and practical on the complementation of traditional teaching for the education of professionals in Agricultural Sciences. Mastering pesticide spraying technology by people involved in agricultural production can be facilitated by the program Pulverizar, which was well accepted in its initial evaluation.O ensino à distância apresenta grande potencial para minorar os problemas ocorridos no campo na área de tecnologia de aplicação de agroquímicos. Dessa forma, diante da escassez de material instrucional na área de tecnologia de aplicação de agroquímicos em Português e do crescimento elevado da educação à distância, o objetivo deste trabalho foi desenvolver e avaliar um programa computacional para o ensino à distância da parte teórica de tecnologia de aplicação de agroquímicos, utilizando as ferramentas de tecnologia da informação. Os módulos que compuseram o curso, intitulado Pulverizar, foram: (1 Conceitos básicos, (2 Fatores que afetam a aplicação, (3 Equipamentos, (4 Pontas de pulverização, (5 Calibração de pulverizadores
Akhtar, Humza; Kemao, Qian; Kakarala, Ramakrishna
A touch panel is an input device for human computer interaction. It consists of a network of sensors, a sampling circuit and a micro controller for detecting and locating a touch input. Touch input can come from either finger or stylus depending upon the type of touch technology. These touch panels provide an intuitive and collaborative workspace so that people can perform various tasks with the use of their fingers instead of traditional input devices like keyboard and mouse. Touch sensing technology is not new. At the time of this writing, various technologies are available in the market and this paper reviews the most common ones. We review traditional designs and sensing algorithms for touch technology. We also observe that due to its various strengths, capacitive touch will dominate the large-scale touch panel industry in years to come. In the end, we discuss the motivation for doing academic research on large-scale panels.
Genetic use restriction technologies (GURTs), developed to secure return on investments through protection of plant varieties, are among the most controversial and opposed genetic engineering biotechnologies as they are perceived as a tool to force farmers to depend on multinational corporations' seed monopolies. In this work, the currently proposed strategies are described and compared with some of the principal techniques implemented for preventing transgene flow and/or seed saving, with a simultaneous analysis of the future perspectives of GURTs taking into account potential benefits, possible impacts on farmers and local plant genetic resources (PGR), hypothetical negative environmental issues and ethical concerns related to intellectual property that have led to the ban of this technology. © 2014 Society for Experimental Biology, Association of Applied Biologists and John Wiley & Sons Ltd.
Adam B. Nover
Full Text Available Breast cancer is a serious threat worldwide and is the number two killer of women in the United States. The key to successful management is screening and early detection. What follows is a description of the state of the art in screening and detection for breast cancer as well as a discussion of new and emerging technologies. This paper aims to serve as a starting point for those who are not acquainted with this growing field.
This month`s issue contains articles entitled Livermore Science and Technology Garner Seven 1997 R&D 100 Awards; New Interferometer Measures to Atomic Dimensions; Compact More Powerful Chips from Virtually Defect-free Thin Film Systems, A New Precision Cutting Tool; The Femtosecond Laser; MELD: A CAD Tool for Photonoics Systems, The Tiltmeter: Tilting at Great Depths to Find Oil; Smaller Insulators Handle Higher Voltage; and Compact Storage Management Software: The Next Generation.
Bookless, W.A.; Wheatcraft, D.
This journal contains two feature articles. The first article reports on the background, design, and capabilities of the Portable Tritium Processing System currently being used to clean up and decontaminate the Laboratory's Tritium Facility. The second article discusses the development of a x-ray lasers as a probe to obtain high-resolution images of high-density plasmas produced at the Nova laser facility. Finally, two research programs are highlighted. They are silicon microcomponents and modern technology for advanced military training
Nover, Adam B; Jagtap, Shami; Anjum, Waqas; Yegingil, Hakki; Shih, Wan Y; Shih, Wei-Heng; Brooks, Ari D
Breast cancer is a serious threat worldwide and is the number two killer of women in the United States. The key to successful management is screening and early detection. What follows is a description of the state of the art in screening and detection for breast cancer as well as a discussion of new and emerging technologies. This paper aims to serve as a starting point for those who are not acquainted with this growing field.
Omaki, Elise; Rizzutti, Nicholas; Shields, Wendy; Zhu, Jeffrey; McDonald, Eileen; Stevens, Martha W; Gielen, Andrea
The aims of this literature review are to (1) summarise how computer and mobile technology-based health behaviour change applications have been evaluated in unintentional injury prevention, (2) describe how these successes can be applied to injury-prevention programmes in the future and (3) identify research gaps. Studies included in this systematic review were education and behaviour change intervention trials and programme evaluations in which the intervention was delivered by either a computer or mobile technology and addressed an unintentional injury prevention topic. Articles were limited to those published in English and after 1990. Among the 44 technology-based injury-prevention studies included in this review, 16 studies evaluated locally hosted software programmes, 4 studies offered kiosk-based programmes, 11 evaluated remotely hosted internet programmes, 2 studies used mobile technology or portable devices and 11 studies evaluated virtual-reality interventions. Locally hosted software programmes and remotely hosted internet programmes consistently increased knowledge and behaviours. Kiosk programmes showed evidence of modest knowledge and behaviour gains. Both programmes using mobile technology improved behaviours. Virtual-reality programmes consistently improved behaviours, but there were little gains in knowledge. No studies evaluated text-messaging programmes dedicated to injury prevention. There is much potential for computer-based programmes to be used for injury-prevention behaviour change. The reviewed studies provide evidence that computer-based communication is effective in conveying information and influencing how participants think about an injury topic and adopt safety behaviours. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Anatoliy I. Bedov
Full Text Available The areas of application of concrete and reinforcement of higher grades for strength in structural elements of a monolithic reinforced concrete frame are considered. Analytic dependencies, criteria and boundary conditions are proposed that numerically describe the relationship between increasing the strength of concrete and reducing the consumption of reinforcing steel for bent and compressed-bent elements. Calculation-analytical models of the deformation state of overlaps of a monolithic reinforced concrete multi-storey frame have been developed on the basis of multifactor numerical studies carried out for various values of the thicknesses of ceilings, spans, operating loads, classes of concrete and reinforcement. Calculated parameters of slabs are determined, which determine their bearing capacity. On the basis of computer technology, the optimum section of a reinforced concrete element is modeled according to the criterion of reducing the material consumption and rational combination of classes of concrete and reinforcement.
Chao, Han-Chieh; Deng, Der-Jiunn; Park, James; HumanCom and EMC 2013
The theme of HumanCom and EMC are focused on the various aspects of human-centric computing for advances in computer science and its applications, embedded and multimedia computing and provides an opportunity for academic and industry professionals to discuss the latest issues and progress in the area of human-centric computing. And the theme of EMC (Advanced in Embedded and Multimedia Computing) is focused on the various aspects of embedded system, smart grid, cloud and multimedia computing, and it provides an opportunity for academic, industry professionals to discuss the latest issues and progress in the area of embedded and multimedia computing. Therefore this book will be include the various theories and practical applications in human-centric computing and embedded and multimedia computing.
Velasco, Carlos; Obrist, Marianna; Petit, Olivia; Spence, Charles
There is growing interest in the development of new technologies that capitalize on our emerging understanding of the multisensory influences on flavor perception in order to enhance human-food interaction design. This review focuses on the role of (extrinsic) visual, auditory, and haptic/tactile elements in modulating flavor perception and more generally, our food and drink experiences. We review some of the most exciting examples of recent multisensory technologies for augmenting such experiences. Here, we discuss applications for these technologies, for example, in the field of food experience design, in the support of healthy eating, and in the rapidly growing world of sensory marketing. However, as the review makes clear, while there are many opportunities for novel human-food interaction design, there are also a number of challenges that will need to be tackled before new technologies can be meaningfully integrated into our everyday food and drink experiences.
Full Text Available There is growing interest in the development of new technologies that capitalize on our emerging understanding of the multisensory influences on flavor perception in order to enhance human–food interaction design. This review focuses on the role of (extrinsic visual, auditory, and haptic/tactile elements in modulating flavor perception and more generally, our food and drink experiences. We review some of the most exciting examples of recent multisensory technologies for augmenting such experiences. Here, we discuss applications for these technologies, for example, in the field of food experience design, in the support of healthy eating, and in the rapidly growing world of sensory marketing. However, as the review makes clear, while there are many opportunities for novel human–food interaction design, there are also a number of challenges that will need to be tackled before new technologies can be meaningfully integrated into our everyday food and drink experiences.
Malone, R.D. [ed.
The Fuels Technology Contractors Review Meeting was held November 16-18, 1993, at the Morgantown Energy Technology Center (METC) in Morgantown, West Virginia. This meeting was sponsored and hosted by METC, the Office of Fossil Energy, U.S. Department of Energy (DOE). METC periodically provides an opportunity to bring together all of the R&D participants in a DOE-sponsored contractors review meeting to present key results of their research and to provide technology transfer to the active research community and to the interested public. This meeting was previously called the Natural Gas Technology Contractors Review Meeting. This year it was expanded to include DOE-sponsored research on oil shale and tar sands and so was retitled the Fuels Technology Contractors Review Meeting. Current research activities include efforts in both natural gas and liquid fuels. The natural gas portion of the meeting included discussions of results summarizing work being conducted in fracture systems, both natural and induced; drilling, completion, and stimulation research; resource characterization; delivery and storage; gas to liquids research; and environmental issues. The meeting also included project and technology summaries on research in oil shale, tar sands, and mild coal gasification, and summaries of work in natural-gas fuel cells and natural-gas turbines. The format included oral and poster session presentations. Individual papers have been processed separately for inclusion in the Energy Science and Technology database.
Óscar Javier Espitia Mendoza
Full Text Available Computed tomography is a noninvasive scan technique widely applied in areas such as medicine, industry, and geology. This technique allows the three-dimensional reconstruction of the internal structure of an object which is lighted with an X-rays source. The reconstruction is formed with two-dimensional cross-sectional images of the object. Each cross-sectional is obtained from measurements of physical phenomena, such as attenuation, dispersion, and diffraction of X-rays, as result of their interaction with the object. In general, measurements acquisition is performed with methods based on any of these phenomena and according to various architectures classified in generations. Furthermore, in response to the need to simulate acquisition systems for CT, software dedicated to this task has been developed. The objective of this research is to determine the current state of CT techniques, for this, a review of methods, different architectures used for the acquisition and some of its applications is presented. Additionally, results of simulations are presented. The main contributions of this work are the detailed description of acquisition methods and the presentation of the possible trends of the technique.
De Pruneda, J.H.
The contents of this Lawrence Livermore National Laboratory newsletter include the following: (1) The Laboratory in the News; (2) Commentary by George Miller--Reaping Unexpected Benefits from the Petawatt Laser Breakthrough; (3) The Amazing Power of the Petawatt--For three years, the Petawatt laser was the most powerful laser in the world, pushing electrons toward the speed of light and accomplishing some remarkable science in the process; (4) Building a Virtual Time Machine--A powerful new computer code simulates geologic changes eons into the future at Yucca Mountain, a potential underground nuclear waste repository; (5) Research Highlight: Dead Sea Explosions Trigger International Cooperation; and (6) Patents and Awards; (7) Abstracts
Quirk, W.J.; Sefcik, J.A. [eds.
Four articles are included in this issue. The first expounds upon the unique properties of organic aerogels and their applications. The second article describes computer algorithms that help detect and identify small features in complex biomedical images. Third, efforts by scientists from Lawrence Livermore National Laboratory to assess pollution sources and establish priorities for pollution prevention in the southern Ural mountains where millions of high-level radioactive wastes were discharged into the Techa river between 1949 and 1976. Last, increases in output power of copper vapor lasers used for uranium atomic vapor isotope separation is improved with the use of an internal septum that reduces peak gas temperature.
Green, Kenneth C.
The 2001 Campus Computing Survey, the 12th such survey, is the largest continuing study of the role of computing and information technology in U.S. higher education today. The survey results in this report summarize data from 590 two- and four-year, public and private colleges across the United States, representing a 38.4% response rate. The focus…
Vancouver Community Coll., British Columbia.
After examining the impact of changing technology on postsecondary instruction and on the tools needed for instruction, this report analyzes the status and offers recommendations concerning the future of instructional computing at Vancouver Community College (VCC) in British Columbia. Section I focuses on the use of computers in community college…
Howard, B.Y.; McClain, W.J.; Landay, M.
The Council on Computers (CC) of the Society of Nuclear Medicine (SNM) annually publishes the Proceedings of its Symposium on the Sharing of Computer Programs and Technology in Nuclear Medicine. This is the seventh such volume and has been organized by topic, with the exception of the invited papers and the discussion following them. An index arranged by author and by subject is included
Howard, B.Y.; McClain, W.J.; Landay, M. (comps.)
The Council on Computers (CC) of the Society of Nuclear Medicine (SNM) annually publishes the Proceedings of its Symposium on the Sharing of Computer Programs and Technology in Nuclear Medicine. This is the seventh such volume and has been organized by topic, with the exception of the invited papers and the discussion following them. An index arranged by author and by subject is included.
The following titles are in the review: The Laboratory in the News; Commentary about the year's R and D 100 Honors; The Optical Modulator and Switch: Light on the Move; From Dinosaur Bones to Software, Gamma Rays Protect Property; High-Power Green Lasers Open up Precision Machining; Breakthrough Design for Accelerators; New Deposition System for the Microchip Revolution; and PEREGRINEtrademark Takes Aim at Cancer Tumors
This volume presents a balanced blend of methodological and applied contributions. It supplements well the first three volumes of the series, revealing results of current research in computational chemistry. It also reviews the topographical features of several molecular scalar fields. A brief discussion of topographical concepts is followed by examples of their application to several branches of chemistry.The size of a basis set applied in a calculation determines the amount of computer resources necessary for a particular task. The details of a common strategy - the ab initio model potential
Li, Jing; Zhou, Liang; Yang, Fei
With the development of computer technology, computer graphics technology has been widely used. Especially, the success of object-oriented technology and multimedia technology promotes the development of graphics technology in the computer software system. Therefore, the computer graphics theory and application technology have become an important topic in the field of computer, while the computer graphics technology becomes more and more extensive in various fields of application. In recent years, with the development of social economy, especially the rapid development of information technology, the traditional way of communication resource management cannot effectively meet the needs of resource management. In this case, the current communication resource management is still using the original management tools and management methods, resource management equipment management and maintenance, which brought a lot of problems. It is very difficult for non-professionals to understand the equipment and the situation in communication resource management. Resource utilization is relatively low, and managers cannot quickly and accurately understand the resource conditions. Aimed at the above problems, this paper proposes to introduce computer graphics technology into the communication resource management. The introduction of computer graphics not only makes communication resource management more vivid, but also reduces the cost of resource management and improves work efficiency.
Barry, Amanda [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bioenergy Technologies Office, Washington, DC (United States); Wolfe, Alexis [Oak Ridge Inst. for Science and Education (ORISE), Oak Ridge, TN (United States); English, Christine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Bioenergy Technologies Office, Washington, DC (United States); Ruddick, Colleen [BCS, Incorporated, Washington, DC (United States); Lambert, Devinn [Bioenergy Technologies Office, Washington, DC (United States)
The Bioenergy Technologies Office (BETO) of the U.S. Department of Energy (DOE), Office of Energy Efficiency and Renewable Energy, is committed to advancing the vision of a viable, sustainable domestic biomass industry that produces renewable biofuels, bioproducts, and biopower; enhances U.S. energy security; reduces our dependence on fossil fuels; provides environmental benefits; and creates economic opportunities across the nation. BETO’s goals are driven by various federal policies and laws, including the Energy Independence and Security Act of 2007 (EISA). To accomplish its goals, BETO has undertaken a diverse portfolio of research, development, and demonstration (RD&D) activities, in partnership with national laboratories, academia, and industry.
Chinn, D J
This month's issue has the following articles: (1) Biomedical Technology Has a Home at Livermore--Commentary by Cherry A. Murray; (2) Shaping the Future of Aneurysm Treatments--Livermore foam devices may offer significant advantages for treating some forms of aneurysms; (3) Ring around a Stellar Shell: A Tale of Scientific Serendipity--Using a three-dimensional model, Livermore scientists have solved a long-standing puzzle of stellar evolution; and (4) On Assignment in Washington, DC--Livermore personnel in Washington, DC, support federal sponsors and become valuable assets to Laboratory programs.
Reviewed by Mubin KIYICI
and time, using alternative mediaresources when students and instructors have difficulties of establishingface-to-face communication. In distance education, instruction deliverybetween tutors and student is done by using different delivery systems suchas computer mediated communication systems, video tapes, printedmaterial, cassettes, and instructional television. With the developments inthe Internet and global network system, the universities immediately tookthe advantage of using World Wide Web to deliver the instruction to almostany node in the world, regardless of the physical distance and time.The main aims which should sought by almost all institutions offeringdistance education are how affective the given program is and whether it isa sufficient replacement to the traditional face-to-face education. Duringthe IETC-2005 will be discussed these aims by presenters, panels andkeynote speakers.The first, second and fourth of “The International Educational TechnologyConference (IETC'' were held by Sakarya University in Turkey, the third onein the Eastern Mediterranean University in the Turkish Republic of NorthernCyprus, and the fifth one again in Sakarya University in Turkey.Without the authors and receivers, IETC 2005 would, of course, have beenimpossible. We would like to sincerely thank all of you for coming,presenting, and joining in the academic activities. We would also like tothank all of those who contributed to the reviewing process of the “IETC2005'' conference papers, which will be also published in TOJET. And finally,we would like to thank Sakarya University (Turkey, Eastern MediterraneanUniversity (TRNC, Louisiana State University (USA, Ohio University (USA,Governors State University (USA and The Turkish Online Journal ofEducational Technology (TOJET for successfully organizing and hosting“IETC 2005'' in Sakarya, Turkey.Finally, I would like to wish you all a pleasant stay in Sakarya-Turkey andsafe return back home. I hope that IETC 2005 will be a
Jwayyed, Sharhabeel; Stiffler, Kirk A; Wilber, Scott T; Southern, Alison; Weigand, John; Bare, Rudd; Gerson, Lowell W
Studies on computer-aided instruction and web-based learning have left many questions unanswered about the most effective use of technology-assisted education in graduate medical education. We conducted a review of the current medical literature to report the techniques, methods, frequency and effectiveness of technology-assisted education in graduate medical education. A structured review of MEDLINE articles dealing with "Computer-Assisted Instruction," "Internet or World Wide Web," "Education" and "Medical" limited to articles published between 2002-2007 in the English language was performed. The two literature searches returned 679 articles; 184 met our inclusion and exclusion criteria. In 87 articles, effectiveness was measured primarily using self-reported results from a survey of subjects. Technology-assisted education was superior to traditional methods in 42 of the 64 direct comparison articles (66%, 95% CI 53-77%). Traditional teaching methods were superior to technology-assisted education in only 3/64 (5%, 95% CI 1-13%). The remaining 19 direct comparison articles showed no difference. A detailed review of the 64 comparative studies (technology-assisted education versus traditional teaching methods) also failed to identify a best method or best uses for technology-assisted education. Technology-assisted education is used in graduate medical education across a variety of content areas and participant types. Knowledge gain was the predominant outcome measured. The majority of studies that directly compared knowledge gains in technology-assisted education to traditional teaching methods found technology-assisted education equal or superior to traditional teaching methods, though no "best methods" or "best use" was found within those studies. Only three articles were specific to Emergency Medicine, suggesting further research in our specialty is warranted.
Ruppel, Clemens C W
Today, acoustic filters are the filter technology to meet the requirements with respect to performance dictated by the cellular phone standards and their form factor. Around two billion cellular phones are sold every year, and smart phones are of a very high percentage of approximately two-thirds. Smart phones require a very high number of filter functions ranging from the low double-digit range up to almost triple digit numbers in the near future. In the frequency range up to 1 GHz, surface acoustic wave (SAW) filters are almost exclusively employed, while in the higher frequency range, bulk acoustic wave (BAW) and SAW filters are competing for their shares. Prerequisites for the success of acoustic filters were the availability of high-quality substrates, advanced and highly reproducible fabrication technologies, optimum filter techniques, precise simulation software, and advanced design tools that allow the fast and efficient design according to customer specifications. This paper will try to focus on innovations leading to high volume applications of intermediate frequency (IF) and radio frequency (RF) acoustic filters, e.g., TV IF filters, IF filters for cellular phones, and SAW/BAW RF filters for the RF front-end of cellular phones.
Slack, Charles W.
It is no accident that the first use of computers in school systems was to arrange schedules for students and teachers. The proper use of the computer in the classroom is as a replacement for the clock and its strict temporal schedule. By conveying information through self-instructional content, the computer can schedule work for pupils in…
In a time of budget cuts and limited funding, purchasing and installing the latest software on classroom computers can be prohibitive for schools. Many educators are unaware that a variety of free software options exist, and some of them do not actually require installing software on the user's computer. One such option is cloud computing. This…
Shao, Kun; Maher, Peter
Purpose: Teaching advanced technical concepts in a computer science program to students of different technical backgrounds presents many challenges. The purpose of this paper is to present a detailed experimental pedagogy in teaching advanced computer science topics, such as computer networking, telecommunications and data structures using…
Musial, W. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lawson, M. [National Renewable Energy Lab. (NREL), Golden, CO (United States); Rooney, S. [National Renewable Energy Lab. (NREL), Golden, CO (United States)
The Marine and Hydrokinetic Technology (MHK) Instrumentation, Measurement, and Computer Modeling Workshop was hosted by the National Renewable Energy Laboratory (NREL) in Broomfield, Colorado, July 9–10, 2012. The workshop brought together over 60 experts in marine energy technologies to disseminate technical information to the marine energy community, and to collect information to help identify ways in which the development of a commercially viable marine energy industry can be accelerated. The workshop was comprised of plenary sessions that reviewed the state of the marine energy industry and technical sessions that covered specific topics of relevance. Each session consisted of presentations, followed by facilitated discussions. During the facilitated discussions, the session chairs posed several prepared questions to the presenters and audience to encourage communication and the exchange of ideas between technical experts. Following the workshop, attendees were asked to provide written feedback on their takeaways from the workshop and their best ideas on how to accelerate the pace of marine energy technology development. The first four sections of this document give a general overview of the workshop format, provide presentation abstracts, supply discussion session notes, and list responses to the post-workshop questions. The final section presents key findings and conclusions from the workshop that suggest what the most pressing MHK technology needs are and how the U.S. Department of Energy (DOE) and national laboratory resources can be utilized to assist the marine energy industry in the most effective manner.
This report contains reviews of operating experiences, selected accident events, and industrial safety performance indicators that document the performance of the major US DOE magnetic fusion experiments and particle accelerators. These data are useful to form a basis for the occupational safety level at matured research facilities with known sets of safety rules and regulations. Some of the issues discussed are radiation safety, electromagnetic energy exposure events, and some of the more widespread issues of working at height, equipment fires, confined space work, electrical work, and other industrial hazards. Nuclear power plant industrial safety data are also included for comparison.
A. Sood (Ashish); S. Stremersch (Stefan)
textabstractUnderstanding technological change is of critical importance to marketers, as it bears new markets, new brands, new customers, and new market leaders. This paper examines the deviation among reviews of a technology’s performance and its consequences for inferences on technology evolution
Blobaum, K M
This month's issue has the following articles: (1) Fifty Years of Stellar Laser Research - Commentary by Edward I. Moses; (2) A Stellar Performance - By combining computational models with test shot data, scientists at the National Ignition Facility have demonstrated that the laser is spot-on for ignition; (3) Extracting More Power from the Wind - Researchers are investigating how atmospheric turbulence affects power production from wind turbines; (4) Date for a Heart Cell - Carbon-14 dating reveals that a significant number of heart muscle cells are regenerated over the course of our lives; and (5) Unique Marriage of Biology and Semiconductors - A new device featuring a layer of fat surrounding a thin silicon wire takes advantage of the communication properties of both biomolecules and semiconductors.
Bearinger, J P
This issue has the following articles: (1) Answering Scientists Most Audacious Questions--Commentary by Dona Crawford; (2) Testing the Accuracy of the Supernova Yardstick--High-resolution simulations are advancing understanding of Type Ia supernovae to help uncover the mysteries of dark energy; (3) Developing New Drugs and Personalized Medical Treatment--Accelerator mass spectrometry is emerging as an essential tool for assessing the effects of drugs in humans; (4) Triage in a Patch--A painless skin patch and accompanying detector can quickly indicate human exposure to biological pathogens, chemicals, explosives, or radiation; and (5) Smoothing Out Defects for Extreme Ultraviolet Lithography--A process for smoothing mask defects helps move extreme ultraviolet lithography one step closer to creating smaller, more powerful computer chips.
Boyack, B.E.; Jenks, R.P.
A structured, independent computer code peer-review process has been developed to assist the US Nuclear Regulatory Commission (NRC) and the US Department of Energy in their nuclear safety missions. This paper describes a structured process of independent code peer review, benefits associated with a code-independent peer review, as well as the authors' recent peer-review experience. The NRC adheres to the principle that safety of plant design, construction, and operation are the responsibility of the licensee. Nevertheless, NRC staff must have the ability to independently assess plant designs and safety analyses submitted by license applicants. According to Ref. 1, open-quotes this requires that a sound understanding be obtained of the important physical phenomena that may occur during transients in operating power plants.close quotes The NRC concluded that computer codes are the principal products to open-quotes understand and predict plant response to deviations from normal operating conditionsclose quotes and has developed several codes for that purpose. However, codes cannot be used blindly; they must be assessed and found adequate for the purposes they are intended. A key part of the qualification process can be accomplished through code peer reviews; this approach has been adopted by the NRC
This is the eleventh and final part of the proceedings of the 2007 CISBAT conference on Renewables in a changing climate, held in Lausanne, Switzerland. On the subject Information technologies and software the following oral contributions are summarised: 'A comparison study of the likely performance of an advanced naturally ventilated building: the relationship between computer simulation analysis and findings from a monitoring exercise', 'PhotonSim: Developing and testing a Monte Carlo ray tracing software for the simulation of planar solar concentrators' and 'Calibration of multiple energy simulations: case study on a Norwegian school building'. Posters summarised include 'Implementing energy building simulation into design studio: lessons learned in Brazil', 'MaterialsDB.org: a Tool for facilitating information exchange between building material providers and building physics softwares', 'Urban built environment climate modification: a modelling approach' and 'An assessment of the Simple Building Energy Model'. Further, the following Software that was presented on stands is summarised: 'Polysun 4: Simulation of solar thermal systems with complex hydraulics', 'Solangles: an internet online service to draw sun rays in plans and sections', 'Daylight 1-2-3: A text guide and a software as integrated tools for initial daylight/energy design', 'Meteonorm Version 5.0' and 'Transol 2.0 - Software for the design of solar thermal systems'. Further groups of presentations at the conference are reported on in separate database records. An index of authors completes the proceedings.
Ana Célia Caetano de Souza
Full Text Available Objective Investigating the educational technologies developed for promoting cardiovascular health in adults. Method Integrative review carried out in the databases of PubMed, SciELO and LILACS, with 15 articles selected. Results Over half (60% of the studies were randomized clinical trials. The developed educational technologies were programs involving three strategies, with duration of one year, use of playful technologies with storytelling, computer programs or software for smartphones, and electronic brochure. These technologies resulted in reduction of blood pressure, weight, waist circumference, decreased hospitalizations and increased years of life. Conclusion The studies with better impact on the cardiovascular health of adults were those who brought the technology in the form of program and duration of one year.
Della Palma, Paolo; Moresco, Luca; Giorgi Rossi, Paolo
; from an economic point of view, the automated computer-assisted Pap test can be convenient only with conventional smears if the screening centre has a volume of more than 49,000 slides/year and the cytologist productivity increases about threefold. It must be highlighted that it is not sufficient to adopt the automated Pap test to reach such an increase in productivity; the laboratory must be organised or re-organised to optimise the use of the review stations and the person time. In the case of liquid-based cytology, the adoption of automated computer- assisted Pap test can only increase the costs. In fact, liquid-based cytology increases the cost of consumable materials but reduces the interpretation time, even in manual screening. Consequently, the reduction of human costs is smaller in the case of computer-assisted screening. Liquid-based cytology has other implications and advantages not linked to the use of computer-assisted Pap test that should be taken into account and are beyond the scope of this Report; given that the computer-assisted Pap test reduces human costs, it may be more advantageous where the cost of cytologists is higher; given the relatively small volume of activity of screening centres in Italy, computer-assisted Pap test may be reasonable for a network using only one central scanner and several remote review stations; the use of automated computer-assisted Pap test only for quality control in a single centre is not economically sustainable. In this case as well, several centres, for example at the regional level, may form a consortium to reach a reasonable number of slides to achieve the break even point. Regarding the use of a machine rather than human intelligence to interpret the slides, some ethical issues were initially raised, but both the scientific community and healthcare professionals have accepted this technology. The identification of fields of interest by the machine is highly reproducible, reducing subjectivity in the diagnostic
Tong, Man; Yuan, Songhu
Highlights: ► HCB contamination is still a serious environmental problem. ► Physiochemical technologies for HCB remediation and disposal are reviewed. ► Perspectives for most remediation technologies are proposed. ► Pilot and large scale remediation and disposal are presented. - Abstract: Hexachlorobenzene (HCB) is one of the 12 persistent organic pollutants (POPs) listed in “Stockholm Convention”. It is hydrophobic, toxic and persistent in the environment. Due to extensive use in the past, HCB contamination is still a serious environmental problem. Strong adsorption on solid particles makes the remediation difficult. This paper presents an overview of the physiochemical technologies for HCB remediation and disposal. The adsorption/desorption behavior of HCB is firstly described because it comprises the fundamental for most remediation technologies. Physiochemical technologies concerned mostly for HCB remediation and disposal, i.e., chemical enhanced washing, electrokinetic remediation, reductive dechlorination and thermal decomposition, are reviewed in terms of fundamentals, state of the art and perspectives. The other physiochemical technologies including chemical oxidation, radiation induced catalytic dechlorination, ultrasonic assisted treatment and mechanochemical dechlorination are also reviewed. The pilot and large scale tests on HCB remediation or disposal are summarized in the end. This review aims to provide useful information to researchers and practitioners regarding HCB remediation and disposal.
Nikolic, R J
This month's issue has the following articles: (1) Honoring a Legacy of Service to the Nation - The nation pays tribute to George Miller, who retired in December 2011 as the Laboratory's tenth director; (2) Life-Extension Programs Encompass All Our Expertise - Commentary by Bruce T. Goodwin; (3) Extending the Life of an Aging Weapon - Stockpile stewards have begun work on a multiyear effort to extend the service life of the aging W78 warhead by 30 years; (4) Materials by Design - Material microstructures go three-dimensional with improved additive manufacturing techniques developed at Livermore; (5) Friendly Microbes Power Energy-Producing Devices - Livermore researchers are demonstrating how electrogenic bacteria and microbial fuel cell technologies can produce clean, renewable energy and purify water; and (6) Chemical Sensor Is All Wires, No Batteries - Livermore's 'batteryless' nanowire sensor could benefit applications in diverse fields such as homeland security and medicine.
Maciej J Mrowinski
Full Text Available With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors' workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy. Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems.
Papadopoulos, Pantelis M.; Lagkas, Thomas D.; Demetriadis, Stavros N.
This study analyses the impact of self and peer feedback in technology-enhanced peer review settings. The impact of receiving peer comments (“receiver” perspective) is compared to that of reaching own insights by reviewing others’ work (“giver” perspective). In this study, 38 sophomore students w...... systems that aim to flexibly support more efficient peer review schemes.......This study analyses the impact of self and peer feedback in technology-enhanced peer review settings. The impact of receiving peer comments (“receiver” perspective) is compared to that of reaching own insights by reviewing others’ work (“giver” perspective). In this study, 38 sophomore students...... were randomly assigned in two conditions and engaged in peer review activity facilitated by a web-based learning environment asking them to provide multiple reviews. In the Peer Reviewed (PR) condition students both reviewed peer work and received peer comments for their own work. By contrast...
Mahesh S. Raisinghani
Full Text Available The word 'quantum' comes from the Latin word quantus meaning 'how much'. Quantum computing is a fundamentally new mode of information processing that can be performed only by harnessing physical phenomena unique to quantum mechanics (especially quantum interference. Paul Benioff of the Argonne National Laboratory first applied quantum theory to computers in 1981 and David Deutsch of Oxford proposed quantum parallel computers in 1985, years before the realization of qubits in 1995. However, it may be well into the 21st century before we see quantum computing used at a commercial level for a variety of reasons discussed in this paper. The subject of quantum computing brings together ideas from classical information theory, computer science, and quantum physics. This paper discusses some of the current advances, applications, and chal-lenges of quantum computing as well as its impact on corporate computing and implications for management. It shows how quantum computing can be utilized to process and store information, as well as impact cryptography for perfectly secure communication, algorithmic searching, factorizing large numbers very rapidly, and simulating quantum-mechanical systems efficiently. A broad interdisciplinary effort will be needed if quantum com-puters are to fulfill their destiny as the world's fastest computing devices.
Mackey, Tim K; Nayyar, Gaurvika
The globalization of the pharmaceutical supply chain has introduced new challenges, chief among them, fighting the international criminal trade in fake medicines. As the manufacture, supply, and distribution of drugs becomes more complex, so does the need for innovative technology-based solutions to protect patients globally. Areas covered: We conducted a multidisciplinary review of the science/health, information technology, computer science, and general academic literature with the aim of identifying cutting-edge existing and emerging 'digital' solutions to combat fake medicines. Our review identified five distinct categories of technology including mobile, radio frequency identification, advanced computational methods, online verification, and blockchain technology. Expert opinion: Digital fake medicine solutions are unifying platforms that integrate different types of anti-counterfeiting technologies as complementary solutions, improve information sharing and data collection, and are designed to overcome existing barriers of adoption and implementation. Investment in this next generation technology is essential to ensure the future security and integrity of the global drug supply chain.
AUTHOR|(INSPIRE)INSPIRE-00291854; The ATLAS collaboration; Di Girolamo, Alessandro; Alandes Pradillo, Maria
The variety of the ATLAS Distributed Computing infrastructure requires a central information system to define the topology of computing resources and to store different parameters and configuration data which are needed by various ATLAS software components. The ATLAS Grid Information System (AGIS) is the system designed to integrate configuration and status information about resources, services and topology of the computing infrastructure used by ATLAS Distributed Computing applications and services. Being an intermediate middleware system between clients and external information sources (like central BDII, GOCDB, MyOSG), AGIS defines the relations between experiment specific used resources and physical distributed computing capabilities. Being in production during LHC Runl AGIS became the central information system for Distributed Computing in ATLAS and it is continuously evolving to fulfil new user requests, enable enhanced operations and follow the extension of the ATLAS Computing model. The ATLAS Computin...
Franco-Martín, Manuel A; Muñoz-Sánchez, Juan Luis; Sainz-de-Abajo, Beatriz; Castillo-Sánchez, Gema; Hamrioui, Sofiane; de la Torre-Díez, Isabel
Suicide is the second cause of death in young people. The use of technologies as tools facilitates the detection of individuals at risk of suicide thus allowing early intervention and efficacy. Suicide can be prevented in many cases. Technology can help people at risk of suicide and their families. It could prevent situations of risk of suicide with the technological evolution that is increasing. This work is a systematic review of research papers published in the last ten years on technology for suicide prevention. In September 2017, the consultation was carried out in the scientific databases PubMed, ScienceDirect, PsycINFO, The Cochrane Library and Google Scholar. A general search was conducted with the terms "prevention" AND "suicide" AND "technology. More specific searches included technologies such as "Web", "mobile", "social networks", and others terms related to technologies. The number of articles found following the methodology proposed was 90, but only 30 are focused on the objective of this work. Most of them were Web technologies (51.61%), mobile solutions (22.58%), social networks (12.90%), machine learning (3.23%) and other technologies (9.68%). According to the results obtained, although there are technological solutions that help the prevention of suicide, much remains to be done in this field. Collaboration among technologists, psychiatrists, patients, and family members is key to advancing the development of new technology-based solutions that can help save lives.
Ruiz, Jorge G; Cook, David A; Levinson, Anthony J
Animations can depict dynamic changes over time and location, and illustrate phenomena and concepts that might otherwise be difficult to visualise. However, animations may not always be effective and educators who use animations must understand the principles that govern their use. This review aims to illustrate potential applications of animations in medical education, to identify evidence-based principles for their design and use, and to propose an agenda for future research. We searched MEDLINE, PsychINFO and EMBASE for articles describing the use of computer animations in medical education. We reviewed and summarised all identified original research studies comparing animations with an alternative computer-based or non-computer-based format. We also selectively reviewed non-medical education research on the use of computer animations. Medical educators have used animations in a variety of computer-assisted learning applications, but few comparative studies have been published and the evidence is inconclusive. Research outside medical education shows conflicting results for studies comparing animations with static images. This may reflect differences in cognitive load induced by animation, or differences in the type of motion being illustrated. The benefits of animations may also vary according to learner characteristics such as prior knowledge and spatial ability. Features of animation that appear to facilitate learning include permitting learner control over the animation's pace, allowing learners to interact with animations and splitting the animation activity into small chunks (segmenting). Existing medical education research does little to inform the use of animations. Research is needed to confirm and extend non-medicine research to ascertain when to use animations and how to use them effectively.
S. Khanagha (Saeed)
markdownabstract__Abstract__ The advancement of information and communication technologies has brought a digital age, where massive computing power, high speed and ubiquitous access to internet and more recently Cloud Computing Technology are expected to transform a wide range of organizations,
Grgurovic, Maja; Chapelle, Carol A.; Shelley, Mack C.
With the aim of summarizing years of research comparing pedagogies for second/foreign language teaching supported with computer technology and pedagogy not-supported by computer technology, a meta-analysis was conducted of empirical research investigating language outcomes. Thirty-seven studies yielding 52 effect sizes were included, following a…
Iyadat, Waleed; Iyadat, Yousef; Ashour, Rateb; Khasawneh, Samer
The primary purpose of this study was to determine the level of students' awareness about computer technology ethics at the Hashemite University in Jordan. A total of 180 university students participated in the study by completing the questionnaire designed by the researchers, named the Computer Technology Ethics Questionnaire (CTEQ). Results…
This study investigated preschool teachers' beliefs and practices regarding the use of computer technology in teaching reading and writing in Jordan. The researcher developed a questionnaire consisting of two scales--Teachers' Beliefs Scale (TB Scale) and Teachers' Practices Scale (TP Scale)--to examine the role of computer technology in teaching…
Full Text Available The article deals with computer technology training, highlights the current state ofcomputerization of educational process in teacher training colleges, reveals the specifictechniques of professional training of teachers of fine arts to use computer technology inteaching careers.Key words: Methods of professional training, professional activities, computertechnology training future teachers of Fine Arts, the subject of research.