WorldWideScience

Sample records for previously developed computational

  1. The Importance of Business Model Factors for Cloud Computing Adoption: Role of Previous Experiences

    Directory of Open Access Journals (Sweden)

    Bogataj Habjan Kristina

    2017-08-01

    Full Text Available Background and Purpose: Bringing several opportunities for more effective and efficient IT governance and service exploitation, cloud computing is expected to impact the European and global economies significantly. Market data show that despite many advantages and promised benefits the adoption of cloud computing is not as fast and widespread as foreseen. This situation shows the need for further exploration of the potentials of cloud computing and its implementation on the market. The purpose of this research was to identify individual business model factors with the highest impact on cloud computing adoption. In addition, the aim was to identify the differences in opinion regarding the importance of business model factors on cloud computing adoption according to companies’ previous experiences with cloud computing services.

  2. Value of computed tomography pelvimetry in patients with a previous cesarean section

    International Nuclear Information System (INIS)

    Yamani, Tarik Y.; Rouzi, Abdulrahim A.

    1998-01-01

    A case-control study was conducted at the Department of Obstetrics and Gynaecology, King Abdulaziz University Hospital, Jeddah, Saudi Arabia to determine the value of computed tomography pelivimetry in patients with a previous cesarean section. Between January 1993 and December 1995, 219 pregnant women with one previous cesarean had antenatal CT pelvimetry for assessment of the pelvis. One hundred and nineteen women did not have CT pelvimetry and served as control. Fifty-one women (51%) in the CT pelvimetry group were delivered by cesarean section. Twenty-three women (23%) underwent elective cesarean section for contracted pelvis based upon the findings of CT pelvimetry and 28 women (28%) underwent emergency cesarean section after trial of labor. In the group who did not have CT pelvimetry, 26 women (21.8%) underwent emergency cesarean section. This was a statistically significant difference (P=0.02). There were no statistically significant differences in birthweight and Apgar scores either group. There was no prenatal or maternal mortality in this study. Computed tomography pelvimetry increased the rate of cesarean delivery without any benefit in the immediate delivery outcomes. Therefore, the practice of documenting the adequacy of the pelvis by CT pelvimetry before vaginal birth after cesarean should be abandoned. (author)

  3. Developing and validating an instrument for measuring mobile computing self-efficacy.

    Science.gov (United States)

    Wang, Yi-Shun; Wang, Hsiu-Yuan

    2008-08-01

    IT-related self-efficacy has been found to have a critical influence on system use. However, traditional measures of computer self-efficacy and Internet-related self-efficacy are perceived to be inapplicable in the context of mobile computing and commerce because they are targeted primarily at either desktop computer or wire-based technology contexts. Based on previous research, this study develops and validates a multidimensional instrument for measuring mobile computing self-efficacy (MCSE). This empirically validated instrument will be useful to researchers in developing and testing the theories of mobile user behavior, and to practitioners in assessing the mobile computing self-efficacy of users and promoting the use of mobile commerce systems.

  4. Low-dose computed tomography image restoration using previous normal-dose scan

    International Nuclear Information System (INIS)

    Ma, Jianhua; Huang, Jing; Feng, Qianjin; Zhang, Hua; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2011-01-01

    Purpose: In current computed tomography (CT) examinations, the associated x-ray radiation dose is of a significant concern to patients and operators. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) or kVp parameter (or delivering less x-ray energy to the body) as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and the noise would propagate into the CT image if no adequate noise control is applied during image reconstruction. Since a normal-dose high diagnostic CT image scanned previously may be available in some clinical applications, such as CT perfusion imaging and CT angiography (CTA), this paper presents an innovative way to utilize the normal-dose scan as a priori information to induce signal restoration of the current low-dose CT image series. Methods: Unlike conventional local operations on neighboring image voxels, nonlocal means (NLM) algorithm utilizes the redundancy of information across the whole image. This paper adapts the NLM to utilize the redundancy of information in the previous normal-dose scan and further exploits ways to optimize the nonlocal weights for low-dose image restoration in the NLM framework. The resulting algorithm is called the previous normal-dose scan induced nonlocal means (ndiNLM). Because of the optimized nature of nonlocal weights calculation, the ndiNLM algorithm does not depend heavily on image registration between the current low-dose and the previous normal-dose CT scans. Furthermore, the smoothing parameter involved in the ndiNLM algorithm can be adaptively estimated based on the image noise relationship between the current low-dose and the previous normal-dose scanning protocols. Results: Qualitative and quantitative evaluations were carried out on a physical phantom as well as clinical abdominal and brain perfusion CT scans in terms of accuracy and resolution properties. The gain by the use

  5. Approximator: Predicting Interruptibility in Software Development with Commodity Computers

    DEFF Research Database (Denmark)

    Tell, Paolo; Jalaliniya, Shahram; Andersen, Kristian S. M.

    2015-01-01

    Assessing the presence and availability of a remote colleague is key in coordination in global software development but is not easily done using existing computer-mediated channels. Previous research has shown that automated estimation of interruptibility is feasible and can achieve a precision....... These early but promising results represent a starting point for designing tools with support for interruptibility capable of improving distributed awareness and cooperation to be used in global software development....

  6. Experimental and computational development of a natural breast phantom for dosimetry studies

    International Nuclear Information System (INIS)

    Nogueira, Luciana B.; Campos, Tarcisio P.R.

    2013-01-01

    This paper describes the experimental and computational development of a natural breast phantom, anthropomorphic and anthropometric for studies in dosimetry of brachytherapy and teletherapy of breast. The natural breast phantom developed corresponding to fibroadipose breasts of women aged 30 to 50 years, presenting radiographically medium density. The experimental breast phantom was constituted of three tissue-equivalents (TE's): glandular TE, adipose TE and skin TE. These TE's were developed according to chemical composition of human breast and present radiological response to exposure. Completed the construction of experimental breast phantom this was mounted on a thorax phantom previously developed by the research group NRI/UFMG. Then the computational breast phantom was constructed by performing a computed tomography (CT) by axial slices of the chest phantom. Through the images generated by CT a computational model of voxels of the thorax phantom was developed by SISCODES computational program, being the computational breast phantom represented by the same TE's of the experimental breast phantom. The images generated by CT allowed evaluating the radiological equivalence of the tissues. The breast phantom is being used in studies of experimental dosimetry both in brachytherapy as in teletherapy of breast. Dosimetry studies by MCNP-5 code using the computational model of the phantom breast are in progress. (author)

  7. Predictive factors for the development of diabetes in women with previous gestational diabetes mellitus

    DEFF Research Database (Denmark)

    Damm, P.; Kühl, C.; Bertelsen, Aksel

    1992-01-01

    OBJECTIVES: The purpose of this study was to determine the incidence of diabetes in women with previous dietary-treated gestational diabetes mellitus and to identify predictive factors for development of diabetes. STUDY DESIGN: Two to 11 years post partum, glucose tolerance was investigated in 241...... women with previous dietary-treated gestational diabetes mellitus and 57 women without previous gestational diabetes mellitus (control group). RESULTS: Diabetes developed in 42 (17.4%) women with previous gestational diabetes mellitus (3.7% insulin-dependent diabetes mellitus and 13.7% non...... of previous patients with gestational diabetes mellitus in whom plasma insulin was measured during an oral glucose tolerance test in late pregnancy a low insulin response at diagnosis was found to be an independent predictive factor for diabetes development. CONCLUSIONS: Women with previous dietary...

  8. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    Science.gov (United States)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  9. A randomised clinical trial of intrapartum fetal monitoring with computer analysis and alerts versus previously available monitoring

    Directory of Open Access Journals (Sweden)

    Santos Cristina

    2010-10-01

    Full Text Available Abstract Background Intrapartum fetal hypoxia remains an important cause of death and permanent handicap and in a significant proportion of cases there is evidence of suboptimal care related to fetal surveillance. Cardiotocographic (CTG monitoring remains the basis of intrapartum surveillance, but its interpretation by healthcare professionals lacks reproducibility and the technology has not been shown to improve clinically important outcomes. The addition of fetal electrocardiogram analysis has increased the potential to avoid adverse outcomes, but CTG interpretation remains its main weakness. A program for computerised analysis of intrapartum fetal signals, incorporating real-time alerts for healthcare professionals, has recently been developed. There is a need to determine whether this technology can result in better perinatal outcomes. Methods/design This is a multicentre randomised clinical trial. Inclusion criteria are: women aged ≥ 16 years, able to provide written informed consent, singleton pregnancies ≥ 36 weeks, cephalic presentation, no known major fetal malformations, in labour but excluding active second stage, planned for continuous CTG monitoring, and no known contra-indication for vaginal delivery. Eligible women will be randomised using a computer-generated randomisation sequence to one of the two arms: continuous computer analysis of fetal monitoring signals with real-time alerts (intervention arm or continuous CTG monitoring as previously performed (control arm. Electrocardiographic monitoring and fetal scalp blood sampling will be available in both arms. The primary outcome measure is the incidence of fetal metabolic acidosis (umbilical artery pH ecf > 12 mmol/L. Secondary outcome measures are: caesarean section and instrumental vaginal delivery rates, use of fetal blood sampling, 5-minute Apgar score Discussion This study will provide evidence of the impact of intrapartum monitoring with computer analysis and real

  10. Development of multimedia computer-based training for VXI integrated fuel monitors

    International Nuclear Information System (INIS)

    Keeffe, R.; Ellacott, T.; Truong, Q.S.

    1999-01-01

    The Canadian Safeguards Support Program has developed the VXI Integrated Fuel Monitor (VFIM) which is based on the international VXI instrument bus standard. This equipment is a generic radiation monitor which can be used in an integrated mode where several detection systems can be connected to a common system where information is collected, displayed, and analyzed via a virtual control panel with the aid of computers, trackball and computer monitor. The equipment can also be used in an autonomous mode as a portable radiation monitor with a very low power consumption. The equipment has been described at previous international symposia. Integration of several monitoring systems (bundle counter, core discharge monitor, and yes/no monitor) has been carried out at Wolsong 2. Performance results from one of the monitoring systems which was installed at CANDU nuclear stations are discussed in a companion paper at this symposium. This paper describes the development of an effective multimedia computer-based training package for the primary users of the equipment; namely IAEA inspectors and technicians. (author)

  11. Emphysema and bronchiectasis in COPD patients with previous pulmonary tuberculosis: computed tomography features and clinical implications

    Directory of Open Access Journals (Sweden)

    Jin J

    2018-01-01

    Full Text Available Jianmin Jin,1 Shuling Li,2 Wenling Yu,2 Xiaofang Liu,1 Yongchang Sun1,3 1Department of Respiratory and Critical Care Medicine, Beijing Tongren Hospital, Capital Medical University, Beijing, 2Department of Radiology, Beijing Tongren Hospital, Capital Medical University, Beijing, 3Department of Respiratory and Critical Care Medicine, Peking University Third Hospital, Beijing, China Background: Pulmonary tuberculosis (PTB is a risk factor for COPD, but the clinical characteristics and the chest imaging features (emphysema and bronchiectasis of COPD with previous PTB have not been studied well.Methods: The presence, distribution, and severity of emphysema and bronchiectasis in COPD patients with and without previous PTB were evaluated by high-resolution computed tomography (HRCT and compared. Demographic data, respiratory symptoms, lung function, and sputum culture of Pseudomonas aeruginosa were also compared between patients with and without previous PTB.Results: A total of 231 COPD patients (82.2% ex- or current smokers, 67.5% male were consecutively enrolled. Patients with previous PTB (45.0% had more severe (p=0.045 and longer history (p=0.008 of dyspnea, more exacerbations in the previous year (p=0.011, and more positive culture of P. aeruginosa (p=0.001, compared with those without PTB. Patients with previous PTB showed a higher prevalence of bronchiectasis (p<0.001, which was more significant in lungs with tuberculosis (TB lesions, and a higher percentage of more severe bronchiectasis (Bhalla score ≥2, p=0.031, compared with those without previous PTB. The overall prevalence of emphysema was not different between patients with and without previous PTB, but in those with previous PTB, a higher number of subjects with middle (p=0.001 and lower (p=0.019 lobe emphysema, higher severity score (p=0.028, higher prevalence of panlobular emphysema (p=0.013, and more extensive centrilobular emphysema (p=0.039 were observed. Notably, in patients with

  12. Computer code development plant for SMART design

    International Nuclear Information System (INIS)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H.

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  13. Computer code development plant for SMART design

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Choi, S.; Cho, B.H.; Kim, K.K.; Lee, J.C.; Kim, J.P.; Kim, J.H.; Chung, M.; Kang, D.J.; Chang, M.H

    1999-03-01

    In accordance with the localization plan for the nuclear reactor design driven since the middle of 1980s, various computer codes have been transferred into the korea nuclear industry through the technical transfer program from the worldwide major pressurized water reactor supplier or through the international code development program. These computer codes have been successfully utilized in reactor and reload core design works. As the results, design- related technologies have been satisfactorily accumulated. However, the activities for the native code development activities to substitute the some important computer codes of which usages are limited by the original technique owners have been carried out rather poorly. Thus, it is most preferentially required to secure the native techniques on the computer code package and analysis methodology in order to establish the capability required for the independent design of our own model of reactor. Moreover, differently from the large capacity loop-type commercial reactors, SMART (SYSTEM-integrated Modular Advanced ReacTor) design adopts a single reactor pressure vessel containing the major primary components and has peculiar design characteristics such as self-controlled gas pressurizer, helical steam generator, passive residual heat removal system, etc. Considering those peculiar design characteristics for SMART, part of design can be performed with the computer codes used for the loop-type commercial reactor design. However, most of those computer codes are not directly applicable to the design of an integral reactor such as SMART. Thus, they should be modified to deal with the peculiar design characteristics of SMART. In addition to the modification efforts, various codes should be developed in several design area. Furthermore, modified or newly developed codes should be verified their reliability through the benchmarking or the test for the object design. Thus, it is necessary to proceed the design according to the

  14. Development of computer-aided diagnosis systems in radiology

    International Nuclear Information System (INIS)

    Higashida, Yoshiharu; Arimura, Hidetaka; Kumazawa, Seiji; Morishita, Junji; Sakai, Shuji

    2006-01-01

    Computer-aided diagnosis (CAD) is a practice done by medical doctors based on computer image analysis as the second opinion, and CAD studies have been government-adopted projects. CAD is already on popular practice in the cancers of the breast by mammography, lung by flat plate and CT images, and large bowel by CT colonoscopy. This paper describes four examples of authors' actual CAD investigations. First, the temporal subtraction image analysis by CAD is for the detection of abnormality in the chest by radiographs taken at different times. Examples are shown in cases of interstitial pneumonia and lung cancer out of 34 patients with diffuse lung diseases. Second, development of CAD system is recorded for detection of aneurysm by the brain MR angiography (MRA). Third is the CAD detection of fascicles in cerebral white matters by the diffuse tensor MRI, which will help the surgery for brain tumors. Final is an automated patient recognition based on an image-matching technique using previous chest radiographs in the picture archiving and communication systems. This is on the radiograph giving biological fingerprints of the patients. CAD will be applied in a wider field of medicare not only in imaging technology. (T.I)

  15. COMPUTATIONAL MODELS FOR SUSTAINABLE DEVELOPMENT

    OpenAIRE

    Monendra Grover; Rajesh Kumar; Tapan Kumar Mondal; S. Rajkumar

    2011-01-01

    Genetic erosion is a serious problem and computational models have been developed to prevent it. The computational modeling in this field not only includes (terrestrial) reserve design, but also decision modeling for related problems such as habitat restoration, marine reserve design, and nonreserve approaches to conservation management. Models have been formulated for evaluating tradeoffs between socioeconomic, biophysical, and spatial criteria in establishing marine reserves. The percolatio...

  16. Implementing and developing cloud computing applications

    CERN Document Server

    Sarna, David E Y

    2010-01-01

    From small start-ups to major corporations, companies of all sizes have embraced cloud computing for the scalability, reliability, and cost benefits it can provide. It has even been said that cloud computing may have a greater effect on our lives than the PC and dot-com revolutions combined.Filled with comparative charts and decision trees, Implementing and Developing Cloud Computing Applications explains exactly what it takes to build robust and highly scalable cloud computing applications in any organization. Covering the major commercial offerings available, it provides authoritative guidan

  17. Wide-angle display developments by computer graphics

    Science.gov (United States)

    Fetter, William A.

    1989-01-01

    Computer graphics can now expand its new subset, wide-angle projection, to be as significant a generic capability as computer graphics itself. Some prior work in computer graphics is presented which leads to an attractive further subset of wide-angle projection, called hemispheric projection, to be a major communication media. Hemispheric film systems have long been present and such computer graphics systems are in use in simulators. This is the leading edge of capabilities which should ultimately be as ubiquitous as CRTs (cathode-ray tubes). These assertions are not from degrees in science or only from a degree in graphic design, but in a history of computer graphics innovations, laying groundwork by demonstration. The author believes that it is timely to look at several development strategies, since hemispheric projection is now at a point comparable to the early stages of computer graphics, requiring similar patterns of development again.

  18. Developing the next generation of diverse computer scientists: the need for enhanced, intersectional computing identity theory

    Science.gov (United States)

    Rodriguez, Sarah L.; Lehman, Kathleen

    2017-10-01

    This theoretical paper explores the need for enhanced, intersectional computing identity theory for the purpose of developing a diverse group of computer scientists for the future. Greater theoretical understanding of the identity formation process specifically for computing is needed in order to understand how students come to understand themselves as computer scientists. To ensure that the next generation of computer scientists is diverse, this paper presents a case for examining identity development intersectionally, understanding the ways in which women and underrepresented students may have difficulty identifying as computer scientists and be systematically oppressed in their pursuit of computer science careers. Through a review of the available scholarship, this paper suggests that creating greater theoretical understanding of the computing identity development process will inform the way in which educational stakeholders consider computer science practices and policies.

  19. Computing in Qualitative Analysis: A Healthy Development?

    Science.gov (United States)

    Richards, Lyn; Richards, Tom

    1991-01-01

    Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…

  20. Development of a computer-aided digital reactivity computer system for PWRs

    International Nuclear Information System (INIS)

    Chung, S.-K.; Sung, K.-Y.; Kim, D.; Cho, D.-Y.

    1993-01-01

    Reactor physics tests at initial startup and after reloading are performed to verify nuclear design and to ensure safety operation. Two kinds of reactivity computers, analog and digital, have been widely used in the pressurized water reactor (PWR) core physics test. The test data of both reactivity computers are displayed only on the strip chart recorder, and these data are managed by hand so that the accuracy of the test results depends on operator expertise and experiences. This paper describes the development of the computer-aided digital reactivity computer system (DRCS), which is enhanced by system management software and an improved system for the application of the PWR core physics test

  1. Computer Graphics for Multimedia and Hypermedia Development.

    Science.gov (United States)

    Mohler, James L.

    1998-01-01

    Discusses several theoretical and technical aspects of computer-graphics development that are useful for creating hypermedia and multimedia materials. Topics addressed include primary bitmap attributes in computer graphics, the jigsaw principle, and raster layering. (MSE)

  2. Human-Computer Interfaces for Wearable Computers: A Systematic Approach to Development and Evaluation

    OpenAIRE

    Witt, Hendrik

    2007-01-01

    The research presented in this thesis examines user interfaces for wearable computers.Wearable computers are a special kind of mobile computers that can be worn on the body. Furthermore, they integrate themselves even more seamlessly into different activities than a mobile phone or a personal digital assistant can.The thesis investigates the development and evaluation of user interfaces for wearable computers. In particular, it presents fundamental research results as well as supporting softw...

  3. Development of Computer Science Disciplines - A Social Network Analysis Approach

    OpenAIRE

    Pham, Manh Cuong; Klamma, Ralf; Jarke, Matthias

    2011-01-01

    In contrast to many other scientific disciplines, computer science considers conference publications. Conferences have the advantage of providing fast publication of papers and of bringing researchers together to present and discuss the paper with peers. Previous work on knowledge mapping focused on the map of all sciences or a particular domain based on ISI published JCR (Journal Citation Report). Although this data covers most of important journals, it lacks computer science conference and ...

  4. Editorial for special section of grid computing journal on “Cloud Computing and Services Science‿

    NARCIS (Netherlands)

    van Sinderen, Marten J.; Ivanov, Ivan I.

    This editorial briefly discusses characteristics, technology developments and challenges of cloud computing. It then introduces the papers included in the special issue on "Cloud Computing and Services Science" and positions the work reported in these papers with respect to the previously mentioned

  5. Computer-assisted Particle-in-Cell code development

    International Nuclear Information System (INIS)

    Kawata, S.; Boonmee, C.; Teramoto, T.; Drska, L.; Limpouch, J.; Liska, R.; Sinor, M.

    1997-12-01

    This report presents a new approach for an electromagnetic Particle-in-Cell (PIC) code development by a computer: in general PIC codes have a common structure, and consist of a particle pusher, a field solver, charge and current density collections, and a field interpolation. Because of the common feature, the main part of the PIC code can be mechanically developed on a computer. In this report we use the packages FIDE and GENTRAN of the REDUCE computer algebra system for discretizations of field equations and a particle equation, and for an automatic generation of Fortran codes. The approach proposed is successfully applied to the development of 1.5-dimensional PIC code. By using the generated PIC code the Weibel instability in a plasma is simulated. The obtained growth rate agrees well with the theoretical value. (author)

  6. Development of emission computed tomography in Japan

    International Nuclear Information System (INIS)

    Tanaka, E.

    1984-01-01

    Two positron emission computed tomography (PCT) devices developed in Japan are described. One is for head and the other for wholebody. The devices show fairly quantitative images with slight modifications of the existing algorithms because they were developed based on filtered back-projection. The PCT device seems to be better than the single photon emission computed tomography (SPECT) since it provides adequade compensation for photon attenuation in patients. (M.A.C.) [pt

  7. Computer aided training system development

    International Nuclear Information System (INIS)

    Midkiff, G.N.

    1987-01-01

    The first three phases of Training System Development (TSD) -- job and task analysis, curriculum design, and training material development -- are time consuming and labor intensive. The use of personal computers with a combination of commercial and custom-designed software resulted in a significant reduction in the man-hours required to complete these phases for a Health Physics Technician Training Program at a nuclear power station. This paper reports that each step in the training program project involved the use of personal computers: job survey data were compiled with a statistical package, task analysis was performed with custom software designed to interface with a commercial database management program. Job Performance Measures (tests) were generated by a custom program from data in the task analysis database, and training materials were drafted, edited, and produced using commercial word processing software

  8. Development of a mechanistically based computer simulation of nitrogen oxide absorption in packed towers

    International Nuclear Information System (INIS)

    Counce, R.M.

    1981-01-01

    A computer simulation for nitrogen oxide (NO/sub x/) scrubbing in packed towers was developed for use in process design and process control. This simulation implements a mechanistically based mathematical model, which was formulated from (1) an exhaustive literature review; (2) previous NO/sub x/ scrubbing experience with sieve-plate towers; and (3) comparisons of sequential sets of experiments. Nitrogen oxide scrubbing is characterized by simultaneous absorption and desorption phenomena: the model development is based on experiments designed to feature these two phenomena. The model was then successfully tested in experiments designed to put it in jeopardy

  9. Cloud Computing-An Ultimate Technique to Minimize Computing cost for Developing Countries

    OpenAIRE

    Narendra Kumar; Shikha Jain

    2012-01-01

    The presented paper deals with how remotely managed computing and IT resources can be beneficial in the developing countries like India and Asian sub-continent countries. This paper not only defines the architectures and functionalities of cloud computing but also indicates strongly about the current demand of Cloud computing to achieve organizational and personal level of IT supports in very minimal cost with high class flexibility. The power of cloud can be used to reduce the cost of IT - r...

  10. Development of computational science in JAEA. R and D of simulation

    International Nuclear Information System (INIS)

    Nakajima, Norihiro; Araya, Fumimasa; Hirayama, Toshio

    2006-01-01

    R and D of computational science in JAEA (Japan Atomic Energy Agency) is described. Environment of computer, R and D system in CCSE (Center for Computational Science and e-Systems), joint computational science researches in Japan and world, development of computer technologies, the some examples of simulation researches, 3-dimensional image vibrational platform system, simulation researches of FBR cycle techniques, simulation of large scale thermal stress for development of steam generator, simulation research of fusion energy techniques, development of grid computing technology, simulation research of quantum beam techniques and biological molecule simulation researches are explained. Organization of JAEA, development of computational science in JAEA, network of JAEA, international collaboration of computational science, and environment of ITBL (Information-Technology Based Laboratory) project are illustrated. (S.Y.)

  11. Development of a small-scale computer cluster

    Science.gov (United States)

    Wilhelm, Jay; Smith, Justin T.; Smith, James E.

    2008-04-01

    An increase in demand for computing power in academia has necessitated the need for high performance machines. Computing power of a single processor has been steadily increasing, but lags behind the demand for fast simulations. Since a single processor has hard limits to its performance, a cluster of computers can have the ability to multiply the performance of a single computer with the proper software. Cluster computing has therefore become a much sought after technology. Typical desktop computers could be used for cluster computing, but are not intended for constant full speed operation and take up more space than rack mount servers. Specialty computers that are designed to be used in clusters meet high availability and space requirements, but can be costly. A market segment exists where custom built desktop computers can be arranged in a rack mount situation, gaining the space saving of traditional rack mount computers while remaining cost effective. To explore these possibilities, an experiment was performed to develop a computing cluster using desktop components for the purpose of decreasing computation time of advanced simulations. This study indicates that small-scale cluster can be built from off-the-shelf components which multiplies the performance of a single desktop machine, while minimizing occupied space and still remaining cost effective.

  12. Development of integrated platform for computational material design

    Energy Technology Data Exchange (ETDEWEB)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato [Center for Computational Science and Engineering, Fuji Research Institute Corporation (Japan); Hideaki, Koike [Advance Soft Corporation (Japan)

    2003-07-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned.

  13. Development of integrated platform for computational material design

    International Nuclear Information System (INIS)

    Kiyoshi, Matsubara; Kumi, Itai; Nobutaka, Nishikawa; Akifumi, Kato; Hideaki, Koike

    2003-01-01

    The goal of our project is to design and develop a problem-solving environment (PSE) that will help computational scientists and engineers develop large complicated application software and simulate complex phenomena by using networking and parallel computing. The integrated platform, which is designed for PSE in the Japanese national project of Frontier Simulation Software for Industrial Science, is defined by supporting the entire range of problem solving activity from program formulation and data setup to numerical simulation, data management, and visualization. A special feature of our integrated platform is based on a new architecture called TASK FLOW. It integrates the computational resources such as hardware and software on the network and supports complex and large-scale simulation. This concept is applied to computational material design and the project 'comprehensive research for modeling, analysis, control, and design of large-scale complex system considering properties of human being'. Moreover this system will provide the best solution for developing large and complicated software and simulating complex and large-scaled phenomena in computational science and engineering. A prototype has already been developed and the validation and verification of an integrated platform will be scheduled by using the prototype in 2003. In the validation and verification, fluid-structure coupling analysis system for designing an industrial machine will be developed on the integrated platform. As other examples of validation and verification, integrated platform for quantum chemistry and bio-mechanical system are planned

  14. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de, E-mail: vagner.macedo@usp.br, E-mail: patricia@ipen.br, E-mail: delvonei@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  15. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    International Nuclear Information System (INIS)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de

    2015-01-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  16. Development Of A Navier-Stokes Computer Code

    Science.gov (United States)

    Yoon, Seokkwan; Kwak, Dochan

    1993-01-01

    Report discusses aspects of development of CENS3D computer code, solving three-dimensional Navier-Stokes equations of compressible, viscous, unsteady flow. Implements implicit finite-difference or finite-volume numerical-integration scheme, called "lower-upper symmetric-Gauss-Seidel" (LU-SGS), offering potential for very low computer time per iteration and for fast convergence.

  17. Implementation of an electronic medical record system in previously computer-naïve primary care centres: a pilot study from Cyprus.

    Science.gov (United States)

    Samoutis, George; Soteriades, Elpidoforos S; Kounalakis, Dimitris K; Zachariadou, Theodora; Philalithis, Anastasios; Lionis, Christos

    2007-01-01

    The computer-based electronic medical record (EMR) is an essential new technology in health care, contributing to high-quality patient care and efficient patient management. The majority of southern European countries, however, have not yet implemented universal EMR systems and many efforts are still ongoing. We describe the development of an EMR system and its pilot implementation and evaluation in two previously computer-naïve public primary care centres in Cyprus. One urban and one rural primary care centre along with their personnel (physicians and nurses) were selected to participate. Both qualitative and quantitative evaluation tools were used during the implementation phase. Qualitative data analysis was based on the framework approach, whereas quantitative assessment was based on a nine-item questionnaire and EMR usage parameters. Two public primary care centres participated, and a total often health professionals served as EMR system evaluators. Physicians and nurses rated EMR relatively highly, while patients were the most enthusiastic supporters for the new information system. Major implementation impediments were the physicians' perceptions that EMR usage negatively affected their workflow, physicians' legal concerns, lack of incentives, system breakdowns, software design problems, transition difficulties and lack of familiarity with electronic equipment. The importance of combining qualitative and quantitative evaluation tools is highlighted. More efforts are needed for the universal adoption and routine use of EMR in the primary care system of Cyprus as several barriers to adoption exist; however, none is insurmountable. Computerised systems could improve efficiency and quality of care in Cyprus, benefiting the entire population.

  18. A Brief Analysis of Development Situations and Trend of Cloud Computing

    Science.gov (United States)

    Yang, Wenyan

    2017-12-01

    in recent years, the rapid development of Internet technology has radically changed people's work, learning and lifestyles. More and more activities are completed by virtue of computers and networks. The amount of information and data generated is bigger day by day, and people rely more on computer, which makes computing power of computer fail to meet demands of accuracy and rapidity from people. The cloud computing technology has experienced fast development, which is widely applied in the computer industry as a result of advantages of high precision, fast computing and easy usage. Moreover, it has become a focus in information research at present. In this paper, the development situations and trend of cloud computing shall be analyzed and researched.

  19. Southampton uni's computer whizzes develop "mini" grid

    CERN Multimedia

    Sherriff, Lucy

    2006-01-01

    "In a bid to help its students explore the potential of grid computing, the University of Southampton's Computer Science department has developed what it calls a "lightweight grid". The system has been designed to allow students to experiment with grid technology without the complexity of inherent security concerns of the real thing. (1 page)

  20. Fluid dynamics parallel computer development at NASA Langley Research Center

    Science.gov (United States)

    Townsend, James C.; Zang, Thomas A.; Dwoyer, Douglas L.

    1987-01-01

    To accomplish more detailed simulations of highly complex flows, such as the transition to turbulence, fluid dynamics research requires computers much more powerful than any available today. Only parallel processing on multiple-processor computers offers hope for achieving the required effective speeds. Looking ahead to the use of these machines, the fluid dynamicist faces three issues: algorithm development for near-term parallel computers, architecture development for future computer power increases, and assessment of possible advantages of special purpose designs. Two projects at NASA Langley address these issues. Software development and algorithm exploration is being done on the FLEX/32 Parallel Processing Research Computer. New architecture features are being explored in the special purpose hardware design of the Navier-Stokes Computer. These projects are complementary and are producing promising results.

  1. Development of a computer design system for HVAC

    International Nuclear Information System (INIS)

    Miyazaki, Y.; Yotsuya, M.; Hasegawa, M.

    1993-01-01

    The development of a computer design system for HVAC (Heating, Ventilating and Air Conditioning) system is presented in this paper. It supports the air conditioning design for a nuclear power plant and a reprocessing plant. This system integrates various computer design systems which were developed separately for the various design phases of HVAC. the purposes include centralizing the HVAC data, optimizing design, and reducing the designing time. The centralized HVAC data are managed by a DBMS (Data Base Management System). The DBMS separates the computer design system into a calculation module and the data. The design system can thus be expanded easily in the future. 2 figs

  2. Development of real-time visualization system for Computational Fluid Dynamics on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    1998-03-01

    A real-time visualization system for computational fluid dynamics in a network connecting between a parallel computing server and the client terminal was developed. Using the system, a user can visualize the results of a CFD (Computational Fluid Dynamics) simulation on the parallel computer as a client terminal during the actual computation on a server. Using GUI (Graphical User Interface) on the client terminal, to user is also able to change parameters of the analysis and visualization during the real-time of the calculation. The system carries out both of CFD simulation and generation of a pixel image data on the parallel computer, and compresses the data. Therefore, the amount of data from the parallel computer to the client is so small in comparison with no compression that the user can enjoy the swift image appearance comfortably. Parallelization of image data generation is based on Owner Computation Rule. GUI on the client is built on Java applet. A real-time visualization is thus possible on the client PC only if Web browser is implemented on it. (author)

  3. Development of a Computer Program for the Integrated Control of the Fuel Homogeneity Measurement System

    International Nuclear Information System (INIS)

    Shin, H. S.; Jang, J. W.; Lee, Y. H.; Oh, S. J.; Park, H. D.; Kim, C. K.

    2005-11-01

    The computer program is developed based on Visual C++, which is equipped with a user-friendly interface of the input/output(I/O) and a display function for the measuring conditions. This program consists of three parts which are the port communication, PLC(Programmable Logic Controller) and the MCA(Multi Channel Analyzer) control parts. The communication type between the CPU of the PLC module box and the computer is selected as be the Rs-232 asynchronous type and the thread method is adapted in the development of the first part of the program. The PLC-related program has been developed so that the data communication between the PLC CPU and the computer could be harmonized with the unique commands which have already been defined in the PLC. The measuring space and time intervals, the start and end ROI(region of interest) values, and the allowable error limitation are input at each measurement in this program. Finally the controlling MCA program has been developed by using Canberra's programming library which contains several files including the head files in which the variable and the function of C++ are declared according to the MCA function. The performance test has been carried out through an application of the developed computer program to the homogeneity measurement system. The gamma counts at 28 measuring points of a fuel rod of 700 mm in length are measured for 50 sec at each point. It was revealed that the measurement results are better than the previous ones in respects of the measurement accuracy and a measurement time saving could be achieved. It was concluded that the gamma measurement system can be improved through equipping it with the developed control program

  4. Preliminary development of a global 3-D magnetohydrodynamic computational model for solar wind-cometary and planetary interactions

    International Nuclear Information System (INIS)

    Stahara, S.S.

    1986-05-01

    This is the final summary report by Resource Management Associates, Inc., of the first year's work under Contract No. NASW-4011 to the National Aeronautics and Space Administration. The work under this initial phase of the contract relates to the preliminary development of a global, 3-D magnetohydrodynamic computational model to quantitatively describe the detailed continuum field and plasma interaction process of the solar wind with cometary and planetary bodies throughout the solar system. The work extends a highly-successful, observationally-verified computational model previously developed by the author, and is appropriate for the global determination of supersonic, super-Alfvenic solar wind flows past planetary obstacles. This report provides a concise description of the problems studied, a summary of all the important research results, and copies of the publications

  5. Reactor safety computer code development at INEL

    International Nuclear Information System (INIS)

    Johnsen, G.W.

    1985-01-01

    This report provides a brief overview of the computer code development programs being conducted at EG and G Idaho, Inc. on behalf of US Nuclear Regulatory Commission and the Department of Energy, Idaho Operations Office. Included are descriptions of the codes being developed, their development status as of the date of this report, and resident code development expertise

  6. Developments in Remote Collaboration and Computation

    International Nuclear Information System (INIS)

    Burruss, J.R.; Abla, G.; Flanagan, S.; Keahey, K.; Leggett, T.; Ludesche, C.; McCune, D.; Papka, M.E.; Peng, Q.; Randerson, L.; Schissel, D.P.

    2005-01-01

    The National Fusion Collaboratory (NFC) is creating and deploying collaborative software tools to unite magnetic fusion research in the United States. In particular, the NFC is developing and deploying a national FES 'Grid' (FusionGrid) for secure sharing of computation, visualization, and data resources over the Internet. The goal of FusionGrid is to allow scientists at remote sites to participate as fully in experiments, machine design, and computational activities as if they were working on site thereby creating a unified virtual organization of the geographically dispersed U.S. fusion community

  7. Recent computational chemistry

    Energy Technology Data Exchange (ETDEWEB)

    Onishi, Taku [Department of Chemistry for Materials, and The Center of Ultimate Technology on nano-Electronics, Mie University (Japan); Center for Theoretical and Computational Chemistry, Department of Chemistry, University of Oslo (Norway)

    2015-12-31

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced.

  8. Recent computational chemistry

    International Nuclear Information System (INIS)

    Onishi, Taku

    2015-01-01

    Now we can investigate quantum phenomena for the real materials and molecules, and can design functional materials by computation, due to the previous developments of quantum theory and calculation methods. As there still exist the limit and problem in theory, the cooperation between theory and computation is getting more important to clarify the unknown quantum mechanism, and discover more efficient functional materials. It would be next-generation standard. Finally, our theoretical methodology for boundary solid is introduced

  9. Practical methods to improve the development of computational software

    International Nuclear Information System (INIS)

    Osborne, A. G.; Harding, D. W.; Deinert, M. R.

    2013-01-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  10. Computer Aided Design System for Developing Musical Fountain Programs

    Institute of Scientific and Technical Information of China (English)

    刘丹; 张乃尧; 朱汉城

    2003-01-01

    A computer aided design system for developing musical fountain programs was developed with multiple functions such as intelligent design, 3-D animation, manual modification and synchronized motion to make the development process more efficient. The system first analyzed the music form and sentiment using many basic features of the music to select a basic fountain program. Then, this program is simulated with 3-D animation and modified manually to achieve the desired results. Finally, the program is transformed to a computer control program to control the musical fountain in time with the music. A prototype system for the musical fountain was also developed. It was tested with many styles of music and users were quite satisfied with its performance. By integrating various functions, the proposed computer aided design system for developing musical fountain programs greatly simplified the design of the musical fountain programs.

  11. Mentoring to develop research selfefficacy, with particular reference to previously disadvantaged individuals

    OpenAIRE

    S. Schulze

    2010-01-01

    The development of inexperienced researchers is crucial. In response to the lack of research self-efficacy of many previously disadvantaged individuals, the article examines how mentoring can enhance the research self-efficacy of mentees. The study is grounded in the self-efficacy theory (SET) – an aspect of the social cognitive theory (SCT). Insights were gained from an in-depth study of SCT, SET and mentoring, and from a completed mentoring project. This led to the formulation of three basi...

  12. Children, computer exposure and musculoskeletal outcomes: the development of pathway models for school and home computer-related musculoskeletal outcomes.

    Science.gov (United States)

    Harris, Courtenay; Straker, Leon; Pollock, Clare; Smith, Anne

    2015-01-01

    Children's computer use is rapidly growing, together with reports of related musculoskeletal outcomes. Models and theories of adult-related risk factors demonstrate multivariate risk factors associated with computer use. Children's use of computers is different from adult's computer use at work. This study developed and tested a child-specific model demonstrating multivariate relationships between musculoskeletal outcomes, computer exposure and child factors. Using pathway modelling, factors such as gender, age, television exposure, computer anxiety, sustained attention (flow), socio-economic status and somatic complaints (headache and stomach pain) were found to have effects on children's reports of musculoskeletal symptoms. The potential for children's computer exposure to follow a dose-response relationship was also evident. Developing a child-related model can assist in understanding risk factors for children's computer use and support the development of recommendations to encourage children to use this valuable resource in educational, recreational and communication environments in a safe and productive manner. Computer use is an important part of children's school and home life. Application of this developed model, that encapsulates related risk factors, enables practitioners, researchers, teachers and parents to develop strategies that assist young people to use information technology for school, home and leisure in a safe and productive manner.

  13. Charged-particle thermonuclear reaction rates: IV. Comparison to previous work

    International Nuclear Information System (INIS)

    Iliadis, C.; Longland, R.; Champagne, A.E.; Coc, A.

    2010-01-01

    We compare our Monte Carlo reaction rates (see Paper II of this issue) to previous results that were obtained by using the classical method of computing thermonuclear reaction rates. For each reaction, the comparison is presented using two types of graphs: the first shows the change in reaction rate uncertainties, while the second displays our new results normalized to the previously recommended reaction rate. We find that the rates have changed significantly for almost all reactions considered here. The changes are caused by (i) our new Monte Carlo method of computing reaction rates (see Paper I of this issue), and (ii) newly available nuclear physics information (see Paper III of this issue).

  14. New developments in the CREAM Computing Element

    International Nuclear Information System (INIS)

    Andreetto, Paolo; Bertocco, Sara; Dorigo, Alvise; Capannini, Fabio; Cecchi, Marco; Zangrando, Luigi

    2012-01-01

    The EU-funded project EMI aims at providing a unified, standardized, easy to install software for distributed computing infrastructures. CREAM is one of the middleware products part of the EMI middleware distribution: it implements a Grid job management service which allows the submission, management and monitoring of computational jobs to local resource management systems. In this paper we discuss about some new features being implemented in the CREAM Computing Element. The implementation of the EMI Execution Service (EMI-ES) specification (an agreement in the EMI consortium on interfaces and protocols to be used in order to enable computational job submission and management required across technologies) is one of the new functions being implemented. New developments are also focusing in the High Availability (HA) area, to improve performance, scalability, availability and fault tolerance.

  15. Recent development in computational actinide chemistry

    International Nuclear Information System (INIS)

    Li Jun

    2008-01-01

    Ever since the Manhattan project in World War II, actinide chemistry has been essential for nuclear science and technology. Yet scientists still seek the ability to interpret and predict chemical and physical properties of actinide compounds and materials using first-principle theory and computational modeling. Actinide compounds are challenging to computational chemistry because of their complicated electron correlation effects and relativistic effects, including spin-orbit coupling effects. There have been significant developments in theoretical studies on actinide compounds in the past several years. The theoretical capabilities coupled with new experimental characterization techniques now offer a powerful combination for unraveling the complexities of actinide chemistry. In this talk, we will provide an overview of our own research in this field, with particular emphasis on applications of relativistic density functional and ab initio quantum chemical methods to the geometries, electronic structures, spectroscopy and excited-state properties of small actinide molecules such as CUO and UO 2 and some large actinide compounds relevant to separation and environment science. The performance of various density functional approaches and wavefunction theory-based electron correlation methods will be compared. The results of computational modeling on the vibrational, electronic, and NMR spectra of actinide compounds will be briefly discussed as well [1-4]. We will show that progress in relativistic quantum chemistry, computer hardware and computational chemistry software has enabled computational actinide chemistry to emerge as a powerful and predictive tool for research in actinide chemistry. (authors)

  16. High performance parallel computers for science: New developments at the Fermilab advanced computer program

    International Nuclear Information System (INIS)

    Nash, T.; Areti, H.; Atac, R.

    1988-08-01

    Fermilab's Advanced Computer Program (ACP) has been developing highly cost effective, yet practical, parallel computers for high energy physics since 1984. The ACP's latest developments are proceeding in two directions. A Second Generation ACP Multiprocessor System for experiments will include $3500 RISC processors each with performance over 15 VAX MIPS. To support such high performance, the new system allows parallel I/O, parallel interprocess communication, and parallel host processes. The ACP Multi-Array Processor, has been developed for theoretical physics. Each $4000 node is a FORTRAN or C programmable pipelined 20 MFlops (peak), 10 MByte single board computer. These are plugged into a 16 port crossbar switch crate which handles both inter and intra crate communication. The crates are connected in a hypercube. Site oriented applications like lattice gauge theory are supported by system software called CANOPY, which makes the hardware virtually transparent to users. A 256 node, 5 GFlop, system is under construction. 10 refs., 7 figs

  17. The role of computer simulation in nuclear technologies development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, V. V.; Ryazanov, D.K.; Tellin, A.I.

    2001-01-01

    In the report the role and purposes of computer simulation in nuclear technologies development is discussed. The authors consider such applications of computer simulation as nuclear safety researches, optimization of technical and economic parameters of acting nuclear plant, planning and support of reactor experiments, research and design new devices and technologies, design and development of 'simulators' for operating personnel training. Among marked applications the following aspects of computer simulation are discussed in the report: neutron-physical, thermal and hydrodynamics models, simulation of isotope structure change and damage dose accumulation for materials under irradiation, simulation of reactor control structures. (authors)

  18. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  19. Optoelectronic Computer Architecture Development for Image Reconstruction

    National Research Council Canada - National Science Library

    Forber, Richard

    1996-01-01

    .... Specifically, we collaborated with UCSD and ERIM on the development of an optically augmented electronic computer for high speed inverse transform calculations to enable real time image reconstruction...

  20. Architecture independent environment for developing engineering software on MIMD computers

    Science.gov (United States)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  1. Computer codes for RF cavity design

    International Nuclear Information System (INIS)

    Ko, K.

    1992-08-01

    In RF cavity design, numerical modeling is assuming an increasingly important role with the help of sophisticated computer codes and powerful yet affordable computers. A description of the cavity codes in use in the accelerator community has been given previously. The present paper will address the latest developments and discuss their applications to cavity toning and matching problems

  2. Development of the computer network of IFIN-HH

    International Nuclear Information System (INIS)

    Danet, A.; Mirica, M.; Constantinescu, S.

    1998-01-01

    The general computer network of Horia Hulubei National Institute for Physics and Nuclear Engineering (IFIN-HH), as part of RNC (Romanian National Computer Network for scientific research and technological development), offers the Romanian physics research community an efficient and cost-effective infrastructure to communicate and collaborate with fellow researchers abroad, and to collect and exchange the most up-to-date information in their research area. RNC is the national project co-ordinated and established by the Ministry of Research and Technology targeted on the following main objectives: - setting up a technical and organizational infrastructure meant to provide national and international electronic services for the Romanian scientific research community; - providing a rapid and competitive tool for the exchange information in the framework of R-D community; - using the scientific and technical data bases available in the country and offered by the national networks from other countries through international networks; - providing a support for information, documentation, scientific and technical co-operation. The guiding principle in elaborating the project of general computer network of IFIN-HH was to implement an open system based on OSI standards without technical barriers in communication between different communities using different computing hardware and software. The major objectives achieved in 1997 in the direction of developing the general computer network of IFIN-HH (over 250 computers connected) were: - connecting all the existing and newly installed computer equipment and providing an adequate connectivity; - providing the usual Internet services: e-mail, ftp, telnet, finger, gopher; - providing access to the World Wide Web resources; - providing on-line statistics of IP traffic (input and output) of each node of the domain computer network; - improving the performance of the connection with the central node RNC. (authors)

  3. The role of computer simulation in nuclear technology development

    International Nuclear Information System (INIS)

    Tikhonchev, M.Yu.; Shimansky, G.A.; Lebedeva, E.E.; Lichadeev, VV.; Ryazanov, D.K.; Tellin, A.I.

    2000-01-01

    In the report, the role and purpose of computer simulation in nuclear technology development is discussed. The authors consider such applications of computer simulation as: (a) Nuclear safety research; (b) Optimization of technical and economic parameters of acting nuclear plant; (c) Planning and support of reactor experiments; (d) Research and design new devices and technologies; (f) Design and development of 'simulators' for operating personnel training. Among marked applications, the following aspects of computer simulation are discussed in the report: (g) Neutron-physical, thermal and hydrodynamics models; (h) Simulation of isotope structure change and dam- age dose accumulation for materials under irradiation; (i) Simulation of reactor control structures. (authors)

  4. Development of an X-ray Computed Tomography System for Non-Invasive Imaging of Industrial Materials

    International Nuclear Information System (INIS)

    Abdullah, J.; Sipaun, S. M.; Mustapha, I.; Zain, R. M.; Rahman, M. F. A.; Mustapha, M.; Shaari, M. R.; Hassan, H.; Said, M. K. M.; Mohamad, G. H. P.; Ibrahim, M. M.

    2008-01-01

    X-ray computed tomography is a powerful non-invasive imaging technique for viewing an object's inner structures in two-dimensional cross-section images without the need to physically section it. The invention of CT techniques revolutionised the field of medical diagnostic imaging because it provided more detailed and useful information than any previous non-invasive imaging techniques. The method is increasingly being used in industry, aerospace, geosciences and archaeology. This paper describes the development of an X-ray computed tomography system for imaging of industrial materials. The theoretical aspects of CT scanner, the system configurations and the adopted algorithm for image reconstruction are discussed. The penetrating rays from a 160 kV industrial X-ray machine were used to investigate structures that manifest in a manufactured component or product. Some results were presented in this paper

  5. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  6. Standardized computer-based organized reporting of EEG

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C.

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE...... are used to report the features of clinical relevance, extracted while assessing the EEGs. Selection of the terms is context sensitive: initial choices determine the subsequently presented sets of additional choices. This process automatically generates a report and feeds these features into a database...

  7. Development of Onboard Computer Complex for Russian Segment of ISS

    Science.gov (United States)

    Branets, V.; Brand, G.; Vlasov, R.; Graf, I.; Clubb, J.; Mikrin, E.; Samitov, R.

    1998-01-01

    Report present a description of the Onboard Computer Complex (CC) that was developed during the period of 1994-1998 for the Russian Segment of ISS. The system was developed in co-operation with NASA and ESA. ESA developed a new computation system under the RSC Energia Technical Assignment, called DMS-R. The CC also includes elements developed by Russian experts and organizations. A general architecture of the computer system and the characteristics of primary elements of this system are described. The system was integrated at RSC Energia with the participation of American and European specialists. The report contains information on software simulators, verification and de-bugging facilities witch were been developed for both stand-alone and integrated tests and verification. This CC serves as the basis for the Russian Segment Onboard Control Complex on ISS.

  8. An integrated computer design environment for the development of micro-computer critical software

    International Nuclear Information System (INIS)

    De Agostino, E.; Massari, V.

    1986-01-01

    The paper deals with the development of micro-computer software for Nuclear Safety System. More specifically, it describes an experimental work in the field of software development methodologies to be used for the implementation of micro-computer based safety systems. An investigation of technological improvements that are provided by state-of-the-art integrated packages for micro-based systems development has been carried out. The work has aimed to assess a suitable automated tools environment for the whole software life-cycle. The main safety functions, as DNBR, KW/FT, of a nuclear power reactor have been implemented in a host-target approach. A prototype test-bed microsystem has been implemented to run the safety functions in order to derive a concrete evaluation on the feasibility of critical software according to new technological trends of ''Software Factories''. (author)

  9. One Head Start Classroom's Experience: Computers and Young Children's Development.

    Science.gov (United States)

    Fischer, Melissa Anne; Gillespie, Catherine Wilson

    2003-01-01

    Contends that early childhood educators need to understand how exposure to computers and constructive computer programs affects the development of children. Specifically examines: (1) research on children's technology experiences; (2) determining best practices; and (3) addressing educators' concerns about computers replacing other developmentally…

  10. A Model of Computation for Bit-Level Concurrent Computing and Programming: APEC

    Science.gov (United States)

    Ajiro, Takashi; Tsuchida, Kensei

    A concurrent model of computation and a language based on the model for bit-level operation are useful for developing asynchronous and concurrent programs compositionally, which frequently use bit-level operations. Some examples are programs for video games, hardware emulation (including virtual machines), and signal processing. However, few models and languages are optimized and oriented to bit-level concurrent computation. We previously developed a visual programming language called A-BITS for bit-level concurrent programming. The language is based on a dataflow-like model that computes using processes that provide serial bit-level operations and FIFO buffers connected to them. It can express bit-level computation naturally and develop compositionally. We then devised a concurrent computation model called APEC (Asynchronous Program Elements Connection) for bit-level concurrent computation. This model enables precise and formal expression of the process of computation, and a notion of primitive program elements for controlling and operating can be expressed synthetically. Specifically, the model is based on a notion of uniform primitive processes, called primitives, that have three terminals and four ordered rules at most, as well as on bidirectional communication using vehicles called carriers. A new notion is that a carrier moving between two terminals can briefly express some kinds of computation such as synchronization and bidirectional communication. The model's properties make it most applicable to bit-level computation compositionally, since the uniform computation elements are enough to develop components that have practical functionality. Through future application of the model, our research may enable further research on a base model of fine-grain parallel computer architecture, since the model is suitable for expressing massive concurrency by a network of primitives.

  11. Computer-aided detection system performance on current and previous digital mammograms in patients with contralateral metachronous breast cancer

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Moon, Woo Kyung; Cho, Nariya; Chang, Jung Min

    2012-01-01

    Background: The computer-aided detection (CAD) system is widely used for screening mammography. The performance of the CAD system for contralateral breast cancer has not been reported for women with a history of breast cancer. Purpose: To retrospectively evaluate the performance of a CAD system on current and previous mammograms in patients with contralateral metachronous breast cancer. Material and Methods: During a 3-year period, 4945 postoperative patients had follow-up examinations, from whom we selected 55 women with contralateral breast cancers. Among them, 38 had visible malignant signs on the current mammograms. We analyzed the sensitivity and false-positive marks of the system on the current and previous mammograms according to lesion type and breast density. Results: The total visible lesion components on the current mammograms included 27 masses and 14 calcifications in 38 patients. The case-based sensitivity for all lesion types was 63.2% (24/38) with false-positive marks of 0.71 per patient. The lesion-based sensitivity for masses and calcifications was 59.3% (16/27) and 71.4% (10/14), respectively. The lesion-based sensitivity for masses in fatty and dense breasts was 68.8% (11/16) and 45.5% (5/11), respectively. The lesion-based sensitivity for calcifications in fatty and dense breasts was 100.0% (3/3) and 63.6% (7/11), respectively. The total visible lesion components on the previous mammograms included 13 masses and three calcifications in 16 patients, and the sensitivity for all lesion types was 31.3% (5/16) with false-positive marks of 0.81 per patient. On these mammograms, the sensitivity for masses and calcifications was 30.8% (4/13) and 33.3% (1/3), respectively. The sensitivity in fatty and dense breasts was 28.6% (2/7) and 33.3% (3/9), respectively. Conclusion: In the women with a history of breast cancer, the sensitivity of the CAD system in visible contralateral breast cancer was lower than in most previous reports using the same CAD

  12. A Generic Software Development Process Refined from Best Practices for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Soojin Park

    2015-04-01

    Full Text Available Cloud computing has emerged as more than just a piece of technology, it is rather a new IT paradigm. The philosophy behind cloud computing shares its view with green computing where computing environments and resources are not as subjects to own but as subjects of sustained use. However, converting currently used IT services to Software as a Service (SaaS cloud computing environments introduces several new risks. To mitigate such risks, existing software development processes must undergo significant remodeling. This study analyzes actual cases of SaaS cloud computing environment adoption as a way to derive four new best practices for software development and incorporates the identified best practices for currently-in-use processes. Furthermore, this study presents a design for generic software development processes that implement the proposed best practices. The design for the generic process has been applied to reinforce the weak points found in SaaS cloud service development practices used by eight enterprises currently developing or operating actual SaaS cloud computing services. Lastly, this study evaluates the applicability of the proposed SaaS cloud oriented development process through analyzing the feedback data collected from actual application to the development of a SaaS cloud service Astation.

  13. Extended precision data types for the development of the original computer aided engineering applications

    Science.gov (United States)

    Pescaru, A.; Oanta, E.; Axinte, T.; Dascalescu, A.-D.

    2015-11-01

    Computer aided engineering is based on models of the phenomena which are expressed as algorithms. The implementations of the algorithms are usually software applications which are processing a large volume of numerical data, regardless the size of the input data. In this way, the finite element method applications used to have an input data generator which was creating the entire volume of geometrical data, starting from the initial geometrical information and the parameters stored in the input data file. Moreover, there were several data processing stages, such as: renumbering of the nodes meant to minimize the size of the band length of the system of equations to be solved, computation of the equivalent nodal forces, computation of the element stiffness matrix, assemblation of system of equations, solving the system of equations, computation of the secondary variables. The modern software application use pre-processing and post-processing programs to easily handle the information. Beside this example, CAE applications use various stages of complex computation, being very interesting the accuracy of the final results. Along time, the development of CAE applications was a constant concern of the authors and the accuracy of the results was a very important target. The paper presents the various computing techniques which were imagined and implemented in the resulting applications: finite element method programs, finite difference element method programs, applied general numerical methods applications, data generators, graphical applications, experimental data reduction programs. In this context, the use of the extended precision data types was one of the solutions, the limitations being imposed by the size of the memory which may be allocated. To avoid the memory-related problems the data was stored in files. To minimize the execution time, part of the file was accessed using the dynamic memory allocation facilities. One of the most important consequences of the

  14. Cloud Computing: Key to IT Development in West Africa | Nwabuonu ...

    African Journals Online (AJOL)

    It has been established that Information Technology (IT) Development in West Africa has faced lots of challenges ranging from Cyber Threat to inadequate IT Infrastructure. Cloud Computing is a Revolution. It is creating a fundamental change in Computer Architecture, Software and Tools Development iIn the way we Store, ...

  15. Development of application program and building database to increase facilities for using the radiation effect assessment computer codes

    International Nuclear Information System (INIS)

    Hyun Seok Ko; Young Min Kim; Suk-Hoon Kim; Dong Hoon Shin; Chang-Sun Kang

    2005-01-01

    The current radiation effect assessment system is required the skillful technique about the application for various code and high level of special knowledge classified by field. Therefore, as a matter of fact, it is very difficult for the radiation users' who don't have enough special knowledge to assess or recognize the radiation effect properly. For this, we already have developed the five Computer codes(windows-based), that is the radiation effect assessment system, in radiation utilizing field including the nuclear power generation. It needs the computer program that non-specialist can use the five computer codes to have already developed with ease. So, we embodied the A.I-based specialist system that can infer the assessment system by itself, according to the characteristic of given problem. The specialist program can guide users, search data, inquire of administrator directly. Conceptually, with circumstance which user to apply the five computer code may encounter actually, we embodied to consider aspects as follows. First, the accessibility of concept and data to need must be improved. Second, the acquirement of reference theory and use of corresponding computer code must be easy. Third, Q and A function needed for solution of user's question out of consideration previously. Finally, the database must be renewed continuously. Actually, to express this necessity, we develop the client program to organize reference data, to build the access methodology(query) about organized data, to load the visible expression function of searched data. And It is embodied the instruction method(effective theory acquirement procedure and methodology) to acquire the theory referring the five computer codes. It is developed the data structure access program(DBMS) to renew continuously data with ease. For Q and A function, it is embodied the Q and A board within client program because the user of client program can search the content of question and answer. (authors)

  16. Air Space Proportion in Pterosaur Limb Bones Using Computed Tomography and Its Implications for Previous Estimates of Pneumaticity

    Science.gov (United States)

    Martin, Elizabeth G.; Palmer, Colin

    2014-01-01

    Air Space Proportion (ASP) is a measure of how much air is present within a bone, which allows for a quantifiable comparison of pneumaticity between specimens and species. Measured from zero to one, higher ASP means more air and less bone. Conventionally, it is estimated from measurements of the internal and external bone diameter, or by analyzing cross-sections. To date, the only pterosaur ASP study has been carried out by visual inspection of sectioned bones within matrix. Here, computed tomography (CT) scans are used to calculate ASP in a small sample of pterosaur wing bones (mainly phalanges) and to assess how the values change throughout the bone. These results show higher ASPs than previous pterosaur pneumaticity studies, and more significantly, higher ASP values in the heads of wing bones than the shaft. This suggests that pneumaticity has been underestimated previously in pterosaurs, birds, and other archosaurs when shaft cross-sections are used to estimate ASP. Furthermore, ASP in pterosaurs is higher than those found in birds and most sauropod dinosaurs, giving them among the highest ASP values of animals studied so far, supporting the view that pterosaurs were some of the most pneumatized animals to have lived. The high degree of pneumaticity found in pterosaurs is proposed to be a response to the wing bone bending stiffness requirements of flight rather than a means to reduce mass, as is often suggested. Mass reduction may be a secondary result of pneumaticity that subsequently aids flight. PMID:24817312

  17. HOME COMPUTER USE AND THE DEVELOPMENT OF HUMAN CAPITAL*

    Science.gov (United States)

    Malamud, Ofer; Pop-Eleches, Cristian

    2012-01-01

    This paper uses a regression discontinuity design to estimate the effect of home computers on child and adolescent outcomes by exploiting a voucher program in Romania. Our main results indicate that home computers have both positive and negative effects on the development of human capital. Children who won a voucher to purchase a computer had significantly lower school grades but show improved computer skills. There is also some evidence that winning a voucher increased cognitive skills, as measured by Raven’s Progressive Matrices. We do not find much evidence for an effect on non-cognitive outcomes. Parental rules regarding homework and computer use attenuate the effects of computer ownership, suggesting that parental monitoring and supervision may be important mediating factors. PMID:22719135

  18. Development of computer aided engineering system for TRAC applications

    International Nuclear Information System (INIS)

    Arai, Kenji; Itoya, Seihiro; Uematsu, Hitoshi; Tsunoyama, Shigeaki

    1990-01-01

    An advanced best estimate computer program for nuclear reactor transient analysis, TRAC has been extensively used to carry out various thermal hydraulic calculations in the nuclear engineering field, because of its versatility. To perform efficiently a wide variety of TRAC calculation, the efficient utilization of computers and the convenient environment for input and output processing is necessary. We have applied a computer network comprising a super-computer, engineering work stations and personal computers to TRAC calculations and have assigned the appropriate functions to each computer. We have also been developing an interactive graphics system for input and output processing on an EWS. This hardware and software environment can improve the effectiveness of TRAC utilization for various thermal hydraulic calculations. (author)

  19. Computer users' risk factors for developing shoulder, elbow and back symptoms

    DEFF Research Database (Denmark)

    Juul-Kristensen, Birgit; Søgaard, Karen; Strøyer, Jesper

    2004-01-01

    OBJECTIVES: This prospective study concentrated on determining factors of computer work that predict musculoskeletal symptoms in the shoulder, elbow, and low-back regions. METHODS: A questionnaire on ergonomics, work pauses, work techniques, and psychosocial and work factors was delivered to 5033......, and previous symptoms was a significant predictor for symptoms in all regions. Computer worktime and psychosocial dimensions were not significant predictors. CONCLUSIONS: Influence on work pauses, reduction of glare or reflection, and screen height are important factors in the design of future computer...... office workers at baseline in early 1999 (response rate 69%) and to 3361 respondents at the time of the follow-up in late 2000 (response rate 77%). An increased frequency or intensity of symptoms was the outcome variable, including only nonsymptomatic respondents from the baseline questionnaire (symptom...

  20. Computer codes for RF cavity design

    International Nuclear Information System (INIS)

    Ko, K.

    1992-01-01

    In RF cavity design, numerical modeling is assuming an increasingly important role with the help of sophisticated computer codes and powerful yet affordable computers. A description of the cavity codes in use in the accelerator community has been given previously. The present paper will address the latest developments and discuss their applications to cavity tuning and matching problems. (Author) 8 refs., 10 figs

  1. Development of computer-aided auto-ranging technique for a computed radiography system

    International Nuclear Information System (INIS)

    Ishida, M.; Shimura, K.; Nakajima, N.; Kato, H.

    1988-01-01

    For a computed radiography system, the authors developed a computer-aided autoranging technique in which the clinically useful image data are automatically mapped to the available display range. The preread image data are inspected to determine the location of collimation. A histogram of the pixels inside the collimation is evaluated regarding characteristic values such as maxima and minima, and then the optimal density and contrast are derived for the display image. The effect of the autoranging technique was investigated at several hospitals in Japan. The average rate of films lost due to undesirable density or contrast was about 0.5%

  2. Developing a Distributed Computing Architecture at Arizona State University.

    Science.gov (United States)

    Armann, Neil; And Others

    1994-01-01

    Development of Arizona State University's computing architecture, designed to ensure that all new distributed computing pieces will work together, is described. Aspects discussed include the business rationale, the general architectural approach, characteristics and objectives of the architecture, specific services, and impact on the university…

  3. ANS main control complex three-dimensional computer model development

    International Nuclear Information System (INIS)

    Cleaves, J.E.; Fletcher, W.M.

    1993-01-01

    A three-dimensional (3-D) computer model of the Advanced Neutron Source (ANS) main control complex is being developed. The main control complex includes the main control room, the technical support center, the materials irradiation control room, computer equipment rooms, communications equipment rooms, cable-spreading rooms, and some support offices and breakroom facilities. The model will be used to provide facility designers and operations personnel with capabilities for fit-up/interference analysis, visual ''walk-throughs'' for optimizing maintain-ability, and human factors and operability analyses. It will be used to determine performance design characteristics, to generate construction drawings, and to integrate control room layout, equipment mounting, grounding equipment, electrical cabling, and utility services into ANS building designs. This paper describes the development of the initial phase of the 3-D computer model for the ANS main control complex and plans for its development and use

  4. Present status of computational tools for maglev development

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z.; Chen, S.S.; Rote, D.M.

    1991-10-01

    High-speed vehicles that employ magnetic levitation (maglev) have received great attention worldwide as a means of relieving both highway and air-traffic congestion. At this time, Japan and Germany are leading the development of maglev. After fifteen years of inactivity that is attributed to technical policy decisions, the federal government of the United States has reconsidered the possibility of using maglev in the United States. The National Maglev Initiative (NMI) was established in May 1990 to assess the potential of maglev in the United States. One of the tasks of the NMI, which is also the objective of this report, is to determine the status of existing computer software that can be applied to maglev-related problems. The computational problems involved in maglev assessment, research, and development can be classified into two categories: electromagnetic and mechanical. Because most maglev problems are complicated and difficult to solve analytically, proper numerical methods are needed to find solutions. To determine the status of maglev-related software, developers and users of computer codes were surveyed. The results of the survey are described in this report. 25 refs.

  5. Development of a UNIX network compatible reactivity computer

    International Nuclear Information System (INIS)

    Sanchez, R.F.; Edwards, R.M.

    1996-01-01

    A state-of-the-art UNIX network compatible controller and UNIX host workstation with MATLAB/SIMULINK software were used to develop, implement, and validate a digital reactivity calculation. An objective of the development was to determine why a Macintosh-based reactivity computer reactivity output drifted intolerably

  6. Is Cup Positioning Challenged in Hips Previously Treated With Periacetabular Osteotomy?

    DEFF Research Database (Denmark)

    Hartig-Andreasen, Charlotte; Stilling, Maiken; Søballe, Kjeld

    2014-01-01

    After periacetabular osteotomy (PAO), some patients develop osteoarthritis with need of a total hip arthroplasty (THA). We evaluated the outcome of THA following PAO and explored factors associated with inferior cup position and increased polyethylene wear. Follow-up were performed 4 to 10years...... after THA in 34 patients (38 hips) with previous PAO. Computer analysis evaluated cup position and wear rates. No patient had dislocations or revision surgery. Median scores were: Harris hip 96, Oxford hip 38 and WOMAC 78. Mean cup anteversion and abduction angles were 22(o) (range 7°-43°) and 45......° (range 28°-65°). Outliers of cup abduction were associated with persisting dysplasia (CE...

  7. Trends in computer hardware and software.

    Science.gov (United States)

    Frankenfeld, F M

    1993-04-01

    Previously identified and current trends in the development of computer systems and in the use of computers for health care applications are reviewed. Trends identified in a 1982 article were increasing miniaturization and archival ability, increasing software costs, increasing software independence, user empowerment through new software technologies, shorter computer-system life cycles, and more rapid development and support of pharmaceutical services. Most of these trends continue today. Current trends in hardware and software include the increasing use of reduced instruction-set computing, migration to the UNIX operating system, the development of large software libraries, microprocessor-based smart terminals that allow remote validation of data, speech synthesis and recognition, application generators, fourth-generation languages, computer-aided software engineering, object-oriented technologies, and artificial intelligence. Current trends specific to pharmacy and hospitals are the withdrawal of vendors of hospital information systems from the pharmacy market, improved linkage of information systems within hospitals, and increased regulation by government. The computer industry and its products continue to undergo dynamic change. Software development continues to lag behind hardware, and its high cost is offsetting the savings provided by hardware.

  8. High-Throughput Computational Assessment of Previously Synthesized Semiconductors for Photovoltaic and Photoelectrochemical Devices

    DEFF Research Database (Denmark)

    Kuhar, Korina; Pandey, Mohnish; Thygesen, Kristian Sommer

    2018-01-01

    Using computational screening we identify materials with potential use as light absorbers in photovoltaic or photoelectrochemical devices. The screening focuses on compounds of up to three different chemical elements which are abundant and nontoxic. A prescreening is carried out based on informat...

  9. Computing in research and development in Africa benefits, trends, challenges and solutions

    CERN Document Server

    2015-01-01

    This book describes the trends, challenges and solutions in computing use for scientific research and development within different domains in Africa, such as health, agriculture, environment, economy, energy, education and engineering. The benefits expected are discussed by a number of recognized, domain-specific experts, with a common theme being computing as solution enabler. This book is the first document providing such a representative up-to-date view on this topic at the continent level.   • Discusses computing for scientific research and development on the African continent, addressing domains such as engineering, health, agriculture, environment, economy, energy, and education; • Describes the state-of-the-art in usage of computing to address problems in developing countries pertaining to health, productivity, economic growth, and renewable energy; • Offers insights applicable to all developing countries on the use of computing technologies to address a variety of societal issues.

  10. Development of computed tomography system and image reconstruction algorithm

    International Nuclear Information System (INIS)

    Khairiah Yazid; Mohd Ashhar Khalid; Azaman Ahmad; Khairul Anuar Mohd Salleh; Ab Razak Hamzah

    2006-01-01

    Computed tomography is one of the most advanced and powerful nondestructive inspection techniques, which is currently used in many different industries. In several CT systems, detection has been by combination of an X-ray image intensifier and charge -coupled device (CCD) camera or by using line array detector. The recent development of X-ray flat panel detector has made fast CT imaging feasible and practical. Therefore this paper explained the arrangement of a new detection system which is using the existing high resolution (127 μm pixel size) flat panel detector in MINT and the image reconstruction technique developed. The aim of the project is to develop a prototype flat panel detector based CT imaging system for NDE. The prototype consisted of an X-ray tube, a flat panel detector system, a rotation table and a computer system to control the sample motion and image acquisition. Hence this project is divided to two major tasks, firstly to develop image reconstruction algorithm and secondly to integrate X-ray imaging components into one CT system. The image reconstruction algorithm using filtered back-projection method is developed and compared to other techniques. The MATLAB program is the tools used for the simulations and computations for this project. (Author)

  11. Beyond computer literacy: supporting youth's positive development through technology.

    Science.gov (United States)

    Bers, Marina Umaschi

    2010-01-01

    In a digital era in which technology plays a role in most aspects of a child's life, having the competence and confidence to use computers might be a necessary step, but not a goal in itself. Developing character traits that will serve children to use technology in a safe way to communicate and connect with others, and providing opportunities for children to make a better world through the use of their computational skills, is just as important. The Positive Technological Development framework (PTD), a natural extension of the computer literacy and the technological fluency movements that have influenced the world of educational technology, adds psychosocial, civic, and ethical components to the cognitive ones. PTD examines the developmental tasks of a child growing up in our digital era and provides a model for developing and evaluating technology-rich youth programs. The explicit goal of PTD programs is to support children in the positive uses of technology to lead more fulfilling lives and make the world a better place. This article introduces the concept of PTD and presents examples of the Zora virtual world program for young people that the author developed following this framework.

  12. Continuous development of schemes for parallel computing of the electrostatics in biological systems: implementation in DelPhi.

    Science.gov (United States)

    Li, Chuan; Petukh, Marharyta; Li, Lin; Alexov, Emil

    2013-08-15

    Due to the enormous importance of electrostatics in molecular biology, calculating the electrostatic potential and corresponding energies has become a standard computational approach for the study of biomolecules and nano-objects immersed in water and salt phase or other media. However, the electrostatics of large macromolecules and macromolecular complexes, including nano-objects, may not be obtainable via explicit methods and even the standard continuum electrostatics methods may not be applicable due to high computational time and memory requirements. Here, we report further development of the parallelization scheme reported in our previous work (Li, et al., J. Comput. Chem. 2012, 33, 1960) to include parallelization of the molecular surface and energy calculations components of the algorithm. The parallelization scheme utilizes different approaches such as space domain parallelization, algorithmic parallelization, multithreading, and task scheduling, depending on the quantity being calculated. This allows for efficient use of the computing resources of the corresponding computer cluster. The parallelization scheme is implemented in the popular software DelPhi and results in speedup of several folds. As a demonstration of the efficiency and capability of this methodology, the electrostatic potential, and electric field distributions are calculated for the bovine mitochondrial supercomplex illustrating their complex topology, which cannot be obtained by modeling the supercomplex components alone. Copyright © 2013 Wiley Periodicals, Inc.

  13. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Katya Le Blanc; Johanna Oxstrand

    2012-04-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.

  14. Development and validation of Monte Carlo dose computations for contrast-enhanced stereotactic synchrotron radiation therapy

    International Nuclear Information System (INIS)

    Vautrin, M.

    2011-01-01

    Contrast-enhanced stereotactic synchrotron radiation therapy (SSRT) is an innovative technique based on localized dose-enhancement effects obtained by reinforced photoelectric absorption in the tumor. Medium energy monochromatic X-rays (50 - 100 keV) are used for irradiating tumors previously loaded with a high-Z element. Clinical trials of SSRT are being prepared at the European Synchrotron Radiation Facility (ESRF), an iodinated contrast agent will be used. In order to compute the energy deposited in the patient (dose), a dedicated treatment planning system (TPS) has been developed for the clinical trials, based on the ISOgray TPS. This work focuses on the SSRT specific modifications of the TPS, especially to the PENELOPE-based Monte Carlo dose engine. The TPS uses a dedicated Monte Carlo simulation of medium energy polarized photons to compute the deposited energy in the patient. Simulations are performed considering the synchrotron source, the modeled beamline geometry and finally the patient. Specific materials were also implemented in the voxelized geometry of the patient, to consider iodine concentrations in the tumor. The computation process has been optimized and parallelized. Finally a specific computation of absolute doses and associated irradiation times (instead of monitor units) was implemented. The dedicated TPS was validated with depth dose curves, dose profiles and absolute dose measurements performed at the ESRF in a water tank and solid water phantoms with or without bone slabs. (author) [fr

  15. NWChem Meeting on Science Driven Petascale Computing and Capability Development at EMSL

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, Wibe A.

    2007-02-19

    On January 25, and 26, 2007, an NWChem meeting was held that was attended by 65 scientists from 29 institutions including 22 universities and 5 national laboratories. The goals of the meeting were to look at major scientific challenges that could be addressed by computational modeling in environmental molecular sciences, and to identify the associated capability development needs. In addition, insights were sought into petascale computing developments in computational chemistry. During the meeting common themes were identified that will drive the need for the development of new or improved capabilities in NWChem. Crucial areas of development that the developer's team will be focusing on are (1) modeling of dynamics and kinetics in chemical transformations, (2) modeling of chemistry at interfaces and in the condensed phase, and (3) spanning longer time scales in biological processes modeled with molecular dynamics. Various computational chemistry methodologies were discussed during the meeting, which will provide the basis for the capability developments in the near or long term future of NWChem.

  16. An Application Development Platform for Neuromorphic Computing

    Energy Technology Data Exchange (ETDEWEB)

    Dean, Mark [University of Tennessee (UT); Chan, Jason [University of Tennessee (UT); Daffron, Christopher [University of Tennessee (UT); Disney, Adam [University of Tennessee (UT); Reynolds, John [University of Tennessee (UT); Rose, Garrett [University of Tennessee (UT); Plank, James [University of Tennessee (UT); Birdwell, John Douglas [University of Tennessee (UT); Schuman, Catherine D [ORNL

    2016-01-01

    Dynamic Adaptive Neural Network Arrays (DANNAs) are neuromorphic computing systems developed as a hardware based approach to the implementation of neural networks. They feature highly adaptive and programmable structural elements, which model arti cial neural networks with spiking behavior. We design them to solve problems using evolutionary optimization. In this paper, we highlight the current hardware and software implementations of DANNA, including their features, functionalities and performance. We then describe the development of an Application Development Platform (ADP) to support efficient application implementation and testing of DANNA based solutions. We conclude with future directions.

  17. ATLAS computing activities and developments in the Italian Grid cloud

    International Nuclear Information System (INIS)

    Rinaldi, L; Ciocca, C; K, M; Annovi, A; Antonelli, M; Martini, A; Barberis, D; Brunengo, A; Corosu, M; Barberis, S; Carminati, L; Campana, S; Di, A; Capone, V; Carlino, G; Doria, A; Esposito, R; Merola, L; De, A; Luminari, L

    2012-01-01

    The large amount of data produced by the ATLAS experiment needs new computing paradigms for data processing and analysis, which involve many computing centres spread around the world. The computing workload is managed by regional federations, called “clouds”. The Italian cloud consists of a main (Tier-1) center, located in Bologna, four secondary (Tier-2) centers, and a few smaller (Tier-3) sites. In this contribution we describe the Italian cloud facilities and the activities of data processing, analysis, simulation and software development performed within the cloud, and we discuss the tests of the new computing technologies contributing to evolution of the ATLAS Computing Model.

  18. DEVELOPMENT OF COMPUTER AIDED DESIGN OF CHAIN COUPLING

    Directory of Open Access Journals (Sweden)

    Sergey Aleksandrovich Sergeev

    2015-12-01

    Full Text Available The present paper describes the development stages of computer-aided design of chain couplings. The first stage is the automation of traditional design techniques (intermediate automation. The second integrated automation with the development of automated equipment and production technology, including on the basis of flexible manufacturing systems (high level of automation.

  19. An Introduction to Quantum Computing, Without the Physics

    OpenAIRE

    Nannicini, Giacomo

    2017-01-01

    This paper is a gentle but rigorous introduction to quantum computing intended for discrete mathematicians. Starting from a small set of assumptions on the behavior of quantum computing devices, we analyze their main characteristics, stressing the differences with classical computers, and finally describe two well-known algorithms (Simon's algorithm and Grover's algorithm) using the formalism developed in previous sections. This paper does not touch on the physics of the devices, and therefor...

  20. Accelerating Development of EV Batteries Through Computer-Aided Engineering (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Pesaran, A.; Kim, G. H.; Smith, K.; Santhanagopalan, S.

    2012-12-01

    The Department of Energy's Vehicle Technology Program has launched the Computer-Aided Engineering for Automotive Batteries (CAEBAT) project to work with national labs, industry and software venders to develop sophisticated software. As coordinator, NREL has teamed with a number of companies to help improve and accelerate battery design and production. This presentation provides an overview of CAEBAT, including its predictive computer simulation of Li-ion batteries known as the Multi-Scale Multi-Dimensional (MSMD) model framework. MSMD's modular, flexible architecture connects the physics of battery charge/discharge processes, thermal control, safety and reliability in a computationally efficient manner. This allows independent development of submodels at the cell and pack levels.

  1. Computing for Lattice QCD: new developments from the APE experiment

    Energy Technology Data Exchange (ETDEWEB)

    Ammendola, R [INFN, Sezione di Roma Tor Vergata, Roma (Italy); Biagioni, A; De Luca, S [INFN, Sezione di Roma, Roma (Italy)

    2008-06-15

    As the Lattice QCD develops improved techniques to shed light on new physics, it demands increasing computing power. The aim of the current APE (Array Processor Experiment) project is to provide the reference computing platform to the Lattice QCD community for the period 2009-2011. We present the project proposal for a peta flops range super-computing center with high performance and low maintenance costs, to be delivered starting from 2010.

  2. Computing for Lattice QCD: new developments from the APE experiment

    International Nuclear Information System (INIS)

    Ammendola, R.; Biagioni, A.; De Luca, S.

    2008-01-01

    As the Lattice QCD develops improved techniques to shed light on new physics, it demands increasing computing power. The aim of the current APE (Array Processor Experiment) project is to provide the reference computing platform to the Lattice QCD community for the period 2009-2011. We present the project proposal for a peta flops range super-computing center with high performance and low maintenance costs, to be delivered starting from 2010.

  3. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  4. Portable Computer Technology (PCT) Research and Development Program Phase 2

    Science.gov (United States)

    Castillo, Michael; McGuire, Kenyon; Sorgi, Alan

    1995-01-01

    The subject of this project report, focused on: (1) Design and development of two Advanced Portable Workstation 2 (APW 2) units. These units incorporate advanced technology features such as a low power Pentium processor, a high resolution color display, National Television Standards Committee (NTSC) video handling capabilities, a Personal Computer Memory Card International Association (PCMCIA) interface, and Small Computer System Interface (SCSI) and ethernet interfaces. (2) Use these units to integrate and demonstrate advanced wireless network and portable video capabilities. (3) Qualification of the APW 2 systems for use in specific experiments aboard the Mir Space Station. A major objective of the PCT Phase 2 program was to help guide future choices in computing platforms and techniques for meeting National Aeronautics and Space Administration (NASA) mission objectives. The focus being on the development of optimal configurations of computing hardware, software applications, and network technologies for use on NASA missions.

  5. Computer-Aided Sensor Development Focused on Security Issues.

    Science.gov (United States)

    Bialas, Andrzej

    2016-05-26

    The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.

  6. A survey of computational physics introductory computational science

    CERN Document Server

    Landau, Rubin H; Bordeianu, Cristian C

    2008-01-01

    Computational physics is a rapidly growing subfield of computational science, in large part because computers can solve previously intractable problems or simulate natural processes that do not have analytic solutions. The next step beyond Landau's First Course in Scientific Computing and a follow-up to Landau and Páez's Computational Physics, this text presents a broad survey of key topics in computational physics for advanced undergraduates and beginning graduate students, including new discussions of visualization tools, wavelet analysis, molecular dynamics, and computational fluid dynamics

  7. Recent Developments in Computed Tomography

    International Nuclear Information System (INIS)

    Braunstein, D.; Dafni, E.; Levene, S.; Malamud, G.; Shapiro, O.; Shechter, G.; Zahavi, O.

    1999-01-01

    Computerized Tomography. has become, during the past few years, one of the mostly used apparatus in X-ray diagnosis. Its clinical applications has penetrated to various fields, like operational guidance, cardiac imaging, computer aided surgery etc. The first second-generation CT scanners consisted of a rotate-rotate system detectors array and an X-ray tube. These scanners were capable of acquiring individual single slices, the duration of each being several seconds. The slow scanning rate, and the then poor computers power, limited the application range of these scanners, to relatively stable organs, short body coverage at given resolutions. Further drawbacks of these machines were weak X-ray sources and low efficiency gas detectors. In the late 80's the first helical scanners were introduced by Siemens. Based on a continuous patient couch movement during gantry rotation, much faster scans could be obtained, increasing significantly the volume coverage at a given time. In 1992 the first dual-slice scanners, equipped with high efficiency solid state detectors were introduced by Elscint. The acquisition of data simultaneously from two detector arrays doubled the efficiency of the scan. Faster computers and stronger X-ray sources further improved the performance, allowing for a new range of clinical applications. Yet, the need for even faster machines and bigger volume coverage led to further R and D efforts by the leading CT manufacturers. In order to accomplish the most demanding clinical needs, innovative 2 dimensional 4-rows solid-state detector arrays were developed, together with faster rotating machines and bigger X-ray tubes, all demanding extremely accurate and robust mechanical constructions. Parallel, multi-processor custom computers were made, in order to allow the on-line reconstruction of the growing amounts of raw data. Four-slice helical scanners, rotating at 0.5 sec per cycle are being tested nowadays in several clinics all over the world. This talk

  8. Development and application of methods and computer codes of fuel management and nuclear design of reload cycles in PWR

    International Nuclear Information System (INIS)

    Ahnert, C.; Aragones, J.M.; Corella, M.R.; Esteban, A.; Martinez-Val, J.M.; Minguez, E.; Perlado, J.M.; Pena, J.; Matias, E. de; Llorente, A.; Navascues, J.; Serrano, J.

    1976-01-01

    Description of methods and computer codes for Fuel Management and Nuclear Design of Reload Cycles in PWR, developed at JEN by adaptation of previous codes (LEOPARD, NUTRIX, CITATION, FUELCOST) and implementation of original codes (TEMP, SOTHIS, CICLON, NUDO, MELON, ROLLO, LIBRA, PENELOPE) and their application to the project of Management and Design of Reload Cycles of a 510 Mwt PWR, including comparison with results of experimental operation and other calculations for validation of methods. (author) [es

  9. Process-Based Development of Competence Models to Computer Science Education

    Science.gov (United States)

    Zendler, Andreas; Seitz, Cornelia; Klaudt, Dieter

    2016-01-01

    A process model ("cpm.4.CSE") is introduced that allows the development of competence models in computer science education related to curricular requirements. It includes eight subprocesses: (a) determine competence concept, (b) determine competence areas, (c) identify computer science concepts, (d) assign competence dimensions to…

  10. A computer literacy scale for newly enrolled nursing college students: development and validation.

    Science.gov (United States)

    Lin, Tung-Cheng

    2011-12-01

    Increasing application and use of information systems and mobile technologies in the healthcare industry require increasing nurse competency in computer use. Computer literacy is defined as basic computer skills, whereas computer competency is defined as the computer skills necessary to accomplish job tasks. Inadequate attention has been paid to computer literacy and computer competency scale validity. This study developed a computer literacy scale with good reliability and validity and investigated the current computer literacy of newly enrolled students to develop computer courses appropriate to students' skill levels and needs. This study referenced Hinkin's process to develop a computer literacy scale. Participants were newly enrolled first-year undergraduate students, with nursing or nursing-related backgrounds, currently attending a course entitled Information Literacy and Internet Applications. Researchers examined reliability and validity using confirmatory factor analysis. The final version of the developed computer literacy scale included six constructs (software, hardware, multimedia, networks, information ethics, and information security) and 22 measurement items. Confirmatory factor analysis showed that the scale possessed good content validity, reliability, convergent validity, and discriminant validity. This study also found that participants earned the highest scores for the network domain and the lowest score for the hardware domain. With increasing use of information technology applications, courses related to hardware topic should be increased to improve nurse problem-solving abilities. This study recommends that emphases on word processing and network-related topics may be reduced in favor of an increased emphasis on database, statistical software, hospital information systems, and information ethics.

  11. Development of Graphical Solution for Computer-Assisted Fault Diagnosis: Preliminary Study

    International Nuclear Information System (INIS)

    Yoon, Han Bean; Yun, Seung Man; Han, Jong Chul

    2009-01-01

    We have developed software for converting the volumetric voxel data obtained from X-ray computed tomography(CT) into computer-aided design(CAD) data. The developed software can used for non-destructive testing and evaluation, reverse engineering, and rapid prototyping, etc. The main algorithms employed in the software are image reconstruction, volume rendering, segmentation, and mesh data generation. The feasibility of the developed software is demonstrated with the CT data of human maxilla and mandible bones

  12. Research and development of grid computing technology in center for computational science and e-systems of Japan Atomic Energy Agency

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of the Japan Atomic Energy Agency (CCSE/JAEA) has carried out R and D of grid computing technology. Since 1995, R and D to realize computational assistance for researchers called Seamless Thinking Aid (STA) and then to share intellectual resources called Information Technology Based Laboratory (ITBL) have been conducted, leading to construct an intelligent infrastructure for the atomic energy research called Atomic Energy Grid InfraStructure (AEGIS) under the Japanese national project 'Development and Applications of Advanced High-Performance Supercomputer'. It aims to enable synchronization of three themes: 1) Computer-Aided Research and Development (CARD) to realize and environment for STA, 2) Computer-Aided Engineering (CAEN) to establish Multi Experimental Tools (MEXT), and 3) Computer Aided Science (CASC) to promote the Atomic Energy Research and Investigation (AERI). This article reviewed achievements in R and D of grid computing technology so far obtained. (T. Tanaka)

  13. The role of computers in developing countries with reference to East Africa

    International Nuclear Information System (INIS)

    Shayo, L.K.

    1984-01-01

    The role of computers in economic and technological development is examined with particular reference to developing countries. It is stressed that these countries must exploit the potential of computers in their strive to catch-up in the development race. The shortage of qualified EDP personnel is singled out as one of the most critical factors in any unsatisfactory state of computer applications. A computerization policy based on the demands for information by the sophistication of the development process, and supported by a sufficient core of qualified local manpower, is recommended. The situation in East Africa is discussed and recommendations for training and production of telematics equipment are made. (author)

  14. Cloud Computing and Agile Organization Development

    Directory of Open Access Journals (Sweden)

    Bogdan GHILIC-MICU

    2014-01-01

    Full Text Available In the 3rd millennium economy, defined by globalization and continuous reduction of natural resources, the economic organization becomes the main actor in the phenomenon of transfor-mation and adaptation to new conditions. Even more, the economic environment, which is closely related to the social environment, undergoes complex metamorphoses, especially in the management area. In this dynamic and complex social and environmental context, the econom-ic organization must possess the ability to adapt, becoming a flexible and agile answer to new market opportunities. Considering the spectacular evolution of information and communica-tions technology, one of the solutions to ensure organization agility is cloud computing. Just like the development of any science requires adaptation to theories and instruments specific to other fields, a cloud computing paradigm for the agile organization must appeal to models from management, cybernetics, mathematics, structuralism and information theory (or information systems theory.

  15. A new incision for unilateral cleft lip repair developed using animated simulation of repair on computer

    Directory of Open Access Journals (Sweden)

    Sahay A

    2007-01-01

    Full Text Available Background: Unilateral cleft lip repair continues to leave behind some amount of dissatisfaction, as a scope for further improvement is always felt. Most surgeons do not like to deviate from the standard Millard′s/ triangular techniques, or their minor modifications, as no one likes to experiment on the face for fear of unfavourable outcomes. The computer can be utilized as a useful tool in the analysis and planning of surgery and new methods can be developed and attempted subsequently with greater confidence. Aim: We decided to see if an improved lip repair could be developed with the use of computers. Materials and Methods: Analysis of previous lip repairs was done to determine where an improvement was required. Movement of tissues, by simulating an ideal repair, using image warping software, on digital images of cleft lip was studied in animation sequences. A repair which could reproduce these movements was planned. A new incision emerged, which had combined the principles of Millard′s and Randall / Tennyson repairs, with additional features. The new method was performed on 30 cases. Conclusions: The results were encouraging as the shortcomings of these methods were minimized, and the advantages maximized.

  16. Ethics in computer software design and development

    Science.gov (United States)

    Alan J. Thomson; Daniel L. Schmoldt

    2001-01-01

    Over the past 20 years, computer software has become integral and commonplace for operational and management tasks throughout agricultural and natural resource disciplines. During this software infusion, however, little thought has been afforded human impacts, both good and bad. This paper examines current ethical issues of software system design and development in...

  17. Laboratory Works Designed for Developing Student Motivation in Computer Architecture

    Directory of Open Access Journals (Sweden)

    Petre Ogrutan

    2017-02-01

    Full Text Available In light of the current difficulties related to maintaining the students’ interest and to stimulate their motivation for learning, the authors have developed a range of new laboratory exercises intended for first-year students in Computer Science as well as for engineering students after completion of at least one course in computers. The educational goal of the herein proposed laboratory exercises is to enhance the students’ motivation and creative thinking by organizing a relaxed yet competitive learning environment. The authors have developed a device including LEDs and switches, which is connected to a computer. By using assembly language, commands can be issued to flash several LEDs and read the states of the switches. The effectiveness of this idea was confirmed by a statistical study.

  18. Computed Tomography Technology: Development and Applications for Defence

    International Nuclear Information System (INIS)

    Baheti, G. L.; Saxena, Nisheet; Tripathi, D. K.; Songara, K. C.; Meghwal, L. R.; Meena, V. L.

    2008-01-01

    Computed Tomography(CT) has revolutionized the field of Non-Destructive Testing and Evaluation (NDT and E). Tomography for industrial applications warrants design and development of customized solutions catering to specific visualization requirements. Present paper highlights Tomography Technology Solutions implemented at Defence Laboratory, Jodhpur (DLJ). Details on the technological developments carried out and their utilization for various Defence applications has been covered.

  19. Concept of development of integrated computer - based control system for 'Ukryttia' object

    International Nuclear Information System (INIS)

    Buyal'skij, V.M.; Maslov, V.P.

    2003-01-01

    The structural concept of Chernobyl NPP 'Ukryttia' Object's integrated computer - based control system development is presented on the basis of general concept of integrated Computer - based Control System (CCS) design process for organizing and technical management subjects.The concept is aimed at state-of-the-art architectural design technique application and allows using modern computer-aided facilities for functional model,information (logical and physical) models development,as well as for system object model under design

  20. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-07-01

    Computational steering has revolutionized the traditional workflow in high performance computing (HPC) applications. The standard workflow that consists of preparation of an application’s input, running of a simulation, and visualization of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation application at run-time. It allows modification of application-defined control parameters at run-time using various user-steering applications. In this project, we propose a computational steering framework for HPC environments that provides an innovative solution and easy-to-use platform, which allows users to connect and interact with running application(s) in real-time. This framework uses RealityGrid as the underlying steering library and adds several enhancements to the library to enable steering support for Blue Gene systems. Included in the scope of this project is the development of a scalable and efficient steering relay server that supports many-to-many connectivity between multiple steered applications and multiple steering clients. Steered applications can range from intermediate simulation and physical modeling applications to complex computational fluid dynamics (CFD) applications or advanced visualization applications. The Blue Gene supercomputer presents special challenges for remote access because the compute nodes reside on private networks. This thesis presents an implemented solution and demonstrates it on representative applications. Thorough implementation details and application enablement steps are also presented in this thesis to encourage direct usage of this framework.

  1. Computer-Aided Template for Model Reuse, Development and Maintenance

    DEFF Research Database (Denmark)

    Fedorova, Marina; Sin, Gürkan; Gani, Rafiqul

    2014-01-01

    A template-based approach for model development is presented in this work. Based on a model decomposition technique, the computer-aided template concept has been developed. This concept is implemented as a software tool , which provides a user-friendly interface for following the workflow steps...

  2. Computer-aided modeling framework for efficient model development, analysis and identification

    DEFF Research Database (Denmark)

    Heitzig, Martina; Sin, Gürkan; Sales Cruz, Mauricio

    2011-01-01

    Model-based computer aided product-process engineering has attained increased importance in a number of industries, including pharmaceuticals, petrochemicals, fine chemicals, polymers, biotechnology, food, energy, and water. This trend is set to continue due to the substantial benefits computer-aided...... methods introduce. The key prerequisite of computer-aided product-process engineering is however the availability of models of different types, forms, and application modes. The development of the models required for the systems under investigation tends to be a challenging and time-consuming task....... The methodology has been implemented into a computer-aided modeling framework, which combines expert skills, tools, and database connections that are required for the different steps of the model development work-flow with the goal to increase the efficiency of the modeling process. The framework has two main...

  3. Computer-Aided Sensor Development Focused on Security Issues

    Directory of Open Access Journals (Sweden)

    Andrzej Bialas

    2016-05-01

    Full Text Available The paper examines intelligent sensor and sensor system development according to the Common Criteria methodology, which is the basic security assurance methodology for IT products and systems. The paper presents how the development process can be supported by software tools, design patterns and knowledge engineering. The automation of this process brings cost-, quality-, and time-related advantages, because the most difficult and most laborious activities are software-supported and the design reusability is growing. The paper includes a short introduction to the Common Criteria methodology and its sensor-related applications. In the experimental section the computer-supported and patterns-based IT security development process is presented using the example of an intelligent methane detection sensor. This process is supported by an ontology-based tool for security modeling and analyses. The verified and justified models are transferred straight to the security target specification representing security requirements for the IT product. The novelty of the paper is to provide a patterns-based and computer-aided methodology for the sensors development with a view to achieving their IT security assurance. The paper summarizes the validation experiment focused on this methodology adapted for the sensors system development, and presents directions of future research.

  4. Development of point Kernel radiation shielding analysis computer program implementing recent nuclear data and graphic user interfaces

    International Nuclear Information System (INIS)

    Kang, S.; Lee, S.; Chung, C.

    2002-01-01

    There is an increasing demand for safe and efficient use of radiation and radioactive work activity along with shielding analysis as a result the number of nuclear and conventional facilities using radiation or radioisotope rises. Most Korean industries and research institutes including Korea Power Engineering Company (KOPEC) have been using foreign computer programs for radiation shielding analysis. Korean nuclear regulations have introduced new laws regarding the dose limits and radiological guides as prescribed in the ICRP 60. Thus, the radiation facilities should be designed and operated to comply with these new regulations. In addition, the previous point kernel shielding computer code utilizes antiquated nuclear data (mass attenuation coefficient, buildup factor, etc) which were developed in 1950∼1960. Subsequently, the various nuclear data such mass attenuation coefficient, buildup factor, etc. have been updated during the past few decades. KOPEC's strategic directive is to become a self-sufficient and independent nuclear design technology company, thus KOPEC decided to develop a new radiation shielding computer program that included the latest regulatory requirements and updated nuclear data. This new code was designed by KOPEC with developmental cooperation with Hanyang University, Department of Nuclear Engineering. VisualShield is designed with a graphical user interface to allow even users unfamiliar to radiation shielding theory to proficiently prepare input data sets and analyzing output results

  5. Standardized computer-based organized reporting of EEG SCORE - Second version

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se......Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted...... in the second, revised version of SCORE (Standardized Computer-based Organized Reporting of EEG), which is presented in this paper. The revised terminology was implemented in a software package (SCORE EEG), which was tested in clinical practice on 12,160 EEG recordings. Standardized terms implemented in SCORE....... In the end, the diagnostic significance is scored, using a standardized list of terms. SCORE has specific modules for scoring seizures (including seizure semiology and ictal EEG patterns), neonatal recordings (including features specific for this age group), and for Critical Care EEG Terminology. SCORE...

  6. The contribution of high-performance computing and modelling for industrial development

    CSIR Research Space (South Africa)

    Sithole, Happy

    2017-10-01

    Full Text Available Performance Computing and Modelling for Industrial Development Dr Happy Sithole and Dr Onno Ubbink 2 Strategic context • High-performance computing (HPC) combined with machine Learning and artificial intelligence present opportunities to non...

  7. Development of the Tensoral Computer Language

    Science.gov (United States)

    Ferziger, Joel; Dresselhaus, Eliot

    1996-01-01

    The research scientist or engineer wishing to perform large scale simulations or to extract useful information from existing databases is required to have expertise in the details of the particular database, the numerical methods and the computer architecture to be used. This poses a significant practical barrier to the use of simulation data. The goal of this research was to develop a high-level computer language called Tensoral, designed to remove this barrier. The Tensoral language provides a framework in which efficient generic data manipulations can be easily coded and implemented. First of all, Tensoral is general. The fundamental objects in Tensoral represent tensor fields and the operators that act on them. The numerical implementation of these tensors and operators is completely and flexibly programmable. New mathematical constructs and operators can be easily added to the Tensoral system. Tensoral is compatible with existing languages. Tensoral tensor operations co-exist in a natural way with a host language, which may be any sufficiently powerful computer language such as Fortran, C, or Vectoral. Tensoral is very-high-level. Tensor operations in Tensoral typically act on entire databases (i.e., arrays) at one time and may, therefore, correspond to many lines of code in a conventional language. Tensoral is efficient. Tensoral is a compiled language. Database manipulations are simplified optimized and scheduled by the compiler eventually resulting in efficient machine code to implement them.

  8. Bioconductor: open software development for computational biology and bioinformatics

    DEFF Research Database (Denmark)

    Gentleman, R.C.; Carey, V.J.; Bates, D.M.

    2004-01-01

    The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry into interdisci......The Bioconductor project is an initiative for the collaborative creation of extensible software for computational biology and bioinformatics. The goals of the project include: fostering collaborative development and widespread use of innovative software, reducing barriers to entry...... into interdisciplinary scientific research, and promoting the achievement of remote reproducibility of research results. We describe details of our aims and methods, identify current challenges, compare Bioconductor to other open bioinformatics projects, and provide working examples....

  9. A way forward for the development of an exposure computational model to computed tomography dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Ferreira, C.C., E-mail: cassio.c.ferreira@gmail.co [Nucleo de Fisica, Universidade Federal de Sergipe, Itabaiana-SE, CEP 49500-000 (Brazil); Galvao, L.A., E-mail: lailagalmeida@gmail.co [Departamento de Fisica, Universidade Federal de Sergipe, Sao Cristovao-SE, CEP 49100-000 (Brazil); Vieira, J.W., E-mail: jose.wilson59@uol.com.b [Instituto Federal de Educacao, Ciencia e Tecnologia de Pernambuco, Recife-PE, CEP 50740-540 (Brazil); Escola Politecnica de Pernambuco, Universidade de Pernambuco, Recife-PE, CEP 50720-001 (Brazil); Maia, A.F., E-mail: afmaia@ufs.b [Departamento de Fisica, Universidade Federal de Sergipe, Sao Cristovao-SE, CEP 49100-000 (Brazil)

    2011-04-15

    A way forward for the development of an exposure computational model to computed tomography dosimetry has been presented. In this way, an exposure computational model (ECM) for computed tomography (CT) dosimetry has been developed and validated through comparison with experimental results. For the development of the ECM, X-ray spectra generator codes have been evaluated and the head bow tie filter has been modelled through a mathematical equation. EGS4 and EGSnrc have been used for simulating the radiation transport by the ECM. Geometrical phantoms, commonly used in CT dosimetry, have been modelled by IDN software. MAX06 has also been used to simulate an adult male patient submitted for CT examinations. The evaluation of the X-ray spectra generator codes in CT dosimetry showed dependence with tube filtration (or HVL value). More generally, with the increment of total filtration (or HVL value) the X-raytbc becomes the best X-ray spectra generator code for CT dosimetry. The EGSnrc/X-raytbc combination has calculated C{sub 100,c} in better concordance with C{sub 100,c} measured in two different CT scanners. For a Toshiba CT scanner, the average percentage difference between the calculated C{sub 100,c} values and measured C{sub 100,c} values was 8.2%. Whilst for a GE CT scanner, the average percentage difference was 10.4%. By the measurements of air kerma through a prototype head bow tie filter a third-order exponential decay equation was found. C{sub 100,c} and C{sub 100,p} values calculated by the ECM are in good agreement with values measured at a specific CT scanner. A maximum percentage difference of 2% has been found in the PMMA CT head phantoms, demonstrating effective modelling of the head bow tie filter by the equation. The absorbed and effective doses calculated by the ECM developed in this work have been compared to those calculated by the ECM of Jones and Shrimpton for an adult male patient. For a head examination the absorbed dose values calculated by the

  10. The impact of home computer use on children's activities and development.

    Science.gov (United States)

    Subrahmanyam, K; Kraut, R E; Greenfield, P M; Gross, E F

    2000-01-01

    The increasing amount of time children are spending on computers at home and school has raised questions about how the use of computer technology may make a difference in their lives--from helping with homework to causing depression to encouraging violent behavior. This article provides an overview of the limited research on the effects of home computer use on children's physical, cognitive, and social development. Initial research suggests, for example, that access to computers increases the total amount of time children spend in front of a television or computer screen at the expense of other activities, thereby putting them at risk for obesity. At the same time, cognitive research suggests that playing computer games can be an important building block to computer literacy because it enhances children's ability to read and visualize images in three-dimensional space and track multiple images simultaneously. The limited evidence available also indicates that home computer use is linked to slightly better academic performance. The research findings are more mixed, however, regarding the effects on children's social development. Although little evidence indicates that the moderate use of computers to play games has a negative impact on children's friendships and family relationships, recent survey data show that increased use of the Internet may be linked to increases in loneliness and depression. Of most concern are the findings that playing violent computer games may increase aggressiveness and desensitize a child to suffering, and that the use of computers may blur a child's ability to distinguish real life from simulation. The authors conclude that more systematic research is needed in these areas to help parents and policymakers maximize the positive effects and to minimize the negative effects of home computers in children's lives.

  11. The development of AR book for computer learning

    Science.gov (United States)

    Phadung, Muneeroh; Wani, Najela; Tongmnee, Nur-aiynee

    2017-08-01

    Educators need to provide the alternative educational tools to foster learning outcomes of students. By using AR technology to create exciting edutainment experiences, this paper presents how augmented reality (AR) can be applied in the education. This study aims to develop the AR book for tenth grade students (age 15-16) and evaluate its quality. The AR book was developed based on ADDIE framework processes to provide computer learning on software computer knowledge. The content was accorded with the current Thai education curriculum. The AR book had 10 pages in three topics (the first was "Introduction," the second was "System Software" and the third was "Application Software"). Each page contained markers that placed virtual objects (2D animation and video clip). The obtained data were analyzed in terms of average and standard deviation. The validity of multimedia design of the AR book was assessed by three experts in multimedia design. A five-point Likert scale was used and the values were X¯ =4 .84 , S.D. = 1.27 which referred to very high. Moreover, three content experts, who specialize in computer teaching, evaluated the AR book's validity. The values determined by the experts were X¯ =4 .69 , S.D. = 0.29 which referred to very high. Implications for future study and education are discussed.

  12. Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans

    Science.gov (United States)

    2017-07-13

    Computed Tomography Scans by Autumn R Kulaga, Kathryn L Loftis, and Eric Murray Approved for public release; distribution is...Army Research Laboratory Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans by Autumn R Kulaga...Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c

  13. Developments of the general computer network of NIPNE-HH

    International Nuclear Information System (INIS)

    Mirica, M.; Constantinescu, S.; Danet, A.

    1997-01-01

    Since 1991 the general computer network of NIPNE-HH was developed and connected to RNCN (Romanian National Computer Network) for research and development and it offers to the Romanian physics research community an efficient and cost-effective infrastructure to communicate and collaborate with fellow researchers abroad, and to collect and exchange the most up-to-date information in their research area. RNCN is targeted on the following main objectives: Setting up a technical and organizational infrastructure meant to provide national and international electronic services for the Romanian scientific research community; - Providing a rapid and competitive tool for the exchange of information in the framework of Research and Development (R-D) community; - Using the scientific and technical data bases available in the country and offered by the national networks from other countries through international networks; - Providing a support for information, scientific and technical co-operation. RNCN has two international links: to EBONE via ACONET (64kbps) and to EuropaNET via Hungarnet (64 kbps). The guiding principle in designing the project of general computer network of NIPNE-HH, as part of RNCN, was to implement an open system based on OSI standards taking into account the following criteria: - development of a flexible solution, according to OSI specifications; - solutions of reliable gateway with the existing network already in use,allowing the access to the worldwide networks; - using the TCP/IP transport protocol for each Local Area Network (LAN) and for the connection to RNCN; - ensuring the integration of different and heterogeneous software and hardware platforms (DOS, Windows, UNIX, VMS, Linux, etc) through some specific interfaces. The major objectives achieved in direction of developing the general computer network of NIPNE-HH are: - linking all the existing and newly installed computer equipment and providing an adequate connectivity. LANs from departments

  14. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  15. Development of a computer writing system based on EOG

    OpenAIRE

    López, A.; Ferrero, F.; Yangüela, D.; Álvarez, C.; Postolache, O.

    2017-01-01

    WOS:000407517600044 (Nº de Acesso Web of Science) The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical i...

  16. A Generic Software Development Process Refined from Best Practices for Cloud Computing

    OpenAIRE

    Soojin Park; Mansoo Hwang; Sangeun Lee; Young B. Park

    2015-01-01

    Cloud computing has emerged as more than just a piece of technology, it is rather a new IT paradigm. The philosophy behind cloud computing shares its view with green computing where computing environments and resources are not as subjects to own but as subjects of sustained use. However, converting currently used IT services to Software as a Service (SaaS) cloud computing environments introduces several new risks. To mitigate such risks, existing software development processes must undergo si...

  17. Computational fluid mechanics

    Science.gov (United States)

    Hassan, H. A.

    1993-01-01

    Two papers are included in this progress report. In the first, the compressible Navier-Stokes equations have been used to compute leading edge receptivity of boundary layers over parabolic cylinders. Natural receptivity at the leading edge was simulated and Tollmien-Schlichting waves were observed to develop in response to an acoustic disturbance, applied through the farfield boundary conditions. To facilitate comparison with previous work, all computations were carried out at a free stream Mach number of 0.3. The spatial and temporal behavior of the flowfields are calculated through the use of finite volume algorithms and Runge-Kutta integration. The results are dominated by strong decay of the Tollmien-Schlichting wave due to the presence of the mean flow favorable pressure gradient. The effects of numerical dissipation, forcing frequency, and nose radius are studied. The Strouhal number is shown to have the greatest effect on the unsteady results. In the second paper, a transition model for low-speed flows, previously developed by Young et al., which incorporates first-mode (Tollmien-Schlichting) disturbance information from linear stability theory has been extended to high-speed flow by incorporating the effects of second mode disturbances. The transition model is incorporated into a Reynolds-averaged Navier-Stokes solver with a one-equation turbulence model. Results using a variable turbulent Prandtl number approach demonstrate that the current model accurately reproduces available experimental data for first and second-mode dominated transitional flows. The performance of the present model shows significant improvement over previous transition modeling attempts.

  18. Development of small scale cluster computer for numerical analysis

    Science.gov (United States)

    Zulkifli, N. H. N.; Sapit, A.; Mohammed, A. N.

    2017-09-01

    In this study, two units of personal computer were successfully networked together to form a small scale cluster. Each of the processor involved are multicore processor which has four cores in it, thus made this cluster to have eight processors. Here, the cluster incorporate Ubuntu 14.04 LINUX environment with MPI implementation (MPICH2). Two main tests were conducted in order to test the cluster, which is communication test and performance test. The communication test was done to make sure that the computers are able to pass the required information without any problem and were done by using simple MPI Hello Program where the program written in C language. Additional, performance test was also done to prove that this cluster calculation performance is much better than single CPU computer. In this performance test, four tests were done by running the same code by using single node, 2 processors, 4 processors, and 8 processors. The result shows that with additional processors, the time required to solve the problem decrease. Time required for the calculation shorten to half when we double the processors. To conclude, we successfully develop a small scale cluster computer using common hardware which capable of higher computing power when compare to single CPU processor, and this can be beneficial for research that require high computing power especially numerical analysis such as finite element analysis, computational fluid dynamics, and computational physics analysis.

  19. Development of a Computer Writing System Based on EOG.

    Science.gov (United States)

    López, Alberto; Ferrero, Francisco; Yangüela, David; Álvarez, Constantina; Postolache, Octavian

    2017-06-26

    The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1) A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2) A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3) A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.

  20. Development of a Computer Writing System Based on EOG

    Directory of Open Access Journals (Sweden)

    Alberto López

    2017-06-01

    Full Text Available The development of a novel computer writing system based on eye movements is introduced herein. A system of these characteristics requires the consideration of three subsystems: (1 A hardware device for the acquisition and transmission of the signals generated by eye movement to the computer; (2 A software application that allows, among other functions, data processing in order to minimize noise and classify signals; and (3 A graphical interface that allows the user to write text easily on the computer screen using eye movements only. This work analyzes these three subsystems and proposes innovative and low cost solutions for each one of them. This computer writing system was tested with 20 users and its efficiency was compared to a traditional virtual keyboard. The results have shown an important reduction in the time spent on writing, which can be very useful, especially for people with severe motor disorders.

  1. Development of 3-D Radiosurgery Planning System Using IBM Personal Computer

    International Nuclear Information System (INIS)

    Suh, Tae Suk; Park, Charn Il; Ha, Sung Whan; Kang, Wee Saing; Suh, Doug Young; Park, Sung Hun

    1993-01-01

    Recently, stereotactic radiosurgery plan is required with the information of 3-D image and dose distribution. A project has been doing if developing LINAC based stereotactic radiosurgery since April 1991. The purpose of this research is to develop 3-D radiosurgery planning system using personal computer. The procedure of this research is based on two steps. The first step is to develop 3-D localization system, which input the image information of the patient, coordinate transformation, the position and shape of target, and patient contour into computer system using CT image and stereotactic frame. The second step is to develop 3-D dose planning system, which compute dose distribution on image plane, display on high resolution monitor both isodose distribution and patient image simultaneously and develop menu-driven planning system. This prototype of radiosurgery planning system was applied recently for several clinical cases. It was shown that our planning system is fast, accurate and efficient while making it possible to handle various kinds of image modalities such as angiography, CT and MRI. It makes it possible to develop general 3-D planning system using beam eye view or CT simulation in radiation therapy in future

  2. Mentoring to develop research selfefficacy, with particular reference to previously disadvantaged individuals

    Directory of Open Access Journals (Sweden)

    S. Schulze

    2010-07-01

    Full Text Available The development of inexperienced researchers is crucial. In response to the lack of research self-efficacy of many previously disadvantaged individuals, the article examines how mentoring can enhance the research self-efficacy of mentees. The study is grounded in the self-efficacy theory (SET – an aspect of the social cognitive theory (SCT. Insights were gained from an in-depth study of SCT, SET and mentoring, and from a completed mentoring project. This led to the formulation of three basic principles. Firstly, institutions need to provide supportive environmental conditions that facilitate research selfefficacy. This implies a supportive and efficient collective system. The possible effects of performance ratings and reward systems at the institution also need to be considered. Secondly, mentoring needs to create opportunities for young researchers to experience successful learning as a result of appropriate action. To this end, mentees need to be involved in actual research projects in small groups. At the same time the mentor needs to facilitate skills development by coaching and encouragement. Thirdly, mentors need to encourage mentees to believe in their ability to successfully complete research projects. This implies encouraging positive emotional states, stimulating self-reflection and self-comparison with others in the group, giving positive evaluative feedback and being an intentional role model.

  3. COMPUTER MODELING IN THE DEVELOPMENT OF ARTIFICIAL VENTRICLES OF HEART

    Directory of Open Access Journals (Sweden)

    L. V. Belyaev

    2011-01-01

    Full Text Available In article modern researches of processes of development of artificial ventricles of heart are described. Advanta- ges of application computer (CAD/CAE technologies are shown by development of artificial ventricles of heart. The systems developed with application of the given technologies are submitted. 

  4. Discovery and Development of ATP-Competitive mTOR Inhibitors Using Computational Approaches.

    Science.gov (United States)

    Luo, Yao; Wang, Ling

    2017-11-16

    The mammalian target of rapamycin (mTOR) is a central controller of cell growth, proliferation, metabolism, and angiogenesis. This protein is an attractive target for new anticancer drug development. Significant progress has been made in hit discovery, lead optimization, drug candidate development and determination of the three-dimensional (3D) structure of mTOR. Computational methods have been applied to accelerate the discovery and development of mTOR inhibitors helping to model the structure of mTOR, screen compound databases, uncover structure-activity relationship (SAR) and optimize the hits, mine the privileged fragments and design focused libraries. Besides, computational approaches were also applied to study protein-ligand interactions mechanisms and in natural product-driven drug discovery. Herein, we survey the most recent progress on the application of computational approaches to advance the discovery and development of compounds targeting mTOR. Future directions in the discovery of new mTOR inhibitors using computational methods are also discussed. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. Trends and developments in computational geometry

    NARCIS (Netherlands)

    Berg, de M.

    1997-01-01

    This paper discusses some trends and achievements in computational geometry during the past five years, with emphasis on problems related to computer graphics. Furthermore, a direction of research in computational geometry is discussed that could help in bringing the fields of computational geometry

  6. Developing Decision-Making Skill: Experiential Learning in Computer Games

    OpenAIRE

    Kurt A. April; Katja M. J. Goebel; Eddie Blass; Jonathan Foster-Pedley

    2012-01-01

    This paper explores the value that computer and video games bring to learning and leadership and explores how games work as learning environments and the impact they have on personal development. The study looks at decisiveness, decision-making ability and styles, and on how this leadership-related skill is learnt through different paradigms. The paper compares the learning from a lecture to the learning from a designed computer game, both of which have the same content through the use of a s...

  7. Developing a personal computer based expert system for radionuclide identification

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Hakulinen, T.T.

    1990-01-01

    Several expert system development tools are available for personal computers today. We have used one of the LISP-based high end tools for nearly two years in developing an expert system for identification of gamma sources. The system contains a radionuclide database of 2055 nuclides and 48000 gamma transitions with a knowledge base of about sixty rules. This application combines a LISP-based inference engine with database management and relatively heavy numerical calculations performed using C-language. The most important feature needed has been the possibility to use LISP and C together with the more advanced object oriented features of the development tool. Main difficulties have been long response times and the big amount (10-16 MB) of computer memory required

  8. The Experiment Method for Manufacturing Grid Development on Single Computer

    Institute of Scientific and Technical Information of China (English)

    XIAO Youan; ZHOU Zude

    2006-01-01

    In this paper, an experiment method for the Manufacturing Grid application system development in the single personal computer environment is proposed. The characteristic of the proposed method is constructing a full prototype Manufacturing Grid application system which is hosted on a single personal computer with the virtual machine technology. Firstly, it builds all the Manufacturing Grid physical resource nodes on an abstraction layer of a single personal computer with the virtual machine technology. Secondly, all the virtual Manufacturing Grid resource nodes will be connected with virtual network and the application software will be deployed on each Manufacturing Grid nodes. Then, we can obtain a prototype Manufacturing Grid application system which is working in the single personal computer, and can carry on the experiment on this foundation. Compared with the known experiment methods for the Manufacturing Grid application system development, the proposed method has the advantages of the known methods, such as cost inexpensively, operation simple, and can get the confidence experiment result easily. The Manufacturing Grid application system constructed with the proposed method has the high scalability, stability and reliability. It is can be migrated to the real application environment rapidly.

  9. Impact of computer use on children's vision.

    Science.gov (United States)

    Kozeis, N

    2009-10-01

    Today, millions of children use computers on a daily basis. Extensive viewing of the computer screen can lead to eye discomfort, fatigue, blurred vision and headaches, dry eyes and other symptoms of eyestrain. These symptoms may be caused by poor lighting, glare, an improper work station set-up, vision problems of which the person was not previously aware, or a combination of these factors. Children can experience many of the same symptoms related to computer use as adults. However, some unique aspects of how children use computers may make them more susceptible than adults to the development of these problems. In this study, the most common eye symptoms related to computer use in childhood, the possible causes and ways to avoid them are reviewed.

  10. Standardized computer-based organized reporting of EEG SCORE - Second version

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Aurlien, Harald; Brøgger, Jan C

    2017-01-01

    Standardized terminology for computer-based assessment and reporting of EEG has been previously developed in Europe. The International Federation of Clinical Neurophysiology established a taskforce in 2013 to develop this further, and to reach international consensus. This work resulted in the se...

  11. Importance of Computer Model Validation in Pyroprocessing Technology Development

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Y. E.; Li, Hui; Yim, M. S. [Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2014-05-15

    In this research, we developed a plan for experimental validation of one of the computer models developed for ER process modeling, i. e., the ERAD code. Several candidate surrogate materials are selected for the experiment considering the chemical and physical properties. Molten salt-based pyroprocessing technology is being examined internationally as an alternative to treat spent nuclear fuel over aqueous technology. The central process in pyroprocessing is electrorefining(ER) which separates uranium from transuranic elements and fission products present in spent nuclear fuel. ER is a widely used process in the minerals industry to purify impure metals. Studies of ER by using actual spent nuclear fuel materials are problematic for both technical and political reasons. Therefore, the initial effort for ER process optimization is made by using computer models. A number of models have been developed for this purpose. But as validation of these models is incomplete and often times problematic, the simulation results from these models are inherently uncertain.

  12. Computing in Hydraulic Engineering Education

    Science.gov (United States)

    Duan, J. G.

    2011-12-01

    Civil engineers, pioneers of our civilization, are rarely perceived as leaders and innovators in modern society because of retardations in technology innovation. This crisis has resulted in the decline of the prestige of civil engineering profession, reduction of federal funding on deteriorating infrastructures, and problems with attracting the most talented high-school students. Infusion of cutting-edge computer technology and stimulating creativity and innovation therefore are the critical challenge to civil engineering education. To better prepare our graduates to innovate, this paper discussed the adaption of problem-based collaborative learning technique and integration of civil engineering computing into a traditional civil engineering curriculum. Three interconnected courses: Open Channel Flow, Computational Hydraulics, and Sedimentation Engineering, were developed with emphasis on computational simulations. In Open Channel flow, the focuses are principles of free surface flow and the application of computational models. This prepares students to the 2nd course, Computational Hydraulics, that introduce the fundamental principles of computational hydraulics, including finite difference and finite element methods. This course complements the Open Channel Flow class to provide students with in-depth understandings of computational methods. The 3rd course, Sedimentation Engineering, covers the fundamentals of sediment transport and river engineering, so students can apply the knowledge and programming skills gained from previous courses to develop computational models for simulating sediment transport. These courses effectively equipped students with important skills and knowledge to complete thesis and dissertation research.

  13. The graphics future in scientific applications-trends and developments in computer graphics

    CERN Document Server

    Enderle, G

    1982-01-01

    Computer graphics methods and tools are being used to a great extent in scientific research. The future development in this area will be influenced both by new hardware developments and by software advances. On the hardware sector, the development of the raster technology will lead to the increased use of colour workstations with more local processing power. Colour hardcopy devices for creating plots, slides, or movies will be available at a lower price than today. The first real 3D-workstations will appear on the marketplace. One of the main activities on the software sector is the standardization of computer graphics systems, graphical files, and device interfaces. This will lead to more portable graphical application programs and to a common base for computer graphics education.

  14. Brain-muscle-computer interface: mobile-phone prototype development and testing.

    Science.gov (United States)

    Vernon, Scott; Joshi, Sanjay S

    2011-07-01

    We report prototype development and testing of a new mobile-phone-based brain-muscle-computer interface for severely paralyzed persons, based on previous results from our group showing that humans may actively create specified power levels in two separate frequency bands of a single surface electromyography (sEMG) signal. EMG activity on the surface of a single face muscle site (auricularis superior) is recorded with a standard electrode. This analog electrical signal is imported into an Android-based mobile phone and digitized via an internal A/D converter. The digital signal is split, and then simultaneously filtered with two band-pass filters to extract total power within two separate frequency bands. The user-modulated power in each frequency band serves as two separate control channels for machine control. After signal processing, the Android phone sends commands to external devices via a Bluetooth interface. Users are trained to use the device via visually based operant conditioning, with simple cursor-to-target activities on the phone screen. The mobile-phone prototype interface is formally evaluated on a single advanced Spinal Muscle Atrophy subject, who has successfully used the interface in his home in evaluation trials and for remote control of a television. Development of this new device will not only guide future interface design for community use, but will also serve as an information technology bridge for in situ data collection to quantify human sEMG manipulation abilities for a relevant population.

  15. Development of computer program for safety of nuclear power plant against tsunami

    International Nuclear Information System (INIS)

    Jin, S. B.; Choi, K. R.; Lee, S. K.; Cho, Y. S.

    2001-01-01

    The main objective of this study is the development of a computer program to check the safety of nuclear power plants along the coastline of the Korean Peninsula. The computer program describes the propagation and associated run-up process of tsunamis by solving linear and nonlinear shallow-water equations with finite difference methods. The computer program has been applied to several ideal and simplified problems. Obtained numerical solutions are compared to existing and available solutions and measurements. A very good agreement between numerical solutions and existing measurement is observed. The computer program developed in this study can be to check the safety analysis of nuclear power plants against tsunamis. The program can also be used to study the propagation of tsunamis for a long distance, and associated run-up and run-down process along a shoreline. Furthermore, the computer program can be used to provide the proper design criteria of coastal facilities and structures

  16. A Knowledge Engineering Approach to Developing Educational Computer Games for Improving Students' Differentiating Knowledge

    Science.gov (United States)

    Hwang, Gwo-Jen; Sung, Han-Yu; Hung, Chun-Ming; Yang, Li-Hsueh; Huang, Iwen

    2013-01-01

    Educational computer games have been recognized as being a promising approach for motivating students to learn. Nevertheless, previous studies have shown that without proper learning strategies or supportive models, the learning achievement of students might not be as good as expected. In this study, a knowledge engineering approach is proposed…

  17. Developing a New Computer Game Attitude Scale for Taiwanese Early Adolescents

    Science.gov (United States)

    Liu, Eric Zhi-Feng; Lee, Chun-Yi; Chen, Jen-Huang

    2013-01-01

    With ever increasing exposure to computer games, gaining an understanding of the attitudes held by young adolescents toward such activities is crucial; however, few studies have provided scales with which to accomplish this. This study revisited the Computer Game Attitude Scale developed by Chappell and Taylor in 1997, reworking the overall…

  18. Cloud computing development in Armenia

    Directory of Open Access Journals (Sweden)

    Vazgen Ghazaryan

    2014-10-01

    Full Text Available Purpose – The purpose of the research is to clarify benefits and risks in regards with data protection, cost; business can have by the use of this new technologies for the implementation and management of organization’s information systems.Design/methodology/approach – Qualitative case study of the results obtained via interviews. Three research questions were raised: Q1: How can company benefit from using Cloud Computing compared to other solutions?; Q2: What are possible issues that occur with Cloud Computing?; Q3: How would Cloud Computing change an organizations’ IT infrastructure?Findings – The calculations provided in the interview section prove the financial advantages, even though the precise degree of flexibility and performance has not been assessed. Cloud Computing offers great scalability. Another benefit that Cloud Computing offers, in addition to better performance and flexibility, is reliable and simple backup data storage, physically distributed and so almost invulnerable to damage. Although the advantages of Cloud Computing more than compensate for the difficulties associated with it, the latter must be carefully considered. Since the cloud architecture is relatively new, so far the best guarantee against all risks it entails, from a single company's perspective, is a well-formulated service-level agreement, where the terms of service and the shared responsibility and security roles between the client and the provider are defined.Research limitations/implications – study was carried out on the bases of two companies, which gives deeper view, but for more widely applicable results, a wider analysis is necessary.Practical implications:Originality/Value – novelty of the research depends on the fact that existing approaches on this problem mainly focus on technical side of computing.Research type: case study

  19. Computer codes developed in FRG to analyse hypothetical meltdown accidents

    International Nuclear Information System (INIS)

    Hassmann, K.; Hosemann, J.P.; Koerber, H.; Reineke, H.

    1978-01-01

    It is the purpose of this paper to give the status of all significant computer codes developed in the core melt-down project which is incorporated in the light water reactor safety research program of the Federal Ministry of Research and Technology. For standard pressurized water reactors, results of some computer codes will be presented, describing the course and the duration of the hypothetical core meltdown accident. (author)

  20. Automated Development of Accurate Algorithms and Efficient Codes for Computational Aeroacoustics

    Science.gov (United States)

    Goodrich, John W.; Dyson, Rodger W.

    1999-01-01

    The simulation of sound generation and propagation in three space dimensions with realistic aircraft components is a very large time dependent computation with fine details. Simulations in open domains with embedded objects require accurate and robust algorithms for propagation, for artificial inflow and outflow boundaries, and for the definition of geometrically complex objects. The development, implementation, and validation of methods for solving these demanding problems is being done to support the NASA pillar goals for reducing aircraft noise levels. Our goal is to provide algorithms which are sufficiently accurate and efficient to produce usable results rapidly enough to allow design engineers to study the effects on sound levels of design changes in propulsion systems, and in the integration of propulsion systems with airframes. There is a lack of design tools for these purposes at this time. Our technical approach to this problem combines the development of new, algorithms with the use of Mathematica and Unix utilities to automate the algorithm development, code implementation, and validation. We use explicit methods to ensure effective implementation by domain decomposition for SPMD parallel computing. There are several orders of magnitude difference in the computational efficiencies of the algorithms which we have considered. We currently have new artificial inflow and outflow boundary conditions that are stable, accurate, and unobtrusive, with implementations that match the accuracy and efficiency of the propagation methods. The artificial numerical boundary treatments have been proven to have solutions which converge to the full open domain problems, so that the error from the boundary treatments can be driven as low as is required. The purpose of this paper is to briefly present a method for developing highly accurate algorithms for computational aeroacoustics, the use of computer automation in this process, and a brief survey of the algorithms that

  1. Integrated computer-aided design in automotive development development processes, geometric fundamentals, methods of CAD, knowledge-based engineering data management

    CERN Document Server

    Mario, Hirz; Gfrerrer, Anton; Lang, Johann

    2013-01-01

    The automotive industry faces constant pressure to reduce development costs and time while still increasing vehicle quality. To meet this challenge, engineers and researchers in both science and industry are developing effective strategies and flexible tools by enhancing and further integrating powerful, computer-aided design technology. This book provides a valuable overview of the development tools and methods of today and tomorrow. It is targeted not only towards professional project and design engineers, but also to students and to anyone who is interested in state-of-the-art computer-aided development. The book begins with an overview of automotive development processes and the principles of virtual product development. Focusing on computer-aided design, a comprehensive outline of the fundamentals of geometry representation provides a deeper insight into the mathematical techniques used to describe and model geometrical elements. The book then explores the link between the demands of integrated design pr...

  2. Contributions to computational stereology and parallel programming

    DEFF Research Database (Denmark)

    Rasmusson, Allan

    rotator, even without the need for isotropic sections. To meet the need for computational power to perform image restoration of virtual tissue sections, parallel programming on GPUs has also been part of the project. This has lead to a significant change in paradigm for a previously developed surgical...

  3. Latest developments for a computer aided thermohydraulic network

    International Nuclear Information System (INIS)

    Alemberti, A.; Graziosi, G.; Mini, G.; Susco, M.

    1999-01-01

    Thermohydraulic networks are I-D systems characterized by a small number of basic components (pumps, valves, heat exchangers, etc) connected by pipes and limited spatially by a defined number of boundary conditions (tanks, atmosphere, etc). The network system is simulated by the well known computer program RELAPS/mod3. Information concerning the network geometry component behaviour, initial and boundary conditions are usually supplied to the RELAPS code using an ASCII input file by means of 'input cards'. CATNET (Computer Aided Thermalhydraulic NETwork) is a graphically user interface that, under specific user guidelines which completely define its range of applicability, permits a very high level of standardization and simplification of the RELAPS/mod3 input deck development process as well as of the output processing. The characteristics of the components (pipes, valves, pumps etc), defining the network system can be entered through CATNET. The CATNET interface is provided by special functions to compute form losses in the most typical bending and branching configurations. When the input of all system components is ready, CATNET is able to generate the RELAPS/mod3 input file. Finally, by means of CATNET, the RELAPS/mod3 code can be run and its output results can be transformed to an intuitive display form. The paper presents an example of application of the CATNET interface as well as the latest developments which greatly simplified the work of the users and allowed to reduce the possibility of input errors. (authors)

  4. Using Animation to Support the Teaching of Computer Game Development Techniques

    Science.gov (United States)

    Taylor, Mark John; Pountney, David C.; Baskett, M.

    2008-01-01

    In this paper, we examine the potential use of animation for supporting the teaching of some of the mathematical concepts that underlie computer games development activities, such as vector and matrix algebra. An experiment was conducted with a group of UK undergraduate computing students to compare the perceived usefulness of animated and static…

  5. Development and evaluation of a computer-based medical work assessment programme

    Directory of Open Access Journals (Sweden)

    Spallek Michael

    2008-12-01

    Full Text Available Abstract Background There are several ways to conduct a job task analysis in medical work environments including pencil-paper observations, interviews and questionnaires. However these methods implicate bias problems such as high inter-individual deviations and risks of misjudgement. Computer-based observation helps to reduce these problems. The aim of this paper is to give an overview of the development process of a computer-based job task analysis instrument for real-time observations to quantify the job tasks performed by physicians working in different medical settings. In addition reliability and validity data of this instrument will be demonstrated. Methods This instrument was developed in consequential steps. First, lists comprising tasks performed by physicians in different care settings were classified. Afterwards content validity of task lists was proved. After establishing the final task categories, computer software was programmed and implemented in a mobile personal computer. At least inter-observer reliability was evaluated. Two trained observers recorded simultaneously tasks of the same physician. Results Content validity of the task lists was confirmed by observations and experienced specialists of each medical area. Development process of the job task analysis instrument was completed successfully. Simultaneous records showed adequate interrater reliability. Conclusion Initial results of this analysis supported the validity and reliability of this developed method for assessing physicians' working routines as well as organizational context factors. Based on results using this method, possible improvements for health professionals' work organisation can be identified.

  6. SAR: A fast computer for Camac data acquisition

    International Nuclear Information System (INIS)

    Bricaud, B.; Faivre, J.C.; Pain, J.

    1979-01-01

    This paper describes a special data acquisition and processing facility developed for Nuclear Physics experiments at intermediate energy installed at SATURNE (France) and at CERN (Geneva, Switzerland). Previously, we used a PDP 11/45 computer which was connected to the experiments through a Camac Branch highway. In a typical experiment (340 words per event), the computer limited the data acquisition rate at 4 μsec for each 16-bit transfer and the on-line data reduction at 20 events per second only. The initial goal of this project was to increase these two performances. Previous known acquisition processors were limited by the memory capacity these systems could support. Most of the time the data reduction was done on the host mini computer. Higher memory size can be designed with new fast RAM (Intel 2147) and the data processing can now take place on the front end processor

  7. Lattice QCD Application Development within the US DOE Exascale Computing Project

    Energy Technology Data Exchange (ETDEWEB)

    Brower, Richard [Boston U.; Christ, Norman [Columbia U.; DeTar, Carleton [Utah U.; Edwards, Robert [Jefferson Lab; Mackenzie, Paul [Fermilab

    2017-10-30

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  8. Lattice QCD Application Development within the US DOE Exascale Computing Project

    Science.gov (United States)

    Brower, Richard; Christ, Norman; DeTar, Carleton; Edwards, Robert; Mackenzie, Paul

    2018-03-01

    In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020's. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  9. Lattice QCD Application Development within the US DOE Exascale Computing Project

    Directory of Open Access Journals (Sweden)

    Brower Richard

    2018-01-01

    Full Text Available In October, 2016, the US Department of Energy launched the Exascale Computing Project, which aims to deploy exascale computing resources for science and engineering in the early 2020’s. The project brings together application teams, software developers, and hardware vendors in order to realize this goal. Lattice QCD is one of the applications. Members of the US lattice gauge theory community with significant collaborators abroad are developing algorithms and software for exascale lattice QCD calculations. We give a short description of the project, our activities, and our plans.

  10. Evidence-based guidelines for the wise use of computers by children: physical development guidelines.

    Science.gov (United States)

    Straker, L; Maslen, B; Burgess-Limerick, R; Johnson, P; Dennerlein, J

    2010-04-01

    Computer use by children is common and there is concern over the potential impact of this exposure on child physical development. Recently principles for child-specific evidence-based guidelines for wise use of computers have been published and these included one concerning the facilitation of appropriate physical development. This paper reviews the evidence and presents detailed guidelines for this principle. The guidelines include encouraging a mix of sedentary and whole body movement tasks, encouraging reasonable postures during computing tasks through workstation, chair, desk, display and input device selection and adjustment and special issues regarding notebook computer use and carriage, computing skills and responding to discomfort. The evidence limitations highlight opportunities for future research. The guidelines themselves can inform parents and teachers, equipment designers and suppliers and form the basis of content for teaching children the wise use of computers. STATEMENT OF RELEVANCE: Many children use computers and computer-use habits formed in childhood may track into adulthood. Therefore child-computer interaction needs to be carefully managed. These guidelines inform those responsible for children to assist in the wise use of computers.

  11. Guidelines for development of NASA (National Aeronautics and Space Administration) computer security training programs

    Science.gov (United States)

    Tompkins, F. G.

    1983-01-01

    The report presents guidance for the NASA Computer Security Program Manager and the NASA Center Computer Security Officials as they develop training requirements and implement computer security training programs. NASA audiences are categorized based on the computer security knowledge required to accomplish identified job functions. Training requirements, in terms of training subject areas, are presented for both computer security program management personnel and computer resource providers and users. Sources of computer security training are identified.

  12. Ethical Issues in Brain-Computer Interface Research, Development, and Dissemination

    NARCIS (Netherlands)

    Vlek, Rutger; Steines, David; Szibbo, Dyana; Kübler, Andrea; Schneider, Mary-Jane; Haselager, Pim; Nijboer, Femke

    The steadily growing field of brain-computer interfacing (BCI) may develop useful technologies, with a potential impact not only on individuals, but also on society as a whole. At the same time, the development of BCI presents significant ethical and legal challenges. In a workshop during the 4th

  13. Ethical Issues in Brain-Computer Interface Research, Development, and Dissemination

    NARCIS (Netherlands)

    Vlek, R.J.; Steines, D.; Szibbo, D.; Kübler, A.; Schneider, M.J.; Haselager, W.F.G.; Nijboer, F.

    2012-01-01

    The steadily growing field of brain–computer interfacing (BCI) may develop useful technologies, with a potential impact not only on individuals, but also on society as a whole. At the same time, the development of BCI presents significant ethical and legal challenges. In a workshop during the 4th

  14. Efficient conjugate gradient algorithms for computation of the manipulator forward dynamics

    Science.gov (United States)

    Fijany, Amir; Scheid, Robert E.

    1989-01-01

    The applicability of conjugate gradient algorithms for computation of the manipulator forward dynamics is investigated. The redundancies in the previously proposed conjugate gradient algorithm are analyzed. A new version is developed which, by avoiding these redundancies, achieves a significantly greater efficiency. A preconditioned conjugate gradient algorithm is also presented. A diagonal matrix whose elements are the diagonal elements of the inertia matrix is proposed as the preconditioner. In order to increase the computational efficiency, an algorithm is developed which exploits the synergism between the computation of the diagonal elements of the inertia matrix and that required by the conjugate gradient algorithm.

  15. A Computational Model of Spatial Development

    Science.gov (United States)

    Hiraki, Kazuo; Sashima, Akio; Phillips, Steven

    Psychological experiments on children's development of spatial knowledge suggest experience at self-locomotion with visual tracking as important factors. Yet, the mechanism underlying development is unknown. We propose a robot that learns to mentally track a target object (i.e., maintaining a representation of an object's position when outside the field-of-view) as a model for spatial development. Mental tracking is considered as prediction of an object's position given the previous environmental state and motor commands, and the current environment state resulting from movement. Following Jordan & Rumelhart's (1992) forward modeling architecture the system consists of two components: an inverse model of sensory input to desired motor commands; and a forward model of motor commands to desired sensory input (goals). The robot was tested on the `three cups' paradigm (where children are required to select the cup containing the hidden object under various movement conditions). Consistent with child development, without the capacity for self-locomotion the robot's errors are self-center based. When given the ability of self-locomotion the robot responds allocentrically.

  16. The Effects of the Previous Outcome on Probabilistic Choice in Rats

    Science.gov (United States)

    Marshall, Andrew T.; Kirkpatrick, Kimberly

    2014-01-01

    This study examined the effects of previous outcomes on subsequent choices in a probabilistic-choice task. Twenty-four rats were trained to choose between a certain outcome (1 or 3 pellets) versus an uncertain outcome (3 or 9 pellets), delivered with a probability of .1, .33, .67, and .9 in different phases. Uncertain outcome choices increased with the probability of uncertain food. Additionally, uncertain choices increased with the probability of uncertain food following both certain-choice outcomes and unrewarded uncertain choices. However, following uncertain-choice food outcomes, there was a tendency to choose the uncertain outcome in all cases, indicating that the rats continued to “gamble” after successful uncertain choices, regardless of the overall probability or magnitude of food. A subsequent manipulation, in which the probability of uncertain food varied within each session as a function of the previous uncertain outcome, examined how the previous outcome and probability of uncertain food affected choice in a dynamic environment. Uncertain-choice behavior increased with the probability of uncertain food. The rats exhibited increased sensitivity to probability changes and a greater degree of win–stay/lose–shift behavior than in the static phase. Simulations of two sequential choice models were performed to explore the possible mechanisms of reward value computations. The simulation results supported an exponentially decaying value function that updated as a function of trial (rather than time). These results emphasize the importance of analyzing global and local factors in choice behavior and suggest avenues for the future development of sequential-choice models. PMID:23205915

  17. Recent development of computational resources for new antibiotics discovery

    DEFF Research Database (Denmark)

    Kim, Hyun Uk; Blin, Kai; Lee, Sang Yup

    2017-01-01

    Understanding a complex working mechanism of biosynthetic gene clusters (BGCs) encoding secondary metabolites is a key to discovery of new antibiotics. Computational resources continue to be developed in order to better process increasing volumes of genome and chemistry data, and thereby better...

  18. Young Children's Computer Skills Development from Kindergarten to Third Grade

    Science.gov (United States)

    Sackes, Mesut; Trundle, Kathy Cabe; Bell, Randy L.

    2011-01-01

    This investigation explores young children's computer skills development from kindergarten to third grade using the Early Childhood Longitudinal Study-Kindergarten (ECLS-K) dataset. The sample size of the study was 8642 children. Latent growth curve modeling analysis was used as an analytical tool to examine the development of children's computer…

  19. Development of industrial x-ray computed tomography and its application to refractories

    International Nuclear Information System (INIS)

    Aiba, Yoshiro; Oki, Kazuo; Nakamura, Shigeo; Fujii, Masashi.

    1985-01-01

    An industrial X-ray computed tomography was developed under the influence of the rapid spread of the use of the X-ray CT scanner in the medical field and improvements of the equipment. Although current nondestructive testing machines of refractories use the ultrasonic inspection method or the X-ray fluoroscopic method, these equipments cannot produce a tomogram or cannot carry out quantitative evaluation. By using an industrial X-ray computed tomography, submerged nozzles for continuous casting of steel were analyzed with interesting results. The features of the industrial X-ray computed tomography applied for refractory nozzles are as follows: (1) It promptly detects interior defects. (2) It can measure dimensions and shapes. (3) It can numerically express the distribution of density. Accordingly, it is expected that the industrial X-ray computed tomography will widely be used in the fields of development and quality control of refractories and advanced ceramic materials. (author)

  20. The Development of Educational and/or Training Computer Games for Students with Disabilities

    Science.gov (United States)

    Kwon, Jungmin

    2012-01-01

    Computer and video games have much in common with the strategies used in special education. Free resources for game development are becoming more widely available, so lay computer users, such as teachers and other practitioners, now have the capacity to develop games using a low budget and a little self-teaching. This article provides a guideline…

  1. Development of a Computational Steering Framework for High Performance Computing Environments on Blue Gene/P Systems

    KAUST Repository

    Danani, Bob K.

    2012-01-01

    of simulation results in a post-processing step is now transformed into a real-time interactive workflow that significantly reduces development and testing time. Computational steering provides the capability to direct or re-direct the progress of a simulation

  2. CONTRIBUTIONS FOR DEVELOPING OF A COMPUTER AIDED LEARNING ENVIRONMENT OF DESCRIPTIVE GEOMETRY

    Directory of Open Access Journals (Sweden)

    Antonescu Ion

    2009-07-01

    Full Text Available The paper presents the authors’ contributions for developing a computer code for teaching of descriptive geometry using the computer aided learning techniques. The program was implemented using the programming interface and the 3D modeling capabilities of the AutoCAD system.

  3. [Personal computer-based computer monitoring system of the anesthesiologist (2-year experience in development and use)].

    Science.gov (United States)

    Buniatian, A A; Sablin, I N; Flerov, E V; Mierbekov, E M; Broĭtman, O G; Shevchenko, V V; Shitikov, I I

    1995-01-01

    Creation of computer monitoring systems (CMS) for operating rooms is one of the most important spheres of personal computer employment in anesthesiology. The authors developed a PC RS/AT-based CMS and effectively used it for more than 2 years. This system permits comprehensive monitoring in cardiosurgical operations by real time processing the values of arterial and central venous pressure, pressure in the pulmonary artery, bioelectrical activity of the brain, and two temperature values. Use of this CMS helped appreciably improve patients' safety during surgery. The possibility to assess brain function by computer monitoring the EEF simultaneously with central hemodynamics and body temperature permit the anesthesiologist to objectively assess the depth of anesthesia and to diagnose cerebral hypoxia. Automated anesthesiological chart issued by the CMS after surgery reliably reflects the patient's status and the measures taken by the anesthesiologist.

  4. The development of a computer technique for the investigation of reactor lattice parameters

    International Nuclear Information System (INIS)

    Joubert, W.R.

    1982-01-01

    An integrated computer technique was developed whereby all the computer programmes needed to calculate reactor lattice parameters from basic neutron data, could be combined in one system. The theory of the computer programmes is explained in detail. Results are given and compared with experimental values as well as those calculated with a standard system

  5. Developing Activities for Teaching Cloud Computing and Virtualization

    Directory of Open Access Journals (Sweden)

    E. Erturk

    2014-10-01

    Full Text Available Cloud computing and virtualization are new but indispensable components of computer engineering and information systems curricula for universities and higher education institutions. Learning about these topics is important for students preparing to work in the IT industry. In many companies, information technology operates under tight financial constraints. Virtualization, (for example storage, desktop, and server virtualization, reduces overall IT costs through the consolidation of systems. It also results in reduced loads and energy savings in terms of the power and cooling infrastructure. Therefore it is important to investigate the practical aspects of this topic both for industry practice and for teaching purposes. This paper demonstrates some activities undertaken recently by students at the Eastern Institute of Technology New Zealand and concludes with general recommendations for IT educators, software developers, and other IT professionals

  6. Using Computer-Assisted Argumentation Mapping to develop effective argumentation skills in high school advanced placement physics

    Science.gov (United States)

    Heglund, Brian

    Educators recognize the importance of reasoning ability for development of critical thinking skills, conceptual change, metacognition, and participation in 21st century society. There is a recognized need for students to improve their skills of argumentation, however, argumentation is not explicitly taught outside logic and philosophy---subjects that are not part of the K-12 curriculum. One potential way of supporting the development of argumentation skills in the K-12 context is through incorporating Computer-Assisted Argument Mapping to evaluate arguments. This quasi-experimental study tested the effects of such argument mapping software and was informed by the following two research questions: 1. To what extent does the collaborative use of Computer-Assisted Argumentation Mapping to evaluate competing theories influence the critical thinking skill of argument evaluation, metacognitive awareness, and conceptual knowledge acquisition in high school Advanced Placement physics, compared to the more traditional method of text tables that does not employ Computer-Assisted Argumentation Mapping? 2. What are the student perceptions of the pros and cons of argument evaluation in the high school Advanced Placement physics environment? This study examined changes in critical thinking skills, including argumentation evaluation skills, as well as metacognitive awareness and conceptual knowledge, in two groups: a treatment group using Computer-Assisted Argumentation Mapping to evaluate physics arguments, and a comparison group using text tables to evaluate physics arguments. Quantitative and qualitative methods for collecting and analyzing data were used to answer the research questions. Quantitative data indicated no significant difference between the experimental groups, and qualitative data suggested students perceived pros and cons of argument evaluation in the high school Advanced Placement physics environment, such as self-reported sense of improvement in argument

  7. Computer Game-based Learning: Applied Game Development Made Simpler

    NARCIS (Netherlands)

    Nyamsuren, Enkhbold

    2018-01-01

    The RAGE project (Realising an Applied Gaming Ecosystem, http://rageproject.eu/) is an ongoing initiative that aims to offer an ecosystem to support serious games’ development and use. Its two main objectives are to provide technologies for computer game-based pedagogy and learning and to establish

  8. Recent developments and applications in mathematics and computer science

    International Nuclear Information System (INIS)

    Churchhouse, R.F.; Tahir Shah, K.; Zanella, P.

    1991-01-01

    The book contains 8 invited lectures and 4 short seminars presented at the College on Recent Developments and Applications in Mathematics and Computer Science held in Trieste from 7 May to 1 June 1990. A separate abstract was prepared for each paper. Refs, figs and tabs

  9. Towards playful learning and computational thinking — Developing the educational robot BRICKO

    DEFF Research Database (Denmark)

    Pedersen, B. K. M. K.; Andersen, K. E.; J⊘rgensen, A.

    2018-01-01

    Educational Robotics has proven a feasible way of supporting and exemplifying Computational Thinking. With this paper, we describe the user-centered iterative and incremental development of a new educational robotic system, BRICKO, to support tangible, social and playful interaction while educating...... children in 1st–3rd grade in Computational Thinking. We develop the system through seven main iterations including a total of 108 participant pupils and their teachers. The methodology is a mixture of observation and interviews using Wizard of OZ testing with the early pilot prototypes as well as usability...... categories of command-bricks. We discuss the methodologies used for assuring a playful and social educational robotic system and conclude that we achieved a useful prototype for supporting education in Computational Thinking....

  10. Shielding Benchmark Computational Analysis

    International Nuclear Information System (INIS)

    Hunter, H.T.; Slater, C.O.; Holland, L.B.; Tracz, G.; Marshall, W.J.; Parsons, J.L.

    2000-01-01

    Over the past several decades, nuclear science has relied on experimental research to verify and validate information about shielding nuclear radiation for a variety of applications. These benchmarks are compared with results from computer code models and are useful for the development of more accurate cross-section libraries, computer code development of radiation transport modeling, and building accurate tests for miniature shielding mockups of new nuclear facilities. When documenting measurements, one must describe many parts of the experimental results to allow a complete computational analysis. Both old and new benchmark experiments, by any definition, must provide a sound basis for modeling more complex geometries required for quality assurance and cost savings in nuclear project development. Benchmarks may involve one or many materials and thicknesses, types of sources, and measurement techniques. In this paper the benchmark experiments of varying complexity are chosen to study the transport properties of some popular materials and thicknesses. These were analyzed using three-dimensional (3-D) models and continuous energy libraries of MCNP4B2, a Monte Carlo code developed at Los Alamos National Laboratory, New Mexico. A shielding benchmark library provided the experimental data and allowed a wide range of choices for source, geometry, and measurement data. The experimental data had often been used in previous analyses by reputable groups such as the Cross Section Evaluation Working Group (CSEWG) and the Organization for Economic Cooperation and Development/Nuclear Energy Agency Nuclear Science Committee (OECD/NEANSC)

  11. Methods for the development of large computer codes under LTSS

    International Nuclear Information System (INIS)

    Sicilian, J.M.

    1977-06-01

    TRAC is a large computer code being developed by Group Q-6 for the analysis of the transient thermal hydraulic behavior of light-water nuclear reactors. A system designed to assist the development of TRAC is described. The system consists of a central HYDRA dataset, R6LIB, containing files used in the development of TRAC, and a file maintenance program, HORSE, which facilitates the use of this dataset

  12. Measuring the impact of computer resource quality on the software development process and product

    Science.gov (United States)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  13. The Computational Development of Reinforcement Learning during Adolescence.

    Directory of Open Access Journals (Sweden)

    Stefano Palminteri

    2016-06-01

    Full Text Available Adolescence is a period of life characterised by changes in learning and decision-making. Learning and decision-making do not rely on a unitary system, but instead require the coordination of different cognitive processes that can be mathematically formalised as dissociable computational modules. Here, we aimed to trace the developmental time-course of the computational modules responsible for learning from reward or punishment, and learning from counterfactual feedback. Adolescents and adults carried out a novel reinforcement learning paradigm in which participants learned the association between cues and probabilistic outcomes, where the outcomes differed in valence (reward versus punishment and feedback was either partial or complete (either the outcome of the chosen option only, or the outcomes of both the chosen and unchosen option, were displayed. Computational strategies changed during development: whereas adolescents' behaviour was better explained by a basic reinforcement learning algorithm, adults' behaviour integrated increasingly complex computational features, namely a counterfactual learning module (enabling enhanced performance in the presence of complete feedback and a value contextualisation module (enabling symmetrical reward and punishment learning. Unlike adults, adolescent performance did not benefit from counterfactual (complete feedback. In addition, while adults learned symmetrically from both reward and punishment, adolescents learned from reward but were less likely to learn from punishment. This tendency to rely on rewards and not to consider alternative consequences of actions might contribute to our understanding of decision-making in adolescence.

  14. Development of a computer-based pulsed NMR thermometer

    International Nuclear Information System (INIS)

    Hobeika, Alexandre; Haard, T.M.; Hoskinson, E.M.; Packard, R.E.

    2003-01-01

    We have designed a fully computer-controlled pulsed NMR system, using the National Instruments PCI-6115 data acquisition board. We use it for millikelvin thermometry and have developed a special control program, written in LabVIEW, for this purpose. It can perform measurements of temperature via the susceptibility or the τ 1 dependence. This system requires little hardware, which makes it very versatile, easily reproducible and customizable

  15. Development of computational algorithms for quantification of pulmonary structures

    International Nuclear Information System (INIS)

    Oliveira, Marcela de; Alvarez, Matheus; Alves, Allan F.F.; Miranda, Jose R.A.; Pina, Diana R.

    2012-01-01

    The high-resolution computed tomography has become the imaging diagnostic exam most commonly used for the evaluation of the squeals of Paracoccidioidomycosis. The subjective evaluations the radiological abnormalities found on HRCT images do not provide an accurate quantification. The computer-aided diagnosis systems produce a more objective assessment of the abnormal patterns found in HRCT images. Thus, this research proposes the development of algorithms in MATLAB® computing environment can quantify semi-automatically pathologies such as pulmonary fibrosis and emphysema. The algorithm consists in selecting a region of interest (ROI), and by the use of masks, filter densities and morphological operators, to obtain a quantification of the injured area to the area of a healthy lung. The proposed method was tested on ten HRCT scans of patients with confirmed PCM. The results of semi-automatic measurements were compared with subjective evaluations performed by a specialist in radiology, falling to a coincidence of 80% for emphysema and 58% for fibrosis. (author)

  16. Development of a scanning proton microprobe - computer-control, elemental mapping and applications

    International Nuclear Information System (INIS)

    Loevestam, Goeran.

    1989-08-01

    A scanning proton microprobe set-up has been developed at the Pelletron accelerator in Lund. A magnetic beam scanning system and a computer-control system for beam scanning and data aquisition is described. The computer system consists of a VMEbus front-end computer and a μVax-II host-computer, interfaced by means of a high-speed data link. The VMEbus computer controls data acquisition, beam charge monitoring and beam scanning while the more sophisticated work of elemental mapping and spectrum evaluations is left to the μVax-II. The calibration of the set-up is described as well as several applications. Elemental micro patterns in tree rings and bark has been investigated by means of elemental mapping and quantitative analysis. Large variations of elemental concentrations have been found for several elements within a single tree ring. An external beam set-up has been developed in addition to the proton microprobe set-up. The external beam has been used for the analysis of antique papyrus documents. Using a scanning sample procedure and particle induced X-ray emission (PIXE) analysis, damaged and missing characters of the text could be made visible by means of multivariate statistical data evaluation and elemental mapping. Also aspects of elemental mapping by means of scanning μPIXE analysis are discussed. Spectrum background, target thickness variations and pile-up are shown to influence the structure of elemental maps considerably. In addition, a semi-quantification procedure has been developed. (author)

  17. Factors Influencing Cloud-Computing Technology Adoption in Developing Countries

    Science.gov (United States)

    Hailu, Alemayehu

    2012-01-01

    Adoption of new technology has complicating components both from the selection, as well as decision-making criteria and process. Although new technology such as cloud computing provides great benefits especially to the developing countries, it has challenges that may complicate the selection decision and subsequent adoption process. This study…

  18. Development of computational models for the simulation of isodose curves on dosimetry films generated by iodine-125 brachytherapy seeds

    International Nuclear Information System (INIS)

    Santos, Adriano M.; Meira-Belo, Luiz C.; Reis, Sergio C.; Grynberg, Suely E.

    2011-01-01

    The interstitial brachytherapy is one modality of radiotherapy in which radioactive sources are placed directly in the region to be treated or close to it. The seeds that are used in the treatment of prostate cancer are generally cylindrical radioactive sources, consisting of a ceramic or metal matrix, which acts as the carrier of the radionuclide and as the X-ray marker, encapsulated in a sealed titanium tube. This study aimed to develop a computational model to reproduce the film-seed geometry, in order to obtain the spatial regions of the isodose curves produced by the seed when it is put over the film surface. The seed modeled in this work was the OncoSeed 6711, a sealed source of iodine-125, which its isodose curves were obtained experimentally in previous work with the use of dosimetric films. For the films modeling, compositions and densities of the two types of dosimetric films were used: Agfa Personal Monitoring photographic film 2/10, manufactured by Agfa-Geavaert; and the model EBT radiochromic film, by International Specialty Products. The film-seed models were coupled to the Monte Carlo code MCNP5. The results obtained by simulations showed to be in good agreement with experimental results performed in a previous work. This indicates that the computational model can be used in future studies for other seeds models. (author)

  19. Development of computational models for the simulation of isodose curves on dosimetry films generated by iodine-125 brachytherapy seeds

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Adriano M.; Meira-Belo, Luiz C.; Reis, Sergio C.; Grynberg, Suely E., E-mail: amsantos@cdtn.b [Center for Development of Nuclear Technology (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2011-07-01

    The interstitial brachytherapy is one modality of radiotherapy in which radioactive sources are placed directly in the region to be treated or close to it. The seeds that are used in the treatment of prostate cancer are generally cylindrical radioactive sources, consisting of a ceramic or metal matrix, which acts as the carrier of the radionuclide and as the X-ray marker, encapsulated in a sealed titanium tube. This study aimed to develop a computational model to reproduce the film-seed geometry, in order to obtain the spatial regions of the isodose curves produced by the seed when it is put over the film surface. The seed modeled in this work was the OncoSeed 6711, a sealed source of iodine-125, which its isodose curves were obtained experimentally in previous work with the use of dosimetric films. For the films modeling, compositions and densities of the two types of dosimetric films were used: Agfa Personal Monitoring photographic film 2/10, manufactured by Agfa-Geavaert; and the model EBT radiochromic film, by International Specialty Products. The film-seed models were coupled to the Monte Carlo code MCNP5. The results obtained by simulations showed to be in good agreement with experimental results performed in a previous work. This indicates that the computational model can be used in future studies for other seeds models. (author)

  20. International Developments in Computer Science.

    Science.gov (United States)

    1982-06-01

    background on 52 53 China’s scientific research and on their computer science before 1978. A useful companion to the directory is another publication of the...bimonthly publication in Portuguese; occasional translation of foreign articles into Portuguese. Data News: A bimonthly industry newsletter. Sistemas ...computer-related topics; Spanish. Delta: Publication of local users group; Spanish. Sistemas : Publication of System Engineers of Colombia; Spanish. CUBA

  1. Design, Development, and Evaluation of a Mobile Learning Application for Computing Education

    Science.gov (United States)

    Oyelere, Solomon Sunday; Suhonen, Jarkko; Wajiga, Greg M.; Sutinen, Erkki

    2018-01-01

    The study focused on the application of the design science research approach in the course of developing a mobile learning application, MobileEdu, for computing education in the Nigerian higher education context. MobileEdu facilitates the learning of computer science courses on mobile devices. The application supports ubiquitous, collaborative,…

  2. Enabling Customization through Web Development: An Iterative Study of the Dell Computer Corporation Website

    Science.gov (United States)

    Liu, Chang; Mackie, Brian G.

    2008-01-01

    Throughout the last decade, companies have increased their investment in electronic commerce (EC) by developing and implementing Web-based applications on the Internet. This paper describes a class project to develop a customized computer website which is similar to Dell Computer Corporation's (Dell) website. The objective of this project is to…

  3. Development of a proton Computed Tomography Detector System

    Energy Technology Data Exchange (ETDEWEB)

    Naimuddin, Md. [Delhi U.; Coutrakon, G. [Northern Illinois U.; Blazey, G. [Northern Illinois U.; Boi, S. [Northern Illinois U.; Dyshkant, A. [Northern Illinois U.; Erdelyi, B. [Northern Illinois U.; Hedin, D. [Northern Illinois U.; Johnson, E. [Northern Illinois U.; Krider, J. [Northern Illinois U.; Rukalin, V. [Northern Illinois U.; Uzunyan, S. A. [Northern Illinois U.; Zutshi, V. [Northern Illinois U.; Fordt, R. [Fermilab; Sellberg, G. [Fermilab; Rauch, J. E. [Fermilab; Roman, M. [Fermilab; Rubinov, P. [Fermilab; Wilson, P. [Fermilab

    2016-02-04

    Computer tomography is one of the most promising new methods to image abnormal tissues inside the human body. Tomography is also used to position the patient accurately before radiation therapy. Hadron therapy for treating cancer has become one of the most advantegeous and safe options. In order to fully utilize the advantages of hadron therapy, there is a necessity of performing radiography with hadrons as well. In this paper we present the development of a proton computed tomography system. Our second-generation proton tomography system consists of two upstream and two downstream trackers made up of fibers as active material and a range detector consisting of plastic scintillators. We present details of the detector system, readout electronics, and data acquisition system as well as the commissioning of the entire system. We also present preliminary results from the test beam of the range detector.

  4. Etiological factors for developing carpal tunnel syndrome in people who work with computers

    Directory of Open Access Journals (Sweden)

    Magdalena Lewańska

    2013-02-01

    Full Text Available Background: Carpal tunnel syndrome (CTS is the most frequent mononeuropathy of upper extremities. From the early 1990's it has been suggested that intensive work with computers can result in CTS development, however, this relationship has not as yet been proved. The aim of the study was to evaluate occupational and non-occupational risk factors for developing CTS in the population of computer-users. Material and Methods: The study group comprised 60 patients (58 women and 2 men; mean age: 53.8±6.35 years working with computers and suspected of occupational CTS. A survey as well as both median and ulnar nerve conduction examination (NCS were performed in all the subjects. Results: The patients worked with use of computer for 6.43±1.71h per day. The mean latency between the beginning of employment and the occurrence of first CTS symptoms was 12.09±5.94 years. All patients met the clinical and electrophysiological diagnostic criteria of CTS. In the majority of patients etiological factors for developing CTS were non-occupational: obesity, hypothyroidism, oophorectomy, past hysterectomy, hormonal replacement therapy or oral contraceptives, recent menopause, diabetes, tendovaginitis. In 7 computer-users etiological factors were not identified. Conclusion: The results of our study show that CTS is usually generated by different causes not related with using computers at work. Med Pr 2013;64(1:37–45

  5. Development of the two Korean adult tomographic computational phantoms for organ dosimetry

    International Nuclear Information System (INIS)

    Lee, Choonsik; Lee, Choonik; Park, Sang-Hyun; Lee, Jai-Ki

    2006-01-01

    Following the previously developed Korean tomographic phantom, KORMAN, two additional whole-body tomographic phantoms of Korean adult males were developed from magnetic resonance (MR) and computed tomography (CT) images, respectively. Two healthy male volunteers, whose body dimensions were fairly representative of the average Korean adult male, were recruited and scanned for phantom development. Contiguous whole body MR images were obtained from one subject exclusive of the arms, while whole-body CT images were acquired from the second individual. A total of 29 organs and tissues and 19 skeletal sites were segmented via image manipulation techniques such as gray-level thresholding, region growing, and manual drawing, in which each of segmented image slice was subsequently reviewed by an experienced radiologist for anatomical accuracy. The resulting phantoms, the MR-based KTMAN-1 (Korean Typical MAN-1) and the CT-based KTMAN-2 (Korean Typical MAN-2), consist of 300x150x344 voxels with a voxel resolution of 2x2x5 mm 3 for both phantoms. Masses of segmented organs and tissues were calculated as the product of a nominal reference density, the prevoxel volume, and the cumulative number of voxels defining each organs or tissue. These organs masses were then compared with those of both the Asian and the ICRP reference adult male. Organ masses within both KTMAN-1 and KTMAN-2 showed differences within 40% of Asian and ICRP reference values, with the exception of the skin, gall bladder, and pancreas which displayed larger differences. The resulting three-dimensional binary file was ported to the Monte Carlo code MCNPX2.4 to calculate organ doses following external irradiation for illustrative purposes. Colon, lung, liver, and stomach absorbed doses, as well as the effective dose, for idealized photon irradiation geometries (anterior-posterior and right lateral) were determined, and then compared with data from two other tomographic phantoms (Asian and Caucasian), and

  6. Development of algorithm for continuous generation of a computer game in terms of usability and optimization of developed code in computer science

    Directory of Open Access Journals (Sweden)

    Tibor Skala

    2018-03-01

    Full Text Available As both hardware and software have become increasingly available and constantly developed, they globally contribute to improvements in technology in every field of technology and arts. Digital tools for creation and processing of graphical contents are very developed and they have been designed to shorten the time required for content creation, which is, in this case, animation. Since contemporary animation has experienced a surge in various visual styles and visualization methods, programming is built-in in everything that is currently in use. There is no doubt that there is a variety of algorithms and software which are the brain and the moving force behind any idea created for a specific purpose and applicability in society. Art and technology combined make a direct and oriented medium for publishing and marketing in every industry, including those which are not necessarily closely related to those that rely heavily on visual aspect of work. Additionally, quality and consistency of an algorithm will also depend on proper integration into the system that will be powered by that algorithm as well as on the way the algorithm is designed. Development of an endless algorithm and its effective use will be shown during the use of the computer game. In order to present the effect of various parameters, in the final phase of the computer game development an endless algorithm was tested with varying number of key input parameters (achieved time, score reached, pace of the game.

  7. Scalable Computational Chemistry: New Developments and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri [Iowa State Univ., Ames, IA (United States)

    2002-01-01

    The computational part of the thesis is the investigation of titanium chloride (II) as a potential catalyst for the bis-silylation reaction of ethylene with hexaclorodisilane at different levels of theory. Bis-silylation is an important reaction for producing bis(silyl) compounds and new C-Si bonds, which can serve as monomers for silicon containing polymers and silicon carbides. Ab initio calculations on the steps involved in a proposed mechanism are presented. This choice of reactants allows them to study this reaction at reliable levels of theory without compromising accuracy. The calculations indicate that this is a highly exothermic barrierless reaction. The TiCl2 catalyst removes a 50 kcal/mol activation energy barrier required for the reaction without the catalyst. The first step is interaction of TiCl2 with ethylene to form an intermediate that is 60 kcal/mol below the energy of the reactants. This is the driving force for the entire reaction. Dynamic correlation plays a significant role because RHF calculations indicate that the net barrier for the catalyzed reaction is 50 kcal/mol. They conclude that divalent Ti has the potential to become an important industrial catalyst for silylation reactions. In the programming part of the thesis, parallelization of different quantum chemistry methods is presented. The parallelization of code is becoming important aspects of quantum chemistry code development. Two trends contribute to it: the overall desire to study large chemical systems and the desire to employ highly correlated methods which are usually computationally and memory expensive. In the presented distributed data algorithms computation is parallelized and the largest arrays are evenly distributed among CPUs. First, the parallelization of the Hartree-Fock self-consistent field (SCF) method is considered. SCF method is the most common starting point for more accurate calculations. The Fock build (sub step of SCF) from AO integrals is

  8. Using NCLab-karel to improve computational thinking skill of junior high school students

    Science.gov (United States)

    Kusnendar, J.; Prabawa, H. W.

    2018-05-01

    Increasingly human interaction with technology and the increasingly complex development of digital technology world make the theme of computer science education interesting to study. Previous studies on Computer Literacy and Competency reveal that Indonesian teachers in general have fairly high computational skill, but their skill utilization are limited to some applications. This engenders limited and minimum computer-related learning for the students. On the other hand, computer science education is considered unrelated to real-world solutions. This paper attempts to address the utilization of NCLab- Karel in shaping the computational thinking in students. This computational thinking is believed to be able to making learn students about technology. Implementation of Karel utilization provides information that Karel is able to increase student interest in studying computational material, especially algorithm. Observations made during the learning process also indicate the growth and development of computing mindset in students.

  9. Developing Oral and Written Communication Skills in Undergraduate Computer Science and Information Systems Curriculum

    Science.gov (United States)

    Kortsarts, Yana; Fischbach, Adam; Rufinus, Jeff; Utell, Janine M.; Yoon, Suk-Chung

    2010-01-01

    Developing and applying oral and written communication skills in the undergraduate computer science and computer information systems curriculum--one of the ABET accreditation requirements - is a very challenging and, at the same time, a rewarding task that provides various opportunities to enrich the undergraduate computer science and computer…

  10. Development of the computer code system for the analyses of PWR core

    International Nuclear Information System (INIS)

    Tsujimoto, Iwao; Naito, Yoshitaka.

    1992-11-01

    This report is one of the materials for the work titled 'Development of the computer code system for the analyses of PWR core phenomena', which is performed under contracts between Shikoku Electric Power Company and JAERI. In this report, the numerical method adopted in our computer code system are described, that is, 'The basic course and the summary of the analysing method', 'Numerical method for solving the Boltzmann equation', 'Numerical method for solving the thermo-hydraulic equations' and 'Description on the computer code system'. (author)

  11. Development of Computational Models for Pyrochemical Electrorefiners of Nuclear Waste Transmutation Systems

    International Nuclear Information System (INIS)

    Kim, K. R.; Lee, H. S.; Hwang, I. S.

    2010-12-01

    The objective of this project is to develop multi-dimensional computational models in order to improve the operation of uranium electrorefiners currently used in pyroprocessing technology. These 2-D (US) and 3-D (ROK) mathematical models are based on the fundamental physical and chemical properties of the electrorefiner processes. The validated models by compiled and evaluated experimental data could provide better information for developing advanced electrorefiners for uranium recovery. The research results in this period are as follows: - Successfully assessed a common computational platform for the modeling work and identify spatial characterization requirements. - Successfully developed a 3-D electro-fluid dynamic electrorefiner model. - Successfully validated and benchmarked the two multi-dimensional models with compiled experimental data sets

  12. Computational techniques in gamma-ray skyshine analysis

    International Nuclear Information System (INIS)

    George, D.L.

    1988-12-01

    Two computer codes were developed to analyze gamma-ray skyshine, the scattering of gamma photons by air molecules. A review of previous gamma-ray skyshine studies discusses several Monte Carlo codes, programs using a single-scatter model, and the MicroSkyshine program for microcomputers. A benchmark gamma-ray skyshine experiment performed at Kansas State University is also described. A single-scatter numerical model was presented which traces photons from the source to their first scatter, then applies a buildup factor along a direct path from the scattering point to a detector. The FORTRAN code SKY, developed with this model before the present study, was modified to use Gauss quadrature, recent photon attenuation data and a more accurate buildup approximation. The resulting code, SILOGP, computes response from a point photon source on the axis of a silo, with and without concrete shielding over the opening. Another program, WALLGP, was developed using the same model to compute response from a point gamma source behind a perfectly absorbing wall, with and without shielding overhead. 29 refs., 48 figs., 13 tabs

  13. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  14. Development of computational small animal models and their applications in preclinical imaging and therapy research.

    Science.gov (United States)

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  15. Development of computational small animal models and their applications in preclinical imaging and therapy research

    Energy Technology Data Exchange (ETDEWEB)

    Xie, Tianwu [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva 4 CH-1211 (Switzerland); Zaidi, Habib, E-mail: habib.zaidi@hcuge.ch [Division of Nuclear Medicine and Molecular Imaging, Geneva University Hospital, Geneva 4 CH-1211 (Switzerland); Geneva Neuroscience Center, Geneva University, Geneva CH-1205 (Switzerland); Department of Nuclear Medicine and Molecular Imaging, University of Groningen, University Medical Center Groningen, Groningen 9700 RB (Netherlands)

    2016-01-15

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future.

  16. Development of computational small animal models and their applications in preclinical imaging and therapy research

    International Nuclear Information System (INIS)

    Xie, Tianwu; Zaidi, Habib

    2016-01-01

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal models have been reported in the literature and used as surrogates to characterize the anatomy of actual animals for the simulation of preclinical studies involving the use of bioluminescence tomography, fluorescence molecular tomography, positron emission tomography, single-photon emission computed tomography, microcomputed tomography, magnetic resonance imaging, and optical imaging. Other applications include electromagnetic field simulation, ionizing and nonionizing radiation dosimetry, and the development and evaluation of new methodologies for multimodality image coregistration, segmentation, and reconstruction of small animal images. This paper provides a comprehensive review of the history and fundamental technologies used for the development of computational small animal models with a particular focus on their application in preclinical imaging as well as nonionizing and ionizing radiation dosimetry calculations. An overview of the overall process involved in the design of these models, including the fundamental elements used for the construction of different types of computational models, the identification of original anatomical data, the simulation tools used for solving various computational problems, and the applications of computational animal models in preclinical research. The authors also analyze the characteristics of categories of computational models (stylized, voxel-based, and boundary representation) and discuss the technical challenges faced at the present time as well as research needs in the future

  17. Development of the Shimadzu computed tomographic scanner SCT-200N

    International Nuclear Information System (INIS)

    Ishihara, Hiroshi; Yamaoka, Nobuyuki; Saito, Masahiro

    1982-01-01

    The Shimadzu Computed Tomographic Scanner SCT-200N has been developed as an ideal CT scanner for diagnosing the head and spine. Due to the large aperture, moderate scan time and the Zoom Scan Mode, any part of the body can be scanned. High quality image can be obtained by adopting the precisely stabilized X-ray unit and densely packed array of 64-detectors. As for its operation, capability of computed radiography (CR) prior to patient positioning and real time reconstruction ensure efficient patient through-put. Details of the SCT-200N are described in this paper. (author)

  18. Dissemination of Information in Developing Countries: The Personal Computer and beyond

    Science.gov (United States)

    Wong, Wai-Man

    2005-01-01

    With the blooming of information in digital format, dissemination of information is becoming a big challenge for developing countries. It is not only due to the limited provision of personal computers--in addition, the technological infrastructure and the ability to access information are also becoming major concerns in developing countries. This…

  19. Development of a Computer Application to Simulate Porous Structures

    Directory of Open Access Journals (Sweden)

    S.C. Reis

    2002-09-01

    Full Text Available Geometric modeling is an important tool to evaluate structural parameters as well as to follow the application of stereological relationships. The obtention, visualization and analysis of volumetric images of the structure of materials, using computational geometric modeling, facilitates the determination of structural parameters of difficult experimental access, such as topological and morphological parameters. In this work, we developed a geometrical model implemented by computer software that simulates random pore structures. The number of nodes, number of branches (connections between nodes and the number of isolated parts, are obtained. Also, the connectivity (C is obtained from this application. Using a list of elements, nodes and branches, generated by the software, in AutoCAD® command line format, the obtained structure can be viewed and analyzed.

  20. Radiologic total lung capacity measurement. Development and evaluation of a computer-based system

    Energy Technology Data Exchange (ETDEWEB)

    Seeley, G.W.; Mazzeo, J.; Borgstrom, M.; Hunter, T.B.; Newell, J.D.; Bjelland, J.C.

    1986-11-01

    The development of a computer-based radiologic total lung capacity (TLC) measurement system designed to be used by non-physician personnel is detailed. Four operators tested the reliability and validity of the system by measuring inspiratory PA and lateral pediatric chest radiographs with a Graf spark pen interfaced to a DEC VAX 11/780 computer. First results suggest that the ultimate goal of developing an accurate and easy to use TLC measurement system for non-physician personnel is attainable.

  1. Developing an Educational Computer Game for Migratory Bird Identification Based on a Two-Tier Test Approach

    Science.gov (United States)

    Chu, Hui-Chun; Chang, Shao-Chen

    2014-01-01

    Although educational computer games have been recognized as being a promising approach, previous studies have indicated that, without supportive models, students might only show temporary interest during the game-based learning process, and their learning performance is often not as good as expected. Therefore, in this paper, a two-tier test…

  2. Programming Unconventional Computers: Dynamics, Development, Self-Reference

    Directory of Open Access Journals (Sweden)

    Susan Stepney

    2012-10-01

    Full Text Available Classical computing has well-established formalisms for specifying, refining, composing, proving, and otherwise reasoning about computations. These formalisms have matured over the past 70 years or so. Unconventional Computing includes the use of novel kinds of substrates–from black holes and quantum effects, through to chemicals, biomolecules, even slime moulds–to perform computations that do not conform to the classical model. Although many of these unconventional substrates can be coerced into performing classical computation, this is not how they “naturally” compute. Our ability to exploit unconventional computing is partly hampered by a lack of corresponding programming formalisms: we need models for building, composing, and reasoning about programs that execute in these substrates. What might, say, a slime mould programming language look like? Here I outline some of the issues and properties of these unconventional substrates that need to be addressed to find “natural” approaches to programming them. Important concepts include embodied real values, processes and dynamical systems, generative systems and their meta-dynamics, and embodied self-reference.

  3. Development of computational small animal models and their applications in preclinical imaging and therapy research

    NARCIS (Netherlands)

    Xie, Tianwu; Zaidi, Habib

    The development of multimodality preclinical imaging techniques and the rapid growth of realistic computer simulation tools have promoted the construction and application of computational laboratory animal models in preclinical research. Since the early 1990s, over 120 realistic computational animal

  4. Natural Computing in Computational Finance Volume 4

    CERN Document Server

    O’Neill, Michael; Maringer, Dietmar

    2012-01-01

    This book follows on from Natural Computing in Computational Finance  Volumes I, II and III.   As in the previous volumes of this series, the  book consists of a series of  chapters each of  which was selected following a rigorous, peer-reviewed, selection process.  The chapters illustrate the application of a range of cutting-edge natural  computing and agent-based methodologies in computational finance and economics.  The applications explored include  option model calibration, financial trend reversal detection, enhanced indexation, algorithmic trading,  corporate payout determination and agent-based modeling of liquidity costs, and trade strategy adaptation.  While describing cutting edge applications, the chapters are  written so that they are accessible to a wide audience. Hence, they should be of interest  to academics, students and practitioners in the fields of computational finance and  economics.  

  5. Developing Human-Computer Interface Models and Representation Techniques(Dialogue Management as an Integral Part of Software Engineering)

    OpenAIRE

    Hartson, H. Rex; Hix, Deborah; Kraly, Thomas M.

    1987-01-01

    The Dialogue Management Project at Virginia Tech is studying the poorly understood problem of human-computer dialogue development. This problem often leads to low usability in human-computer dialogues. The Dialogue Management Project approaches solutions to low usability in interfaces by addressing human-computer dialogue development as an integral and equal part of the total system development process. This project consists of two rather distinct, but dependent, parts. One is development of ...

  6. A computational clonal analysis of the developing mouse limb bud.

    Directory of Open Access Journals (Sweden)

    Luciano Marcon

    Full Text Available A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis.

  7. CADRIGS--computer aided design reliability interactive graphics system

    International Nuclear Information System (INIS)

    Kwik, R.J.; Polizzi, L.M.; Sticco, S.; Gerrard, P.B.; Yeater, M.L.; Hockenbury, R.W.; Phillips, M.A.

    1982-01-01

    An integrated reliability analysis program combining graphic representation of fault trees, automated data base loadings and reference, and automated construction of reliability code input files was developed. The functional specifications for CADRIGS, the computer aided design reliability interactive graphics system, are presented. Previously developed fault tree segments used in auxiliary feedwater system safety analysis were constructed on CADRIGS and, when combined, yielded results identical to those resulting from manual input to the same reliability codes

  8. Agile Development of Various Computational Power Adaptive Web-Based Mobile-Learning Software Using Mobile Cloud Computing

    Science.gov (United States)

    Zadahmad, Manouchehr; Yousefzadehfard, Parisa

    2016-01-01

    Mobile Cloud Computing (MCC) aims to improve all mobile applications such as m-learning systems. This study presents an innovative method to use web technology and software engineering's best practices to provide m-learning functionalities hosted in a MCC-learning system as service. Components hosted by MCC are used to empower developers to create…

  9. Development of Point Kernel Shielding Analysis Computer Program Implementing Recent Nuclear Data and Graphic User Interfaces

    International Nuclear Information System (INIS)

    Kang, Sang Ho; Lee, Seung Gi; Chung, Chan Young; Lee, Choon Sik; Lee, Jai Ki

    2001-01-01

    In order to comply with revised national regulationson radiological protection and to implement recent nuclear data and dose conversion factors, KOPEC developed a new point kernel gamma and beta ray shielding analysis computer program. This new code, named VisualShield, adopted mass attenuation coefficient and buildup factors from recent ANSI/ANS standards and flux-to-dose conversion factors from the International Commission on Radiological Protection (ICRP) Publication 74 for estimation of effective/equivalent dose recommended in ICRP 60. VisualShield utilizes graphical user interfaces and 3-D visualization of the geometric configuration for preparing input data sets and analyzing results, which leads users to error free processing with visual effects. Code validation and data analysis were performed by comparing the results of various calculations to the data outputs of previous programs such as MCNP 4B, ISOSHLD-II, QAD-CGGP, etc

  10. Automatic Grading of 3D Computer Animation Laboratory Assignments

    Science.gov (United States)

    Lamberti, Fabrizio; Sanna, Andrea; Paravati, Gianluca; Carlevaris, Gilles

    2014-01-01

    Assessment is a delicate task in the overall teaching process because it may require significant time and may be prone to subjectivity. Subjectivity is especially true for disciplines in which perceptual factors play a key role in the evaluation. In previous decades, computer-based assessment techniques were developed to address the…

  11. A Multidisciplinary Model for Development of Intelligent Computer-Assisted Instruction.

    Science.gov (United States)

    Park, Ok-choon; Seidel, Robert J.

    1989-01-01

    Proposes a schematic multidisciplinary model to help developers of intelligent computer-assisted instruction (ICAI) identify the types of required expertise and integrate them into a system. Highlights include domain types and expertise; knowledge acquisition; task analysis; knowledge representation; student modeling; diagnosis of learning needs;…

  12. Caltech computer scientists develop FAST protocol to speed up Internet

    CERN Multimedia

    2003-01-01

    "Caltech computer scientists have developed a new data transfer protocol for the Internet fast enough to download a full-length DVD movie in less than five seconds. The protocol is called FAST, standing for Fast Active queue management Scalable Transmission Control Protocol" (1 page).

  13. Parallel, distributed and GPU computing technologies in single-particle electron microscopy.

    Science.gov (United States)

    Schmeisser, Martin; Heisen, Burkhard C; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger

    2009-07-01

    Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today's technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined.

  14. Software Development Processes Applied to Computational Icing Simulation

    Science.gov (United States)

    Levinson, Laurie H.; Potapezuk, Mark G.; Mellor, Pamela A.

    1999-01-01

    The development of computational icing simulation methods is making the transition form the research to common place use in design and certification efforts. As such, standards of code management, design validation, and documentation must be adjusted to accommodate the increased expectations of the user community with respect to accuracy, reliability, capability, and usability. This paper discusses these concepts with regard to current and future icing simulation code development efforts as implemented by the Icing Branch of the NASA Lewis Research Center in collaboration with the NASA Lewis Engineering Design and Analysis Division. With the application of the techniques outlined in this paper, the LEWICE ice accretion code has become a more stable and reliable software product.

  15. Developing an evidence-based curriculum designed to help psychiatric nurses learn to use computers and the Internet.

    Science.gov (United States)

    Koivunen, Marita; Välimäki, Maritta; Jakobsson, Tiina; Pitkänen, Anneli

    2008-01-01

    This article describes the systematic process in which an evidence-based approach was used to develop a curriculum designed to support the computer and Internet skills of nurses in psychiatric hospitals in Finland. The pressure on organizations to have skilled and motivated nurses who use modern information and communication technology in health care organizations has increased due to rapid technology development at the international and national levels. However, less frequently has the development of those computer education curricula been based on evidence-based knowledge. First, we identified psychiatric nurses' learning experiences and barriers to computer use by examining written essays. Second, nurses' computer skills were surveyed. Last, evidence from the literature was scrutinized to find effective methods that can be used to teach and learn computer use in health care. This information was integrated and used for the development process of an education curriculum designed to support nurses' computer and Internet skills.

  16. GSTARS computer models and their applications, part I: theoretical development

    Science.gov (United States)

    Yang, C.T.; Simoes, F.J.M.

    2008-01-01

    GSTARS is a series of computer models developed by the U.S. Bureau of Reclamation for alluvial river and reservoir sedimentation studies while the authors were employed by that agency. The first version of GSTARS was released in 1986 using Fortran IV for mainframe computers. GSTARS 2.0 was released in 1998 for personal computer application with most of the code in the original GSTARS revised, improved, and expanded using Fortran IV/77. GSTARS 2.1 is an improved and revised GSTARS 2.0 with graphical user interface. The unique features of all GSTARS models are the conjunctive use of the stream tube concept and of the minimum stream power theory. The application of minimum stream power theory allows the determination of optimum channel geometry with variable channel width and cross-sectional shape. The use of the stream tube concept enables the simulation of river hydraulics using one-dimensional numerical solutions to obtain a semi-two- dimensional presentation of the hydraulic conditions along and across an alluvial channel. According to the stream tube concept, no water or sediment particles can cross the walls of stream tubes, which is valid for many natural rivers. At and near sharp bends, however, sediment particles may cross the boundaries of stream tubes. GSTARS3, based on FORTRAN 90/95, addresses this phenomenon and further expands the capabilities of GSTARS 2.1 for cohesive and non-cohesive sediment transport in rivers and reservoirs. This paper presents the concepts, methods, and techniques used to develop the GSTARS series of computer models, especially GSTARS3. ?? 2008 International Research and Training Centre on Erosion and Sedimentation and the World Association for Sedimentation and Erosion Research.

  17. Development of a Computer Code for the Estimation of Fuel Rod Failure

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, I.H.; Ahn, H.J. [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)

    1997-12-31

    Much research has already been performed to obtain the information on the degree of failed fuel rods from the primary coolant activities of operating PWRs in the last few decades. The computer codes that are currently in use for domestic nuclear power plants, such as CADE code and ABB-CE codes developed by Westinghouse and ABB-CE, respectively, still give significant overall errors in estimating the failed fuel rods. In addition, with the CADE code, it is difficult to predict the degree of fuel rod failures during the transient period of nuclear reactor operation, where as the ABB-CE codes are relatively more difficult to use for end-users. In particular, the rapid progresses made recently in the area of the computer hardware and software systems that their computer programs be more versatile and user-friendly. While the MS windows system that is centered on the graphic user interface and multitasking is now in widespread use, the computer codes currently employed at the nuclear power plants, such as CADE and ABB-CE codes, can only be run on the DOS system. Moreover, it is desirable to have a computer code for the fuel rod failure estimation that can directly use the radioactivity data obtained from the on-line monitoring system of the primary coolant activity. The main purpose of this study is, therefore, to develop a Windows computer code that can predict the location, the number of failed fuel rods,and the degree of failures using the radioactivity data obtained from the primary coolant activity for PWRs. Another objective is to combine this computer code with the on-line monitoring system of the primary coolant radioactivity at Kori 3 and 4 operating nuclear power plants and enable their combined use for on-line evaluation of the number and degree of fuel rod failures. (author). 49 refs., 85 figs., 30 tabs.

  18. Development of a computer control system for the RCNP ring cyclotron

    International Nuclear Information System (INIS)

    Ogata, H.; Yamazaki, T.; Ando, A.; Hosono, K.; Itahashi, T.; Katayama, I.; Kibayashi, M.; Kinjo, S.; Kondo, M.; Miura, I.; Nagayama, K.; Noro, T.; Saito, T.; Shimizu, A.; Uraki, M.; Maruyama, M.; Aoki, K.; Yamada, S.; Kodaira, K.

    1990-01-01

    A hierarchically distributed computer control system for the RCNP ring cyclotron is being developed. The control system consists of a central computer and four subcomputers which are linked together by an Ethernet, universal device controllers which control component devices, man-machine interfaces including an operator console and interlock systems. The universal device controller is a standard single-board computer with an 8344 microcontroller and parallel interfaces, and is usually integrated into a component device and connected to a subcomputer by means of an optical-fiber cable to achieve high-speed data transfer. Control sequences for subsystems are easily produced and improved by using an interpreter language named OPELA (OPEration Language for Accelerators). The control system will be installed in March 1990. (orig.)

  19. Computer science teacher professional development in the United States: a review of studies published between 2004 and 2014

    Science.gov (United States)

    Menekse, Muhsin

    2015-10-01

    While there has been a remarkable interest to make computer science a core K-12 academic subject in the United States, there is a shortage of K-12 computer science teachers to successfully implement computer sciences courses in schools. In order to enhance computer science teacher capacity, training programs have been offered through teacher professional development. In this study, the main goal was to systematically review the studies regarding computer science professional development to understand the scope, context, and effectiveness of these programs in the past decade (2004-2014). Based on 21 journal articles and conference proceedings, this study explored: (1) Type of professional development organization and source of funding, (2) professional development structure and participants, (3) goal of professional development and type of evaluation used, (4) specific computer science concepts and training tools used, (5) and their effectiveness to improve teacher practice and student learning.

  20. Heat Transfer Computations of Internal Duct Flows With Combined Hydraulic and Thermal Developing Length

    Science.gov (United States)

    Wang, C. R.; Towne, C. E.; Hippensteele, S. A.; Poinsatte, P. E.

    1997-01-01

    This study investigated the Navier-Stokes computations of the surface heat transfer coefficients of a transition duct flow. A transition duct from an axisymmetric cross section to a non-axisymmetric cross section, is usually used to connect the turbine exit to the nozzle. As the gas turbine inlet temperature increases, the transition duct is subjected to the high temperature at the gas turbine exit. The transition duct flow has combined development of hydraulic and thermal entry length. The design of the transition duct required accurate surface heat transfer coefficients. The Navier-Stokes computational method could be used to predict the surface heat transfer coefficients of a transition duct flow. The Proteus three-dimensional Navier-Stokes numerical computational code was used in this study. The code was first studied for the computations of the turbulent developing flow properties within a circular duct and a square duct. The code was then used to compute the turbulent flow properties of a transition duct flow. The computational results of the surface pressure, the skin friction factor, and the surface heat transfer coefficient were described and compared with their values obtained from theoretical analyses or experiments. The comparison showed that the Navier-Stokes computation could predict approximately the surface heat transfer coefficients of a transition duct flow.

  1. Developing a computationally efficient dynamic multilevel hybrid optimization scheme using multifidelity model interactions.

    Energy Technology Data Exchange (ETDEWEB)

    Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Castro, Joseph Pete Jr. (; .); Giunta, Anthony Andrew

    2006-01-01

    Many engineering application problems use optimization algorithms in conjunction with numerical simulators to search for solutions. The formulation of relevant objective functions and constraints dictate possible optimization algorithms. Often, a gradient based approach is not possible since objective functions and constraints can be nonlinear, nonconvex, non-differentiable, or even discontinuous and the simulations involved can be computationally expensive. Moreover, computational efficiency and accuracy are desirable and also influence the choice of solution method. With the advent and increasing availability of massively parallel computers, computational speed has increased tremendously. Unfortunately, the numerical and model complexities of many problems still demand significant computational resources. Moreover, in optimization, these expenses can be a limiting factor since obtaining solutions often requires the completion of numerous computationally intensive simulations. Therefore, we propose a multifidelity optimization algorithm (MFO) designed to improve the computational efficiency of an optimization method for a wide range of applications. In developing the MFO algorithm, we take advantage of the interactions between multi fidelity models to develop a dynamic and computational time saving optimization algorithm. First, a direct search method is applied to the high fidelity model over a reduced design space. In conjunction with this search, a specialized oracle is employed to map the design space of this high fidelity model to that of a computationally cheaper low fidelity model using space mapping techniques. Then, in the low fidelity space, an optimum is obtained using gradient or non-gradient based optimization, and it is mapped back to the high fidelity space. In this paper, we describe the theory and implementation details of our MFO algorithm. We also demonstrate our MFO method on some example problems and on two applications: earth penetrators and

  2. Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective

    Directory of Open Access Journals (Sweden)

    Shuo Gu

    2017-01-01

    Full Text Available With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed.

  3. Chinese Herbal Medicine Meets Biological Networks of Complex Diseases: A Computational Perspective.

    Science.gov (United States)

    Gu, Shuo; Pei, Jianfeng

    2017-01-01

    With the rapid development of cheminformatics, computational biology, and systems biology, great progress has been made recently in the computational research of Chinese herbal medicine with in-depth understanding towards pharmacognosy. This paper summarized these studies in the aspects of computational methods, traditional Chinese medicine (TCM) compound databases, and TCM network pharmacology. Furthermore, we chose arachidonic acid metabolic network as a case study to demonstrate the regulatory function of herbal medicine in the treatment of inflammation at network level. Finally, a computational workflow for the network-based TCM study, derived from our previous successful applications, was proposed.

  4. Automating Commercial Video Game Development using Computational Intelligence

    OpenAIRE

    Tse G. Tan; Jason Teo; Patricia Anthony

    2011-01-01

    Problem statement: The retail sales of computer and video games have grown enormously during the last few years, not just in United States (US), but also all over the world. This is the reason a lot of game developers and academic researchers have focused on game related technologies, such as graphics, audio, physics and Artificial Intelligence (AI) with the goal of creating newer and more fun games. In recent years, there has been an increasing interest in game AI for pro...

  5. Some computational challenges of developing efficient parallel algorithms for data-dependent computations in thermal-hydraulics supercomputer applications

    International Nuclear Information System (INIS)

    Woodruff, S.B.

    1992-01-01

    The Transient Reactor Analysis Code (TRAC), which features a two- fluid treatment of thermal-hydraulics, is designed to model transients in water reactors and related facilities. One of the major computational costs associated with TRAC and similar codes is calculating constitutive coefficients. Although the formulations for these coefficients are local the costs are flow-regime- or data-dependent; i.e., the computations needed for a given spatial node often vary widely as a function of time. Consequently, poor load balancing will degrade efficiency on either vector or data parallel architectures when the data are organized according to spatial location. Unfortunately, a general automatic solution to the load-balancing problem associated with data-dependent computations is not yet available for massively parallel architectures. This document discusses why developers algorithms, such as a neural net representation, that do not exhibit algorithms, such as a neural net representation, that do not exhibit load-balancing problems

  6. An Overview of Recent Developments in Cognitive Diagnostic Computer Adaptive Assessments

    Directory of Open Access Journals (Sweden)

    Alan Huebner

    2010-01-01

    Full Text Available Cognitive diagnostic modeling has become an exciting new field of psychometric research. These models aim to diagnose examinees' mastery status of a group of discretely defined skills, or attributes, thereby providing them with detailed information regarding their specific strengths and weaknesses. Combining cognitive diagnosis with computer adaptive assessments has emerged as an important part of this new field. This article aims to provide practitioners and researchers with an introduction to and overview of recent developments in cognitive diagnostic computer adaptive assessments.

  7. Computer and machine vision theory, algorithms, practicalities

    CERN Document Server

    Davies, E R

    2012-01-01

    Computer and Machine Vision: Theory, Algorithms, Practicalities (previously entitled Machine Vision) clearly and systematically presents the basic methodology of computer and machine vision, covering the essential elements of the theory while emphasizing algorithmic and practical design constraints. This fully revised fourth edition has brought in more of the concepts and applications of computer vision, making it a very comprehensive and up-to-date tutorial text suitable for graduate students, researchers and R&D engineers working in this vibrant subject. Key features include: Practical examples and case studies give the 'ins and outs' of developing real-world vision systems, giving engineers the realities of implementing the principles in practice New chapters containing case studies on surveillance and driver assistance systems give practical methods on these cutting-edge applications in computer vision Necessary mathematics and essential theory are made approachable by careful explanations and well-il...

  8. Individual Stochastic Screening for the Development of Computer Graphics

    Directory of Open Access Journals (Sweden)

    Maja Turčić¹*

    2012-12-01

    Full Text Available With the emergence of new tools and media, art and design have developed into digital computer-generated works. This article presents a sequence of creating art graphics because their original authors have not published the procedures. The goal is to discover the mathematics of an image and the programming libretto with the purpose of organizing a structural base of computer graphics. We will elaborate the procedures used to produce graphics known throughout the history of art, but that are nowadays also found in design and security graphics. The results are closely related graphics obtained by changing parameters that initiate them. The aim is to control the graphics, i.e. to use controlled stochastic to achieve desired solutions. Since the artists from the past have never published the procedures of screening methods, their ideas have remained “only” the works of art. In this article we will present the development of the algorithm that, more or less successfully, simulates those screening solutions. It has been proven that mathematically defined graphical elements serve as screening elements. New technological and mathematical solutions are introduced in the reproduction with individual screening elements to be used in printing.

  9. HEDPIN: a computer program to estimate pinwise power density

    International Nuclear Information System (INIS)

    Cappiello, M.W.

    1976-05-01

    A description is given of the digital computer program, HEDPIN. This program, modeled after a previously developed program, POWPIN, provides a means of estimating the pinwise power density distribution in fast reactor triangular pitched pin bundles. The capability also exists for computing any reaction rate of interest at the respective pin positions within an assembly. HEDPIN was developed in support of FTR fuel and test management as well as fast reactor core design and core characterization planning and analysis. The results of a test devised to check out HEDPIN's computational method are given, and the realm of application is discussed. Nearly all programming is in FORTRAN IV. Variable dimensioning is employed to make efficient use of core memory and maintain short running time for small problems. Input instructions, sample problem, and a program listing are also given

  10. Web Program for Development of GUIs for Cluster Computers

    Science.gov (United States)

    Czikmantory, Akos; Cwik, Thomas; Klimeck, Gerhard; Hua, Hook; Oyafuso, Fabiano; Vinyard, Edward

    2003-01-01

    WIGLAF (a Web Interface Generator and Legacy Application Facade) is a computer program that provides a Web-based, distributed, graphical-user-interface (GUI) framework that can be adapted to any of a broad range of application programs, written in any programming language, that are executed remotely on any cluster computer system. WIGLAF enables the rapid development of a GUI for controlling and monitoring a specific application program running on the cluster and for transferring data to and from the application program. The only prerequisite for the execution of WIGLAF is a Web-browser program on a user's personal computer connected with the cluster via the Internet. WIGLAF has a client/server architecture: The server component is executed on the cluster system, where it controls the application program and serves data to the client component. The client component is an applet that runs in the Web browser. WIGLAF utilizes the Extensible Markup Language to hold all data associated with the application software, Java to enable platform-independent execution on the cluster system and the display of a GUI generator through the browser, and the Java Remote Method Invocation software package to provide simple, effective client/server networking.

  11. Computational geometry algorithms and applications

    CERN Document Server

    de Berg, Mark; Overmars, Mark; Schwarzkopf, Otfried

    1997-01-01

    Computational geometry emerged from the field of algorithms design and anal­ ysis in the late 1970s. It has grown into a recognized discipline with its own journals, conferences, and a large community of active researchers. The suc­ cess of the field as a research discipline can on the one hand be explained from the beauty of the problems studied and the solutions obtained, and, on the other hand, by the many application domains--computer graphics, geographic in­ formation systems (GIS), robotics, and others-in which geometric algorithms play a fundamental role. For many geometric problems the early algorithmic solutions were either slow or difficult to understand and implement. In recent years a number of new algorithmic techniques have been developed that improved and simplified many of the previous approaches. In this textbook we have tried to make these modem algorithmic solutions accessible to a large audience. The book has been written as a textbook for a course in computational geometry, but it can ...

  12. Bringing Mohamed to the Mountain: Situated Professional Development in a Ubiquitous Computing Classroom

    Science.gov (United States)

    Swan, Karen; Kratcoski, Annette; Mazzer, Pat; Schenker, Jason

    2005-01-01

    This article describes an ongoing situated professional development program in which teachers bring their intact classes for an extended stay in a ubiquitous computing environment equipped with a variety of state-of-the-art computing devices. The experience is unique in that it not only situates teacher learning about technology integration in…

  13. The Development of Computer-Aided Design for Electrical Equipment Selection and Arrangement of 10 Kv Switchgear

    Directory of Open Access Journals (Sweden)

    Chernaya Anastassiya

    2015-01-01

    Full Text Available The paper intends to give an overview of a computer-aided design program application. The research includes two main parts: the development of a computer-aided design for an appropriate switchgear selection and its arrangement in an indoor switchgear layout. Matlab program was used to develop a computer-aided design system. The use of this program considerably simplifies the selection and arrangement of 10 kV switchgear.

  14. Teachers' Support in Using Computers for Developing Students' Listening and Speaking Skills in Pre-Sessional English Courses

    Science.gov (United States)

    Zou, Bin

    2013-01-01

    Many computer-assisted language learning (CALL) studies have found that teacher direction can help learners develop language skills at their own pace on computers. However, many teachers still do not know how to provide support for students to use computers to reinforce the development of their language skills. Hence, more examples of CALL…

  15. Computer based workstation for development of software for high energy physics experiments

    International Nuclear Information System (INIS)

    Ivanchenko, I.M.; Sedykh, Yu.V.

    1987-01-01

    Methodical principles and results of a successful attempt to create on the base of IBM-PC/AT personal computer of effective means for development of programs for high energy physics experiments are analysed. The obtained results permit to combine the best properties and a positive materialized experience accumulated on the existing time sharing collective systems with a high quality of data representation, reliability and convenience of personal computer applications

  16. The cognitive dynamics of computer science cost-effective large scale software development

    CERN Document Server

    De Gyurky, Szabolcs Michael; John Wiley & Sons

    2006-01-01

    This book has three major objectives: To propose an ontology for computer software; To provide a methodology for development of large software systems to cost and schedule that is based on the ontology; To offer an alternative vision regarding the development of truly autonomous systems.

  17. Aortic pseudoaneurysm detected on external jugular venous distention following a Bentall procedure 10 years previously.

    Science.gov (United States)

    Fukunaga, Naoto; Shomura, Yu; Nasu, Michihiro; Okada, Yukikatsu

    2010-11-01

    An asymptomatic 49-year-old woman was admitted for the purpose of surgery for aortic pseudoaneurysm. She had Marfan syndrome and had undergone an emergent Bentall procedure 10 years previously. About six months previously, she could palpate distended bilateral external jugular veins, which became distended only in a supine position and without any other symptoms. Enhanced computed tomography revealed an aortic pseudoaneurysm originating from a previous distal anastomosis site. During induction of general anesthesia in a supine position, bilateral external jugular venous distention was remarkable. Immediately after a successful operation, distention completely resolved. The present case emphasizes the importance of physical examination leading to a diagnosis of asymptomatic life-threatening diseases in patients with a history of previous aortic surgery.

  18. Spatial Computing and Spatial Practices

    DEFF Research Database (Denmark)

    Brodersen, Anders; Büsher, Monika; Christensen, Michael

    2007-01-01

    The gathering momentum behind the research agendas of pervasive, ubiquitous and ambient computing, set in motion by Mark Weiser (1991), offer dramatic opportunities for information systems design. They raise the possibility of "putting computation where it belongs" by exploding computing power out...... the "disappearing computer" we have, therefore, carried over from previous research an interdisciplinary perspective, and a focus on the sociality of action (Suchman 1987)....

  19. Developing a project-based computational physics course grounded in expert practice

    Science.gov (United States)

    Burke, Christopher J.; Atherton, Timothy J.

    2017-04-01

    We describe a project-based computational physics course developed using a backwards course design approach. From an initial competency-based model of problem solving in computational physics, we interviewed faculty who use these tools in their own research to determine indicators of expert practice. From these, a rubric was formulated that enabled us to design a course intended to allow students to learn these skills. We also report an initial implementation of the course and, by having the interviewees regrade student work, show that students acquired many of the expert practices identified.

  20. Erlotinib-induced rash spares previously irradiated skin

    International Nuclear Information System (INIS)

    Lips, Irene M.; Vonk, Ernest J.A.; Koster, Mariska E.Y.; Houwing, Ronald H.

    2011-01-01

    Erlotinib is an epidermal growth factor receptor inhibitor prescribed to patients with locally advanced or metastasized non-small cell lung carcinoma after failure of at least one earlier chemotherapy treatment. Approximately 75% of the patients treated with erlotinib develop acneiform skin rashes. A patient treated with erlotinib 3 months after finishing concomitant treatment with chemotherapy and radiotherapy for non-small cell lung cancer is presented. Unexpectedly, the part of the skin that had been included in his previously radiotherapy field was completely spared from the erlotinib-induced acneiform skin rash. The exact mechanism of erlotinib-induced rash sparing in previously irradiated skin is unclear. The underlying mechanism of this phenomenon needs to be explored further, because the number of patients being treated with a combination of both therapeutic modalities is increasing. The therapeutic effect of erlotinib in the area of the previously irradiated lesion should be assessed. (orig.)

  1. Large-scale computation in solid state physics - Recent developments and prospects

    International Nuclear Information System (INIS)

    DeVreese, J.T.

    1985-01-01

    During the past few years an increasing interest in large-scale computation is developing. Several initiatives were taken to evaluate and exploit the potential of ''supercomputers'' like the CRAY-1 (or XMP) or the CYBER-205. In the U.S.A., there first appeared the Lax report in 1982 and subsequently (1984) the National Science Foundation in the U.S.A. announced a program to promote large-scale computation at the universities. Also, in Europe several CRAY- and CYBER-205 systems have been installed. Although the presently available mainframes are the result of a continuous growth in speed and memory, they might have induced a discontinuous transition in the evolution of the scientific method; between theory and experiment a third methodology, ''computational science'', has become or is becoming operational

  2. Computable Frames in Computable Banach Spaces

    Directory of Open Access Journals (Sweden)

    S.K. Kaushik

    2016-06-01

    Full Text Available We develop some parts of the frame theory in Banach spaces from the point of view of Computable Analysis. We define computable M-basis and use it to construct a computable Banach space of scalar valued sequences. Computable Xd frames and computable Banach frames are also defined and computable versions of sufficient conditions for their existence are obtained.

  3. Parallel, distributed and GPU computing technologies in single-particle electron microscopy

    International Nuclear Information System (INIS)

    Schmeisser, Martin; Heisen, Burkhard C.; Luettich, Mario; Busche, Boris; Hauer, Florian; Koske, Tobias; Knauber, Karl-Heinz; Stark, Holger

    2009-01-01

    An introduction to the current paradigm shift towards concurrency in software. Most known methods for the determination of the structure of macromolecular complexes are limited or at least restricted at some point by their computational demands. Recent developments in information technology such as multicore, parallel and GPU processing can be used to overcome these limitations. In particular, graphics processing units (GPUs), which were originally developed for rendering real-time effects in computer games, are now ubiquitous and provide unprecedented computational power for scientific applications. Each parallel-processing paradigm alone can improve overall performance; the increased computational performance obtained by combining all paradigms, unleashing the full power of today’s technology, makes certain applications feasible that were previously virtually impossible. In this article, state-of-the-art paradigms are introduced, the tools and infrastructure needed to apply these paradigms are presented and a state-of-the-art infrastructure and solution strategy for moving scientific applications to the next generation of computer hardware is outlined

  4. Design and development of a diversified real time computer for future FBRs

    International Nuclear Information System (INIS)

    Sujith, K.R.; Bhattacharyya, Anindya; Behera, R.P.; Murali, N.

    2014-01-01

    The current safety related computer system of Prototype Fast Breeder Reactor (PFBR) under construction in Kalpakkam consists of two redundant Versa Module Europa (VME) bus based Real Time Computer system with a Switch Over Logic Circuit (SOLC). Since both the VME systems are identical, the dual redundant system is prone to common cause failure (CCF). The probability of CCF can be reduced by adopting diversity. Design diversity has long been used to protect redundant systems against common-mode failures. The conventional notion of diversity relies on 'independent' generation of 'different' implementations. This paper discusses the design and development of a diversified Real Time Computer which will replace one of the computer system in the dual redundant architecture. Compact PCI (cPCI) bus systems are widely used in safety critical applications such as avionics, railways, defence and uses diverse electrical signaling and logical specifications, hence was chosen for development of the diversified system. Towards the initial development a CPU card based on an ARM-9 processor, 16 channel Relay Output (RO) card and a 30 channel Analog Input (AI) card was developed. All the cards mentioned supports hot-swap and geographic addressing capability. In order to mitigate the component obsolescence problem the 32 bit PCI target controller and associated glue logic for the slave I/O cards was indigenously developed using VHDL. U-boot was selected as the boot loader and arm Linux 2.6 as the preliminary operating system for the CPU card. Board specific initialization code for the CPU card was written in ARM assembly language and serial port initialization was written in C language. Boot loader along with Linux 2.6 kernel and jffs2 file system was flashed into the CPU card. Test applications written in C language were used to test the various peripherals of the CPU card. Device driver for the AI and RO card was developed as Linux kernel modules and application library was also

  5. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    Science.gov (United States)

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  6. Obstetrical ultrasound data-base management system by using personal computer

    International Nuclear Information System (INIS)

    Jeon, Hae Jeong; Park, Jeong Hee; Kim, Soo Nyung

    1993-01-01

    A computer program which performs obstetric calculations on Clipper Language using the data from ultrasonography was developed for personal computer. It was designed for fast assessment of fetal development, prediction of gestational age, and weight from ultrasonographic measurements which included biparietal diameter, femur length, gestational sac, occipito-frontal diameter, abdominal diameter, and etc. The Obstetrical-Ultrasound Data-Base Management System was tested for its performance. The Obstetrical-Ultrasound Data-Base Management System was very useful in patient management with its convenient data filing, easy retrieval of previous report, prompt but accurate estimation of fetal growth and skeletal anomaly and production of equation and growth curve for pregnant women

  7. Rapid mental computation system as a tool for algorithmic thinking of elementary school students development

    OpenAIRE

    Ziatdinov, Rushan; Musa, Sajid

    2013-01-01

    In this paper, we describe the possibilities of using a rapid mental computation system in elementary education. The system consists of a number of readily memorized operations that allow one to perform arithmetic computations very quickly. These operations are actually simple algorithms which can develop or improve the algorithmic thinking of pupils. Using a rapid mental computation system allows forming the basis for the study of computer science in secondary school.

  8. DIII-D tokamak control and neutral beam computer system upgrades

    International Nuclear Information System (INIS)

    Penaflor, B.G.; McHarg, B.B.; Piglowski, D.A.; Pham, D.; Phillips, J.C.

    2004-01-01

    This paper covers recent computer system upgrades made to the DIII-D tokamak control and neutral beam computer systems. The systems responsible for monitoring and controlling the DIII-D tokamak and injecting neutral beam power have recently come online with new computing hardware and software. The new hardware and software have provided a number of significant improvements over the previous Modcomp AEG VME and accessware based systems. These improvements include the incorporation of faster, less expensive, and more readily available computing hardware which have provided performance increases of up to a factor 20 over the prior systems. A more modern graphical user interface with advanced plotting capabilities has improved feedback to users on the operating status of the tokamak and neutral beam systems. The elimination of aging and non supportable hardware and software has increased overall maintainability. The distinguishing characteristics of the new system include: (1) a PC based computer platform running the Redhat version of the Linux operating system; (2) a custom PCI CAMAC software driver developed by general atomics for the kinetic systems 2115 serial highway card; and (3) a custom developed supervisory control and data acquisition (SCADA) software package based on Kylix, an inexpensive interactive development environment (IDE) tool from borland corporation. This paper provides specific details of the upgraded computer systems

  9. Novel opportunities for computational biology and sociology in drug discovery☆

    Science.gov (United States)

    Yao, Lixia; Evans, James A.; Rzhetsky, Andrey

    2013-01-01

    Current drug discovery is impossible without sophisticated modeling and computation. In this review we outline previous advances in computational biology and, by tracing the steps involved in pharmaceutical development, explore a range of novel, high-value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy–industry links for scientific and human benefit. Attention to these opportunities could promise punctuated advance and will complement the well-established computational work on which drug discovery currently relies. PMID:20349528

  10. Novel opportunities for computational biology and sociology in drug discovery

    Science.gov (United States)

    Yao, Lixia

    2009-01-01

    Drug discovery today is impossible without sophisticated modeling and computation. In this review we touch on previous advances in computational biology and by tracing the steps involved in pharmaceutical development, we explore a range of novel, high value opportunities for computational innovation in modeling the biological process of disease and the social process of drug discovery. These opportunities include text mining for new drug leads, modeling molecular pathways and predicting the efficacy of drug cocktails, analyzing genetic overlap between diseases and predicting alternative drug use. Computation can also be used to model research teams and innovative regions and to estimate the value of academy-industry ties for scientific and human benefit. Attention to these opportunities could promise punctuated advance, and will complement the well-established computational work on which drug discovery currently relies. PMID:19674801

  11. Evolution of facility layout requirements and CAD [computer-aided design] system development

    International Nuclear Information System (INIS)

    Jones, M.

    1990-06-01

    The overall configuration of the Superconducting Super Collider (SSC) including the infrastructure and land boundary requirements were developed using a computer-aided design (CAD) system. The evolution of the facility layout requirements and the use of the CAD system are discussed. The emphasis has been on minimizing the amount of input required and maximizing the speed by which the output may be obtained. The computer system used to store the data is also described

  12. Development of Computer Program for Analysis of Irregular Non Homogenous Radiation Shielding

    International Nuclear Information System (INIS)

    Bang Rozali; Nina Kusumah; Hendro Tjahjono; Darlis

    2003-01-01

    A computer program for radiation shielding analysis has been developed to obtain radiation attenuation calculation in non-homogenous radiation shielding and irregular geometry. By determining radiation source strength, geometrical shape of radiation source, location, dimension and geometrical shape of radiation shielding, radiation level of a point at certain position from radiation source can be calculated. By using a computer program, calculation result of radiation distribution analysis can be obtained for some analytical points simultaneously. (author)

  13. Development and validation of the computer technology literacy self-assessment scale for Taiwanese elementary school students.

    Science.gov (United States)

    Chang, Chiung-Sui

    2008-01-01

    The purpose of this study was to describe the development and validation of an instrument to identify various dimensions of the computer technology literacy self-assessment scale (CTLS) for elementary school students. The instrument included five CTLS dimensions (subscales): the technology operation skills, the computer usages concepts, the attitudes toward computer technology, the learning with technology, and the Internet operation skills. Participants were 1,539 elementary school students in Taiwan. Data analysis indicated that the instrument developed in the study had satisfactory validity and reliability. Correlations analysis supported the legitimacy of using multiple dimensions in representing students' computer technology literacy. Significant differences were found between male and female students, and between grades on some CTLS dimensions. Suggestions are made for use of the instrument to examine complicated interplays between students' computer behaviors and their computer technology literacy.

  14. Development of personnel exposure management system with personal computer

    International Nuclear Information System (INIS)

    Yamato, Ichiro; Yamamoto, Toshiki

    1992-01-01

    In nuclear power plants, large scale personnel exposure management systems have been developed and established by utilities. Though being common in the base, the implementations are specific by plants. Contractors must control their workers' exposures by their own methods and systems. To comply with the utilities' parental systems, contractors' systems tend to differ by plants, thus make it difficult for contractors to design a standard system that is common to all relevant plants. Circumstances being as such, however, we have developed a system which is applicable to various customer utilities with minimal variations, using personal computers with database management and data communication softwares, with relatively low cost. We hope that this system will develop to the standard model for all Japanese contractors' personnel exposure management systems. (author)

  15. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    International Nuclear Information System (INIS)

    Fonseca, T C Ferreira; Vanhavere, F; Bogaerts, R; Hunt, John

    2014-01-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium. (paper)

  16. A methodology to develop computational phantoms with adjustable posture for WBC calibration

    Science.gov (United States)

    Ferreira Fonseca, T. C.; Bogaerts, R.; Hunt, John; Vanhavere, F.

    2014-11-01

    A Whole Body Counter (WBC) is a facility to routinely assess the internal contamination of exposed workers, especially in the case of radiation release accidents. The calibration of the counting device is usually done by using anthropomorphic physical phantoms representing the human body. Due to such a challenge of constructing representative physical phantoms a virtual calibration has been introduced. The use of computational phantoms and the Monte Carlo method to simulate radiation transport have been demonstrated to be a worthy alternative. In this study we introduce a methodology developed for the creation of realistic computational voxel phantoms with adjustable posture for WBC calibration. The methodology makes use of different software packages to enable the creation and modification of computational voxel phantoms. This allows voxel phantoms to be developed on demand for the calibration of different WBC configurations. This in turn helps to study the major source of uncertainty associated with the in vivo measurement routine which is the difference between the calibration phantoms and the real persons being counted. The use of realistic computational phantoms also helps the optimization of the counting measurement. Open source codes such as MakeHuman and Blender software packages have been used for the creation and modelling of 3D humanoid characters based on polygonal mesh surfaces. Also, a home-made software was developed whose goal is to convert the binary 3D voxel grid into a MCNPX input file. This paper summarizes the development of a library of phantoms of the human body that uses two basic phantoms called MaMP and FeMP (Male and Female Mesh Phantoms) to create a set of male and female phantoms that vary both in height and in weight. Two sets of MaMP and FeMP phantoms were developed and used for efficiency calibration of two different WBC set-ups: the Doel NPP WBC laboratory and AGM laboratory of SCK-CEN in Mol, Belgium.

  17. Recent Development in Rigorous Computational Methods in Dynamical Systems

    OpenAIRE

    Arai, Zin; Kokubu, Hiroshi; Pilarczyk, Paweł

    2009-01-01

    We highlight selected results of recent development in the area of rigorous computations which use interval arithmetic to analyse dynamical systems. We describe general ideas and selected details of different ways of approach and we provide specific sample applications to illustrate the effectiveness of these methods. The emphasis is put on a topological approach, which combined with rigorous calculations provides a broad range of new methods that yield mathematically rel...

  18. An Algebra-Based Introductory Computational Neuroscience Course with Lab.

    Science.gov (United States)

    Fink, Christian G

    2017-01-01

    A course in computational neuroscience has been developed at Ohio Wesleyan University which requires no previous experience with calculus or computer programming, and which exposes students to theoretical models of neural information processing and techniques for analyzing neural data. The exploration of theoretical models of neural processes is conducted in the classroom portion of the course, while data analysis techniques are covered in lab. Students learn to program in MATLAB and are offered the opportunity to conclude the course with a final project in which they explore a topic of their choice within computational neuroscience. Results from a questionnaire administered at the beginning and end of the course indicate significant gains in student facility with core concepts in computational neuroscience, as well as with analysis techniques applied to neural data.

  19. The design development and commissioning of two distributed computer based boiler control systems

    International Nuclear Information System (INIS)

    Collier, D.; Johnstone, L.R.; Pringle, S.T.; Walker, R.W.

    1980-01-01

    The CEBG N.E. Region has recently commissioned two major boiler control schemes using distributed computer control system. Both systems have considerable development potential to allow modifications to meet changing operational requirements. The distributed approach to control was chosen in both instances so as to achieve high control system availability and as a method of easing the commissioning programs. The experience gained with these two projects has reinforced the view that distributed computer systems show advantages over centralised single computers especially if software is designed for the distributed system. (auth)

  20. Developing Computer-Assisted Instruction Multimedia For Educational Technology Course of Coastal Area Students

    Science.gov (United States)

    Idris, Husni; Nurhayati, Nurhayati; Satriani, Satriani

    2018-05-01

    This research aims to a) identify instructional software (interactive multimedia CDs) by developing Computer-Assisted Instruction (CAI) multimedia that is eligible to be used in the instruction of the Educational Technology course; b) analysis the role of instructional software (interactive multimedia CDs) on the Educational Technology course through the development of Computer-Assisted Instruction (CAI) multimedia to improve the quality of education and instructional activities. This is Research and Development (R&D). It employed the descriptive procedural model of development, which outlines the steps to be taken to develop a product, which is instructional multimedia. The number of subjects of the research trial or respondents for each stage was 20 people. To maintain development quality, an expert in materials outside the materials under study, an expert in materials who is also a Educational Technology lecturer, a small groupof 3 students, a medium-sized group of 10 students, and 20 students to participate in the field testing took part in this research. Then, data collection instruments were developed in two stages, namely: a) developing the instruments; and b) trying out instruments. Data on students’ responses were collected using questionnaires and analyzed using descriptive statistics with percentage and categorization techniques. Based on data analysis results, it is revealed that the Computer-Assisted Instruction (CAI) multimedia developed and tried out among students during the preliminary field testing falls into the “Good” category, with the aspects of instruction, materials, and media falling into the “Good” category. Subsequently, results of the main field testing among students also suggest that it falls into the “Good” category, with the aspects of instruction, materials, and media falling into the “Good” category. Similarly, results of the operational field testing among students also suggest that it falls into the

  1. Some computational challenges of developing efficient parallel algorithms for data-dependent computations in thermal-hydraulics supercomputer applications

    International Nuclear Information System (INIS)

    Woodruff, S.B.

    1994-01-01

    The Transient Reactor Analysis Code (TRAC), which features a two-fluid treatment of thermal-hydraulics, is designed to model transients in water reactors and related facilities. One of the major computational costs associated with TRAC and similar codes is calculating constitutive coefficients. Although the formulations for these coefficients are local, the costs are flow-regime- or data-dependent; i.e., the computations needed for a given spatial node often vary widely as a function of time. Consequently, a fixed, uniform assignment of nodes to prallel processors will result in degraded computational efficiency due to the poor load balancing. A standard method for treating data-dependent models on vector architectures has been to use gather operations (or indirect adressing) to sort the nodes into subsets that (temporarily) share a common computational model. However, this method is not effective on distributed memory data parallel architectures, where indirect adressing involves expensive communication overhead. Another serious problem with this method involves software engineering challenges in the areas of maintainability and extensibility. For example, an implementation that was hand-tuned to achieve good computational efficiency would have to be rewritten whenever the decision tree governing the sorting was modified. Using an example based on the calculation of the wall-to-liquid and wall-to-vapor heat-transfer coefficients for three nonboiling flow regimes, we describe how the use of the Fortran 90 WHERE construct and automatic inlining of functions can be used to ameliorate this problem while improving both efficiency and software engineering. Unfortunately, a general automatic solution to the load-balancing problem associated with data-dependent computations is not yet available for massively parallel architectures. We discuss why developers should either wait for such solutions or consider alternative numerical algorithms, such as a neural network

  2. Computing nucleon EDM on a lattice

    Science.gov (United States)

    Abramczyk, Michael; Aoki, Sinya; Blum, Tom; Izubuchi, Taku; Ohki, Hiroshi; Syritsyn, Sergey

    2018-03-01

    I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.

  3. Computing nucleon EDM on a lattice

    Energy Technology Data Exchange (ETDEWEB)

    Abramczyk, Michael; Izubuchi, Taku

    2017-06-18

    I will discuss briefly recent changes in the methodology of computing the baryon EDM on a lattice. The associated correction substantially reduces presently existing lattice values for the proton and neutron theta-induced EDMs, so that even the most precise previous lattice results become consistent with zero. On one hand, this change removes previous disagreements between these lattice results and the phenomenological estimates of the nucleon EDM. On the other hand, the nucleon EDM becomes much harder to compute on a lattice. In addition, I will review the progress in computing quark chromo-EDM-induced nucleon EDM using chiral quark action.

  4. Development of a system of computer codes for severe accident analyses and its applications

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Soon Hong; Cheon, Moon Heon; Cho, Nam jin; No, Hui Cheon; Chang, Hyeon Seop; Moon, Sang Kee; Park, Seok Jeong; Chung, Jee Hwan [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1991-12-15

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in Nuclear Power Plants. This system of codes is necessary to conduct individual plant examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident resistance. The scope and contents of this study are as follows : development of a system of computer codes for severe accident analyses, development of severe accident management strategy.

  5. Development of a system of computer codes for severe accident analyses and its applications

    International Nuclear Information System (INIS)

    Chang, Soon Hong; Cheon, Moon Heon; Cho, Nam jin; No, Hui Cheon; Chang, Hyeon Seop; Moon, Sang Kee; Park, Seok Jeong; Chung, Jee Hwan

    1991-12-01

    The objectives of this study is to develop a system of computer codes for postulated severe accident analyses in Nuclear Power Plants. This system of codes is necessary to conduct individual plant examination for domestic nuclear power plants. As a result of this study, one can conduct severe accident assessments more easily, and can extract the plant-specific vulnerabilities for severe accidents and at the same time the ideas for enhancing overall accident resistance. The scope and contents of this study are as follows : development of a system of computer codes for severe accident analyses, development of severe accident management strategy

  6. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer......-format and COM-objects, are incorporated to allow the export and import of mathematical models; 5) a user interface that provides the work-flow and data-flow to guide the user through the different modelling tasks....

  7. Articles on Practical Cybernetics. Computer-Developed Computers; Heuristics and Modern Sciences; Linguistics and Practice; Cybernetics and Moral-Ethical Considerations; and Men and Machines at the Chessboard.

    Science.gov (United States)

    Berg, A. I.; And Others

    Five articles which were selected from a Russian language book on cybernetics and then translated are presented here. They deal with the topics of: computer-developed computers, heuristics and modern sciences, linguistics and practice, cybernetics and moral-ethical considerations, and computer chess programs. (Author/JY)

  8. Conceptual aspects: analyses law, ethical, human, technical, social factors of development ICT, e-learning and intercultural development in different countries setting out the previous new theoretical model and preliminary findings

    NARCIS (Netherlands)

    Kommers, Petrus A.M.; Smyrnova-Trybulska, Eugenia; Morze, Natalia; Issa, Tomayess; Issa, Theodora

    2015-01-01

    This paper, prepared by an international team of authors focuses on the conceptual aspects: analyses law, ethical, human, technical, social factors of ICT development, e-learning and intercultural development in different countries, setting out the previous and new theoretical model and preliminary

  9. Development of computer code in PNC, 8

    International Nuclear Information System (INIS)

    Ohhira, Mitsuru

    1990-01-01

    Private buildings applied base isolation system, are on the practical stage now. So, under Construction and Maintenance Management Office, we are doing an application study of base isolation system to nuclear fuel facilities. On the process of this study, we have developed Dynamic Analysis Program-Base Isolation System (DAP-BS) which is able to run a 32-bit personal computer. Using this program, we can analyze a 3-dimensional structure, and evaluate the various properties of base isolation parts that are divided into maximum 16 blocks. And from the results of some simulation analyses, we thought that DAP-BS had good reliability and marketability. So, we put DAP-BS on the market. (author)

  10. The development of computer ethics: contributions from business ethics and medical ethics.

    Science.gov (United States)

    Wong, K; Steinke, G

    2000-04-01

    In this essay, we demonstrate that the field of computer ethics shares many core similarities with two other areas of applied ethics. Academicians writing and teaching in the area of computer ethics, along with practitioners, must address ethical issues that are qualitatively similar in nature to those raised in medicine and business. In addition, as academic disciplines, these three fields also share some similar concerns. For example, all face the difficult challenge of maintaining a credible dialogue with diverse constituents such as academicians of various disciplines, professionals, policymakers, and the general public. Given these similarities, the fields of bioethics and business ethics can serve as useful models for the development of computer ethics.

  11. Development of a Computer-Based Measure of Listening Comprehension of Science Talk

    Science.gov (United States)

    Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien

    2015-01-01

    The purpose of this study was to develop a computer-based assessment for elementary school students' listening comprehension of science talk within an inquiry-oriented environment. The development procedure had 3 steps: a literature review to define the framework of the test, collecting and identifying key constructs of science talk, and…

  12. Development of Probabilistic Internal Dosimetry Computer Code

    Energy Technology Data Exchange (ETDEWEB)

    Noh, Siwan [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kwon, Tae-Eun [Korea Institute of Radiological and Medical Sciences, Seoul (Korea, Republic of); Lee, Jai-Ki [Korean Association for Radiation Protection, Seoul (Korea, Republic of)

    2017-02-15

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5{sup th}, 5{sup th}, median, 95{sup th}, and 97.5{sup th} percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various

  13. Development of Probabilistic Internal Dosimetry Computer Code

    International Nuclear Information System (INIS)

    Noh, Siwan; Kwon, Tae-Eun; Lee, Jai-Ki

    2017-01-01

    Internal radiation dose assessment involves biokinetic models, the corresponding parameters, measured data, and many assumptions. Every component considered in the internal dose assessment has its own uncertainty, which is propagated in the intake activity and internal dose estimates. For research or scientific purposes, and for retrospective dose reconstruction for accident scenarios occurring in workplaces having a large quantity of unsealed radionuclides, such as nuclear power plants, nuclear fuel cycle facilities, and facilities in which nuclear medicine is practiced, a quantitative uncertainty assessment of the internal dose is often required. However, no calculation tools or computer codes that incorporate all the relevant processes and their corresponding uncertainties, i.e., from the measured data to the committed dose, are available. Thus, the objective of the present study is to develop an integrated probabilistic internal-dose-assessment computer code. First, the uncertainty components in internal dosimetry are identified, and quantitative uncertainty data are collected. Then, an uncertainty database is established for each component. In order to propagate these uncertainties in an internal dose assessment, a probabilistic internal-dose-assessment system that employs the Bayesian and Monte Carlo methods. Based on the developed system, we developed a probabilistic internal-dose-assessment code by using MATLAB so as to estimate the dose distributions from the measured data with uncertainty. Using the developed code, we calculated the internal dose distribution and statistical values (e.g. the 2.5 th , 5 th , median, 95 th , and 97.5 th percentiles) for three sample scenarios. On the basis of the distributions, we performed a sensitivity analysis to determine the influence of each component on the resulting dose in order to identify the major component of the uncertainty in a bioassay. The results of this study can be applied to various situations. In cases

  14. A computationally efficient fuzzy control s

    Directory of Open Access Journals (Sweden)

    Abdel Badie Sharkawy

    2013-12-01

    Full Text Available This paper develops a decentralized fuzzy control scheme for MIMO nonlinear second order systems with application to robot manipulators via a combination of genetic algorithms (GAs and fuzzy systems. The controller for each degree of freedom (DOF consists of a feedforward fuzzy torque computing system and a feedback fuzzy PD system. The feedforward fuzzy system is trained and optimized off-line using GAs, whereas not only the parameters but also the structure of the fuzzy system is optimized. The feedback fuzzy PD system, on the other hand, is used to keep the closed-loop stable. The rule base consists of only four rules per each DOF. Furthermore, the fuzzy feedback system is decentralized and simplified leading to a computationally efficient control scheme. The proposed control scheme has the following advantages: (1 it needs no exact dynamics of the system and the computation is time-saving because of the simple structure of the fuzzy systems and (2 the controller is robust against various parameters and payload uncertainties. The computational complexity of the proposed control scheme has been analyzed and compared with previous works. Computer simulations show that this controller is effective in achieving the control goals.

  15. Computers as components principles of embedded computing system design

    CERN Document Server

    Wolf, Marilyn

    2012-01-01

    Computers as Components: Principles of Embedded Computing System Design, 3e, presents essential knowledge on embedded systems technology and techniques. Updated for today's embedded systems design methods, this edition features new examples including digital signal processing, multimedia, and cyber-physical systems. Author Marilyn Wolf covers the latest processors from Texas Instruments, ARM, and Microchip Technology plus software, operating systems, networks, consumer devices, and more. Like the previous editions, this textbook: Uses real processors to demonstrate both technology and tec

  16. Look back and look forward to the future of computer applications in the field of nuclear science and technology

    International Nuclear Information System (INIS)

    Yang Yanming; Dai Guiling

    1988-01-01

    All previous National Conferences on computer application in the field of nuclear science and technology sponsored by the Society of Nuclear Electronics and Detection Technology are reviewed. Surveys are geiven on the basic situations and technique levels of computer applications for each time period. Some points concerning possible developments of computer techniques are given as well

  17. Patterns of students' computer use and relations to their computer and information literacy

    DEFF Research Database (Denmark)

    Bundsgaard, Jeppe; Gerick, Julia

    2017-01-01

    Background: Previous studies have shown that there is a complex relationship between students’ computer and information literacy (CIL) and their use of information and communication technologies (ICT) for both recreational and school use. Methods: This study seeks to dig deeper into these complex...... relations by identifying different patterns of students’ school-related and recreational computer use in the 21 countries participating in the International Computer and Information Literacy Study (ICILS 2013). Results: Latent class analysis (LCA) of the student questionnaire and performance data from......, raising important questions about differences in contexts. Keywords: ICILS, Computer use, Latent class analysis (LCA), Computer and information literacy....

  18. Cloud Computing: The Level of Awareness Amongst Small & Medium-sized Enterprises (SMEs) in Developing Economies

    DEFF Research Database (Denmark)

    Yeboah-Boateng, Ezer Osei; Essandoh, Kofi Asare

    2013-01-01

    Cloud computing services are being touted as a major enabler for small businesses lately. This new paradigm is seen to offer unique opportunities to small and medium enterprises (SMEs) worldwide and developing economies are no exception. It presents SMEs access to similar technologies available...... indicated that a slight majority of the respondents were familiar with cloud computing on the individual level but the level of awareness amongst the larger SME industry was low to medium. The finding therefore recommends education and sensitization on cloud computing in order to increase awareness...... to their larger counterparts and those in the developed world which inherently creates innovativeness, increases competitive advantage and impacts their operations and processes. This paper seeks to determine the level of awareness and familiarity with this emerging computing paradigm. The results of the study...

  19. Computationally Developed Sham Stimulation Protocol for Multichannel Desynchronizing Stimulation

    Directory of Open Access Journals (Sweden)

    Magteld Zeitler

    2018-05-01

    Full Text Available A characteristic pattern of abnormal brain activity is abnormally strong neuronal synchronization, as found in several brain disorders, such as tinnitus, Parkinson's disease, and epilepsy. As observed in several diseases, different therapeutic interventions may induce a placebo effect that may be strong and hinder reliable clinical evaluations. Hence, to distinguish between specific, neuromodulation-induced effects and unspecific, placebo effects, it is important to mimic the therapeutic procedure as precisely as possibly, thereby providing controls that actually lack specific effects. Coordinated Reset (CR stimulation has been developed to specifically counteract abnormally strong synchronization by desynchronization. CR is a spatio-temporally patterned multichannel stimulation which reduces the extent of coincident neuronal activity and aims at an anti-kindling, i.e., an unlearning of both synaptic connectivity and neuronal synchrony. Apart from acute desynchronizing effects, CR may cause sustained, long-lasting desynchronizing effects, as already demonstrated in pre-clinical and clinical proof of concept studies. In this computational study, we set out to computationally develop a sham stimulation protocol for multichannel desynchronizing stimulation. To this end, we compare acute effects and long-lasting effects of six different spatio-temporally patterned stimulation protocols, including three variants of CR, using a no-stimulation condition as additional control. This is to provide an inventory of different stimulation algorithms with similar fundamental stimulation parameters (e.g., mean stimulation rates but qualitatively different acute and/or long-lasting effects. Stimulation protocols sharing basic parameters, but inducing nevertheless completely different or even no acute effects and/or after-effects, might serve as controls to validate the specific effects of particular desynchronizing protocols such as CR. In particular, based on

  20. Integrated Graphics Operations and Analysis Lab Development of Advanced Computer Graphics Algorithms

    Science.gov (United States)

    Wheaton, Ira M.

    2011-01-01

    The focus of this project is to aid the IGOAL in researching and implementing algorithms for advanced computer graphics. First, this project focused on porting the current International Space Station (ISS) Xbox experience to the web. Previously, the ISS interior fly-around education and outreach experience only ran on an Xbox 360. One of the desires was to take this experience and make it into something that can be put on NASA s educational site for anyone to be able to access. The current code works in the Unity game engine which does have cross platform capability but is not 100% compatible. The tasks for an intern to complete this portion consisted of gaining familiarity with Unity and the current ISS Xbox code, porting the Xbox code to the web as is, and modifying the code to work well as a web application. In addition, a procedurally generated cloud algorithm will be developed. Currently, the clouds used in AGEA animations and the Xbox experiences are a texture map. The desire is to create a procedurally generated cloud algorithm to provide dynamically generated clouds for both AGEA animations and the Xbox experiences. This task consists of gaining familiarity with AGEA and the plug-in interface, developing the algorithm, creating an AGEA plug-in to implement the algorithm inside AGEA, and creating a Unity script to implement the algorithm for the Xbox. This portion of the project was unable to be completed in the time frame of the internship; however, the IGOAL will continue to work on it in the future.

  1. A new approach in development of data flow control and investigation system for computer networks

    International Nuclear Information System (INIS)

    Frolov, I.; Vaguine, A.; Silin, A.

    1992-01-01

    This paper describes a new approach in development of data flow control and investigation system for computer networks. This approach was developed and applied in the Moscow Radiotechnical Institute for control and investigations of Institute computer network. It allowed us to solve our network current problems successfully. Description of our approach is represented below along with the most interesting results of our work. (author)

  2. Development of a Very Dense Liquid Cooled Compute Platform

    Energy Technology Data Exchange (ETDEWEB)

    Hughes, Phillip N.; Lipp, Robert J.

    2013-12-10

    The objective of this project was to design and develop a prototype very energy efficient high density compute platform with 100% pumped refrigerant liquid cooling using commodity components and high volume manufacturing techniques. Testing at SLAC has indicated that we achieved a DCIE of 0.93 against our original goal of 0.85. This number includes both cooling and power supply and was achieved employing some of the highest wattage processors available.

  3. The influence of playing computer games on pupil's development

    OpenAIRE

    Pospíšilová, Lenka

    2008-01-01

    This thesis is about the effects of playing computer games on pupils and students behavior. It is divided into a theoretical and an investigative part. The theoretical part is dedicated to historical development of technologies and principals of game systems in relationship to technical progress. It adverts to psychological, social and biological effects of long time, intensive playing of games. It shows positive and negative effects ofthis activity. The work analyses typical pathological eve...

  4. On the Tengiz petroleum deposit previous study

    International Nuclear Information System (INIS)

    Nysangaliev, A.N.; Kuspangaliev, T.K.

    1997-01-01

    Tengiz petroleum deposit previous study is described. Some consideration about structure of productive formation, specific characteristic properties of petroleum-bearing collectors are presented. Recommendation on their detail study and using of experience on exploration and development of petroleum deposit which have analogy on most important geological and industrial parameters are given. (author)

  5. Development and application of a computer model for large-scale flame acceleration experiments

    International Nuclear Information System (INIS)

    Marx, K.D.

    1987-07-01

    A new computational model for large-scale premixed flames is developed and applied to the simulation of flame acceleration experiments. The primary objective is to circumvent the necessity for resolving turbulent flame fronts; this is imperative because of the relatively coarse computational grids which must be used in engineering calculations. The essence of the model is to artificially thicken the flame by increasing the appropriate diffusivities and decreasing the combustion rate, but to do this in such a way that the burn velocity varies with pressure, temperature, and turbulence intensity according to prespecified phenomenological characteristics. The model is particularly aimed at implementation in computer codes which simulate compressible flows. To this end, it is applied to the two-dimensional simulation of hydrogen-air flame acceleration experiments in which the flame speeds and gas flow velocities attain or exceed the speed of sound in the gas. It is shown that many of the features of the flame trajectories and pressure histories in the experiments are simulated quite well by the model. Using the comparison of experimental and computational results as a guide, some insight is developed into the processes which occur in such experiments. 34 refs., 25 figs., 4 tabs

  6. Runway exit designs for capacity improvement demonstrations. Phase 2: Computer model development

    Science.gov (United States)

    Trani, A. A.; Hobeika, A. G.; Kim, B. J.; Nunna, V.; Zhong, C.

    1992-01-01

    The development is described of a computer simulation/optimization model to: (1) estimate the optimal locations of existing and proposed runway turnoffs; and (2) estimate the geometric design requirements associated with newly developed high speed turnoffs. The model described, named REDIM 2.0, represents a stand alone application to be used by airport planners, designers, and researchers alike to estimate optimal turnoff locations. The main procedures are described in detail which are implemented in the software package and possible applications are illustrated when using 6 major runway scenarios. The main output of the computer program is the estimation of the weighted average runway occupancy time for a user defined aircraft population. Also, the location and geometric characteristics of each turnoff are provided to the user.

  7. BCILAB: a platform for brain-computer interface development

    Science.gov (United States)

    Kothe, Christian Andreas; Makeig, Scott

    2013-10-01

    Objective. The past two decades have seen dramatic progress in our ability to model brain signals recorded by electroencephalography, functional near-infrared spectroscopy, etc., and to derive real-time estimates of user cognitive state, response, or intent for a variety of purposes: to restore communication by the severely disabled, to effect brain-actuated control and, more recently, to augment human-computer interaction. Continuing these advances, largely achieved through increases in computational power and methods, requires software tools to streamline the creation, testing, evaluation and deployment of new data analysis methods. Approach. Here we present BCILAB, an open-source MATLAB-based toolbox built to address the need for the development and testing of brain-computer interface (BCI) methods by providing an organized collection of over 100 pre-implemented methods and method variants, an easily extensible framework for the rapid prototyping of new methods, and a highly automated framework for systematic testing and evaluation of new implementations. Main results. To validate and illustrate the use of the framework, we present two sample analyses of publicly available data sets from recent BCI competitions and from a rapid serial visual presentation task. We demonstrate the straightforward use of BCILAB to obtain results compatible with the current BCI literature. Significance. The aim of the BCILAB toolbox is to provide the BCI community a powerful toolkit for methods research and evaluation, thereby helping to accelerate the pace of innovation in the field, while complementing the existing spectrum of tools for real-time BCI experimentation, deployment and use.

  8. Dissociation in decision bias mechanism between probabilistic information and previous decision

    Directory of Open Access Journals (Sweden)

    Yoshiyuki eKaneko

    2015-05-01

    Full Text Available Target detection performance is known to be influenced by events in the previous trials. It has not been clear, however, whether this bias effect is due to the previous sensory stimulus, motor response, or decision. Also it remains open whether or not the previous trial effect emerges via the same mechanism as the effect of knowledge about the target probability. In the present study, we asked normal human subjects to make a decision about the presence or absence of a visual target. We presented a pre-cue indicating the target probability before the stimulus, and also a decision-response mapping cue after the stimulus so as to tease apart the effect of decision from that of motor response. We found that the target detection performance was significantly affected by the probability cue in the current trial and also by the decision in the previous trial. While the information about the target probability modulated the decision criteria, the previous decision modulated the sensitivity to target-relevant sensory signals (d-prime. Using functional magnetic resonance imaging, we also found that activation in the left intraparietal sulcus was decreased when the probability cue indicated a high probability of the target. By contrast, activation in the right inferior frontal gyrus was increased when the subjects made a target-present decision in the previous trial, but this change was observed specifically when the target was present in the current trial. Activation in these regions was associated with individual-difference in the decision computation parameters. We argue that the previous decision biases the target detection performance by modulating the processing of target-selective information, and this mechanism is distinct from modulation of decision criteria due to expectation of a target.

  9. Dissociation in decision bias mechanism between probabilistic information and previous decision

    Science.gov (United States)

    Kaneko, Yoshiyuki; Sakai, Katsuyuki

    2015-01-01

    Target detection performance is known to be influenced by events in the previous trials. It has not been clear, however, whether this bias effect is due to the previous sensory stimulus, motor response, or decision. Also it remains open whether or not the previous trial effect emerges via the same mechanism as the effect of knowledge about the target probability. In the present study, we asked normal human subjects to make a decision about the presence or absence of a visual target. We presented a pre-cue indicating the target probability before the stimulus, and also a decision-response mapping cue after the stimulus so as to tease apart the effect of decision from that of motor response. We found that the target detection performance was significantly affected by the probability cue in the current trial and also by the decision in the previous trial. While the information about the target probability modulated the decision criteria, the previous decision modulated the sensitivity to target-relevant sensory signals (d-prime). Using functional magnetic resonance imaging (fMRI), we also found that activation in the left intraparietal sulcus (IPS) was decreased when the probability cue indicated a high probability of the target. By contrast, activation in the right inferior frontal gyrus (IFG) was increased when the subjects made a target-present decision in the previous trial, but this change was observed specifically when the target was present in the current trial. Activation in these regions was associated with individual-difference in the decision computation parameters. We argue that the previous decision biases the target detection performance by modulating the processing of target-selective information, and this mechanism is distinct from modulation of decision criteria due to expectation of a target. PMID:25999844

  10. Computable general equilibrium model fiscal year 2013 capability development report

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Brian Keith [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rivera, Michael Kelly [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Boero, Riccardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-17

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences in the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.

  11. A computer-aided framework for development, identification andmanagement of physiologically-based pharmacokinetic models

    DEFF Research Database (Denmark)

    Heitzig, Martina; Linninger, Andreas; Sin, Gürkan

    2014-01-01

    The objective of this work is the development of a generic computer-aided modelling framework to support the development of physiologically-based pharmacokinetic models thereby increasing the efficiency and quality of the modelling process. In particular, the framework systematizes the modelling...

  12. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  13. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  14. Development and Assessment of a Chemistry-Based Computer Video Game as a Learning Tool

    Science.gov (United States)

    Martinez-Hernandez, Kermin Joel

    2010-01-01

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning…

  15. Real-Time Thevenin Impedance Computation

    DEFF Research Database (Denmark)

    Sommer, Stefan Horst; Jóhannsson, Hjörtur

    2013-01-01

    operating state, and strict time constraints are difficult to adhere to as the complexity of the grid increases. Several suggested approaches for real-time stability assessment require Thevenin impedances to be determined for the observed system conditions. By combining matrix factorization, graph reduction......, and parallelization, we develop an algorithm for computing Thevenin impedances an order of magnitude faster than previous approaches. We test the factor-and-solve algorithm with data from several power grids of varying complexity, and we show how the algorithm allows realtime stability assessment of complex power...

  16. Development of a Traditional/Computer-aided Graphics Course for Engineering Technology.

    Science.gov (United States)

    Anand, Vera B.

    1985-01-01

    Describes a two-semester-hour freshman course in engineering graphics which uses both traditional and computerized instruction. Includes course description, computer graphics topics, and recommendations. Indicates that combining interactive graphics software with development of simple programs gave students a better foundation for upper-division…

  17. Business model elements impacting cloud computing adoption

    DEFF Research Database (Denmark)

    Bogataj, Kristina; Pucihar, Andreja; Sudzina, Frantisek

    The paper presents a proposed research framework for identification of business model elements impacting Cloud Computing Adoption. We provide a definition of main Cloud Computing characteristics, discuss previous findings on factors impacting Cloud Computing Adoption, and investigate technology a...

  18. The Development Of A Computer Program For Thermohydraulic Analysis Of Kartini Reactor

    International Nuclear Information System (INIS)

    Ischaq, Ma'sum; Syarip; BS, Edi Trijono; Suyamto

    1996-01-01

    A computer programming that could be used to calculate the fuel temperature and the reactor core, has been developed. The inside fuel temperature was calculated by explicit method. First, the differential partial equations were arranged the temperature as a function of the fuel radius and lime. T = f(r.t), and then the equations were transformed into the finite difference equations that could be solved numerically by the computer. The convection heal transfer coefficient between the fuel and the coolant was calculated basically by the free convection phenomena that followed the equation Nu = f (Gr, Pr). By this computer programming, the fuel and the core temperature in a certain condition of the reactor power and the fluid could be predicted

  19. High-resolution computational algorithms for simulating offshore wind turbines and farms: Model development and validation

    Energy Technology Data Exchange (ETDEWEB)

    Calderer, Antoni [Univ. of Minnesota, Minneapolis, MN (United States); Yang, Xiaolei [Stony Brook Univ., NY (United States); Angelidis, Dionysios [Univ. of Minnesota, Minneapolis, MN (United States); Feist, Chris [Univ. of Minnesota, Minneapolis, MN (United States); Guala, Michele [Univ. of Minnesota, Minneapolis, MN (United States); Ruehl, Kelley [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Guo, Xin [Univ. of Minnesota, Minneapolis, MN (United States); Boomsma, Aaron [Univ. of Minnesota, Minneapolis, MN (United States); Shen, Lian [Univ. of Minnesota, Minneapolis, MN (United States); Sotiropoulos, Fotis [Stony Brook Univ., NY (United States)

    2015-10-30

    The present project involves the development of modeling and analysis design tools for assessing offshore wind turbine technologies. The computational tools developed herein are able to resolve the effects of the coupled interaction of atmospheric turbulence and ocean waves on aerodynamic performance and structural stability and reliability of offshore wind turbines and farms. Laboratory scale experiments have been carried out to derive data sets for validating the computational models.

  20. Associated computational plasticity schemes for nonassociated frictional materials

    DEFF Research Database (Denmark)

    Krabbenhoft, K.; Karim, M. R.; Lyamin, A. V.

    2012-01-01

    A new methodology for computational plasticity of nonassociated frictional materials is presented. The new approach is inspired by the micromechanical origins of friction and results in a set of governing equations similar to those of standard associated plasticity. As such, procedures previously...... developed for associated plasticity are applicable with minor modification. This is illustrated by adaptation of the standard implicit scheme. Moreover, the governing equations can be cast in terms of a variational principle, which after discretization is solved by means of a newly developed second...

  1. The Unlock Project: a Python-based framework for practical brain-computer interface communication "app" development.

    Science.gov (United States)

    Brumberg, Jonathan S; Lorenz, Sean D; Galbraith, Byron V; Guenther, Frank H

    2012-01-01

    In this paper we present a framework for reducing the development time needed for creating applications for use in non-invasive brain-computer interfaces (BCI). Our framework is primarily focused on facilitating rapid software "app" development akin to current efforts in consumer portable computing (e.g. smart phones and tablets). This is accomplished by handling intermodule communication without direct user or developer implementation, instead relying on a core subsystem for communication of standard, internal data formats. We also provide a library of hardware interfaces for common mobile EEG platforms for immediate use in BCI applications. A use-case example is described in which a user with amyotrophic lateral sclerosis participated in an electroencephalography-based BCI protocol developed using the proposed framework. We show that our software environment is capable of running in real-time with updates occurring 50-60 times per second with limited computational overhead (5 ms system lag) while providing accurate data acquisition and signal analysis.

  2. Extending and Applying Spartan to Perform Temporal Sensitivity Analyses for Predicting Changes in Influential Biological Pathways in Computational Models.

    Science.gov (United States)

    Alden, Kieran; Timmis, Jon; Andrews, Paul S; Veiga-Fernandes, Henrique; Coles, Mark

    2017-01-01

    Through integrating real time imaging, computational modelling, and statistical analysis approaches, previous work has suggested that the induction of and response to cell adhesion factors is the key initiating pathway in early lymphoid tissue development, in contrast to the previously accepted view that the process is triggered by chemokine mediated cell recruitment. These model derived hypotheses were developed using spartan, an open-source sensitivity analysis toolkit designed to establish and understand the relationship between a computational model and the biological system that model captures. Here, we extend the functionality available in spartan to permit the production of statistical analyses that contrast the behavior exhibited by a computational model at various simulated time-points, enabling a temporal analysis that could suggest whether the influence of biological mechanisms changes over time. We exemplify this extended functionality by using the computational model of lymphoid tissue development as a time-lapse tool. By generating results at twelve- hour intervals, we show how the extensions to spartan have been used to suggest that lymphoid tissue development could be biphasic, and predict the time-point when a switch in the influence of biological mechanisms might occur.

  3. Development of a 3-D flow analysis computer program for integral reactor

    International Nuclear Information System (INIS)

    Youn, H. Y.; Lee, K. H.; Kim, H. K.; Whang, Y. D.; Kim, H. C.

    2003-01-01

    A 3-D computational fluid dynamics program TASS-3D is being developed for the flow analysis of primary coolant system consists of complex geometries such as SMART. A pre/post processor also is being developed to reduce the pre/post processing works such as a computational grid generation, set-up the analysis conditions and analysis of the calculated results. TASS-3D solver employs a non-orthogonal coordinate system and FVM based on the non-staggered grid system. The program includes the various models to simulate the physical phenomena expected to be occurred in the integral reactor and will be coupled with core dynamics code, core T/H code and the secondary system code modules. Currently, the application of TASS-3D is limited to the single phase of liquid, but the code will be further developed including 2-phase phenomena expected for the normal operation and the various transients of the integrator reactor in the next stage

  4. Intruder dose pathway analysis for the onsite disposal of radioactive wastes: The ONSITE/MAXI1 computer program

    International Nuclear Information System (INIS)

    Kennedy, W.E. Jr.; Peloquin, R.A.; Napier, B.A.; Neuder, S.M.

    1987-02-01

    This document summarizes initial efforts to develop human-intrusion scenarios and a modified version of the MAXI computer program for potential use by the NRC in reviewing applications for onsite radioactive waste disposal. Supplement 1 of NUREG/CR-3620 (1986) summarized modifications and improvements to the ONSITE/MAXI1 software package. This document summarizes a modified version of the ONSITE/MAXI1 computer program. This modified version of the computer program operates on a personal computer and permits the user to optionally select radiation dose conversion factors published by the International Commission on Radiological Protection (ICRP) in their Publication No. 30 (ICRP 1979-1982) in place of those published by the ICRP in their Publication No. 2 (ICRP 1959) (as implemented in the previous versions of the ONSITE/MAXI1 computer program). The pathway-to-human models used in the computer program have not been changed from those described previously. Computer listings of the ONSITE/MAXI1 computer program and supporting data bases are included in the appendices of this document

  5. [Development of computer aided forming techniques in manufacturing scaffolds for bone tissue engineering].

    Science.gov (United States)

    Wei, Xuelei; Dong, Fuhui

    2011-12-01

    To review recent advance in the research and application of computer aided forming techniques for constructing bone tissue engineering scaffolds. The literature concerning computer aided forming techniques for constructing bone tissue engineering scaffolds in recent years was reviewed extensively and summarized. Several studies over last decade have focused on computer aided forming techniques for bone scaffold construction using various scaffold materials, which is based on computer aided design (CAD) and bone scaffold rapid prototyping (RP). CAD include medical CAD, STL, and reverse design. Reverse design can fully simulate normal bone tissue and could be very useful for the CAD. RP techniques include fused deposition modeling, three dimensional printing, selected laser sintering, three dimensional bioplotting, and low-temperature deposition manufacturing. These techniques provide a new way to construct bone tissue engineering scaffolds with complex internal structures. With rapid development of molding and forming techniques, computer aided forming techniques are expected to provide ideal bone tissue engineering scaffolds.

  6. Exploring the Effects of Web-Mediated Computational Thinking on Developing Students' Computing Skills in a Ubiquitous Learning Environment

    Science.gov (United States)

    Tsai, Chia-Wen; Shen, Pei-Di; Tsai, Meng-Chuan; Chen, Wen-Yu

    2017-01-01

    Much application software education in Taiwan can hardly be regarded as practical. The researchers in this study provided a flexible means of ubiquitous learning (u-learning) with a mobile app for students to access the learning material. In addition, the authors also adopted computational thinking (CT) to help students develop practical computing…

  7. Computer games: a double-edged sword?

    Science.gov (United States)

    Sun, De-Lin; Ma, Ning; Bao, Min; Chen, Xang-Chuan; Zhang, Da-Ren

    2008-10-01

    Excessive computer game playing (ECGP) has already become a serious social problem. However, limited data from experimental lab studies are available about the negative consequences of ECGP on players' cognitive characteristics. In the present study, we compared three groups of participants (current ECGP participants, previous ECGP participants, and control participants) on a Multiple Object Tracking (MOT) task. The previous ECGP participants performed significantly better than the control participants, which suggested a facilitation effect of computer games on visuospatial abilities. Moreover, the current ECGP participants performed significantly worse than the previous ECGP participants. This more important finding indicates that ECGP may be related to cognitive deficits. Implications of this study are discussed.

  8. Scientific Grand Challenges: Crosscutting Technologies for Computing at the Exascale - February 2-4, 2010, Washington, D.C.

    Energy Technology Data Exchange (ETDEWEB)

    Khaleel, Mohammad A.

    2011-02-06

    The goal of the "Scientific Grand Challenges - Crosscutting Technologies for Computing at the Exascale" workshop in February 2010, jointly sponsored by the U.S. Department of Energy’s Office of Advanced Scientific Computing Research and the National Nuclear Security Administration, was to identify the elements of a research and development agenda that will address these challenges and create a comprehensive exascale computing environment. This exascale computing environment will enable the science applications identified in the eight previously held Scientific Grand Challenges Workshop Series.

  9. Summaries of research and development activities by using JAERI computer system in FY2003. April 1, 2003 - March 31, 2004

    International Nuclear Information System (INIS)

    2005-03-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Research Institute (JAERI) installed large computer system included super-computers in order to support research and development activities in JAERI. CCSE operates and manages the computer system and network system. This report presents usage records of the JAERI computer system and big user's research and development activities by using the computer system in FY2003 (April 1, 2003 - March 31, 2004). (author)

  10. Summaries of research and development activities by using JAEA computer system in FY2005. April 1, 2005 - March, 31, 2006

    International Nuclear Information System (INIS)

    2006-10-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. CCSE operates and manages the computer system and network system. This report presents usage records of the JAERI computer system and the big users' research and development activities by using the computer system in FY2005 (April 1, 2005 - March 31, 2006). (author)

  11. Summaries of research and development activities by using JAEA computer system in FY2006. April 1, 2006 - March 31, 2007

    International Nuclear Information System (INIS)

    2008-02-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. CCSE operates and manages the computer system and network system. This report presents usage records of the JAEA computer system and the big users' research and development activities by using the computer system in FY2006 (April 1, 2006 - March 31, 2007). (author)

  12. Integrating computation into the undergraduate curriculum: A vision and guidelines for future developments

    Science.gov (United States)

    Chonacky, Norman; Winch, David

    2008-04-01

    There is substantial evidence of a need to make computation an integral part of the undergraduate physics curriculum. This need is consistent with data from surveys in both the academy and the workplace, and has been reinforced by two years of exploratory efforts by a group of physics faculty for whom computation is a special interest. We have examined past and current efforts at reform and a variety of strategic, organizational, and institutional issues involved in any attempt to broadly transform existing practice. We propose a set of guidelines for development based on this past work and discuss our vision of computationally integrated physics.

  13. Development of computer program ENAUDIBL for computation of the sensation levels of multiple, complex, intrusive sounds in the presence of residual environmental masking noise

    Energy Technology Data Exchange (ETDEWEB)

    Liebich, R. E.; Chang, Y.-S.; Chun, K. C.

    2000-03-31

    The relative audibility of multiple sounds occurs in separate, independent channels (frequency bands) termed critical bands or equivalent rectangular (filter-response) bandwidths (ERBs) of frequency. The true nature of human hearing is a function of a complex combination of subjective factors, both auditory and nonauditory. Assessment of the probability of individual annoyance, community-complaint reaction levels, speech intelligibility, and the most cost-effective mitigation actions requires sensation-level data; these data are one of the most important auditory factors. However, sensation levels cannot be calculated by using single-number, A-weighted sound level values. This paper describes specific steps to compute sensation levels. A unique, newly developed procedure is used, which simplifies and improves the accuracy of such computations by the use of maximum sensation levels that occur, for each intrusive-sound spectrum, within each ERB. The newly developed program ENAUDIBL makes use of ERB sensation-level values generated with some computational subroutines developed for the formerly documented program SPECTRAN.

  14. Computational Enhancements for Direct Numerical Simulations of Statistically Stationary Turbulent Premixed Flames

    KAUST Repository

    Mukhadiyev, Nurzhan

    2017-05-01

    Combustion at extreme conditions, such as a turbulent flame at high Karlovitz and Reynolds numbers, is still a vast and an uncertain field for researchers. Direct numerical simulation of a turbulent flame is a superior tool to unravel detailed information that is not accessible to most sophisticated state-of-the-art experiments. However, the computational cost of such simulations remains a challenge even for modern supercomputers, as the physical size, the level of turbulence intensity, and chemical complexities of the problems continue to increase. As a result, there is a strong demand for computational cost reduction methods as well as in acceleration of existing methods. The main scope of this work was the development of computational and numerical tools for high-fidelity direct numerical simulations of premixed planar flames interacting with turbulence. The first part of this work was KAUST Adaptive Reacting Flow Solver (KARFS) development. KARFS is a high order compressible reacting flow solver using detailed chemical kinetics mechanism; it is capable to run on various types of heterogeneous computational architectures. In this work, it was shown that KARFS is capable of running efficiently on both CPU and GPU. The second part of this work was numerical tools for direct numerical simulations of planar premixed flames: such as linear turbulence forcing and dynamic inlet control. DNS of premixed turbulent flames conducted previously injected velocity fluctuations at an inlet. Turbulence injected at the inlet decayed significantly while reaching the flame, which created a necessity to inject higher than needed fluctuations. A solution for this issue was to maintain turbulence strength on the way to the flame using turbulence forcing. Therefore, a linear turbulence forcing was implemented into KARFS to enhance turbulence intensity. Linear turbulence forcing developed previously by other groups was corrected with net added momentum removal mechanism to prevent mean

  15. Multilink manipulator computer control: experience in development and commissioning

    International Nuclear Information System (INIS)

    Holt, J.E.

    1988-11-01

    This report describes development which has been carried out on the multilink manipulator computer control system. The system allows the manipulator to be driven using only two joysticks. The leading link is controlled and the other links follow its path into the reactor, thus avoiding any potential obstacles. The system has been fully commissioned and used with the Sizewell ''A'' reactor 2 Multilink T.V. manipulator. Experience of the use of the system is presented, together with recommendations for future improvements. (author)

  16. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    International Nuclear Information System (INIS)

    Manimaran, M.; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-01-01

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored

  17. Software development methodology for computer based I&C systems of prototype fast breeder reactor

    Energy Technology Data Exchange (ETDEWEB)

    Manimaran, M., E-mail: maran@igcar.gov.in; Shanmugam, A.; Parimalam, P.; Murali, N.; Satya Murty, S.A.V.

    2015-10-15

    Highlights: • Software development methodology adopted for computer based I&C systems of PFBR is detailed. • Constraints imposed as part of software requirements and coding phase are elaborated. • Compliance to safety and security requirements are described. • Usage of CASE (Computer Aided Software Engineering) tools during software design, analysis and testing phase are explained. - Abstract: Prototype Fast Breeder Reactor (PFBR) is sodium cooled reactor which is in the advanced stage of construction in Kalpakkam, India. Versa Module Europa bus based Real Time Computer (RTC) systems are deployed for Instrumentation & Control of PFBR. RTC systems have to perform safety functions within the stipulated time which calls for highly dependable software. Hence, well defined software development methodology is adopted for RTC systems starting from the requirement capture phase till the final validation of the software product. V-model is used for software development. IEC 60880 standard and AERB SG D-25 guideline are followed at each phase of software development. Requirements documents and design documents are prepared as per IEEE standards. Defensive programming strategies are followed for software development using C language. Verification and validation (V&V) of documents and software are carried out at each phase by independent V&V committee. Computer aided software engineering tools are used for software modelling, checking for MISRA C compliance and to carry out static and dynamic analysis. Various software metrics such as cyclomatic complexity, nesting depth and comment to code are checked. Test cases are generated using equivalence class partitioning, boundary value analysis and cause and effect graphing techniques. System integration testing is carried out wherein functional and performance requirements of the system are monitored.

  18. Summaries of research and development activities by using JAEA computer system in FY2007. April 1, 2007 - March 31, 2008

    International Nuclear Information System (INIS)

    2008-11-01

    Center for Computational Science and e-Systems (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. This report presents usage records of the JAEA computer system and the big users' research and development activities by using the computer system in FY2007 (April 1, 2007 - March 31, 2008). (author)

  19. Summaries of research and development activities by using JAEA computer system in FY2009. April 1, 2009 - March 31, 2010

    International Nuclear Information System (INIS)

    2010-11-01

    Center for Computational Science and e-Systems (CCSE) of Japan Atomic Energy Agency (JAEA) installed large computer systems including super-computers in order to support research and development activities in JAEA. This report presents usage records of the JAEA computer system and the big users' research and development activities by using the computer system in FY2009 (April 1, 2009 - March 31, 2010). (author)

  20. Summaries of research and development activities by using JAERI computer system in FY2004 (April 1, 2004 - March 31, 2005)

    International Nuclear Information System (INIS)

    2005-08-01

    Center for Promotion of Computational Science and Engineering (CCSE) of Japan Atomic Energy Research Institute (JAERI) installed large computer systems including super-computers in order to support research and development activities in JAERI. CCSE operates and manages the computer system and network system. This report presents usage records of the JAERI computer system and the big users' research and development activities by using the computer system in FY2004 (April 1, 2004 - March 31, 2005). (author)

  1. The development and application of a coincidence measurement apparatus with micro-computer system

    International Nuclear Information System (INIS)

    Du Hongshan; Zhou Youpu; Gao Junlin; Qin Deming; Cao Yunzheng; Zhao Shiping

    1987-01-01

    A coincidence measurement apparatus with micro-computer system is developed. Automatic data acquisition and processing are achieved. Results of its application for radioactive measurement are satisfactory

  2. Development of superconductor electronics technology for high-end computing

    Energy Technology Data Exchange (ETDEWEB)

    Silver, A [Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099 (United States); Kleinsasser, A [Jet Propulsion Laboratory, 4800 Oak Grove Drive, Pasadena, CA 91109-8099 (United States); Kerber, G [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Herr, Q [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Dorojevets, M [Department of Electrical and Computer Engineering, SUNY-Stony Brook, NY 11794-2350 (United States); Bunyk, P [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States); Abelson, L [Northrop Grumman Space Technology, One Space Park, Redondo Beach, CA 90278 (United States)

    2003-12-01

    This paper describes our programme to develop and demonstrate ultra-high performance single flux quantum (SFQ) VLSI technology that will enable superconducting digital processors for petaFLOPS-scale computing. In the hybrid technology, multi-threaded architecture, the computational engine to power a petaFLOPS machine at affordable power will consist of 4096 SFQ multi-chip processors, with 50 to 100 GHz clock frequency and associated cryogenic RAM. We present the superconducting technology requirements, progress to date and our plan to meet these requirements. We improved SFQ Nb VLSI by two generations, to a 8 kA cm{sup -2}, 1.25 {mu}m junction process, incorporated new CAD tools into our methodology, demonstrated methods for recycling the bias current and data communication at speeds up to 60 Gb s{sup -1}, both on and between chips through passive transmission lines. FLUX-1 is the most ambitious project implemented in SFQ technology to date, a prototype general-purpose 8 bit microprocessor chip. We are testing the FLUX-1 chip (5K gates, 20 GHz clock) and designing a 32 bit floating-point SFQ multiplier with vector-register memory. We report correct operation of the complete stripline-connected gate library with large bias margins, as well as several larger functional units used in FLUX-1. The next stage will be an SFQ multi-processor machine. Important challenges include further reducing chip supply current and on-chip power dissipation, developing at least 64 kbit, sub-nanosecond cryogenic RAM chips, developing thermally and electrically efficient high data rate cryogenic-to-ambient input/output technology and improving Nb VLSI to increase gate density.

  3. Development of superconductor electronics technology for high-end computing

    International Nuclear Information System (INIS)

    Silver, A; Kleinsasser, A; Kerber, G; Herr, Q; Dorojevets, M; Bunyk, P; Abelson, L

    2003-01-01

    This paper describes our programme to develop and demonstrate ultra-high performance single flux quantum (SFQ) VLSI technology that will enable superconducting digital processors for petaFLOPS-scale computing. In the hybrid technology, multi-threaded architecture, the computational engine to power a petaFLOPS machine at affordable power will consist of 4096 SFQ multi-chip processors, with 50 to 100 GHz clock frequency and associated cryogenic RAM. We present the superconducting technology requirements, progress to date and our plan to meet these requirements. We improved SFQ Nb VLSI by two generations, to a 8 kA cm -2 , 1.25 μm junction process, incorporated new CAD tools into our methodology, demonstrated methods for recycling the bias current and data communication at speeds up to 60 Gb s -1 , both on and between chips through passive transmission lines. FLUX-1 is the most ambitious project implemented in SFQ technology to date, a prototype general-purpose 8 bit microprocessor chip. We are testing the FLUX-1 chip (5K gates, 20 GHz clock) and designing a 32 bit floating-point SFQ multiplier with vector-register memory. We report correct operation of the complete stripline-connected gate library with large bias margins, as well as several larger functional units used in FLUX-1. The next stage will be an SFQ multi-processor machine. Important challenges include further reducing chip supply current and on-chip power dissipation, developing at least 64 kbit, sub-nanosecond cryogenic RAM chips, developing thermally and electrically efficient high data rate cryogenic-to-ambient input/output technology and improving Nb VLSI to increase gate density

  4. Integrated Computational Materials Engineering (ICME) for Third Generation Advanced High-Strength Steel Development

    Energy Technology Data Exchange (ETDEWEB)

    Savic, Vesna; Hector, Louis G.; Ezzat, Hesham; Sachdev, Anil K.; Quinn, James; Krupitzer, Ronald; Sun, Xin

    2015-06-01

    This paper presents an overview of a four-year project focused on development of an integrated computational materials engineering (ICME) toolset for third generation advanced high-strength steels (3GAHSS). Following a brief look at ICME as an emerging discipline within the Materials Genome Initiative, technical tasks in the ICME project will be discussed. Specific aims of the individual tasks are multi-scale, microstructure-based material model development using state-of-the-art computational and experimental techniques, forming, toolset assembly, design optimization, integration and technical cost modeling. The integrated approach is initially illustrated using a 980 grade transformation induced plasticity (TRIP) steel, subject to a two-step quenching and partitioning (Q&P) heat treatment, as an example.

  5. Development of the computer-aided process planning (CAPP system for polymer injection molds manufacturing

    Directory of Open Access Journals (Sweden)

    J. Tepić

    2011-10-01

    Full Text Available Beginning of production and selling of polymer products largely depends on mold manufacturing. The costs of mold manufacturing have significant share in the final price of a product. The best way to improve and rationalize polymer injection molds production process is by doing mold design automation and manufacturing process planning automation. This paper reviews development of a dedicated process planning system for manufacturing of the mold for injection molding, which integrates computer-aided design (CAD, computer-aided process planning (CAPP and computer-aided manufacturing (CAM technologies.

  6. Report on nuclear industry quality assurance procedures for safety analysis computer code development and use

    International Nuclear Information System (INIS)

    Sheron, B.W.; Rosztoczy, Z.R.

    1980-08-01

    As a result of a request from Commissioner V. Gilinsky to investigate in detail the causes of an error discovered in a vendor Emergency Core Cooling System (ECCS) computer code in March, 1978, the staff undertook an extensive investigation of the vendor quality assurance practices applied to safety analysis computer code development and use. This investigation included inspections of code development and use practices of the four major Light Water Reactor Nuclear Steam Supply System vendors and a major reload fuel supplier. The conclusion reached by the staff as a result of the investigation is that vendor practices for code development and use are basically sound. A number of areas were identified, however, where improvements to existing vendor procedures should be made. In addition, the investigation also addressed the quality assurance (QA) review and inspection process for computer codes and identified areas for improvement

  7. Further computer appreciation

    CERN Document Server

    Fry, T F

    2014-01-01

    Further Computer Appreciation is a comprehensive cover of the principles and aspects in computer appreciation. The book starts by describing the development of computers from the first to the third computer generations, to the development of processors and storage systems, up to the present position of computers and future trends. The text tackles the basic elements, concepts and functions of digital computers, computer arithmetic, input media and devices, and computer output. The basic central processor functions, data storage and the organization of data by classification of computer files,

  8. Computer Forensic Function Testing: Media Preparation, Write Protection And Verification

    Directory of Open Access Journals (Sweden)

    Yinghua (David Guo

    2010-06-01

    Full Text Available Normal 0 false false false EN-US JA AR-SA The growth in the computer forensic field has created a demand for new software (or increased functionality to existing software and a means to verify that this software is truly forensic i.e. capable of meeting the requirements of the trier of fact. In this work, we review our previous work---a function oriented testing framework for validation and verification of computer forensic tools. This framework consists of three parts: function mapping, requirements specification and reference set development. Through function mapping, we give a scientific and systemical description of the fundamentals of computer forensic discipline, i.e. what functions are needed in the computer forensic investigation process. We focus this paper on the functions of media preparation, write protection and verification. Specifically, we complete the function mapping of these functions and specify their requirements. Based on this work, future work can be conducted to develop corresponding reference sets to test any tools that possess these functions.

  9. A Faster Algorithm for Computing Motorcycle Graphs

    KAUST Repository

    Vigneron, Antoine E.; Yan, Lie

    2014-01-01

    We present a new algorithm for computing motorcycle graphs that runs in (Formula presented.) time for any (Formula presented.), improving on all previously known algorithms. The main application of this result is to computing the straight skeleton of a polygon. It allows us to compute the straight skeleton of a non-degenerate polygon with (Formula presented.) holes in (Formula presented.) expected time. If all input coordinates are (Formula presented.)-bit rational numbers, we can compute the straight skeleton of a (possibly degenerate) polygon with (Formula presented.) holes in (Formula presented.) expected time. In particular, it means that we can compute the straight skeleton of a simple polygon in (Formula presented.) expected time if all input coordinates are (Formula presented.)-bit rationals, while all previously known algorithms have worst-case running time (Formula presented.). © 2014 Springer Science+Business Media New York.

  10. A Faster Algorithm for Computing Motorcycle Graphs

    KAUST Repository

    Vigneron, Antoine E.

    2014-08-29

    We present a new algorithm for computing motorcycle graphs that runs in (Formula presented.) time for any (Formula presented.), improving on all previously known algorithms. The main application of this result is to computing the straight skeleton of a polygon. It allows us to compute the straight skeleton of a non-degenerate polygon with (Formula presented.) holes in (Formula presented.) expected time. If all input coordinates are (Formula presented.)-bit rational numbers, we can compute the straight skeleton of a (possibly degenerate) polygon with (Formula presented.) holes in (Formula presented.) expected time. In particular, it means that we can compute the straight skeleton of a simple polygon in (Formula presented.) expected time if all input coordinates are (Formula presented.)-bit rationals, while all previously known algorithms have worst-case running time (Formula presented.). © 2014 Springer Science+Business Media New York.

  11. Parallel computers and three-dimensional computational electromagnetics

    International Nuclear Information System (INIS)

    Madsen, N.K.

    1994-01-01

    The authors have continued to enhance their ability to use new massively parallel processing computers to solve time-domain electromagnetic problems. New vectorization techniques have improved the performance of their code DSI3D by factors of 5 to 15, depending on the computer used. New radiation boundary conditions and far-field transformations now allow the computation of radar cross-section values for complex objects. A new parallel-data extraction code has been developed that allows the extraction of data subsets from large problems, which have been run on parallel computers, for subsequent post-processing on workstations with enhanced graphics capabilities. A new charged-particle-pushing version of DSI3D is under development. Finally, DSI3D has become a focal point for several new Cooperative Research and Development Agreement activities with industrial companies such as Lockheed Advanced Development Company, Varian, Hughes Electron Dynamics Division, General Atomic, and Cray

  12. Development of a locally mass flux conservative computer code for calculating 3-D viscous flow in turbomachines

    Science.gov (United States)

    Walitt, L.

    1982-01-01

    The VANS successive approximation numerical method was extended to the computation of three dimensional, viscous, transonic flows in turbomachines. A cross-sectional computer code, which conserves mass flux at each point of the cross-sectional surface of computation was developed. In the VANS numerical method, the cross-sectional computation follows a blade-to-blade calculation. Numerical calculations were made for an axial annular turbine cascade and a transonic, centrifugal impeller with splitter vanes. The subsonic turbine cascade computation was generated in blade-to-blade surface to evaluate the accuracy of the blade-to-blade mode of marching. Calculated blade pressures at the hub, mid, and tip radii of the cascade agreed with corresponding measurements. The transonic impeller computation was conducted to test the newly developed locally mass flux conservative cross-sectional computer code. Both blade-to-blade and cross sectional modes of calculation were implemented for this problem. A triplet point shock structure was computed in the inducer region of the impeller. In addition, time-averaged shroud static pressures generally agreed with measured shroud pressures. It is concluded that the blade-to-blade computation produces a useful engineering flow field in regions of subsonic relative flow; and cross-sectional computation, with a locally mass flux conservative continuity equation, is required to compute the shock waves in regions of supersonic relative flow.

  13. COMPUTING

    CERN Multimedia

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  14. High-performance computing in accelerating structure design and analysis

    International Nuclear Information System (INIS)

    Li Zenghai; Folwell, Nathan; Ge Lixin; Guetz, Adam; Ivanov, Valentin; Kowalski, Marc; Lee, Lie-Quan; Ng, Cho-Kuen; Schussman, Greg; Stingelin, Lukas; Uplenchwar, Ravindra; Wolf, Michael; Xiao, Liling; Ko, Kwok

    2006-01-01

    Future high-energy accelerators such as the Next Linear Collider (NLC) will accelerate multi-bunch beams of high current and low emittance to obtain high luminosity, which put stringent requirements on the accelerating structures for efficiency and beam stability. While numerical modeling has been quite standard in accelerator R and D, designing the NLC accelerating structure required a new simulation capability because of the geometric complexity and level of accuracy involved. Under the US DOE Advanced Computing initiatives (first the Grand Challenge and now SciDAC), SLAC has developed a suite of electromagnetic codes based on unstructured grids and utilizing high-performance computing to provide an advanced tool for modeling structures at accuracies and scales previously not possible. This paper will discuss the code development and computational science research (e.g. domain decomposition, scalable eigensolvers, adaptive mesh refinement) that have enabled the large-scale simulations needed for meeting the computational challenges posed by the NLC as well as projects such as the PEP-II and RIA. Numerical results will be presented to show how high-performance computing has made a qualitative improvement in accelerator structure modeling for these accelerators, either at the component level (single cell optimization), or on the scale of an entire structure (beam heating and long-range wakefields)

  15. Development of a new generation solid rocket motor ignition computer code

    Science.gov (United States)

    Foster, Winfred A., Jr.; Jenkins, Rhonald M.; Ciucci, Alessandro; Johnson, Shelby D.

    1994-01-01

    This report presents the results of experimental and numerical investigations of the flow field in the head-end star grain slots of the Space Shuttle Solid Rocket Motor. This work provided the basis for the development of an improved solid rocket motor ignition transient code which is also described in this report. The correlation between the experimental and numerical results is excellent and provides a firm basis for the development of a fully three-dimensional solid rocket motor ignition transient computer code.

  16. Specialized computer architectures for computational aerodynamics

    Science.gov (United States)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  17. The quantum computer game: citizen science

    Science.gov (United States)

    Damgaard, Sidse; Mølmer, Klaus; Sherson, Jacob

    2013-05-01

    Progress in the field of quantum computation is hampered by daunting technical challenges. Here we present an alternative approach to solving these by enlisting the aid of computer players around the world. We have previously examined a quantum computation architecture involving ultracold atoms in optical lattices and strongly focused tweezers of light. In The Quantum Computer Game (see http://www.scienceathome.org/), we have encapsulated the time-dependent Schrödinger equation for the problem in a graphical user interface allowing for easy user input. Players can then search the parameter space with real-time graphical feedback in a game context with a global high-score that rewards short gate times and robustness to experimental errors. The game which is still in a demo version has so far been tried by several hundred players. Extensions of the approach to other models such as Gross-Pitaevskii and Bose-Hubbard are currently under development. The game has also been incorporated into science education at high-school and university level as an alternative method for teaching quantum mechanics. Initial quantitative evaluation results are very positive. AU Ideas Center for Community Driven Research, CODER.

  18. Postpartum IGF-I and IGFBP-2 levels are prospectively associated with the development of type 2 diabetes in women with previous gestational diabetes mellitus.

    Science.gov (United States)

    Lappas, M; Jinks, D; Shub, A; Willcox, J C; Georgiou, H M; Permezel, M

    2016-12-01

    Women with previous gestational diabetes mellitus (GDM) are at greater risk of developing type 2 diabetes. In the general population, the insulin-like growth factor (IGF) system has been implicated in the development of type 2 diabetes. The aim of this study was to determine if circulating IGF-I, IGF-II, IGFBP-1 and IGFBP-2 levels 12weeks following a GDM pregnancy are associated with an increased risk of developing type 2 diabetes. IGF-I, IGF-II, IGFBP-1 and IGFBP-2 levels were measured in 98 normal glucose tolerant women, 12weeks following an index GDM pregnancy using enzyme immunoassay. Women were assessed for up to 10years for the development of overt type 2 diabetes. Among the 98 women with previous GDM, 21 (21%) developed diabetes during the median follow-up period of 8.5years. After adjusting for age and BMI, IGF-I and IGFBP-2 were significantly associated with the development of type 2 diabetes. In a clinical model of prediction of type 2 diabetes that included age, BMI, pregnancy fasting glucose and postnatal fasting glucose, the addition of IGF-I and IGFBP-2 resulted in an improvement in the net reclassification index of 17.8%. High postpartum IGF-I and low postpartum IGFBP-2 levels are a significant risk factor for the development of type 2 diabetes in women with a previous history of GDM. This is the first report that identifies IGF-I and IGFBP-2 as a potential biomarker for the prediction of type 2 diabetes in women with a history of GDM. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  19. An Instructional Design Model for Developing a Computer Curriculum To Increase Employee Productivity in a Pharmaceutical Company.

    Science.gov (United States)

    Stumpf, Mark R.

    This report presents an instructional design model that was developed for use by the End-Users Computing department of a large pharmaceutical company in developing effective--but not lengthy--microcomputer training seminars to train office workers and executives in the proper use of computers and thus increase their productivity. The 14 steps of…

  20. Development of a research prototype computer `Wearables` that one can wear on his or her body; Minitsukeru computer `Wearables` kenkyuyo shisakuki wo kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-02-01

    Development has been made on a prototype of a wearable computer `Wearables` that makes the present notebook type PC still smaller in size, can be worn on human body for utilization at any time and from anywhere, and aims at realizing a social infrastructure. Using the company`s portable PC, Libretto as the base, the keyboard and the liquid crystal display panel were removed. To replace these functions, a voice inputting microphone, and various types of head mounting type displays (glasses type) mounted on a head to see images are connected. Provided as the means for information communication between the prototype computer and outside environments are infrared ray interface and data communication function using wireless (electric wave) communications. The wireless desk area network (DAN) technology that can structure dynamically a network between multiple number of computers has realized smooth communications with external environments. The voice recognition technology that can work efficiently against noise has realized keyboard-free operation that gives no neural stress to users. The `wearable computer` aims at not only users utilizing it simply wearing it, but also providing a new perception ability that could not have been seen or heard directly to date, that is realizing the digital sensation. With the computer, a society will be structured in which people can live comfortably and safely, maintaining conversations between the users and the computers, and interactions between the surrounding environment and the social infrastructures, with protection of individual privacy and information security taken into consideration. The company is working with the Massachusetts Institute of Technology (MIT) for research and development of the `wearable computer` as to how it can be utilized and basic technologies that will be required in the future. (translated by NEDO)

  1. Development of the operator training system using computer graphics. Pt. 1. Defining the system configuration and developing basic techniques

    International Nuclear Information System (INIS)

    Takana, Kenchi; Sasou, Kunihide; Sano, Toshiaki; Suzuki, Koichi; Noji, Kunio

    2001-01-01

    Efficient and concurrent operator training seems to be crucial in near future because of an increase in operators to be trained due to generation alternations. Ever developed Man -Machine-Simulator (MMS) has several merits: (1) Operators' cognitive and behavioral activities among team in emergency could be simulated based on the concurrent mental model; (2) Simulated scenarios could be expanded to multiple malfunctions events, to all of which procedures could not be stipulated previously, (3) Standard behavior in coping with anomalies including communication and operations could be presented. This paper describes the development of an operator training system by applying this MMS. Three dimensional computer graphics (3D-CG) was adopted for improving the training effects and attracting operators' interest by visually presenting realistic operating team behavior in the main control room. Towards the completion of the operator training system, following designs of system configuration and developments of several basic techniques were availed: (1) Imaging the utilization of the operator training system, functions to be equipped and system configurations for realizing functions were determined. And three of scenarios were chosen in order to appeal the merits of the MMS and to raise training effects. (2) Knowledge base was completed to execute simulations. And connection between operator team model and plant simulator, that is the 2nd generation type simulator of the BTC -4, was executed to obtain simulation results (time sequential log data of plant dynamics and operating team behavior). (3) Operator's actions seen in VCR tapes in real training were classified for eighteen kinds of fundamental categories and those fundamental actions were modeled on 3D-CG using the People Shop software. The 3D-CG of main control panel was prepared using Multi Gen software. (author)

  2. 24 CFR 1710.552 - Previously accepted state filings.

    Science.gov (United States)

    2010-04-01

    ... of Substantially Equivalent State Law § 1710.552 Previously accepted state filings. (a) Materials... and contracts or agreements contain notice of purchaser's revocation rights. In addition see § 1715.15..., unless the developer is obligated to do so in the contract. (b) If any such filing becomes inactive or...

  3. International Conference of Intelligence Computation and Evolutionary Computation ICEC 2012

    CERN Document Server

    Intelligence Computation and Evolutionary Computation

    2013-01-01

    2012 International Conference of Intelligence Computation and Evolutionary Computation (ICEC 2012) is held on July 7, 2012 in Wuhan, China. This conference is sponsored by Information Technology & Industrial Engineering Research Center.  ICEC 2012 is a forum for presentation of new research results of intelligent computation and evolutionary computation. Cross-fertilization of intelligent computation, evolutionary computation, evolvable hardware and newly emerging technologies is strongly encouraged. The forum aims to bring together researchers, developers, and users from around the world in both industry and academia for sharing state-of-art results, for exploring new areas of research and development, and to discuss emerging issues facing intelligent computation and evolutionary computation.

  4. Development of a computational model for astronaut reorientation.

    Science.gov (United States)

    Stirling, Leia; Willcox, Karen; Newman, Dava

    2010-08-26

    The ability to model astronaut reorientations computationally provides a simple way to develop and study human motion control strategies. Since the cost of experimenting in microgravity is high, and underwater training can lead to motions inappropriate for microgravity, these techniques allow for motions to be developed and well-understood prior to any microgravity exposure. By including a model of the current space suit, we have the ability to study both intravehicular and extravehicular activities. We present several techniques for rotating about the axes of the body and show that motions performed by the legs create a greater net rotation than those performed by the arms. Adding a space suit to the motions was seen to increase the resistance torque and limit the available range of motion. While rotations about the body axes can be performed in the current space suit, the resulting motions generated a reduced rotation when compared to the unsuited configuration. 2010 Elsevier Ltd. All rights reserved.

  5. Tangential scanning of hardwood logs: developing an industrial computer tomography scanner

    Science.gov (United States)

    Nand K. Gupta; Daniel L. Schmoldt; Bruce Isaacson

    1999-01-01

    It is generally believed that noninvasive scanning of hardwood logs such as computer tomography (CT) scanning prior to initial breakdown will greatly improve the processing of logs into lumber. This belief, however, has not translated into rapid development and widespread installation of industrial CT scanners for log processing. The roadblock has been more operational...

  6. Development of a graphical interface computer code for reactor fuel reloading optimization

    International Nuclear Information System (INIS)

    Do Quang Binh; Nguyen Phuoc Lan; Bui Xuan Huy

    2007-01-01

    This report represents the results of the project performed in 2007. The aim of this project is to develop a graphical interface computer code that allows refueling engineers to design fuel reloading patterns for research reactor using simulated graphical model of reactor core. Besides, this code can perform refueling optimization calculations based on genetic algorithms as well as simulated annealing. The computer code was verified based on a sample problem, which relies on operational and experimental data of Dalat research reactor. This code can play a significant role in in-core fuel management practice at nuclear research reactor centers and in training. (author)

  7. Dynamic time-dependent analysis and static three-dimensional imaging procedures for computer-assisted CNS studies

    International Nuclear Information System (INIS)

    Budinger, T.F.; DeLand, F.H.; Duggan, H.E.; Bouz, J.J.; Hoop, B. Jr.; McLaughlin, W.T.; Weber, P.M.

    1975-01-01

    Two-dimensional computer image-processing techniques have not proved to be of importance in diagnostic nuclear medicine primarily because the radionuclide distribution represents a three-dimensional problem. More recent developments in three-dimensional reconstruction from multiple views or multiple detectors promise to overcome the major limitations in previous work with digital computers. These techniques are now in clinical use for static imaging; however, speed limitations have prevented application to dynamic imaging. The future development of these methods will require innovations in patient positioning and multiple-view devices for either single-gamma or positron annihilation detection

  8. Scaling predictive modeling in drug development with cloud computing.

    Science.gov (United States)

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  9. The Development of University Computing in Sweden 1965-1985

    Science.gov (United States)

    Dahlstrand, Ingemar

    In 1965-70 the government agency, Statskontoret, set up five university computing centers, as service bureaux financed by grants earmarked for computer use. The centers were well equipped and staffed and caused a surge in computer use. When the yearly flow of grant money stagnated at 25 million Swedish crowns, the centers had to find external income to survive and acquire time-sharing. But the charging system led to the computers not being fully used. The computer scientists lacked equipment for laboratory use. The centers were decentralized and the earmarking abolished. Eventually they got new tasks like running computers owned by the departments, and serving the university administration.

  10. Computer-aided design and computer science technology

    Science.gov (United States)

    Fulton, R. E.; Voigt, S. J.

    1976-01-01

    A description is presented of computer-aided design requirements and the resulting computer science advances needed to support aerospace design. The aerospace design environment is examined, taking into account problems of data handling and aspects of computer hardware and software. The interactive terminal is normally the primary interface between the computer system and the engineering designer. Attention is given to user aids, interactive design, interactive computations, the characteristics of design information, data management requirements, hardware advancements, and computer science developments.

  11. [Fatal amnioinfusion with previous choriocarcinoma in a parturient woman].

    Science.gov (United States)

    Hrgović, Z; Bukovic, D; Mrcela, M; Hrgović, I; Siebzehnrübl, E; Karelovic, D

    2004-04-01

    The case of 36-year-old tercipare is described who developed choriocharcinoma in a previous pregnancy. During the first term labour the patient developed cardiac arrest, so reanimation and sectio cesarea was performed. A male new-born was delivered in good condition, but even after intensive therapy and reanimation occurred death of parturient woman with picture of disseminate intravascular coagulopathia (DIK). On autopsy and on histology there was no sign of malignant disease, so it was not possible to connect previous choricarcinoma with amniotic fluid embolism. Maybe was place of choriocarcinoma "locus minoris resistentiae" which later resulted with failure in placentation what was hard to prove. On autopsy we found embolia of lung with a microthrombosis of terminal circulation with punctiformis bleeding in mucous, what stands for DIK.

  12. Development of a totally computer-controlled triple quadrupole mass spectrometer system

    International Nuclear Information System (INIS)

    Wong, C.M.; Crawford, R.W.; Barton, V.C.; Brand, H.R.; Neufeld, K.W.; Bowman, J.E.

    1983-01-01

    A totally computer-controlled triple quadrupole mass spectrometer (TQMS) is described. It has a number of unique features not available on current commercial instruments, including: complete computer control of source and all ion axial potentials; use of dual computers for data acquisition and data processing; and capability for self-adaptive control of experiments. Furthermore, it has been possible to produce this instrument at a cost significantly below that of commercial instruments. This triple quadrupole mass spectrometer has been constructed using components commercially available from several different manufacturers. The source is a standard Hewlett-Packard 5985B GC/MS source. The two quadrupole analyzers and the quadrupole CAD region contain Balzers QMA 150 rods with Balzers QMG 511 rf controllers for the analyzers and a Balzers QHS-511 controller for the CAD region. The pulsed-positive-ion-negative-ion-chemical ionization (PPINICI) detector is made by Finnigan Corporation. The mechanical and electronics design were developed at LLNL for linking these diverse elements into a functional TQMS as described. The computer design for total control of the system is unique in that two separate LSI-11/23 minicomputers and assorted I/O peripherals and interfaces from several manufacturers are used. The evolution of this design concept from totally computer-controlled instrumentation into future self-adaptive or ''expert'' systems for instrumental analysis is described. Operational characteristics of the instrument and initial results from experiments involving the analysis of the high explosive HMX (1,3,5,7-Tetranitro-1,3,5,7-Tetrazacyclooctane) are presented

  13. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  14. Computing programme SPEGTAR and user's guide

    International Nuclear Information System (INIS)

    Altiparmakov, D.; Bosevski, T.

    1974-01-01

    Computer code SPEGTAR is one dimensional multigroup code for calculating neutron transport in multi zone cylindrical geometry. Neutron flux distribution is calculated by solving integral transport equation by the method of first collision probability previously developed for both one-group and multi group cases. According to spatial multigroup distribution of neutron flux, integral values of nuclear constants are determined by numerical integration for all material zones and all energy groups. These results are used for determining the parameters of homogenized reactor cell and condensation of energy groups suitable for reactor overall calculation

  15. Computational Modeling Develops Ultra-Hard Steel

    Science.gov (United States)

    2007-01-01

    Glenn Research Center's Mechanical Components Branch developed a spiral bevel or face gear test rig for testing thermal behavior, surface fatigue, strain, vibration, and noise; a full-scale, 500-horsepower helicopter main-rotor transmission testing stand; a gear rig that allows fundamental studies of the dynamic behavior of gear systems and gear noise; and a high-speed helical gear test for analyzing thermal behavior for rotorcraft. The test rig provides accelerated fatigue life testing for standard spur gears at speeds of up to 10,000 rotations per minute. The test rig enables engineers to investigate the effects of materials, heat treat, shot peen, lubricants, and other factors on the gear's performance. QuesTek Innovations LLC, based in Evanston, Illinois, recently developed a carburized, martensitic gear steel with an ultra-hard case using its computational design methodology, but needed to verify surface fatigue, lifecycle performance, and overall reliability. The Battelle Memorial Institute introduced the company to researchers at Glenn's Mechanical Components Branch and facilitated a partnership allowing researchers at the NASA Center to conduct spur gear fatigue testing for the company. Testing revealed that QuesTek's gear steel outperforms the current state-of-the-art alloys used for aviation gears in contact fatigue by almost 300 percent. With the confidence and credibility provided by the NASA testing, QuesTek is commercializing two new steel alloys. Uses for this new class of steel are limitless in areas that demand exceptional strength for high throughput applications.

  16. The early development of medial coronoid disease in growing Labrador retrivers: Radographic, computed tomographic, necropsy and micro-computed tomographic findings

    NARCIS (Netherlands)

    Lau, S.F.; Wolschrijn, C.F.; Hazewinkel, H.A.W.; Siebelt, M; Voorhout, G.

    2013-01-01

    Abstract Medial coronoid disease (MCD) encompasses lesions of the entire medial coronoid process (MCP), both of the articular cartilage and the subchondral bone. To detect the earliest signs of MCD, radiography and computed tomography were used to monitor the development of MCD in 14 Labrador

  17. Integrated Computational Materials Engineering Development of Advanced High Strength Steel for Lightweight Vehicles

    Energy Technology Data Exchange (ETDEWEB)

    Hector, Jr., Louis G. [General Motors, Warren, MI (United States); McCarty, Eric D. [United States Automotive Materials Partnership LLC (USAMP), Southfield, MI (United States)

    2017-07-31

    The goal of the ICME 3GAHSS project was to successfully demonstrate the applicability of Integrated Computational Materials Engineering (ICME) for the development and deployment of third generation advanced high strength steels (3GAHSS) for immediate weight reduction in passenger vehicles. The ICME approach integrated results from well-established computational and experimental methodologies to develop a suite of material constitutive models (deformation and failure), manufacturing process and performance simulation modules, a properties database, as well as the computational environment linking them together for both performance prediction and material optimization. This is the Final Report for the ICME 3GAHSS project, which achieved the fol-lowing objectives: 1) Developed a 3GAHSS ICME model, which includes atomistic, crystal plasticity, state variable and forming models. The 3GAHSS model was implemented in commercially available LS-DYNA and a user guide was developed to facilitate use of the model. 2) Developed and produced two 3GAHSS alloys using two different chemistries and manufacturing processes, for use in calibrating and validating the 3GAHSS ICME Model. 3) Optimized the design of an automotive subassembly by substituting 3GAHSS for AHSS yielding a design that met or exceeded all baseline performance requirements with a 30% mass savings. A technical cost model was also developed to estimate the cost per pound of weight saved when substituting 3GAHSS for AHSS. The project demonstrated the potential for 3GAHSS to achieve up to 30% weight savings in an automotive structure at a cost penalty of up to $0.32 to $1.26 per pound of weight saved. The 3GAHSS ICME Model enables the user to design 3GAHSS to desired mechanical properties in terms of strength and ductility.

  18. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  19. Large-scale computer-mediated training for management teachers

    Directory of Open Access Journals (Sweden)

    Gilly Salmon

    1997-01-01

    Full Text Available In 1995/6 the Open University Business School (OUBS trained 187 tutors in the UK and Continental Western Europe in Computer Mediated Conferencing (CMC for management education. The medium chosen for the training was FirstClassTM. In 1996/7 the OUBS trained a further 106 tutors in FirstClassTM using an improved version of the previous years training. The on line training was based on a previously developed model of learning on line. The model was tested both by means of the structure of the training programme and the improvements made. The training programme was evaluated and revised for the second cohort. Comparison was made between the two training programmes.

  20. HCI^2 Workbench: A Development Tool for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Wenzhe, Shi; Pantic, Maja

    In this paper, we present a novel software tool designed and implemented to simplify the development process of Multimodal Human-Computer Interaction (MHCI) systems. This tool, which is called the HCI^2 Workbench, exploits a Publish / Subscribe (P/S) architecture [13] [14] to facilitate efficient

  1. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Y W [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Zhang, L F [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China); Huang, J P [Surface Physics Laboratory and Department of Physics, Fudan University, Shanghai 200433 (China)

    2007-07-20

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property.

  2. The Watts-Strogatz network model developed by including degree distribution: theory and computer simulation

    International Nuclear Information System (INIS)

    Chen, Y W; Zhang, L F; Huang, J P

    2007-01-01

    By using theoretical analysis and computer simulations, we develop the Watts-Strogatz network model by including degree distribution, in an attempt to improve the comparison between characteristic path lengths and clustering coefficients predicted by the original Watts-Strogatz network model and those of the real networks with the small-world property. Good agreement between the predictions of the theoretical analysis and those of the computer simulations has been shown. It is found that the developed Watts-Strogatz network model can fit the real small-world networks more satisfactorily. Some other interesting results are also reported by adjusting the parameters in a model degree-distribution function. The developed Watts-Strogatz network model is expected to help in the future analysis of various social problems as well as financial markets with the small-world property

  3. Computer security

    CERN Document Server

    Gollmann, Dieter

    2011-01-01

    A completely up-to-date resource on computer security Assuming no previous experience in the field of computer security, this must-have book walks you through the many essential aspects of this vast topic, from the newest advances in software and technology to the most recent information on Web applications security. This new edition includes sections on Windows NT, CORBA, and Java and discusses cross-site scripting and JavaScript hacking as well as SQL injection. Serving as a helpful introduction, this self-study guide is a wonderful starting point for examining the variety of competing sec

  4. Dynamical Models for Computer Viruses Propagation

    Directory of Open Access Journals (Sweden)

    José R. C. Piqueira

    2008-01-01

    Full Text Available Nowadays, digital computer systems and networks are the main engineering tools, being used in planning, design, operation, and control of all sizes of building, transportation, machinery, business, and life maintaining devices. Consequently, computer viruses became one of the most important sources of uncertainty, contributing to decrease the reliability of vital activities. A lot of antivirus programs have been developed, but they are limited to detecting and removing infections, based on previous knowledge of the virus code. In spite of having good adaptation capability, these programs work just as vaccines against diseases and are not able to prevent new infections based on the network state. Here, a trial on modeling computer viruses propagation dynamics relates it to other notable events occurring in the network permitting to establish preventive policies in the network management. Data from three different viruses are collected in the Internet and two different identification techniques, autoregressive and Fourier analyses, are applied showing that it is possible to forecast the dynamics of a new virus propagation by using the data collected from other viruses that formerly infected the network.

  5. Issues on the Development and Application of Computer Tools to Support Product Structuring and Configuring

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp; Riitahuhta, A.

    2001-01-01

    The aim of this article is to make a balance on the results and challenges in the efforts to develop computer tools to support product structuring and configuring in product development projects. The balance will be made in two dimensions, a design science and an industrial dimension. The design ...... that there are large positive effects to be gained for industrial companies by conscious implementing computer tools based on the results of design science. The positive effects will be measured by e.g. predictable product quality, reduced lead time, and reuse of design solutions....

  6. Development of a portable computed tomographic scanner for on-line imaging of industrial piping systems

    International Nuclear Information System (INIS)

    Jaafar Abdullah; Mohd Arif Hamzah; Mohd Soyapi Mohd Yusof; Mohd Fitri Abdul Rahman; Fadil IsmaiI; Rasif Mohd Zain

    2003-01-01

    Computed tomography (CT) technology is being increasingly developed for industrial application. This paper presents the development of a portable computed tomographic scanner for on?line imaging of industrial piping systems. The theoretical approach, the system hardware, the data acquisition system and the adopted algorithm for image reconstruction are discussed. The scanner has large potential to be used to determine the extent of corrosion under insulation (CUI), to detect blockages, to measure the thickness of deposit/materials built-up on the walls and to improve understanding of material flow in pipelines. (Author)

  7. The development of Sonic Pi and its use in educational partnerships: Co-creating pedagogies for learning computer programming

    OpenAIRE

    Aaron, S; Blackwell, Alan Frank; Burnard, Pamela Anne

    2017-01-01

    Sonic Pi is a new open source software tool and platform originally developed for the Raspberry Pi computer, designed to enable school children to learn programming by creating music. In this article we share insights from a scoping study on the development of Sonic Pi and its use in educational partnerships. Our findings draw attention to the importance of collaborative relationships between teacher and computer scientist and the value of creative pedagogies for learning computer programming...

  8. Positron emission computed tomography

    International Nuclear Information System (INIS)

    Grover, M.; Schelbert, H.R.

    1985-01-01

    Regional mycardial blood flow and substrate metabolism can be non-invasively evaluated and quantified with positron emission computed tomography (Positron-CT). Tracers of exogenous glucose utilization and fatty acid metabolism are available and have been extensively tested. Specific tracer kinetic models have been developed or are being tested so that glucose and fatty acid metabolism can be measured quantitatively by Positron-CT. Tracers of amino acid and oxygen metabolism are utilized in Positron-CT studies of the brain and development of such tracers for cardiac studies are in progress. Methods to quantify regional myocardial blood flow are also being developed. Previous studies have demonstrated the ability of Positron-/CT to document myocardial infarction. Experimental and clinical studies have begun to identify metabolic markers of reversibly ischemic myocardium. The potential of Positron-CT to reliably detect potentially salvageable myocardium and, hence, to identify appropriate therapeutic interventions is one of the most exciting applications of the technique

  9. Report on the FY17 Development of Computer Program for ASME Section III, Division 5, Subsection HB, Subpart B Rules

    Energy Technology Data Exchange (ETDEWEB)

    Swindeman, M. J. [Argonne National Lab. (ANL), Argonne, IL (United States); Jetter, R. I. [Argonne National Lab. (ANL), Argonne, IL (United States); Sham, T. -L. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-01-01

    One of the objectives of the high temperature design methodology activities is to develop and validate both improvements and the basic features of ASME Boiler and Pressure Vessel Code, Section III, Rules for Construction of Nuclear Facility Components, Division 5, High Temperature Reactors, Subsection HB, Subpart B (HBB). The overall scope of this task is to develop a computer program to aid assessment procedures of components under specified loading conditions in accordance with the elevated temperature design requirements for Division 5 Class A components. There are many features and alternative paths of varying complexity in HBB. The initial focus of this computer program is a basic path through the various options for a single reference material, 316H stainless steel. However, the computer program is being structured for eventual incorporation all of the features and permitted materials of HBB. This report will first provide a description of the overall computer program, particular challenges in developing numerical procedures for the assessment, and an overall approach to computer program development. This is followed by a more comprehensive appendix, which is the draft computer program manual for the program development. The strain limits rules have been implemented in the computer program. The evaluation of creep-fatigue damage will be implemented in future work scope.

  10. Revolutionary development of computer education : A success story

    OpenAIRE

    Nandasara, S. T.; Samaranayake, V. K.; Mikami, Yoshiki

    2006-01-01

    The University of Colombo, Sri Lanka has been in the forefront of the “Computer Revolution” in Sri Lanka. It has introduced the teaching of computer programming and applications as early as in 1967, more than a decade before other educational institutions, thereby producing, over the years, a large number of pioneer computer scientists and IT graduates out of students entering the university from a variety of disciplines. They are presently employed as researchers, educators, data processing ...

  11. Computer-Aided Software Engineering - An approach to real-time software development

    Science.gov (United States)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  12. Biased ART: a neural architecture that shifts attention toward previously disregarded features following an incorrect prediction.

    Science.gov (United States)

    Carpenter, Gail A; Gaddam, Sai Chaitanya

    2010-04-01

    Memories in Adaptive Resonance Theory (ART) networks are based on matched patterns that focus attention on those portions of bottom-up inputs that match active top-down expectations. While this learning strategy has proved successful for both brain models and applications, computational examples show that attention to early critical features may later distort memory representations during online fast learning. For supervised learning, biased ARTMAP (bARTMAP) solves the problem of over-emphasis on early critical features by directing attention away from previously attended features after the system makes a predictive error. Small-scale, hand-computed analog and binary examples illustrate key model dynamics. Two-dimensional simulation examples demonstrate the evolution of bARTMAP memories as they are learned online. Benchmark simulations show that featural biasing also improves performance on large-scale examples. One example, which predicts movie genres and is based, in part, on the Netflix Prize database, was developed for this project. Both first principles and consistent performance improvements on all simulation studies suggest that featural biasing should be incorporated by default in all ARTMAP systems. Benchmark datasets and bARTMAP code are available from the CNS Technology Lab Website: http://techlab.bu.edu/bART/. Copyright 2009 Elsevier Ltd. All rights reserved.

  13. Design, development and integration of a large scale multiple source X-ray computed tomography system

    International Nuclear Information System (INIS)

    Malcolm, Andrew A.; Liu, Tong; Ng, Ivan Kee Beng; Teng, Wei Yuen; Yap, Tsi Tung; Wan, Siew Ping; Kong, Chun Jeng

    2013-01-01

    X-ray Computed Tomography (CT) allows visualisation of the physical structures in the interior of an object without physically opening or cutting it. This technology supports a wide range of applications in the non-destructive testing, failure analysis or performance evaluation of industrial products and components. Of the numerous factors that influence the performance characteristics of an X-ray CT system the energy level in the X-ray spectrum to be used is one of the most significant. The ability of the X-ray beam to penetrate a given thickness of a specific material is directly related to the maximum available energy level in the beam. Higher energy levels allow penetration of thicker components made of more dense materials. In response to local industry demand and in support of on-going research activity in the area of 3D X-ray imaging for industrial inspection the Singapore Institute of Manufacturing Technology (SIMTech) engaged in the design, development and integration of large scale multiple source X-ray computed tomography system based on X-ray sources operating at higher energies than previously available in the Institute. The system consists of a large area direct digital X-ray detector (410 x 410 mm), a multiple-axis manipulator system, a 225 kV open tube microfocus X-ray source and a 450 kV closed tube millifocus X-ray source. The 225 kV X-ray source can be operated in either transmission or reflection mode. The body of the 6-axis manipulator system is fabricated from heavy-duty steel onto which high precision linear and rotary motors have been mounted in order to achieve high accuracy, stability and repeatability. A source-detector distance of up to 2.5 m can be achieved. The system is controlled by a proprietary X-ray CT operating system developed by SIMTech. The system currently can accommodate samples up to 0.5 x 0.5 x 0.5 m in size with weight up to 50 kg. These specifications will be increased to 1.0 x 1.0 x 1.0 m and 100 kg in future

  14. Development of a rapid multi-line detector for industrial computed tomography

    International Nuclear Information System (INIS)

    Nachtrab, Frank; Firsching, Markus; Hofmann, Thomas; Uhlmann, Norman; Neubauer, Harald; Nowak, Arne

    2015-01-01

    In this paper we present the development of a rapid multi-row detector is optimized for industrial computed tomography. With a high frame rate, high spatial resolution and the ability to use up to 450 kVp it is particularly suitable for applications such as fast acquisition of large objects, inline CT or time-resolved 4D CT. (Contains PowerPoint slides). [de

  15. Development of a 3-dimensional seismic isolation floor for computer systems

    International Nuclear Information System (INIS)

    Kurihara, M.; Shigeta, M.; Nino, T.; Matsuki, T.

    1991-01-01

    In this paper, we investigated the applicability of a seismic isolation floor as a method for protecting computer systems from strong earthquakes, such as computer systems in nuclear power plants. Assuming that the computer system is guaranteed for 250 cm/s 2 of input acceleration in the horizontal and vertical directions as the seismic performance, the basic design specification of the seismic isolation floor is considered as follows. Against S 1 level earthquakes, the maximum acceleration response of the seismic isolation floor in the horizontal and vertical directions is kept less than 250 cm/s 2 to maintain continuous computer operation. Against S 2 level earthquakes, the isolation floor allows large horizontal movement and large displacement of the isolation devices to reduce the acceleration response, although it is not guaranteed to be less than 250 cm/s 2 . By reducing the acceleration response, however, serious damage to the computer systems is reduced, so that they can be restarted after an earthquake. Usually, seismic isolation floor systems permit 2-dimensional (horizontal) isolation. However, in the case of just-under-seated earthquakes, which have large vertical components, the vertical acceleration response of this system is amplified by the lateral vibration of the frame of the isolation floor. Therefore, in this study a 3-dimensional seismic isolation floor, including vertical isolation, was developed. This paper describes 1) the experimental results of the response characteristics of the 3-dimensional seismic isolation floor built as a trial using a 3-dimensional shaking table, and 2) comparison of a 2-dimensional analytical model, for motion in one horizontal direction and the vertical direction, to experimental results. (J.P.N.)

  16. Development of a computer code for thermohydraulic analysis of a heated channel in transients

    International Nuclear Information System (INIS)

    Jafari, J.; Kazeminejad, H.; Davilu, H.

    2004-01-01

    This paper discusses the thermohydraulic analysis of a heated channel of a nuclear reactor in transients by a computer code that has been developed by the writer. The considered geometry is a channel of a nuclear reactor with cylindrical or planar fuel rods. The coolant is water and flows from the outer surface of the fuel rod. To model the heat transfer in the fuel rod, two dimensional time dependent conduction equations has been solved by combination of numerical methods, O rthogonal Collocation Method in radial direction and finite difference method in axial direction . For coolant modelling the single phase time dependent energy equation has been used and solved by finite difference method . The combination of the first module that solves the conduction in the fuel rod and a second one that solved the energy balance in the coolant region constitute the computer code (Thyc-1) to analysis thermohydraulic of a heated channel in transients. The Orthogonal collocation method maintains the accuracy and computing time of conventional finite difference methods, while the computer storage is reduced by a factor of two. The same problem has been modelled by RELAP5/M3 system code to asses the validity of the Thyc-1 code. The good agreement of the results qualifies the developed code

  17. Development of distributed computer systems for future nuclear power plants

    International Nuclear Information System (INIS)

    Yan, G.; L'Archeveque, J.V.R.

    1978-01-01

    Dual computers have been used for direct digital control in CANDU power reactors since 1963. However, as reactor plants have grown in size and complexity, some drawbacks to centralized control appear such as, for example, the surprisingly large amount of cabling required for information transmission. Dramatic changes in costs of components and a desire to improve system performance have stimulated a broad-based research and development effort in distribution systems. This paper outlines work in this area

  18. Flow around an oscillating cylinder: computational issues

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Fengjian; Gallardo, José P; Pettersen, Bjørnar [Department of Marine Technology, Norwegian University of Science and Technology, NO-7491 Trondheim (Norway); Andersson, Helge I, E-mail: fengjian.jiang@ntnu.no [Department of Energy and Process Engineering, Norwegian University of Science and Technology, NO-7491 Trondheim (Norway)

    2017-10-15

    We consider different computational issues related to the three-dimensionalities of the flow around an oscillating circular cylinder. The full time-dependent Navier–Stokes equations are directly solved in a moving reference frame by introducing a forcing term. The choice of quantitative validation criteria is discussed and discrepancies of previously published results are addressed. The development of Honji vortices shows that short simulation times may lead to incorrect quasi-stable vortex patterns. The viscous decay of already established Honji vortices is also examined. (paper)

  19. De novo adamantinomatous craniopharyngioma presenting anew in an elderly patient with previous normal CT and MRI studies: A case report and implications on pathogenesis

    Directory of Open Access Journals (Sweden)

    Amy Walker, B.S.

    2015-09-01

    Full Text Available Adamantinomatous craniopharyngiomas are histologically benign epithelial tumors which arise from embryonic remnants of the craniopharyngeal duct and Rathke’s pouch. They are thought to have a congenital origin and are histologically unique from papillary craniopharyngioma. We describe the case of an elderly male who presented with symptoms related to a large craniopharyngioma with previously normal brain magnetic resonance and computed tomography imaging studies. These findings dispute the embryogenic theory that craniopharyngiomas observed in adults develop from the persistent slow growth of embryonic remnants.

  20. Starpc: a library for communication among tools on a parallel computer cluster. User's and developer's guide to Starpc

    International Nuclear Information System (INIS)

    Takemiya, Hiroshi; Yamagishi, Nobuhiro

    2000-02-01

    We report on a RPC(Remote Procedure Call)-based communication library, Starpc, for a parallel computer cluster. Starpc supports communication between Java Applets and C programs as well as between C programs. Starpc has the following three features. (1) It enables communication between Java Applets and C programs on an arbitrary computer without security violation, although Java Applets are supposed to communicate only with programs on the specific computer (Web server) in subject to a restriction on security. (2) Diverse network communication protocols are available on Starpc, because of using Nexus communication library developed at Argonne National Laboratory. (3) It works on many kinds of computers including eight parallel computers and four WS servers. In this report, the usage of Starpc and the development of applications using Starpc are described. (author)

  1. Development of the method of aggregation to determine the current storage area using computer vision and radiofrequency identification

    Science.gov (United States)

    Astafiev, A.; Orlov, A.; Privezencev, D.

    2018-01-01

    The article is devoted to the development of technology and software for the construction of positioning and control systems in industrial plants based on aggregation to determine the current storage area using computer vision and radiofrequency identification. It describes the developed of the project of hardware for industrial products positioning system in the territory of a plant on the basis of radio-frequency grid. It describes the development of the project of hardware for industrial products positioning system in the plant on the basis of computer vision methods. It describes the development of the method of aggregation to determine the current storage area using computer vision and radiofrequency identification. Experimental studies in laboratory and production conditions have been conducted and described in the article.

  2. Design and development of a run-time monitor for multi-core architectures in cloud computing.

    Science.gov (United States)

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.

  3. Design and Development of a Run-Time Monitor for Multi-Core Architectures in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Junghoon Lee

    2011-03-01

    Full Text Available Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.

  4. Development of a tracer transport option for the NAPSAC fracture network computer code

    International Nuclear Information System (INIS)

    Herbert, A.W.

    1990-06-01

    The Napsac computer code predicts groundwater flow through fractured rock using a direct fracture network approach. This paper describes the development of a tracer transport algorithm for the NAPSAC code. A very efficient particle-following approach is used enabling tracer transport to be predicted through large fracture networks. The new algorithm is tested against three test examples. These demonstrations confirm the accuracy of the code for simple networks, where there is an analytical solution to the transport problem, and illustrates the use of the computer code on a more realistic problem. (author)

  5. A Randomized Field Trial of the Fast ForWord Language Computer-Based Training Program

    Science.gov (United States)

    Borman, Geoffrey D.; Benson, James G.; Overman, Laura

    2009-01-01

    This article describes an independent assessment of the Fast ForWord Language computer-based training program developed by Scientific Learning Corporation. Previous laboratory research involving children with language-based learning impairments showed strong effects on their abilities to recognize brief and fast sequences of nonspeech and speech…

  6. Development of the JFT-2M data analysis software system on the mainframe computer

    International Nuclear Information System (INIS)

    Matsuda, Toshiaki; Amagai, Akira; Suda, Shuji; Maemura, Katsumi; Hata, Ken-ichiro.

    1990-11-01

    We developed software system on the FACOM mainframe computer to analyze JFT-2M experimental data archived by JFT-2M data acquisition system. Then we can reduce and distribute the CPU load of the data acquisition system. And we can analyze JFT-2M experimental data by using complicated computational code with raw data, such as equilibrium calculation and transport analysis, and useful software package like SAS statistic package on the mainframe. (author)

  7. DCE. Future IHEP's computing environment

    International Nuclear Information System (INIS)

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented

  8. Semiannual Report, April 1, 1989 through September 30, 1989 (Institute for Computer Applications in Science and Engineering)

    Science.gov (United States)

    1990-02-01

    noise. Tobias B. Orloff Work began on developing a high quality rendering algorithm based on the radiosity method. The algorithm is similar to...previous progressive radiosity algorithms except for the following improvements: 1. At each iteration vertex radiosities are computed using a modified scan...line approach, thus eliminating the quadratic cost associated with a ray tracing computation of vortex radiosities . 2. At each iteration the scene is

  9. Proposing Hybrid Architecture to Implement Cloud Computing in Higher Education Institutions Using a Meta-synthesis Appro

    Directory of Open Access Journals (Sweden)

    hamid reza bazi

    2017-12-01

    Full Text Available Cloud computing is a new technology that considerably helps Higher Education Institutions (HEIs to develop and create competitive advantage with inherent characteristics such as flexibility, scalability, accessibility, reliability, fault tolerant and economic efficiency. Due to the numerous advantages of cloud computing, and in order to take advantage of cloud computing infrastructure, services of universities and HEIs need to migrate to the cloud. However, this transition involves many challenges, one of which is lack or shortage of appropriate architecture for migration to the technology. Using a reliable architecture for migration ensures managers to mitigate risks in the cloud computing technology. Therefore, organizations always search for suitable cloud computing architecture. In previous studies, these important features have received less attention and have not been achieved in a comprehensive way. The aim of this study is to use a meta-synthesis method for the first time to analyze the previously published studies and to suggest appropriate hybrid cloud migration architecture (IUHEC. We reviewed many papers from relevant journals and conference proceedings. The concepts extracted from these papers are classified to related categories and sub-categories. Then, we developed our proposed hybrid architecture based on these concepts and categories. The proposed architecture was validated by a panel of experts and Lawshe’s model was used to determine the content validity. Due to its innovative yet user-friendly nature, comprehensiveness, and high security, this architecture can help HEIs have an effective migration to cloud computing environment.

  10. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    Science.gov (United States)

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into

  11. Development of computer models for fuel element behaviour in water reactors

    International Nuclear Information System (INIS)

    Gittus, J.H.

    1987-03-01

    Description of fuel behaviour during normal operation transients and accident conditions has always represented a most challenging and important problem. Reliable predictions constitute a basic demand for safety based calculations, for design purposes and for fuel performance. Therefore, computer codes based on deterministic and probabilistic models were developed. Possibility of comprehensive descriptions of the phenomena is precluded in view of the great number of individual processes, involving physical, chemical, thermohydraulical and mechanical parameters, to be considered in a wide range of situations. In case of fast thermal transients predictive capability is limited by the kinetics of evolution of the system and its eventual dynamic behaviour. Evidently, probabilistic approaches are also limited by the sparcity and limited breadth of the impirical data base. Code predictions have to be evaluated against power reactor data, results from simulation experiments and, if possible, include cross validation of different codes and validation of sub-models. Progress on this subject is reviewed in this report, which completes the co-ordinated research programme on 'Development of Computer Models for Fuel Element Behaviour in Water Reactors' (D-COM), initiated under the auspices of the IAEA in 1981

  12. Development of computer systems for planning and management of reactor decommissioning

    International Nuclear Information System (INIS)

    Yanagihara, Satoshi; Sukegawa, Takenori; Shiraishi, Kunio

    2001-01-01

    The computer systems for planning and management of reactor decommissioning were developed for effective implementation of a decommissioning project. The systems are intended to be applied to construction of work breakdown structures and estimation of manpower needs, worker doses, etc. based on the unit productivity and work difficulty factors, which were developed by analyzing the actual data on the JPDR dismantling activities. In addition, information necessary for project planning can be effectively integrated as a graphical form on a computer screen by transferring the data produced by subprograms such as radioactive inventory and dose rate calculation routines among the systems. Expert systems were adopted for modeling a new decommissioning project using production rules by reconstructing work breakdown structures and work specifications. As the results, the systems were characterized by effective modeling of a decommissioning project, project management data estimation based on feedback of past experience, and information integration through the graphical user interface. On the other hands, the systems were validated by comparing the calculated results with the actual manpower needs of the JPDR dismantling activities; it is expected that the systems will be applicable to planning and evaluation of other decommissioning projects. (author)

  13. Development of high performance scientific components for interoperability of computing packages

    Energy Technology Data Exchange (ETDEWEB)

    Gulabani, Teena Pratap [Iowa State Univ., Ames, IA (United States)

    2008-01-01

    Three major high performance quantum chemistry computational packages, NWChem, GAMESS and MPQC have been developed by different research efforts following different design patterns. The goal is to achieve interoperability among these packages by overcoming the challenges caused by the different communication patterns and software design of each of these packages. A chemistry algorithm is hard to develop as well as being a time consuming process; integration of large quantum chemistry packages will allow resource sharing and thus avoid reinvention of the wheel. Creating connections between these incompatible packages is the major motivation of the proposed work. This interoperability is achieved by bringing the benefits of Component Based Software Engineering through a plug-and-play component framework called Common Component Architecture (CCA). In this thesis, I present a strategy and process used for interfacing two widely used and important computational chemistry methodologies: Quantum Mechanics and Molecular Mechanics. To show the feasibility of the proposed approach the Tuning and Analysis Utility (TAU) has been coupled with NWChem code and its CCA components. Results show that the overhead is negligible when compared to the ease and potential of organizing and coping with large-scale software applications.

  14. Development of an international matrix-solver prediction system on a French-Japanese international grid computing environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio; Kushida, Noriyuki; Tatekawa, Takayuki; Teshima, Naoya; Caniou, Yves; Guivarch, Ronan; Dayde, Michel; Ramet, Pierre

    2010-01-01

    The 'Research and Development of International Matrix-Solver Prediction System (REDIMPS)' project aimed at improving the TLSE sparse linear algebra expert website by establishing an international grid computing environment between Japan and France. To help users in identifying the best solver or sparse linear algebra tool for their problems, we have developed an interoperable environment between French and Japanese grid infrastructures (respectively managed by DIET and AEGIS). Two main issues were considered. The first issue is how to submit a job from DIET to AEGIS. The second issue is how to bridge the difference of security between DIET and AEGIS. To overcome these issues, we developed APIs to communicate between different grid infrastructures by improving the client API of AEGIS. By developing a server deamon program (SeD) of DIET which behaves like an AEGIS user, DIET can call functions in AEGIS: authentication, file transfer, job submission, and so on. To intensify the security, we also developed functionalities to authenticate DIET sites and DIET users in order to access AEGIS computing resources. By this study, the set of software and computers available within TLSE to find an appropriate solver is enlarged over France (DIET) and Japan (AEGIS). (author)

  15. Iodine-131 induced hepatotoxicity in previously healthy patients with Grave's disease.

    Science.gov (United States)

    Jhummon, Navina Priya; Tohooloo, Bhavna; Qu, Shen

    2013-01-01

    To describe the association of the rare and serious complication of liver toxicity in previously healthy Grave's disease (GD) patients after the treatment with radioactive iodine (131)I (RAI). We report the clinical, laboratory and pathologic findings of 2 cases of severe liver toxicity associated with the treatment with RAI in previously healthy patients with GD. Clinical examination and laboratory investigations excluded viral hepatitis, autoimmune hepatitis, granulomatous disease, primary biliary disease, extrahepatic biliary obstruction, and heart failure. Case 1: A previously healthy 52-years old man reportedly having a typical GD but following RAI treatment, concomitantly developed severe liver toxicity that required 1 week of treatment in hospital. Case 2: A previously healthy 34-years old woman is reported as having a typical GD but developed jaundice following RAI treatment that required several weeks of in hospital treatment in the hepato-biliary department. In both cases, the liver dysfunction resolved after intensive treatment with hepato-protective agents. In this report the therapeutic considerations as well as the pathogenetic possibilities are reviewed. To the best of our knowledge, this is the first description of the association observed, which is rare but may be severe and should be considered in any case of thyrotoxicosis where a liver dysfunction develops after the treatment with radioactive iodine (131)I.

  16. On the Development of a Computing Infrastructure that Facilitates IPPD from a Decision-Based Design Perspective

    Science.gov (United States)

    Hale, Mark A.; Craig, James I.; Mistree, Farrokh; Schrage, Daniel P.

    1995-01-01

    Integrated Product and Process Development (IPPD) embodies the simultaneous application of both system and quality engineering methods throughout an iterative design process. The use of IPPD results in the time-conscious, cost-saving development of engineering systems. Georgia Tech has proposed the development of an Integrated Design Engineering Simulator that will merge Integrated Product and Process Development with interdisciplinary analysis techniques and state-of-the-art computational technologies. To implement IPPD, a Decision-Based Design perspective is encapsulated in an approach that focuses on the role of the human designer in product development. The approach has two parts and is outlined in this paper. First, an architecture, called DREAMS, is being developed that facilitates design from a decision-based perspective. Second, a supporting computing infrastructure, called IMAGE, is being designed. The current status of development is given and future directions are outlined.

  17. Magnetic fusion energy and computers: the role of computing in magnetic fusion energy research and development

    International Nuclear Information System (INIS)

    1979-10-01

    This report examines the role of computing in the Department of Energy magnetic confinement fusion program. The present status of the MFECC and its associated network is described. The third part of this report examines the role of computer models in the main elements of the fusion program and discusses their dependence on the most advanced scientific computers. A review of requirements at the National MFE Computer Center was conducted in the spring of 1976. The results of this review led to the procurement of the CRAY 1, the most advanced scientific computer available, in the spring of 1978. The utilization of this computer in the MFE program has been very successful and is also described in the third part of the report. A new study of computer requirements for the MFE program was conducted during the spring of 1979 and the results of this analysis are presented in the forth part of this report

  18. Development of computed tomography instrument for college teaching

    International Nuclear Information System (INIS)

    Liu Fenglin; Lu Yanping; Wang Jue

    2006-01-01

    Computed tomography (CT), which uses penetrating radiation from many directions to reconstruct cross-sectional or 3D images of object, has widely applied in medical diagnosis and treatment, industrial NDT and NDE. So it is significant for college students to understand the fundamental of CT. The authors describe the CD-50BG CT instrument developed for experimental teaching at colleges. With 50 mm field-of-view and the translation-rotation scanning mode, the system makes use of a single plastic scintillator + photomultiplier detector and a 137 Cs radioactive source with 0.74 GBq activity, which is housed in a tungsten alloy shield. At the same time, an image processing software has been developed to process the acquired data, so that cross-sectional and 3D images can be reconstructed. High quality images with 1 lp·mm -1 spatial resolution and 1% contrast sensitivity are obtained. So far in China, more than ten institutions including Tsinghua University and Peking University have already applied the system to elementary teaching. (authors)

  19. Computational chemistry

    Science.gov (United States)

    Arnold, J. O.

    1987-01-01

    With the advent of supercomputers, modern computational chemistry algorithms and codes, a powerful tool was created to help fill NASA's continuing need for information on the properties of matter in hostile or unusual environments. Computational resources provided under the National Aerodynamics Simulator (NAS) program were a cornerstone for recent advancements in this field. Properties of gases, materials, and their interactions can be determined from solutions of the governing equations. In the case of gases, for example, radiative transition probabilites per particle, bond-dissociation energies, and rates of simple chemical reactions can be determined computationally as reliably as from experiment. The data are proving to be quite valuable in providing inputs to real-gas flow simulation codes used to compute aerothermodynamic loads on NASA's aeroassist orbital transfer vehicles and a host of problems related to the National Aerospace Plane Program. Although more approximate, similar solutions can be obtained for ensembles of atoms simulating small particles of materials with and without the presence of gases. Computational chemistry has application in studying catalysis, properties of polymers, all of interest to various NASA missions, including those previously mentioned. In addition to discussing these applications of computational chemistry within NASA, the governing equations and the need for supercomputers for their solution is outlined.

  20. Patient perspective: choosing or developing instruments.

    Science.gov (United States)

    Kirwan, John R; Fries, James F; Hewlett, Sarah; Osborne, Richard H

    2011-08-01

    Previous Outcome Measures in Rheumatology (OMERACT) meetings recognized that patients view outcomes of intervention from a different perspective. This preconference position paper briefly sets out 2 patient-reported outcome (PRO) instrument approaches, the PROMISE computer adaptive testing (CAT) system and development of a rheumatoid arthritis-specific questionnaire to measure fatigue; a tentative proposal for a PRO instrument development pathway is also made.

  1. A dynamical-systems approach for computing ice-affected streamflow

    Science.gov (United States)

    Holtschlag, David J.

    1996-01-01

    A dynamical-systems approach was developed and evaluated for computing ice-affected streamflow. The approach provides for dynamic simulation and parameter estimation of site-specific equations relating ice effects to routinely measured environmental variables. Comparison indicates that results from the dynamical-systems approach ranked higher than results from 11 analytical methods previously investigated on the basis of accuracy and feasibility criteria. Additional research will likely lead to further improvements in the approach.

  2. Interpreting "Personality" Taxonomies: Why Previous Models Cannot Capture Individual-Specific Experiencing, Behaviour, Functioning and Development. Major Taxonomic Tasks Still Lay Ahead.

    Science.gov (United States)

    Uher, Jana

    2015-12-01

    As science seeks to make generalisations, a science of individual peculiarities encounters intricate challenges. This article explores these challenges by applying the Transdisciplinary Philosophy-of-Science Paradigm for Research on Individuals (TPS-Paradigm) and by exploring taxonomic "personality" research as an example. Analyses of researchers' interpretations of the taxonomic "personality" models, constructs and data that have been generated in the field reveal widespread erroneous assumptions about the abilities of previous methodologies to appropriately represent individual-specificity in the targeted phenomena. These assumptions, rooted in everyday thinking, fail to consider that individual-specificity and others' minds cannot be directly perceived, that abstract descriptions cannot serve as causal explanations, that between-individual structures cannot be isomorphic to within-individual structures, and that knowledge of compositional structures cannot explain the process structures of their functioning and development. These erroneous assumptions and serious methodological deficiencies in widely used standardised questionnaires have effectively prevented psychologists from establishing taxonomies that can comprehensively model individual-specificity in most of the kinds of phenomena explored as "personality", especially in experiencing and behaviour and in individuals' functioning and development. Contrary to previous assumptions, it is not universal models but rather different kinds of taxonomic models that are required for each of the different kinds of phenomena, variations and structures that are commonly conceived of as "personality". Consequently, to comprehensively explore individual-specificity, researchers have to apply a portfolio of complementary methodologies and develop different kinds of taxonomies, most of which have yet to be developed. Closing, the article derives some meta-desiderata for future research on individuals' "personality".

  3. Development of a Wireless Computer Vision Instrument to Detect Biotic Stress in Wheat

    Directory of Open Access Journals (Sweden)

    Joaquin J. Casanova

    2014-09-01

    Full Text Available Knowledge of crop abiotic and biotic stress is important for optimal irrigation management. While spectral reflectance and infrared thermometry provide a means to quantify crop stress remotely, these measurements can be cumbersome. Computer vision offers an inexpensive way to remotely detect crop stress independent of vegetation cover. This paper presents a technique using computer vision to detect disease stress in wheat. Digital images of differentially stressed wheat were segmented into soil and vegetation pixels using expectation maximization (EM. In the first season, the algorithm to segment vegetation from soil and distinguish between healthy and stressed wheat was developed and tested using digital images taken in the field and later processed on a desktop computer. In the second season, a wireless camera with near real-time computer vision capabilities was tested in conjunction with the conventional camera and desktop computer. For wheat irrigated at different levels and inoculated with wheat streak mosaic virus (WSMV, vegetation hue determined by the EM algorithm showed significant effects from irrigation level and infection. Unstressed wheat had a higher hue (118.32 than stressed wheat (111.34. In the second season, the hue and cover measured by the wireless computer vision sensor showed significant effects from infection (p = 0.0014, as did the conventional camera (p < 0.0001. Vegetation hue obtained through a wireless computer vision system in this study is a viable option for determining biotic crop stress in irrigation scheduling. Such a low-cost system could be suitable for use in the field in automated irrigation scheduling applications.

  4. Mathematical Language Development and Talk Types in Computer Supported Collaborative Learning Environments

    Science.gov (United States)

    Symons, Duncan; Pierce, Robyn

    2015-01-01

    In this study we examine the use of cumulative and exploratory talk types in a year 5 computer supported collaborative learning environment. The focus for students in this environment was to participate in mathematical problem solving, with the intention of developing the proficiencies of problem solving and reasoning. Findings suggest that…

  5. Multiparty Computations

    DEFF Research Database (Denmark)

    Dziembowski, Stefan

    here and discuss other problems caused by the adaptiveness. All protocols in the thesis are formally specified and the proofs of their security are given. [1]Ronald Cramer, Ivan Damgård, Stefan Dziembowski, Martin Hirt, and Tal Rabin. Efficient multiparty computations with dishonest minority......In this thesis we study a problem of doing Verifiable Secret Sharing (VSS) and Multiparty Computations in a model where private channels between the players and a broadcast channel is available. The adversary is active, adaptive and has an unbounded computing power. The thesis is based on two...... to a polynomial time black-box reduction, the complexity of adaptively secure VSS is the same as that of ordinary secret sharing (SS), where security is only required against a passive, static adversary. Previously, such a connection was only known for linear secret sharing and VSS schemes. We then show...

  6. Plasma geometric optics analysis and computation

    International Nuclear Information System (INIS)

    Smith, T.M.

    1983-01-01

    Important practical applications in the generation, manipulation, and diagnosis of laboratory thermonuclear plasmas have created a need for elaborate computational capabilities in the study of high frequency wave propagation in plasmas. A reduced description of such waves suitable for digital computation is provided by the theory of plasma geometric optics. The existing theory is beset by a variety of special cases in which the straightforward analytical approach fails, and has been formulated with little attention to problems of numerical implementation of that analysis. The standard field equations are derived for the first time from kinetic theory. A discussion of certain terms previously, and erroneously, omitted from the expansion of the plasma constitutive relation is given. A powerful but little known computational prescription for determining the geometric optics field in the neighborhood of caustic singularities is rigorously developed, and a boundary layer analysis for the asymptotic matching of the plasma geometric optics field across caustic singularities is performed for the first time with considerable generality. A proper treatment of birefringence is detailed, wherein a breakdown of the fundamental perturbation theory is identified and circumvented. A general ray tracing computer code suitable for applications to radiation heating and diagnostic problems is presented and described

  7. Computational intelligence and neuromorphic computing potential for cybersecurity applications

    Science.gov (United States)

    Pino, Robinson E.; Shevenell, Michael J.; Cam, Hasan; Mouallem, Pierre; Shumaker, Justin L.; Edwards, Arthur H.

    2013-05-01

    In today's highly mobile, networked, and interconnected internet world, the flow and volume of information is overwhelming and continuously increasing. Therefore, it is believed that the next frontier in technological evolution and development will rely in our ability to develop intelligent systems that can help us process, analyze, and make-sense of information autonomously just as a well-trained and educated human expert. In computational intelligence, neuromorphic computing promises to allow for the development of computing systems able to imitate natural neurobiological processes and form the foundation for intelligent system architectures.

  8. Computational and experimental investigation of local stress fiber orientation in uniaxially and biaxially constrained microtissues

    NARCIS (Netherlands)

    Obbink - Huizer, C.; Foolen, J.; Oomens, C.W.J.; Borochin, M.A.; Chen, C.S.; Bouten, C.V.C.; Baaijens, F.P.T.

    2014-01-01

    The orientation of cells and associated F-actin stress fibers is essential for proper tissue functioning. We have previously developed a computational model that qualitatively describes stress fiber orientation in response to a range of mechanical stimuli. In this paper, the aim is to quantitatively

  9. The study of Kruskal's and Prim's algorithms on the Multiple Instruction and Single Data stream computer system

    Directory of Open Access Journals (Sweden)

    A. Yu. Popov

    2015-01-01

    Full Text Available Bauman Moscow State Technical University is implementing a project to develop operating principles of computer system having radically new architecture. A developed working model of the system allowed us to evaluate an efficiency of developed hardware and software. The experimental results presented in previous studies, as well as the analysis of operating principles of new computer system permit to draw conclusions regarding its efficiency in solving discrete optimization problems related to processing of sets.The new architecture is based on a direct hardware support of operations of discrete mathematics, which is reflected in using the special facilities for processing of sets and data structures. Within the framework of the project a special device was designed, i.e. a structure processor (SP, which improved the performance, without limiting the scope of applications of such a computer system.The previous works presented the basic principles of the computational process organization in MISD (Multiple Instructions, Single Data system, showed the structure and features of the structure processor and the general principles to solve discrete optimization problems on graphs.This paper examines two search algorithms of the minimum spanning tree, namely Kruskal's and Prim's algorithms. It studies the implementations of algorithms for two SP operation modes: coprocessor mode and MISD one. The paper presents results of experimental comparison of MISD system performance in coprocessor mode with mainframes.

  10. Computational Science: Ensuring America's Competitiveness

    National Research Council Canada - National Science Library

    Reed, Daniel A; Bajcsy, Ruzena; Fernandez, Manuel A; Griffiths, Jose-Marie; Mott, Randall D; Dongarra, J. J; Johnson, Chris R; Inouye, Alan S; Miner, William; Matzke, Martha K; Ponick, Terry L

    2005-01-01

    ... previously deemed intractable. Yet, despite the great opportunities and needs, universities and the Federal government have not effectively recognized the strategic significance of computational science in either...

  11. Digital optical computers at the optoelectronic computing systems center

    Science.gov (United States)

    Jordan, Harry F.

    1991-01-01

    The Digital Optical Computing Program within the National Science Foundation Engineering Research Center for Opto-electronic Computing Systems has as its specific goal research on optical computing architectures suitable for use at the highest possible speeds. The program can be targeted toward exploiting the time domain because other programs in the Center are pursuing research on parallel optical systems, exploiting optical interconnection and optical devices and materials. Using a general purpose computing architecture as the focus, we are developing design techniques, tools and architecture for operation at the speed of light limit. Experimental work is being done with the somewhat low speed components currently available but with architectures which will scale up in speed as faster devices are developed. The design algorithms and tools developed for a general purpose, stored program computer are being applied to other systems such as optimally controlled optical communication networks.

  12. DEVELOPMENT OF QUARRY SOLUTION VERSION 1.0 FOR QUICK COMPUTATION OF DRILLING AND BLASTING PARAMETERS

    Directory of Open Access Journals (Sweden)

    B. ADEBAYO

    2014-10-01

    Full Text Available Computation of drilling cost, quantity of explosives and blasting cost are routine procedure in Quarry and all these parameters are estimated manually in most of the quarries in Nigeria. This paper deals with the development of application package QUARRY SOLUTION Version 1.0 for quarries using Visual Basic 6.0. In order to achieve this data were obtained from the quarry such as drilling and blasting activities. Also, empirical formulae developed by different researchers were used for computation of the required parameters viz: practical burden, spacing, length of hole, cost of drilling consumables, drilling cost, powder factor, quantity of column charge, total quantity of explosives, volume of blast and blasting cost. The output obtained from the software QUARRY SOLUTION Version 1.0 for length of drilling, drilling cost, total quantity of explosives, volume of blast and blasting cost were compared with the results manually computed for these routine parameters estimated during drilling and blasting operation in quarry, it was then discovered that they followed the same trend. The computation from the application package revealed that 611 blast-holes require 3326.71 kg of high explosives (166 cartons of explosives and 20147.2 kg of low explosives (806 bags of explosives. The total cost was computed to be N 5133999:50 ($ 32087.49. Moreover, the output showed that these routine parameters estimated during drilling and blasting could be computed within a short time frame using this QUARRY SOLUTION, therefore, improving productivity and efficiency. This application package is recommended for use in open-pit and quarries when all necessary inputs are supplied.

  13. New computational paradigms changing conceptions of what is computable

    CERN Document Server

    Cooper, SB; Sorbi, Andrea

    2007-01-01

    This superb exposition of a complex subject examines new developments in the theory and practice of computation from a mathematical perspective. It covers topics ranging from classical computability to complexity, from biocomputing to quantum computing.

  14. Developing Face-to-Face Argumentation Skills: Does Arguing on the Computer Help?

    Science.gov (United States)

    Iordanou, Kalypso

    2013-01-01

    Arguing on the computer was used as a method to promote development of face-to-face argumentation skills in middle schoolers. In the study presented, sixth graders engaged in electronic dialogues with peers on a controversial topic and in some reflective activities based on transcriptions of the dialogues. Although participants initially exhibited…

  15. Debunking the Computer Science Digital Library: Lessons Learned in Collection Development at Seneca College of Applied Arts & Technology

    Science.gov (United States)

    Buczynski, James Andrew

    2005-01-01

    Developing a library collection to support the curriculum of Canada's largest computer studies school has debunked many myths about collecting computer science and technology information resources. Computer science students are among the heaviest print book and e-book users in the library. Circulation statistics indicate that the demand for print…

  16. Computational Models for Calcium-Mediated Astrocyte Functions

    Directory of Open Access Journals (Sweden)

    Tiina Manninen

    2018-04-01

    Full Text Available The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro, but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop

  17. Computational Models for Calcium-Mediated Astrocyte Functions.

    Science.gov (United States)

    Manninen, Tiina; Havela, Riikka; Linne, Marja-Leena

    2018-01-01

    The computational neuroscience field has heavily concentrated on the modeling of neuronal functions, largely ignoring other brain cells, including one type of glial cell, the astrocytes. Despite the short history of modeling astrocytic functions, we were delighted about the hundreds of models developed so far to study the role of astrocytes, most often in calcium dynamics, synchronization, information transfer, and plasticity in vitro , but also in vascular events, hyperexcitability, and homeostasis. Our goal here is to present the state-of-the-art in computational modeling of astrocytes in order to facilitate better understanding of the functions and dynamics of astrocytes in the brain. Due to the large number of models, we concentrated on a hundred models that include biophysical descriptions for calcium signaling and dynamics in astrocytes. We categorized the models into four groups: single astrocyte models, astrocyte network models, neuron-astrocyte synapse models, and neuron-astrocyte network models to ease their use in future modeling projects. We characterized the models based on which earlier models were used for building the models and which type of biological entities were described in the astrocyte models. Features of the models were compared and contrasted so that similarities and differences were more readily apparent. We discovered that most of the models were basically generated from a small set of previously published models with small variations. However, neither citations to all the previous models with similar core structure nor explanations of what was built on top of the previous models were provided, which made it possible, in some cases, to have the same models published several times without an explicit intention to make new predictions about the roles of astrocytes in brain functions. Furthermore, only a few of the models are available online which makes it difficult to reproduce the simulation results and further develop the models. Thus

  18. Cloud Computing in Higher Education Sector for Sustainable Development

    Science.gov (United States)

    Duan, Yuchao

    2016-01-01

    Cloud computing is considered a new frontier in the field of computing, as this technology comprises three major entities namely: software, hardware and network. The collective nature of all these entities is known as the Cloud. This research aims to examine the impacts of various aspects namely: cloud computing, sustainability, performance…

  19. MoPCoM Methodology: Focus on Models of Computation

    Science.gov (United States)

    Koudri, Ali; Champeau, Joël; Le Lann, Jean-Christophe; Leilde, Vincent

    Today, developments of Real Time Embedded Systems have to face new challenges. On the one hand, Time-To-Market constraints require a reliable development process allowing quick design space exploration. On the other hand, rapidly developing technology, as stated by Moore's law, requires techniques to handle the resulting productivity gap. In a previous paper, we have presented our Model Based Engineering methodology addressing those issues. In this paper, we make a focus on Models of Computation design and analysis. We illustrate our approach on a Cognitive Radio System development implemented on an FPGA. This work is part of the MoPCoM research project gathering academic and industrial organizations (http://www.mopcom.fr).

  20. The development of remote teaching laboratory access software for multi-slice computed optical tomography for use in undergraduate nuclear education

    International Nuclear Information System (INIS)

    Price, T.J.; Nichita, E.

    2013-01-01

    Internet-based laboratory exercises were developed for a course on biomedical imaging at the University of Ontario Institute of Technology. These exercises used a multi-slice computed optical tomography machine named DeskCAT to instruct students on the principals of computed tomography. User management software was developed which enabled course instructors to quickly set up a computer to accept a series of scheduled remote user connections for a classroom. Laboratory exercises using the DeskCAT machine were developed to be conducted remotely. (author)

  1. The development of remote teaching laboratory access software for multi-slice computed optical tomography for use in undergraduate nuclear education

    Energy Technology Data Exchange (ETDEWEB)

    Price, T.J.; Nichita, E., E-mail: Terry.Price@gmail.com [Univ. of Ontario Inst. of Technology, Oshawa, Ontario (Canada)

    2013-07-01

    Internet-based laboratory exercises were developed for a course on biomedical imaging at the University of Ontario Institute of Technology. These exercises used a multi-slice computed optical tomography machine named DeskCAT to instruct students on the principals of computed tomography. User management software was developed which enabled course instructors to quickly set up a computer to accept a series of scheduled remote user connections for a classroom. Laboratory exercises using the DeskCAT machine were developed to be conducted remotely. (author)

  2. Computer software.

    Science.gov (United States)

    Rosenthal, L E

    1986-10-01

    Software is the component in a computer system that permits the hardware to perform the various functions that a computer system is capable of doing. The history of software and its development can be traced to the early nineteenth century. All computer systems are designed to utilize the "stored program concept" as first developed by Charles Babbage in the 1850s. The concept was lost until the mid-1940s, when modern computers made their appearance. Today, because of the complex and myriad tasks that a computer system can perform, there has been a differentiation of types of software. There is software designed to perform specific business applications. There is software that controls the overall operation of a computer system. And there is software that is designed to carry out specialized tasks. Regardless of types, software is the most critical component of any computer system. Without it, all one has is a collection of circuits, transistors, and silicone chips.

  3. Reasoning with Previous Decisions: Beyond the Doctrine of Precedent

    DEFF Research Database (Denmark)

    Komárek, Jan

    2013-01-01

    in different jurisdictions use previous judicial decisions in their argument, we need to move beyond the concept of precedent to a wider notion, which would embrace practices and theories in legal systems outside the Common law tradition. This article presents the concept of ‘reasoning with previous decisions...... law method’, but they are no less rational and intellectually sophisticated. The reason for the rather conceited attitude of some comparatists is in the dominance of the common law paradigm of precedent and the accompanying ‘case law method’. If we want to understand how courts and lawyers......’ as such an alternative and develops its basic models. The article first points out several shortcomings inherent in limiting the inquiry into reasoning with previous decisions by the common law paradigm (1). On the basis of numerous examples provided in section (1), I will present two basic models of reasoning...

  4. Consolidation of cloud computing in ATLAS

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00224309; The ATLAS collaboration; Cordeiro, Cristovao; Hover, John; Kouba, Tomas; Love, Peter; Mcnab, Andrew; Schovancova, Jaroslava; Sobie, Randall; Giordano, Domenico

    2017-01-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in resp...

  5. Framework for Computation Offloading in Mobile Cloud Computing

    Directory of Open Access Journals (Sweden)

    Dejan Kovachev

    2012-12-01

    Full Text Available The inherently limited processing power and battery lifetime of mobile phones hinder the possible execution of computationally intensive applications like content-based video analysis or 3D modeling. Offloading of computationally intensive application parts from the mobile platform into a remote cloud infrastructure or nearby idle computers addresses this problem. This paper presents our Mobile Augmentation Cloud Services (MACS middleware which enables adaptive extension of Android application execution from a mobile client into the cloud. Applications are developed by using the standard Android development pattern. The middleware does the heavy lifting of adaptive application partitioning, resource monitoring and computation offloading. These elastic mobile applications can run as usual mobile application, but they can also use remote computing resources transparently. Two prototype applications using the MACS middleware demonstrate the benefits of the approach. The evaluation shows that applications, which involve costly computations, can benefit from offloading with around 95% energy savings and significant performance gains compared to local execution only.

  6. A Development Architecture for Serious Games Using BCI (Brain Computer Interface Sensors

    Directory of Open Access Journals (Sweden)

    Kyhyun Um

    2012-11-01

    Full Text Available Games that use brainwaves via brain–computer interface (BCI devices, to improve brain functions are known as BCI serious games. Due to the difficulty of developing BCI serious games, various BCI engines and authoring tools are required, and these reduce the development time and cost. However, it is desirable to reduce the amount of technical knowledge of brain functions and BCI devices needed by game developers. Moreover, a systematic BCI serious game development process is required. In this paper, we present a methodology for the development of BCI serious games. We describe an architecture, authoring tools, and development process of the proposed methodology, and apply it to a game development approach for patients with mild cognitive impairment as an example. This application demonstrates that BCI serious games can be developed on the basis of expert-verified theories.

  7. A development architecture for serious games using BCI (brain computer interface) sensors.

    Science.gov (United States)

    Sung, Yunsick; Cho, Kyungeun; Um, Kyhyun

    2012-11-12

    Games that use brainwaves via brain-computer interface (BCI) devices, to improve brain functions are known as BCI serious games. Due to the difficulty of developing BCI serious games, various BCI engines and authoring tools are required, and these reduce the development time and cost. However, it is desirable to reduce the amount of technical knowledge of brain functions and BCI devices needed by game developers. Moreover, a systematic BCI serious game development process is required. In this paper, we present a methodology for the development of BCI serious games. We describe an architecture, authoring tools, and development process of the proposed methodology, and apply it to a game development approach for patients with mild cognitive impairment as an example. This application demonstrates that BCI serious games can be developed on the basis of expert-verified theories.

  8. A Development Architecture for Serious Games Using BCI (Brain Computer Interface) Sensors

    Science.gov (United States)

    Sung, Yunsick; Cho, Kyungeun; Um, Kyhyun

    2012-01-01

    Games that use brainwaves via brain–computer interface (BCI) devices, to improve brain functions are known as BCI serious games. Due to the difficulty of developing BCI serious games, various BCI engines and authoring tools are required, and these reduce the development time and cost. However, it is desirable to reduce the amount of technical knowledge of brain functions and BCI devices needed by game developers. Moreover, a systematic BCI serious game development process is required. In this paper, we present a methodology for the development of BCI serious games. We describe an architecture, authoring tools, and development process of the proposed methodology, and apply it to a game development approach for patients with mild cognitive impairment as an example. This application demonstrates that BCI serious games can be developed on the basis of expert-verified theories. PMID:23202227

  9. Development of Computer-Based Resources for Textile Education.

    Science.gov (United States)

    Hopkins, Teresa; Thomas, Andrew; Bailey, Mike

    1998-01-01

    Describes the production of computer-based resources for students of textiles and engineering in the United Kingdom. Highlights include funding by the Teaching and Learning Technology Programme (TLTP), courseware author/subject expert interaction, usage test and evaluation, authoring software, graphics, computer-aided design simulation, self-test…

  10. New Developments and Geoscience Applications of Synchrotron Computed Microtomography (Invited)

    Science.gov (United States)

    Rivers, M. L.; Wang, Y.; Newville, M.; Sutton, S. R.; Yu, T.; Lanzirotti, A.

    2013-12-01

    Computed microtomography is the extension to micron spatial resolution of the CAT scanning technique developed for medical imaging. Synchrotron sources are ideal for the method, since they provide a monochromatic, parallel beam with high intensity. High energy storage rings such as the Advanced Photon Source at Argonne National Laboratory produce x-rays with high energy, high brilliance, and high coherence. All of these factors combine to produce an extremely powerful imaging tool for earth science research. Techniques that have been developed include: - Absorption and phase contrast computed tomography with spatial resolution below one micron. - Differential contrast computed tomography, imaging above and below the absorption edge of a particular element. - High-pressure tomography, imaging inside a pressure cell at pressures above 10GPa. - High speed radiography and tomography, with 100 microsecond temporal resolution. - Fluorescence tomography, imaging the 3-D distribution of elements present at ppm concentrations. - Radiographic strain measurements during deformation at high confining pressure, combined with precise x-ray diffraction measurements to determine stress. These techniques have been applied to important problems in earth and environmental sciences, including: - The 3-D distribution of aqueous and organic liquids in porous media, with applications in contaminated groundwater and petroleum recovery. - The kinetics of bubble formation in magma chambers, which control explosive volcanism. - Studies of the evolution of the early solar system from 3-D textures in meteorites - Accurate crystal size distributions in volcanic systems, important for understanding the evolution of magma chambers. - The equation-of-state of amorphous materials at high pressure using both direct measurements of volume as a function of pressure and also by measuring the change x-ray absorption coefficient as a function of pressure. - The location and chemical speciation of toxic

  11. CMS Distributed Computing Integration in the LHC sustained operations era

    International Nuclear Information System (INIS)

    Grandi, C; Bonacorsi, D; Bockelman, B; Fisk, I

    2011-01-01

    After many years of preparation the CMS computing system has reached a situation where stability in operations limits the possibility to introduce innovative features. Nevertheless it is the same need of stability and smooth operations that requires the introduction of features that were considered not strategic in the previous phases. Examples are: adequate authorization to control and prioritize the access to storage and computing resources; improved monitoring to investigate problems and identify bottlenecks on the infrastructure; increased automation to reduce the manpower needed for operations; effective process to deploy in production new releases of the software tools. We present the work of the CMS Distributed Computing Integration Activity that is responsible for providing a liaison between the CMS distributed computing infrastructure and the software providers, both internal and external to CMS. In particular we describe the introduction of new middleware features during the last 18 months as well as the requirements to Grid and Cloud software developers for the future.

  12. Computational techniques used in the development of coprocessing flowsheets

    International Nuclear Information System (INIS)

    Groenier, W.S.; Mitchell, A.D.; Jubin, R.T.

    1979-01-01

    The computer program SEPHIS, developed to aid in determining optimum solvent extraction conditions for the reprocessing of nuclear power reactor fuels by the Purex method, is described. The program employs a combination of approximate mathematical equilibrium expressions and a transient, stagewise-process calculational method to allow stage and product-stream concentrations to be predicted with accuracy and reliability. The possible applications to inventory control for nuclear material safeguards, nuclear criticality analysis, and process analysis and control are of special interest. The method is also applicable to other counntercurrent liquid--liquid solvent extraction processes having known chemical kinetics, that may involve multiple solutes and are performed in conventional contacting equipment

  13. Image analysis and modeling in medical image computing. Recent developments and advances.

    Science.gov (United States)

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body

  14. Applications and issues in automotive computational aeroacoustics

    International Nuclear Information System (INIS)

    Karbon, K.J.; Kumarasamy, S.; Singh, R.

    2002-01-01

    Automotive aeroacoustics is the noise generated due to the airflow around a moving vehicle. Previously regarded as a minor contributor, wind noise is now recognized as one of the dominant vehicle sound sources, since significant progress has been made in suppressing engine and tire noise. Currently, almost all aeroacoustic development work is performed experimentally on a full-scale vehicle in the wind tunnel. Any reduction in hardware models is recognized as one of the major enablers to quickly bring the vehicle to market. In addition, prediction of noise sources and characteristics at the early stages of vehicle design will help in reducing the costly fixes at the later stages. However, predictive methods such as Computational Fluid Dynamics (CFD) and Computational Aeroacoustics (CAA) are still under development and are not considered mainstream design tools. This paper presents some initial applications and findings of CFD and CAA analysis towards vehicle aeroacoustics. Transient Reynolds Averaged Navier Stokes (RANS) and Lighthill-Curle methods are used to model low frequency buffeting and high frequency wind rush noise. Benefits and limitations of the approaches are described. (author)

  15. Virtual photons in imaginary time: Computing exact Casimir forces via standard numerical electromagnetism techniques

    International Nuclear Information System (INIS)

    Rodriguez, Alejandro; Ibanescu, Mihai; Joannopoulos, J. D.; Johnson, Steven G.; Iannuzzi, Davide

    2007-01-01

    We describe a numerical method to compute Casimir forces in arbitrary geometries, for arbitrary dielectric and metallic materials, with arbitrary accuracy (given sufficient computational resources). Our approach, based on well-established integration of the mean stress tensor evaluated via the fluctuation-dissipation theorem, is designed to directly exploit fast methods developed for classical computational electromagnetism, since it only involves repeated evaluation of the Green's function for imaginary frequencies (equivalently, real frequencies in imaginary time). We develop the approach by systematically examining various formulations of Casimir forces from the previous decades and evaluating them according to their suitability for numerical computation. We illustrate our approach with a simple finite-difference frequency-domain implementation, test it for known geometries such as a cylinder and a plate, and apply it to new geometries. In particular, we show that a pistonlike geometry of two squares sliding between metal walls, in both two and three dimensions with both perfect and realistic metallic materials, exhibits a surprising nonmonotonic ''lateral'' force from the walls

  16. Development of posture-specific computational phantoms using motion capture technology and application to radiation dose-reconstruction for the 1999 Tokai-Mura nuclear criticality accident

    International Nuclear Information System (INIS)

    Vazquez, Justin A; Caracappa, Peter F; Xu, X George

    2014-01-01

    The majority of existing computational phantoms are designed to represent workers in typical standing anatomical postures with fixed arm and leg positions. However, workers found in accident-related scenarios often assume varied postures. This paper describes the development and application of two phantoms with adjusted postures specified by data acquired from a motion capture system to simulate unique human postures found in a 1999 criticality accident that took place at a JCO facility in Tokai-Mura, Japan. In the course of this accident, two workers were fatally exposed to extremely high levels of radiation. Implementation of the emergent techniques discussed produced more accurate and more detailed dose estimates for the two workers than were reported in previous studies. A total-body dose of 6.43 and 26.38 Gy was estimated for the two workers, who assumed a crouching and a standing posture, respectively. Additionally, organ-specific dose estimates were determined, including a 7.93 Gy dose to the thyroid and 6.11 Gy dose to the stomach for the crouching worker and a 41.71 Gy dose to the liver and a 37.26 Gy dose to the stomach for the standing worker. Implications for the medical prognosis of the workers are discussed, and the results of this study were found to correlate better with the patient outcome than previous estimates, suggesting potential future applications of such methods for improved epidemiological studies involving next-generation computational phantom tools. (paper)

  17. Theory, Modeling, Software and Hardware Development for Analytical and Computational Materials Science

    Science.gov (United States)

    Young, Gerald W.; Clemons, Curtis B.

    2004-01-01

    The focus of this Cooperative Agreement between the Computational Materials Laboratory (CML) of the Processing Science and Technology Branch of the NASA Glenn Research Center (GRC) and the Department of Theoretical and Applied Mathematics at The University of Akron was in the areas of system development of the CML workstation environment, modeling of microgravity and earth-based material processing systems, and joint activities in laboratory projects. These efforts complement each other as the majority of the modeling work involves numerical computations to support laboratory investigations. Coordination and interaction between the modelers, system analysts, and laboratory personnel are essential toward providing the most effective simulations and communication of the simulation results. Toward these means, The University of Akron personnel involved in the agreement worked at the Applied Mathematics Research Laboratory (AMRL) in the Department of Theoretical and Applied Mathematics while maintaining a close relationship with the personnel of the Computational Materials Laboratory at GRC. Network communication between both sites has been established. A summary of the projects we undertook during the time period 9/1/03 - 6/30/04 is included.

  18. Development of tight-binding, chemical-reaction-dynamics simulator for combinatorial computational chemistry

    International Nuclear Information System (INIS)

    Kubo, Momoji; Ando, Minako; Sakahara, Satoshi; Jung, Changho; Seki, Kotaro; Kusagaya, Tomonori; Endou, Akira; Takami, Seiichi; Imamura, Akira; Miyamoto, Akira

    2004-01-01

    Recently, we have proposed a new concept called 'combinatorial computational chemistry' to realize a theoretical, high-throughput screening of catalysts and materials. We have already applied our combinatorial, computational-chemistry approach, mainly based on static first-principles calculations, to various catalysts and materials systems and its applicability to the catalysts and materials design was strongly confirmed. In order to realize more effective and efficient combinatorial, computational-chemistry screening, a high-speed, chemical-reaction-dynamics simulator based on quantum-chemical, molecular-dynamics method is essential. However, to the best of our knowledge, there is no chemical-reaction-dynamics simulator, which has an enough high-speed ability to perform a high-throughput screening. In the present study, we have succeeded in the development of a chemical-reaction-dynamics simulator based on our original, tight-binding, quantum-chemical, molecular-dynamics method, which is more than 5000 times faster than the regular first-principles, molecular-dynamics method. Moreover, its applicability and effectiveness to the atomistic clarification of the methanol-synthesis dynamics at reaction temperature were demonstrated

  19. Development of a multimaterial, two-dimensional, arbitrary Lagrangian-Eulerian mesh computer program

    International Nuclear Information System (INIS)

    Barton, R.T.

    1982-01-01

    We have developed a large, multimaterial, two-dimensional Arbitrary Lagrangian-Eulerian (ALE) computer program. The special feature of an ALE mesh is that it can be either an embedded Lagrangian mesh, a fixed Eulerian mesh, or a partially embedded, partially remapped mesh. Remapping is used to remove Lagrangian mesh distortion. This general purpose program has been used for astrophysical modeling, under the guidance of James R. Wilson. The rationale behind the development of this program will be used to highlight several important issues in program design

  20. Current algorithms used in reactor safety codes and the impact of future computer development on these algorithms

    International Nuclear Information System (INIS)

    Mahaffy, J.H.; Liles, D.R.; Woodruff, S.B.

    1985-01-01

    Computational methods and solution procedures used in the US Nuclear Regulatory Commission's reactor safety systems codes, Transient Reactor Analysis Code (TRAC) and Reactor Leak and Power Safety Excursion Code (RELAP), are reviewed. Methods used in TRAC-PF1/MOD1, including the stability-enhancing two-step (SETS) technique, which permits fast computations by allowing time steps larger than the material Courant stability limit, are described in detail, and the differences from RELAP5/MOD2 are noted. Developments in computing, including parallel and vector processing, and their applicability to nuclear reactor safety codes are described. These developments, coupled with appropriate numerical methods, make detailed faster-than-real-time reactor safety analysis a realistic near-term possibility

  1. Computational Intelligence, Cyber Security and Computational Models

    CERN Document Server

    Anitha, R; Lekshmi, R; Kumar, M; Bonato, Anthony; Graña, Manuel

    2014-01-01

    This book contains cutting-edge research material presented by researchers, engineers, developers, and practitioners from academia and industry at the International Conference on Computational Intelligence, Cyber Security and Computational Models (ICC3) organized by PSG College of Technology, Coimbatore, India during December 19–21, 2013. The materials in the book include theory and applications for design, analysis, and modeling of computational intelligence and security. The book will be useful material for students, researchers, professionals, and academicians. It will help in understanding current research trends and findings and future scope of research in computational intelligence, cyber security, and computational models.

  2. On the impact of quantum computing technology on future developments in high-performance scientific computing

    OpenAIRE

    Möller, Matthias; Vuik, Cornelis

    2017-01-01

    Quantum computing technologies have become a hot topic in academia and industry receiving much attention and financial support from all sides. Building a quantum computer that can be used practically is in itself an outstanding challenge that has become the ‘new race to the moon’. Next to researchers and vendors of future computing technologies, national authorities are showing strong interest in maturing this technology due to its known potential to break many of today’s encryption technique...

  3. Development of Computer-Based Training to Supplement Lessons in Fundamentals of Electronics

    Directory of Open Access Journals (Sweden)

    Ian P. Benitez

    2016-05-01

    Full Text Available Teaching Fundamentals of Electronics allow students to familiarize with basic electronics concepts, acquire skills in the use of multi-meter test instrument, and develop mastery in testing basic electronic components. Actual teaching and doing observations during practical activities on components pin identification and testing showed that the lack of skills of new students in testing components can lead to incorrect fault diagnosis and wrong pin connection during in-circuit replacement of the defective parts. With the aim of reinforcing students with concrete understanding of the concepts of components applied in the actual test and measurement, a Computer-Based Training was developed. The proponent developed the learning modules (courseware utilizing concept mapping and storyboarding instructional design. Developing a courseware as simulated, activity-based and interactive as possible was the primary goal to resemble the real-world process. A Local area network (LAN-based learning management system was also developed to use in administering the learning modules. The Paired Sample T-Test based on the pretest and post-test result was used to determine whether the students achieved learning after taking the courseware. The result revealed that there is a significant achievement of the students after studying the learning module. The E-learning content was validated by the instructors in terms of contents, activities, assessment and format with a grand weighted mean of 4.35 interpreted as Sufficient. Based from the evaluation result, supplementing with the proposed computer-based training can enhance the teachinglearning process in electronic fundamentals.

  4. Early Childhood Teacher Candidates\\' Attitudes towards Computer and Computer Assisted Instruction

    OpenAIRE

    Oğuz, Evrim; Ellez, A. Murat; Akamca, Güzin Özyılmaz; Kesercioğlu, Teoman İ.; Girgin, Günseli

    2011-01-01

    The aim of this research is to evaluate preschool candidates’ attitudes towards computers andattitudes towards use of computer assisted instruction. The sample of this study includes 481 early childhoodeducation students who attended Dokuz Eylül University’s department of Early Childhood Education. Data werecollected by using “Scale of Computer Assisted Instruction Attitudes” developed by the Arslan (2006),“Computer Attitudes Scale” developed by Çelik & Bindak (2005) and “General Info...

  5. Teaching Web Application Development: A Case Study in a Computer Science Course

    Science.gov (United States)

    Del Fabro, Marcos Didonet; de Alimeda, Eduardo Cunha; Sluzarski, Fabiano

    2012-01-01

    Teaching web development in Computer Science undergraduate courses is a difficult task. Often, there is a gap between the students' experiences and the reality in the industry. As a consequence, the students are not always well-prepared once they get the degree. This gap is due to several reasons, such as the complexity of the assignments, the…

  6. Evolution of the Milieu Approach for Software Development for the Polymorphous Computing Architecture Program

    National Research Council Canada - National Science Library

    Dandass, Yoginder

    2004-01-01

    A key goal of the DARPA Polymorphous Computing Architectures (PCA) program is to develop reactive closed-loop systems that are capable of being dynamically reconfigured in order to respond to changing mission scenarios...

  7. Global Software Development with Cloud Platforms

    Science.gov (United States)

    Yara, Pavan; Ramachandran, Ramaseshan; Balasubramanian, Gayathri; Muthuswamy, Karthik; Chandrasekar, Divya

    Offshore and outsourced distributed software development models and processes are facing challenges, previously unknown, with respect to computing capacity, bandwidth, storage, security, complexity, reliability, and business uncertainty. Clouds promise to address these challenges by adopting recent advances in virtualization, parallel and distributed systems, utility computing, and software services. In this paper, we envision a cloud-based platform that addresses some of these core problems. We outline a generic cloud architecture, its design and our first implementation results for three cloud forms - a compute cloud, a storage cloud and a cloud-based software service- in the context of global distributed software development (GSD). Our ”compute cloud” provides computational services such as continuous code integration and a compile server farm, ”storage cloud” offers storage (block or file-based) services with an on-line virtual storage service, whereas the on-line virtual labs represent a useful cloud service. We note some of the use cases for clouds in GSD, the lessons learned with our prototypes and identify challenges that must be conquered before realizing the full business benefits. We believe that in the future, software practitioners will focus more on these cloud computing platforms and see clouds as a means to supporting a ecosystem of clients, developers and other key stakeholders.

  8. DCA++: A case for science driven application development for leadership computing platforms

    International Nuclear Information System (INIS)

    Summers, Michael S; Alvarez, Gonzalo; Meredith, Jeremy; Maier, Thomas A; Schulthess, Thomas C

    2009-01-01

    The DCA++ code was one of the early science applications that ran on jaguar at the National Center for Computational Sciences, and the first application code to sustain a petaflop/s under production conditions on a general-purpose supercomputer. The code implements a quantum cluster method with a Quantum Monte Carlo kernel to solve the 2D Hubbard model for high-temperature superconductivity. It is implemented in C++, making heavy use of the generic programming model. In this paper, we discuss how this code was developed, reaching scalability and high efficiency on the world's fastest supercomputer in only a few years. We show how the use of generic concepts combined with systematic refactoring of codes is a better strategy for computational sciences than a comprehensive upfront design.

  9. Development of computer model for radionuclide released from shallow-land disposal facility

    International Nuclear Information System (INIS)

    Suganda, D.; Sucipta; Sastrowardoyo, P.B.; Eriendi

    1998-01-01

    Development of 1-dimensional computer model for radionuclide release from shallow land disposal facility (SLDF) has been done. This computer model is used for the SLDF facility at PPTA Serpong. The SLDF facility is above 1.8 metres from groundwater and 150 metres from Cisalak river. Numerical method by implicit method of finite difference solution is chosen to predict the migration of radionuclide with any concentration.The migration starts vertically from the bottom of SLDF until the groundwater layer, then horizontally in the groundwater until the critical population group. Radionuclide Cs-137 is chosen as a sample to know its migration. The result of the assessment shows that the SLDF facility at PPTA Serpong has the high safety criteria. (author)

  10. 3rd International Conference on Computational Mathematics and Computational Geometry

    CERN Document Server

    Ravindran, Anton

    2016-01-01

    This volume presents original research contributed to the 3rd Annual International Conference on Computational Mathematics and Computational Geometry (CMCGS 2014), organized and administered by Global Science and Technology Forum (GSTF). Computational Mathematics and Computational Geometry are closely related subjects, but are often studied by separate communities and published in different venues. This volume is unique in its combination of these topics. After the conference, which took place in Singapore, selected contributions chosen for this volume and peer-reviewed. The section on Computational Mathematics contains papers that are concerned with developing new and efficient numerical algorithms for mathematical sciences or scientific computing. They also cover analysis of such algorithms to assess accuracy and reliability. The parts of this project that are related to Computational Geometry aim to develop effective and efficient algorithms for geometrical applications such as representation and computati...

  11. Development and initial testing of a computer-based patient decision aid to promote colorectal cancer screening for primary care practice

    Directory of Open Access Journals (Sweden)

    Fowler Beth

    2005-11-01

    Full Text Available Abstract Background Although colorectal cancer screening is recommended by major policy-making organizations, rates of screening remain low. Our aim was to develop a patient-directed, computer-based decision aid about colorectal cancer screening and investigate whether it could increase patient interest in screening. Methods We used content from evidence-based literature reviews and our previous decision aid research to develop a prototype. We performed two rounds of usability testing with representative patients to revise the content and format. The final decision aid consisted of an introductory segment, four test-specific segments, and information to allow comparison of the tests across several key parameters. We then conducted a before-after uncontrolled trial of 80 patients 50–75 years old recruited from an academic internal medicine practice. Results Mean viewing time was 19 minutes. The decision aid improved patients' intent to ask providers for screening from a mean score of 2.8 (1 = not at all likely to ask, 4 = very likely to ask before viewing the decision aid to 3.2 afterwards (difference, 0.4; p Conclusion We conclude that a computer-based decision aid can increase patient intent to be screened and increase interest in screening. Practice Implications: This decision aid can be viewed by patients prior to provider appointments to increase motivation to be screened and to help them decide about which modality to use for screening. Further work is required to integrate the decision aid with other practice change strategies to raise screening rates to target levels.

  12. Development Of The Computer Code For Comparative Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Purwadi, Mohammad Dhandhang

    2001-01-01

    The qualitative and quantitative chemical analysis with Neutron Activation Analysis (NAA) is an importance utilization of a nuclear research reactor, and this should be accelerated and promoted in application and its development to raise the utilization of the reactor. The application of Comparative NAA technique in GA Siwabessy Multi Purpose Reactor (RSG-GAS) needs special (not commercially available yet) soft wares for analyzing the spectrum of multiple elements in the analysis at once. The application carried out using a single spectrum software analyzer, and comparing each result manually. This method really degrades the quality of the analysis significantly. To solve the problem, a computer code was designed and developed for comparative NAA. Spectrum analysis in the code is carried out using a non-linear fitting method. Before the spectrum analyzed, it was passed to the numerical filter which improves the signal to noise ratio to do the deconvolution operation. The software was developed using the G language and named as PASAN-K The testing result of the developed software was benchmark with the IAEA spectrum and well operated with less than 10 % deviation

  13. A stand alone computer system to aid the development of mirror fusion test facility RF heating systems

    International Nuclear Information System (INIS)

    Thomas, R.A.

    1983-01-01

    The Mirror Fusion Test Facility (MFTF-B) control system architecture requires the Supervisory Control and Diagnostic System (SCDS) to communicate with a LSI-11 Local Control Computer (LCC) that in turn communicates via a fiber optic link to CAMAC based control hardware located near the machine. In many cases, the control hardware is very complex and requires a sizable development effort prior to being integrated into the overall MFTF-B system. One such effort was the development of the Electron Cyclotron Resonance Heating (ECRH) system. It became clear that a stand alone computer system was needed to simulate the functions of SCDS. This paper describes the hardware and software necessary to implement the SCDS Simulation Computer (SSC). It consists of a Digital Equipment Corporation (DEC) LSI-11 computer and a Winchester/Floppy disk operating under the DEC RT-11 operating system. All application software for MFTF-B is programmed in PASCAL, which allowed us to adapt procedures originally written for SCDS to the SSC. This nearly identical software interface means that software written during the equipment development will be useful to the SCDS programmers in the integration phase

  14. Developing Inventory Records Using the AppleWorks Data Base Subprogram and Apple IIe or GS Computers.

    Science.gov (United States)

    Schlenker, Richard M.

    This manual is a "how to" training device for developing inventory records in the AppleWorks program using an Apple IIe or Apple IIGS Computer with Duodisk or two disk drives and an 80-column card. The manual provides step-by-step directions, and includes 17 figures depicting the computer screen at the various stages of the inventory…

  15. Computable General Equilibrium Model Fiscal Year 2013 Capability Development Report - April 2014

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, Brian Keith [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). National Infrastructure Simulation and Analysis Center (NISAC); Rivera, Michael K. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). National Infrastructure Simulation and Analysis Center (NISAC); Boero, Riccardo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). National Infrastructure Simulation and Analysis Center (NISAC)

    2014-04-01

    This report documents progress made on continued developments of the National Infrastructure Simulation and Analysis Center (NISAC) Computable General Equilibrium Model (NCGEM), developed in fiscal year 2012. In fiscal year 2013, NISAC the treatment of the labor market and tests performed with the model to examine the properties of the solutions computed by the model. To examine these, developers conducted a series of 20 simulations for 20 U.S. States. Each of these simulations compared an economic baseline simulation with an alternative simulation that assumed a 20-percent reduction in overall factor productivity in the manufacturing industries of each State. Differences in the simulation results between the baseline and alternative simulations capture the economic impact of the reduction in factor productivity. While not every State is affected in precisely the same way, the reduction in manufacturing industry productivity negatively affects the manufacturing industries in each State to an extent proportional to the reduction in overall factor productivity. Moreover, overall economic activity decreases when manufacturing sector productivity is reduced. Developers ran two additional simulations: (1) a version of the model for the State of Michigan, with manufacturing divided into two sub-industries (automobile and other vehicle manufacturing as one sub-industry and the rest of manufacturing as the other subindustry); and (2) a version of the model for the United States, divided into 30 industries. NISAC conducted these simulations to illustrate the flexibility of industry definitions in NCGEM and to examine the simulation properties of in more detail.

  16. COMPUTATIONAL THINKING

    Directory of Open Access Journals (Sweden)

    Evgeniy K. Khenner

    2016-01-01

    Full Text Available Abstract. The aim of the research is to draw attention of the educational community to the phenomenon of computational thinking which actively discussed in the last decade in the foreign scientific and educational literature, to substantiate of its importance, practical utility and the right on affirmation in Russian education.Methods. The research is based on the analysis of foreign studies of the phenomenon of computational thinking and the ways of its formation in the process of education; on comparing the notion of «computational thinking» with related concepts used in the Russian scientific and pedagogical literature.Results. The concept «computational thinking» is analyzed from the point of view of intuitive understanding and scientific and applied aspects. It is shown as computational thinking has evolved in the process of development of computers hardware and software. The practice-oriented interpretation of computational thinking which dominant among educators is described along with some ways of its formation. It is shown that computational thinking is a metasubject result of general education as well as its tool. From the point of view of the author, purposeful development of computational thinking should be one of the tasks of the Russian education.Scientific novelty. The author gives a theoretical justification of the role of computational thinking schemes as metasubject results of learning. The dynamics of the development of this concept is described. This process is connected with the evolution of computer and information technologies as well as increase of number of the tasks for effective solutions of which computational thinking is required. Author substantiated the affirmation that including «computational thinking » in the set of pedagogical concepts which are used in the national education system fills an existing gap.Practical significance. New metasubject result of education associated with

  17. Computability, complexity, and languages fundamentals of theoretical computer science

    CERN Document Server

    Davis, Martin D; Rheinboldt, Werner

    1983-01-01

    Computability, Complexity, and Languages: Fundamentals of Theoretical Computer Science provides an introduction to the various aspects of theoretical computer science. Theoretical computer science is the mathematical study of models of computation. This text is composed of five parts encompassing 17 chapters, and begins with an introduction to the use of proofs in mathematics and the development of computability theory in the context of an extremely simple abstract programming language. The succeeding parts demonstrate the performance of abstract programming language using a macro expa

  18. Present state of computer-aided diagnosis (CAD) development

    International Nuclear Information System (INIS)

    Fujita, Hiroshi

    2007-01-01

    Topics of computer-aided detection (CAD) are reviewed. Commercially available, Food and Drug Administration (FDA)-approved CAD systems are for fields of breast cancer (mammography), chest (flat X-ray and CT imaging) and colon (polyp detection). In Japan, only mammography CAD is approved. Efficacy of CAD is controversial, for which reliable database is important, and its construction is under development in various medical fields. Digitalized image is now popularized, which conceivably leads to improve the cost-effectiveness of diagnosis with CAD. For incentive, approval for health insurance would be the case as seen in the increased CAD sale by R2 Technology Co., and MHLW actually assists facilities to introduce the reading-aid system of mammography by sharing a half of its cost. There are 2 big projects for CAD study supported by MECSST, which the author concerns. One is the development of diagnostic aid for the multi-dimensional medical images where the multi-organ, multi-disease CAD system is considered. The other involves the CAD in brain MRI, in breast US and in eyeground picture. It is not in so far future for patients and doctors to fully enjoy the benefit of CAD. (R.T.)

  19. Reaction Diffusion Voronoi Diagrams: From Sensors Data to Computing

    Directory of Open Access Journals (Sweden)

    Alejandro Vázquez-Otero

    2015-05-01

    Full Text Available In this paper, a new method to solve computational problems using reaction diffusion (RD systems is presented. The novelty relies on the use of a model configuration that tailors its spatiotemporal dynamics to develop Voronoi diagrams (VD as a part of the system’s natural evolution. The proposed framework is deployed in a solution of related robotic problems, where the generalized VD are used to identify topological places in a grid map of the environment that is created from sensor measurements. The ability of the RD-based computation to integrate external information, like a grid map representing the environment in the model computational grid, permits a direct integration of sensor data into the model dynamics. The experimental results indicate that this method exhibits significantly less sensitivity to noisy data than the standard algorithms for determining VD in a grid. In addition, previous drawbacks of the computational algorithms based on RD models, like the generation of volatile solutions by means of excitable waves, are now overcome by final stable states.

  20. Development of selective photoionization spectroscopy technology - Development of a computer program to calculate selective ionization of atoms with multistep processes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Soon; Nam, Baek Il [Myongji University, Seoul (Korea, Republic of)

    1995-08-01

    We have developed computer programs to calculate 2-and 3-step selective resonant multiphoton ionization of atoms. Autoionization resonances in the final continuum can be put into account via B-Spline basis set method. 8 refs., 5 figs. (author)

  1. Ground-glass opacity: High-resolution computed tomography and 64-multi-slice computed tomography findings comparison

    International Nuclear Information System (INIS)

    Sergiacomi, Gianluigi; Ciccio, Carmelo; Boi, Luca; Velari, Luca; Crusco, Sonia; Orlacchio, Antonio; Simonetti, Giovanni

    2010-01-01

    Objective: Comparative evaluation of ground-glass opacity using conventional high-resolution computed tomography technique and volumetric computed tomography by 64-row multi-slice scanner, verifying advantage of volumetric acquisition and post-processing technique allowed by 64-row CT scanner. Methods: Thirty-four patients, in which was assessed ground-glass opacity pattern by previous high-resolution computed tomography during a clinical-radiological follow-up for their lung disease, were studied by means of 64-row multi-slice computed tomography. Comparative evaluation of image quality was done by both CT modalities. Results: It was reported good inter-observer agreement (k value 0.78-0.90) in detection of ground-glass opacity with high-resolution computed tomography technique and volumetric Computed Tomography acquisition with moderate increasing of intra-observer agreement (k value 0.46) using volumetric computed tomography than high-resolution computed tomography. Conclusions: In our experience, volumetric computed tomography with 64-row scanner shows good accuracy in detection of ground-glass opacity, providing a better spatial and temporal resolution and advanced post-processing technique than high-resolution computed tomography.

  2. Consolidation of cloud computing in ATLAS

    Science.gov (United States)

    Taylor, Ryan P.; Domingues Cordeiro, Cristovao Jose; Giordano, Domenico; Hover, John; Kouba, Tomas; Love, Peter; McNab, Andrew; Schovancova, Jaroslava; Sobie, Randall; ATLAS Collaboration

    2017-10-01

    Throughout the first half of LHC Run 2, ATLAS cloud computing has undergone a period of consolidation, characterized by building upon previously established systems, with the aim of reducing operational effort, improving robustness, and reaching higher scale. This paper describes the current state of ATLAS cloud computing. Cloud activities are converging on a common contextualization approach for virtual machines, and cloud resources are sharing monitoring and service discovery components. We describe the integration of Vacuum resources, streamlined usage of the Simulation at Point 1 cloud for offline processing, extreme scaling on Amazon compute resources, and procurement of commercial cloud capacity in Europe. Finally, building on the previously established monitoring infrastructure, we have deployed a real-time monitoring and alerting platform which coalesces data from multiple sources, provides flexible visualization via customizable dashboards, and issues alerts and carries out corrective actions in response to problems.

  3. Efficient Parallel Kernel Solvers for Computational Fluid Dynamics Applications

    Science.gov (United States)

    Sun, Xian-He

    1997-01-01

    Distributed-memory parallel computers dominate today's parallel computing arena. These machines, such as Intel Paragon, IBM SP2, and Cray Origin2OO, have successfully delivered high performance computing power for solving some of the so-called "grand-challenge" problems. Despite initial success, parallel machines have not been widely accepted in production engineering environments due to the complexity of parallel programming. On a parallel computing system, a task has to be partitioned and distributed appropriately among processors to reduce communication cost and to attain load balance. More importantly, even with careful partitioning and mapping, the performance of an algorithm may still be unsatisfactory, since conventional sequential algorithms may be serial in nature and may not be implemented efficiently on parallel machines. In many cases, new algorithms have to be introduced to increase parallel performance. In order to achieve optimal performance, in addition to partitioning and mapping, a careful performance study should be conducted for a given application to find a good algorithm-machine combination. This process, however, is usually painful and elusive. The goal of this project is to design and develop efficient parallel algorithms for highly accurate Computational Fluid Dynamics (CFD) simulations and other engineering applications. The work plan is 1) developing highly accurate parallel numerical algorithms, 2) conduct preliminary testing to verify the effectiveness and potential of these algorithms, 3) incorporate newly developed algorithms into actual simulation packages. The work plan has well achieved. Two highly accurate, efficient Poisson solvers have been developed and tested based on two different approaches: (1) Adopting a mathematical geometry which has a better capacity to describe the fluid, (2) Using compact scheme to gain high order accuracy in numerical discretization. The previously developed Parallel Diagonal Dominant (PDD) algorithm

  4. Recent developments and new directions in soft computing

    CERN Document Server

    Abbasov, Ali; Yager, Ronald; Shahbazova, Shahnaz; Reformat, Marek

    2014-01-01

    The book reports on the latest advances and challenges of soft computing. It  gathers original scientific contributions written by top scientists in the field and covering theories, methods and applications in a number of research areas related to soft-computing, such as decision-making, probabilistic reasoning, image processing, control, neural networks and data analysis.  

  5. Computing Educator Attitudes about Motivation

    OpenAIRE

    Settle, Amber; Sedlak, Brian

    2016-01-01

    While motivation is of great interest to computing educators, relatively little work has been done on understanding faculty attitudes toward student motivation. Two previous qualitative studies of instructor attitudes found results identical to those from other disciplines, but neither study considered whether instructors perceive student motivation to be more important in certain computing classes. In this work we present quantitative results about the perceived importance of student motivat...

  6. Correlation signatures of wet soils and snows. [algorithm development and computer programming

    Science.gov (United States)

    Phillips, M. R.

    1972-01-01

    Interpretation, analysis, and development of algorithms have provided the necessary computational programming tools for soil data processing, data handling and analysis. Algorithms that have been developed thus far, are adequate and have been proven successful for several preliminary and fundamental applications such as software interfacing capabilities, probability distributions, grey level print plotting, contour plotting, isometric data displays, joint probability distributions, boundary mapping, channel registration and ground scene classification. A description of an Earth Resources Flight Data Processor, (ERFDP), which handles and processes earth resources data under a users control is provided.

  7. Protein adsorption on nanoparticles: model development using computer simulation

    International Nuclear Information System (INIS)

    Shao, Qing; Hall, Carol K

    2016-01-01

    The adsorption of proteins on nanoparticles results in the formation of the protein corona, the composition of which determines how nanoparticles influence their biological surroundings. We seek to better understand corona formation by developing models that describe protein adsorption on nanoparticles using computer simulation results as data. Using a coarse-grained protein model, discontinuous molecular dynamics simulations are conducted to investigate the adsorption of two small proteins (Trp-cage and WW domain) on a model nanoparticle of diameter 10.0 nm at protein concentrations ranging from 0.5 to 5 mM. The resulting adsorption isotherms are well described by the Langmuir, Freundlich, Temkin and Kiselev models, but not by the Elovich, Fowler–Guggenheim and Hill–de Boer models. We also try to develop a generalized model that can describe protein adsorption equilibrium on nanoparticles of different diameters in terms of dimensionless size parameters. The simulation results for three proteins (Trp-cage, WW domain, and GB3) on four nanoparticles (diameter  =  5.0, 10.0, 15.0, and 20.0 nm) illustrate both the promise and the challenge associated with developing generalized models of protein adsorption on nanoparticles. (paper)

  8. Fetal organ dosimetry for the Techa River and Ozyorsk offspring cohorts. Pt. 1. A Urals-based series of fetal computational phantoms

    Energy Technology Data Exchange (ETDEWEB)

    Maynard, Matthew R.; Bolch, Wesley E. [University of Florida, Advanced Laboratory for Radiation Dosimetry Studies (ALRADS), J. Crayton Pruitt Family Department of Biomedical Engineering, Gainesville, FL (United States); Shagina, Natalia B.; Tolstykh, Evgenia I.; Degteva, Marina O. [Urals Research Center for Radiation Medicine, Chelyabinsk (Russian Federation); Fell, Tim P. [Public Health England, Centre for Radiation, Chemical and Environmental Health, Didcot, Chilton, Oxon (United Kingdom)

    2015-03-15

    The European Union's SOLO (Epidemiological Studies of Exposed Southern Urals Populations) project aims to improve understanding of cancer risks associated with chronic in utero radiation exposure. A comprehensive series of hybrid computational fetal phantoms was previously developed at the University of Florida in order to provide the SOLO project with the capability of computationally simulating and quantifying radiation exposures to individual fetal bones and soft tissue organs. To improve harmonization between the SOLO fetal biokinetic models and the computational phantoms, a subset of those phantoms was systematically modified to create a novel series of phantoms matching anatomical data representing Russian fetal biometry in the Southern Urals. Using previously established modeling techniques, eight computational Urals-based phantoms aged 8, 12, 18, 22, 26, 30, 34, and 38 weeks post-conception were constructed to match appropriate age-dependent femur lengths, biparietal diameters, individual bone masses and whole-body masses. Bone and soft tissue organ mass differences between the common ages of the subset of UF phantom series and the Urals-based phantom series illustrated the need for improved understanding of fetal bone densities as a critical parameter of computational phantom development. In anticipation for SOLO radiation dosimetry studies involving the developing fetus and pregnant female, the completed phantom series was successfully converted to a cuboidal voxel format easily interpreted by radiation transport software. (orig.)

  9. InfoMall: An Innovative Strategy for High-Performance Computing and Communications Applications Development.

    Science.gov (United States)

    Mills, Kim; Fox, Geoffrey

    1994-01-01

    Describes the InfoMall, a program led by the Northeast Parallel Architectures Center (NPAC) at Syracuse University (New York). The InfoMall features a partnership of approximately 24 organizations offering linked programs in High Performance Computing and Communications (HPCC) technology integration, software development, marketing, education and…

  10. FFTF fission gas monitor computer system

    International Nuclear Information System (INIS)

    Hubbard, J.A.

    1987-01-01

    The Fast Flux Test Facility (FFTF) is a liquid-metal-cooled test reactor located on the Hanford site. A dual computer system has been developed to monitor the reactor cover gas to detect and characterize any fuel or test pin fission gas releases. The system acquires gamma spectra data, identifies isotopes, calculates specific isotope and overall cover gas activity, presents control room alarms and displays, and records and prints data and analysis reports. The fission gas monitor system makes extensive use of commercially available hardware and software, providing a reliable and easily maintained system. The design provides extensive automation of previous manual operations, reducing the need for operator training and minimizing the potential for operator error. The dual nature of the system allows one monitor to be taken out of service for periodic tests or maintenance without interrupting the overall system functions. A built-in calibrated gamma source can be controlled by the computer, allowing the system to provide rapid system self tests and operational performance reports

  11. Computational composites

    DEFF Research Database (Denmark)

    Vallgårda, Anna K. A.; Redström, Johan

    2007-01-01

    Computational composite is introduced as a new type of composite material. Arguing that this is not just a metaphorical maneuver, we provide an analysis of computational technology as material in design, which shows how computers share important characteristics with other materials used in design...... and architecture. We argue that the notion of computational composites provides a precise understanding of the computer as material, and of how computations need to be combined with other materials to come to expression as material. Besides working as an analysis of computers from a designer’s point of view......, the notion of computational composites may also provide a link for computer science and human-computer interaction to an increasingly rapid development and use of new materials in design and architecture....

  12. A computational model predicting disruption of blood vessel development.

    Directory of Open Access Journals (Sweden)

    Nicole Kleinstreuer

    2013-04-01

    Full Text Available Vascular development is a complex process regulated by dynamic biological networks that vary in topology and state across different tissues and developmental stages. Signals regulating de novo blood vessel formation (vasculogenesis and remodeling (angiogenesis come from a variety of biological pathways linked to endothelial cell (EC behavior, extracellular matrix (ECM remodeling and the local generation of chemokines and growth factors. Simulating these interactions at a systems level requires sufficient biological detail about the relevant molecular pathways and associated cellular behaviors, and tractable computational models that offset mathematical and biological complexity. Here, we describe a novel multicellular agent-based model of vasculogenesis using the CompuCell3D (http://www.compucell3d.org/ modeling environment supplemented with semi-automatic knowledgebase creation. The model incorporates vascular endothelial growth factor signals, pro- and anti-angiogenic inflammatory chemokine signals, and the plasminogen activating system of enzymes and proteases linked to ECM interactions, to simulate nascent EC organization, growth and remodeling. The model was shown to recapitulate stereotypical capillary plexus formation and structural emergence of non-coded cellular behaviors, such as a heterologous bridging phenomenon linking endothelial tip cells together during formation of polygonal endothelial cords. Molecular targets in the computational model were mapped to signatures of vascular disruption derived from in vitro chemical profiling using the EPA's ToxCast high-throughput screening (HTS dataset. Simulating the HTS data with the cell-agent based model of vascular development predicted adverse effects of a reference anti-angiogenic thalidomide analog, 5HPP-33, on in vitro angiogenesis with respect to both concentration-response and morphological consequences. These findings support the utility of cell agent-based models for simulating a

  13. Development of a computational methodology for internal dose calculations

    International Nuclear Information System (INIS)

    Yoriyaz, Helio

    2000-01-01

    A new approach for calculating internal dose estimates was developed through the use of a more realistic computational model of the human body and a more precise tool for the radiation transport simulation. The present technique shows the capability to build a patient-specific phantom with tomography data (a voxel-based phantom) for the simulation of radiation transport and energy deposition using Monte Carlo methods such as in the MCNP-4B code. In order to utilize the segmented human anatomy as a computational model for the simulation of radiation transport, an interface program, SCMS, was developed to build the geometric configurations for the phantom through the use of tomographic images. This procedure allows to calculate not only average dose values but also spatial distribution of dose in regions of interest. With the present methodology absorbed fractions for photons and electrons in various organs of the Zubal segmented phantom were calculated and compared to those reported for the mathematical phantoms of Snyder and Cristy-Eckerman. Although the differences in the organ's geometry between the phantoms are quite evident, the results demonstrate small discrepancies, however, in some cases, considerable discrepancies were found due to two major causes: differences in the organ masses between the phantoms and the occurrence of organ overlap in the Zubal segmented phantom, which is not considered in the mathematical phantom. This effect was quite evident for organ cross-irradiation from electrons. With the determination of spatial dose distribution it was demonstrated the possibility of evaluation of more detailed doses data than those obtained in conventional methods, which will give important information for the clinical analysis in therapeutic procedures and in radiobiologic studies of the human body. (author)

  14. Computer Assisted Instructional Design for Computer-Based Instruction. Final Report. Working Papers.

    Science.gov (United States)

    Russell, Daniel M.; Pirolli, Peter

    Recent advances in artificial intelligence and the cognitive sciences have made it possible to develop successful intelligent computer-aided instructional systems for technical and scientific training. In addition, computer-aided design (CAD) environments that support the rapid development of such computer-based instruction have also been recently…

  15. Computer literacy among first year medical students in a developing country: A cross sectional study

    Science.gov (United States)

    2012-01-01

    Background The use of computer assisted learning (CAL) has enhanced undergraduate medical education. CAL improves performance at examinations, develops problem solving skills and increases student satisfaction. The study evaluates computer literacy among first year medical students in Sri Lanka. Methods The study was conducted at Faculty of Medicine, University of Colombo, Sri Lanka between August-September 2008. First year medical students (n = 190) were invited for the study. Data on computer literacy and associated factors were collected by an expert-validated pre-tested self-administered questionnaire. Computer literacy was evaluated by testing knowledge on 6 domains; common software packages, operating systems, database management and the usage of internet and E-mail. A linear regression was conducted using total score for computer literacy as the continuous dependant variable and other independent covariates. Results Sample size-181 (Response rate-95.3%), 49.7% were Males. Majority of the students (77.3%) owned a computer (Males-74.4%, Females-80.2%). Students have gained their present computer knowledge by; a formal training programme (64.1%), self learning (63.0%) or by peer learning (49.2%). The students used computers for predominately; word processing (95.6%), entertainment (95.0%), web browsing (80.1%) and preparing presentations (76.8%). Majority of the students (75.7%) expressed their willingness for a formal computer training programme at the faculty. Mean score for the computer literacy questionnaire was 48.4 ± 20.3, with no significant gender difference (Males-47.8 ± 21.1, Females-48.9 ± 19.6). There were 47.9% students that had a score less than 50% for the computer literacy questionnaire. Students from Colombo district, Western Province and Student owning a computer had a significantly higher mean score in comparison to other students (p computer training was the strongest predictor of computer literacy (β = 13.034), followed by using

  16. How Adverse Outcome Pathways Can Aid the Development and Use of Computational Prediction Models for Regulatory Toxicology

    Energy Technology Data Exchange (ETDEWEB)

    Wittwehr, Clemens; Aladjov, Hristo; Ankley, Gerald; Byrne, Hugh J.; de Knecht, Joop; Heinzle, Elmar; Klambauer, Günter; Landesmann, Brigitte; Luijten, Mirjam; MacKay, Cameron; Maxwell, Gavin; Meek, M. E. (Bette); Paini, Alicia; Perkins, Edward; Sobanski, Tomasz; Villeneuve, Dan; Waters, Katrina M.; Whelan, Maurice

    2016-12-19

    Efforts are underway to transform regulatory toxicology and chemical safety assessment from a largely empirical science based on direct observation of apical toxicity outcomes in whole organism toxicity tests to a predictive one in which outcomes and risk are inferred from accumulated mechanistic understanding. The adverse outcome pathway (AOP) framework has emerged as a systematic approach for organizing knowledge that supports such inference. We argue that this systematic organization of knowledge can inform and help direct the design and development of computational prediction models that can further enhance the utility of mechanistic and in silico data for chemical safety assessment. Examples of AOP-informed model development and its application to the assessment of chemicals for skin sensitization and multiple modes of endocrine disruption are provided. The role of problem formulation, not only as a critical phase of risk assessment, but also as guide for both AOP and complementary model development described. Finally, a proposal for actively engaging the modeling community in AOP-informed computational model development is made. The contents serve as a vision for how AOPs can be leveraged to facilitate development of computational prediction models needed to support the next generation of chemical safety assessment.

  17. System-level tools and reconfigurable computing for next-generation HWIL systems

    Science.gov (United States)

    Stark, Derek; McAulay, Derek; Cantle, Allan J.; Devlin, Malachy

    2001-08-01

    Previous work has been presented on the creation of computing architectures called DIME, which addressed the particular computing demands of hardware in the loop systems. These demands include low latency, high data rates and interfacing. While it is essential to have a capable platform for handling and processing of the data streams, the tools must also complement this so that a system's engineer is able to construct their final system. The paper will present the work in the area of integration of system level design tools, such as MATLAB and SIMULINK, with a reconfigurable computing platform. This will demonstrate how algorithms can be implemented and simulated in a familiar rapid application development environment before they are automatically transposed for downloading directly to the computing platform. This complements the established control tools, which handle the configuration and control of the processing systems leading to a tool suite for system development and implementation. As the development tools have evolved the core-processing platform has also been enhanced. These improved platforms are based on dynamically reconfigurable computing, utilizing FPGA technologies, and parallel processing methods that more than double the performance and data bandwidth capabilities. This offers support for the processing of images in Infrared Scene Projectors with 1024 X 1024 resolutions at 400 Hz frame rates. The processing elements will be using the latest generation of FPGAs, which implies that the presented systems will be rated in terms of Tera (1012) operations per second.

  18. Development, verification and validation of the fuel channel behaviour computer code FACTAR

    Energy Technology Data Exchange (ETDEWEB)

    Westbye, C J; Brito, A C; MacKinnon, J C; Sills, H E; Langman, V J [Ontario Hydro, Toronto, ON (Canada)

    1996-12-31

    FACTAR (Fuel And Channel Temperature And Response) is a computer code developed to simulate the transient thermal and mechanical behaviour of 37-element or 28-element fuel bundles within a single CANDU fuel channel for moderate loss of coolant accident conditions including transition and large break LOCA`s (loss of coolant accidents) with emergency coolant injection assumed available. FACTAR`s predictions of fuel temperature and sheath failure times are used to subsequent assessment of fission product releases and fuel string expansion. This paper discusses the origin and development history of FACTAR, presents the mathematical models and solution technique, the detailed quality assurance procedures that are followed during development, and reports the future development of the code. (author). 27 refs., 3 figs.

  19. Predicting the Pullout Capacity of Small Ground Anchors Using Nonlinear Integrated Computing Techniques

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available This study investigates predicting the pullout capacity of small ground anchors using nonlinear computing techniques. The input-output prediction model for the nonlinear Hammerstein-Wiener (NHW and delay inputs for the adaptive neurofuzzy inference system (DANFIS are developed and utilized to predict the pullout capacity. The results of the developed models are compared with previous studies that used artificial neural networks and least square support vector machine techniques for the same case study. The in situ data collection and statistical performances are used to evaluate the models performance. Results show that the developed models enhance the precision of predicting the pullout capacity when compared with previous studies. Also, the DANFIS model performance is proven to be better than other models used to detect the pullout capacity of ground anchors.

  20. Laparoscopy After Previous Laparotomy

    Directory of Open Access Journals (Sweden)

    Zulfo Godinjak

    2006-11-01

    Full Text Available Following the abdominal surgery, extensive adhesions often occur and they can cause difficulties during laparoscopic operations. However, previous laparotomy is not considered to be a contraindication for laparoscopy. The aim of this study is to present that an insertion of Veres needle in the region of umbilicus is a safe method for creating a pneumoperitoneum for laparoscopic operations after previous laparotomy. In the last three years, we have performed 144 laparoscopic operations in patients that previously underwent one or two laparotomies. Pathology of digestive system, genital organs, Cesarean Section or abdominal war injuries were the most common causes of previouslaparotomy. During those operations or during entering into abdominal cavity we have not experienced any complications, while in 7 patients we performed conversion to laparotomy following the diagnostic laparoscopy. In all patients an insertion of Veres needle and trocar insertion in the umbilical region was performed, namely a technique of closed laparoscopy. Not even in one patient adhesions in the region of umbilicus were found, and no abdominal organs were injured.