WorldWideScience

Sample records for applying computer-based procedures

  1. Applying computer-based procedures in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Mauro V. de; Carvalho, Paulo V.R. de; Santos, Isaac J.A.L. dos; Grecco, Claudio H.S. [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Div. de Instrumentacao e Confiabilidade Humana], e-mail: mvitor@ien.gov.br, e-mail: paulov@ien.gov.br, e-mail: luquetti@ien.gov.br, e-mail: grecco@ien.gov.br; Bruno, Diego S. [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Escola Politecnica. Curso de Engenharia de Controle e Automacao], e-mail: diegosalomonebruno@gmail.com

    2009-07-01

    Plant operation procedures are used to guide operators in coping with normal, abnormal or emergency situations in a process control system. Historically, the plant procedures have been paper-based (PBP), with the digitalisation trend in these complex systems computer-based procedures (CBPs) are being developed to support procedure use. This work shows briefly the research on CBPs at the Human-System Interface Laboratory (LABIHS). The emergency operation procedure EOP-0 of the LABIHS NPP simulator was implemented in the ImPRO CBP system. The ImPRO system was chosen for test because it is available for download in the Internet. A preliminary operation test using the implemented procedure in the CBP system was realized and the results were compared to the operation through PBP use. (author)

  2. Situation awareness and trust in computer-based procedures in nuclear power plant operations

    Energy Technology Data Exchange (ETDEWEB)

    Throneburg, E. B.; Jones, J. M. [AREVA NP Inc., 7207 IBM Drive, Charlotte, NC 28262 (United States)

    2006-07-01

    Situation awareness and trust are two issues that need to be addressed in the design of computer-based procedures for nuclear power plants. Situation awareness, in relation to computer-based procedures, concerns the operators' knowledge of the plant's state while following the procedures. Trust concerns the amount of faith that the operators put into the automated procedures, which can affect situation awareness. This paper first discusses the advantages and disadvantages of computer-based procedures. It then discusses the known aspects of situation awareness and trust as applied to computer-based procedures in nuclear power plants. An outline of a proposed experiment is then presented that includes methods of measuring situation awareness and trust so that these aspects can be analyzed for further study. (authors)

  3. Situation awareness and trust in computer-based procedures in nuclear power plant operations

    International Nuclear Information System (INIS)

    Throneburg, E. B.; Jones, J. M.

    2006-01-01

    Situation awareness and trust are two issues that need to be addressed in the design of computer-based procedures for nuclear power plants. Situation awareness, in relation to computer-based procedures, concerns the operators' knowledge of the plant's state while following the procedures. Trust concerns the amount of faith that the operators put into the automated procedures, which can affect situation awareness. This paper first discusses the advantages and disadvantages of computer-based procedures. It then discusses the known aspects of situation awareness and trust as applied to computer-based procedures in nuclear power plants. An outline of a proposed experiment is then presented that includes methods of measuring situation awareness and trust so that these aspects can be analyzed for further study. (authors)

  4. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Johanna H Oxstrand; Katya L Le Blanc

    2012-07-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups

  5. Computer-Based Procedures for Field Workers in Nuclear Power Plants: Development of a Model of Procedure Usage and Identification of Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Katya Le Blanc; Johanna Oxstrand

    2012-04-01

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts we are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field workers. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do so. This paper describes the development of a Model of Procedure Use and the qualitative study on which the model is based. The study was conducted in collaboration with four nuclear utilities and five research institutes. During the qualitative study and the model development requirements and for computer-based procedures were identified.

  6. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    International Nuclear Information System (INIS)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-01-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  7. Standardized Procedure Content And Data Structure Based On Human Factors Requirements For Computer-Based Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Bly, Aaron; Oxstrand, Johanna; Le Blanc, Katya L

    2015-02-01

    Most activities that involve human interaction with systems in a nuclear power plant are guided by procedures. Traditionally, the use of procedures has been a paper-based process that supports safe operation of the nuclear power industry. However, the nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. Advances in digital technology make computer-based procedures (CBPs) a valid option that provides further enhancement of safety by improving human performance related to procedure use. The transition from paper-based procedures (PBPs) to CBPs creates a need for a computer-based procedure system (CBPS). A CBPS needs to have the ability to perform logical operations in order to adjust to the inputs received from either users or real time data from plant status databases. Without the ability for logical operations the procedure is just an electronic copy of the paper-based procedure. In order to provide the CBPS with the information it needs to display the procedure steps to the user, special care is needed in the format used to deliver all data and instructions to create the steps. The procedure should be broken down into basic elements and formatted in a standard method for the CBPS. One way to build the underlying data architecture is to use an Extensible Markup Language (XML) schema, which utilizes basic elements to build each step in the smart procedure. The attributes of each step will determine the type of functionality that the system will generate for that step. The CBPS will provide the context for the step to deliver referential information, request a decision, or accept input from the user. The XML schema needs to provide all data necessary for the system to accurately perform each step without the need for the procedure writer to reprogram the CBPS. The research team at the Idaho National Laboratory has developed a prototype CBPS for field workers as well as the

  8. Usability test of the ImPRO, computer-based procedure system

    International Nuclear Information System (INIS)

    Jung, Y.; Lee, J.

    2006-01-01

    ImPRO is a computer based procedure in both flowchart and success logic tree. It is evaluated on the basis of computer based procedure guidelines. It satisfies most requirements such as presentations and functionalities. Besides, SGTR has been performed with ImPRO to evaluate reading comprehension and situation awareness. ImPRO is a software engine which can interpret procedure script language, so that ImPRO is reliable by nature and verified with formal method. One bug, however, had hidden one year after release, but it was fixed. Finally backup paper procedures can be prepared on the same format as VDU in case of ImPRO failure. (authors)

  9. Design Guidance for Computer-Based Procedures for Field Workers

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Le Blanc, Katya [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bly, Aaron [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Nearly all activities that involve human interaction with nuclear power plant systems are guided by procedures, instructions, or checklists. Paper-based procedures (PBPs) currently used by most utilities have a demonstrated history of ensuring safety; however, improving procedure use could yield significant savings in increased efficiency, as well as improved safety through human performance gains. The nuclear industry is constantly trying to find ways to decrease human error rates, especially human error rates associated with procedure use. As a step toward the goal of improving field workers’ procedure use and adherence and hence improve human performance and overall system reliability, the U.S. Department of Energy Light Water Reactor Sustainability (LWRS) Program researchers, together with the nuclear industry, have been investigating the possibility and feasibility of replacing current paper-based procedures with computer-based procedures (CBPs). PBPs have ensured safe operation of plants for decades, but limitations in paper-based systems do not allow them to reach the full potential for procedures to prevent human errors. The environment in a nuclear power plant is constantly changing, depending on current plant status and operating mode. PBPs, which are static by nature, are being applied to a constantly changing context. This constraint often results in PBPs that are written in a manner that is intended to cover many potential operating scenarios. Hence, the procedure layout forces the operator to search through a large amount of irrelevant information to locate the pieces of information relevant for the task and situation at hand, which has potential consequences of taking up valuable time when operators must be responding to the situation, and potentially leading operators down an incorrect response path. Other challenges related to use of PBPs are management of multiple procedures, place-keeping, finding the correct procedure for a task, and relying

  10. Benefits of computer screen-based simulation in learning cardiac arrest procedures.

    Science.gov (United States)

    Bonnetain, Elodie; Boucheix, Jean-Michel; Hamet, Maël; Freysz, Marc

    2010-07-01

    What is the best way to train medical students early so that they acquire basic skills in cardiopulmonary resuscitation as effectively as possible? Studies have shown the benefits of high-fidelity patient simulators, but have also demonstrated their limits. New computer screen-based multimedia simulators have fewer constraints than high-fidelity patient simulators. In this area, as yet, there has been no research on the effectiveness of transfer of learning from a computer screen-based simulator to more realistic situations such as those encountered with high-fidelity patient simulators. We tested the benefits of learning cardiac arrest procedures using a multimedia computer screen-based simulator in 28 Year 2 medical students. Just before the end of the traditional resuscitation course, we compared two groups. An experiment group (EG) was first asked to learn to perform the appropriate procedures in a cardiac arrest scenario (CA1) in the computer screen-based learning environment and was then tested on a high-fidelity patient simulator in another cardiac arrest simulation (CA2). While the EG was learning to perform CA1 procedures in the computer screen-based learning environment, a control group (CG) actively continued to learn cardiac arrest procedures using practical exercises in a traditional class environment. Both groups were given the same amount of practice, exercises and trials. The CG was then also tested on the high-fidelity patient simulator for CA2, after which it was asked to perform CA1 using the computer screen-based simulator. Performances with both simulators were scored on a precise 23-point scale. On the test on a high-fidelity patient simulator, the EG trained with a multimedia computer screen-based simulator performed significantly better than the CG trained with traditional exercises and practice (16.21 versus 11.13 of 23 possible points, respectively; p<0.001). Computer screen-based simulation appears to be effective in preparing learners to

  11. Implementing Computer-Based Procedures: Thinking Outside the Paper Margins

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna; Bly, Aaron

    2017-06-01

    In the past year there has been increased interest from the nuclear industry in adopting the use of electronic work packages and computer-based procedures (CBPs) in the field. The goal is to incorporate the use of technology in order to meet the Nuclear Promise requirements of reducing costs and improve efficiency and decrease human error rates of plant operations. Researchers, together with the nuclear industry, have been investigating the benefits an electronic work package system and specifically CBPs would have over current paper-based procedure practices. There are several classifications of CBPs ranging from a straight copy of the paper-based procedure in PDF format to a more intelligent dynamic CBP. A CBP system offers a vast variety of improvements, such as context driven job aids, integrated human performance tools (e.g., placekeeping and correct component verification), and dynamic step presentation. The latter means that the CBP system could only display relevant steps based on operating mode, plant status, and the task at hand. The improvements can lead to reduction of the worker’s workload and human error by allowing the work to focus on the task at hand more. A team of human factors researchers at the Idaho National Laboratory studied and developed design concepts for CBPs for field workers between 2012 and 2016. The focus of the research was to present information in a procedure in a manner that leveraged the dynamic and computational capabilities of a handheld device allowing the worker to focus more on the task at hand than on the administrative processes currently applied when conducting work in the plant. As a part of the research the team identified type of work, instructions, and scenarios where the transition to a dynamic CBP system might not be as beneficial as it would for other types of work in the plant. In most cases the decision to use a dynamic CBP system and utilize the dynamic capabilities gained will be beneficial to the worker

  12. Evaluation of Computer-Based Procedure System Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Johanna Oxstrand; Katya Le Blanc; Seth Hays

    2012-09-01

    This research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which is a research and development (R&D) program sponsored by Department of Energy (DOE), performed in close collaboration with industry R&D programs, to provide the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. The LWRS program serves to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. The introduction of advanced technology in existing nuclear power plants may help to manage the effects of aging systems, structures, and components. In addition, the incorporation of advanced technology in the existing LWR fleet may entice the future workforce, who will be familiar with advanced technology, to work for these utilities rather than more newly built nuclear power plants. Advantages are being sought by developing and deploying technologies that will increase safety and efficiency. One significant opportunity for existing plants to increase efficiency is to phase out the paper-based procedures (PBPs) currently used at most nuclear power plants and replace them, where feasible, with computer-based procedures (CBPs). PBPs have ensured safe operation of plants for decades, but limitations in paper-based systems do not allow them to reach the full potential for procedures to prevent human errors. The environment in a nuclear power plant is constantly changing depending on current plant status and operating mode. PBPs, which are static by nature, are being applied to a constantly changing context. This constraint often results in PBPs that are written in a manner that is intended to cover many potential operating scenarios. Hence, the procedure layout forces the operator to search through a large amount of irrelevant information to locate the pieces of information

  13. Preliminary Investigation of Time Remaining Display on the Computer-based Emergency Operating Procedure

    Science.gov (United States)

    Suryono, T. J.; Gofuku, A.

    2018-02-01

    One of the important thing in the mitigation of accidents in nuclear power plant accidents is time management. The accidents should be resolved as soon as possible in order to prevent the core melting and the release of radioactive material to the environment. In this case, operators should follow the emergency operating procedure related with the accident, in step by step order and in allowable time. Nowadays, the advanced main control rooms are equipped with computer-based procedures (CBPs) which is make it easier for operators to do their tasks of monitoring and controlling the reactor. However, most of the CBPs do not include the time remaining display feature which informs operators of time available for them to execute procedure steps and warns them if the they reach the time limit. Furthermore, the feature will increase the awareness of operators about their current situation in the procedure. This paper investigates this issue. The simplified of emergency operating procedure (EOP) of steam generator tube rupture (SGTR) accident of PWR plant is applied. In addition, the sequence of actions on each step of the procedure is modelled using multilevel flow modelling (MFM) and influenced propagation rule. The prediction of action time on each step is acquired based on similar case accidents and the Support Vector Regression. The derived time will be processed and then displayed on a CBP user interface.

  14. 18 CFR 284.502 - Procedures for applying for market-based rates.

    Science.gov (United States)

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Procedures for applying for market-based rates. 284.502 Section 284.502 Conservation of Power and Water Resources FEDERAL... POLICY ACT OF 1978 AND RELATED AUTHORITIES Applications for Market-Based Rates for Storage § 284.502...

  15. Computer Game-based Learning: Applied Game Development Made Simpler

    NARCIS (Netherlands)

    Nyamsuren, Enkhbold

    2018-01-01

    The RAGE project (Realising an Applied Gaming Ecosystem, http://rageproject.eu/) is an ongoing initiative that aims to offer an ecosystem to support serious games’ development and use. Its two main objectives are to provide technologies for computer game-based pedagogy and learning and to establish

  16. System Requirements Analysis for a Computer-based Procedure in a Research Reactor Facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jaek Wan; Jang, Gwi Sook; Seo, Sang Moon; Shin, Sung Ki [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    This can address many of the routine problems related to human error in the use of conventional, hard-copy operating procedures. An operating supporting system is also required in a research reactor. A well-made CBP can address the staffing issues of a research reactor and reduce the human errors by minimizing the operator's routine tasks. A CBP for a research reactor has not been proposed yet. Also, CBPs developed for nuclear power plants have powerful and various technical functions to cover complicated plant operation situations. However, many of the functions may not be required for a research reactor. Thus, it is not reasonable to apply the CBP to a research reactor directly. Also, customizing of the CBP is not cost-effective. Therefore, a compact CBP should be developed for a research reactor. This paper introduces high level requirements derived by the system requirements analysis activity as the first stage of system implementation. Operation support tools are under consideration for application to research reactors. In particular, as a full digitalization of the main control room, application of a computer-based procedure system has been required as a part of man-machine interface system because it makes an impact on the operating staffing and human errors of a research reactor. To establish computer-based system requirements for a research reactor, this paper addressed international standards and previous practices on nuclear plants.

  17. A conceptual application for computer-based procedures for handheld devices

    Energy Technology Data Exchange (ETDEWEB)

    Sofie, Lunde-Hanssen Linda [Industrial Psychology, Institute for Energy Technology, Halden (Norway)

    2014-08-15

    This paper describes the concepts and proposed design principles for an application for computer-based procedures (CBPs) for field operators in the nuclear domain (so-called handheld procedures). The concept is focused on the field operators' work with procedures and the communication and coordination between field operators and control room operators. The goal is to overcome challenges with shared situation awareness (SA) in a distributed team by providing effective and usable information design. An iterative design method and user-centred design is used for tailoring the concept to the context of field operations. The resulting concept supports the execution of procedures where close collaboration is needed between control room and field operations, e.g. where particular procedure steps are executed from remote control points and others from the control room. The resulting conceptual application for CBPs on handheld devices is developed for mitigating the SA challenges and designing for usability and ease of use.

  18. A conceptual application for computer-based procedures for handheld devices

    International Nuclear Information System (INIS)

    Sofie, Lunde-Hanssen Linda

    2014-01-01

    This paper describes the concepts and proposed design principles for an application for computer-based procedures (CBPs) for field operators in the nuclear domain (so-called handheld procedures). The concept is focused on the field operators' work with procedures and the communication and coordination between field operators and control room operators. The goal is to overcome challenges with shared situation awareness (SA) in a distributed team by providing effective and usable information design. An iterative design method and user-centred design is used for tailoring the concept to the context of field operations. The resulting concept supports the execution of procedures where close collaboration is needed between control room and field operations, e.g. where particular procedure steps are executed from remote control points and others from the control room. The resulting conceptual application for CBPs on handheld devices is developed for mitigating the SA challenges and designing for usability and ease of use

  19. An integrated computer-based procedure for teamwork in digital nuclear power plants.

    Science.gov (United States)

    Gao, Qin; Yu, Wenzhu; Jiang, Xiang; Song, Fei; Pan, Jiajie; Li, Zhizhong

    2015-01-01

    Computer-based procedures (CBPs) are expected to improve operator performance in nuclear power plants (NPPs), but they may reduce the openness of interaction between team members and harm teamwork consequently. To support teamwork in the main control room of an NPP, this study proposed a team-level integrated CBP that presents team members' operation status and execution histories to one another. Through a laboratory experiment, we compared the new integrated design and the existing individual CBP design. Sixty participants, randomly divided into twenty teams of three people each, were assigned to the two conditions to perform simulated emergency operating procedures. The results showed that compared with the existing CBP design, the integrated CBP reduced the effort of team communication and improved team transparency. The results suggest that this novel design is effective to optim team process, but its impact on the behavioural outcomes may be moderated by more factors, such as task duration. The study proposed and evaluated a team-level integrated computer-based procedure, which present team members' operation status and execution history to one another. The experimental results show that compared with the traditional procedure design, the integrated design reduces the effort of team communication and improves team transparency.

  20. Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses

    Science.gov (United States)

    Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…

  1. Computer Based Procedures for Field Workers - FY16 Research Activities

    International Nuclear Information System (INIS)

    Oxstrand, Johanna; Bly, Aaron

    2016-01-01

    The Computer-Based Procedure (CBP) research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. One area that could yield tremendous savings in increased efficiency and safety is in improving procedure use. A CBP provides the opportunity to incorporate context-driven job aids, such as drawings, photos, and just-in-time training. The presentation of information in CBPs can be much more flexible and tailored to the task, actual plant condition, and operation mode. The dynamic presentation of the procedure will guide the user down the path of relevant steps, thus minimizing time spent by the field worker to evaluate plant conditions and decisions related to the applicability of each step. This dynamic presentation of the procedure also minimizes the risk of conducting steps out of order and/or incorrectly assessed applicability of steps. This report provides a summary of the main research activities conducted in the Computer-Based Procedures for Field Workers effort since 2012. The main focus of the report is on the research activities conducted in fiscal year 2016. The activities discussed are the Nuclear Electronic Work Packages - Enterprise Requirements initiative, the development of a design guidance for CBPs (which compiles all insights gained through the years of CBP research), the facilitation of vendor studies at the Idaho National Laboratory (INL) Advanced Test Reactor (ATR), a pilot study for how to enhance the plant design modification work process, the collection of feedback from a field evaluation study at Plant Vogtle, and path forward to

  2. Computer Based Procedures for Field Workers - FY16 Research Activities

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bly, Aaron [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    The Computer-Based Procedure (CBP) research effort is a part of the Light-Water Reactor Sustainability (LWRS) Program, which provides the technical foundations for licensing and managing the long-term, safe, and economical operation of current nuclear power plants. One of the primary missions of the LWRS program is to help the U.S. nuclear industry adopt new technologies and engineering solutions that facilitate the continued safe operation of the plants and extension of the current operating licenses. One area that could yield tremendous savings in increased efficiency and safety is in improving procedure use. A CBP provides the opportunity to incorporate context-driven job aids, such as drawings, photos, and just-in-time training. The presentation of information in CBPs can be much more flexible and tailored to the task, actual plant condition, and operation mode. The dynamic presentation of the procedure will guide the user down the path of relevant steps, thus minimizing time spent by the field worker to evaluate plant conditions and decisions related to the applicability of each step. This dynamic presentation of the procedure also minimizes the risk of conducting steps out of order and/or incorrectly assessed applicability of steps. This report provides a summary of the main research activities conducted in the Computer-Based Procedures for Field Workers effort since 2012. The main focus of the report is on the research activities conducted in fiscal year 2016. The activities discussed are the Nuclear Electronic Work Packages – Enterprise Requirements initiative, the development of a design guidance for CBPs (which compiles all insights gained through the years of CBP research), the facilitation of vendor studies at the Idaho National Laboratory (INL) Advanced Test Reactor (ATR), a pilot study for how to enhance the plant design modification work process, the collection of feedback from a field evaluation study at Plant Vogtle, and path forward to

  3. Supporting plant operation through computer-based procedures

    International Nuclear Information System (INIS)

    Martinez, Victor; Medrano, Javier; Mendez, Julio

    2014-01-01

    Digital Systems are becoming more important in controlling and monitoring nuclear power plant operations. The capabilities of these systems provide additional functions as well as support operators in making decisions and avoiding errors. Regarding Operation Support Systems, an important way of taking advantage of these features is using computer-based procedures (CBPs) tools that enhance the plant operation. Integrating digital systems in analogue controls at nuclear power plants in operation becomes an extra challenge, in contrast to the integration of Digital Control Systems in new nuclear power plants. Considering the potential advantages of using this technology, Tecnatom has designed and developed a CBP platform taking currently operating nuclear power plants as its design basis. The result is a powerful tool which combines the advantages of CBPs and the conventional analogue control systems minimizing negative effects during plant operation and integrating operation aid-systems to support operators. (authors)

  4. Computer assisted procedure maintenance

    International Nuclear Information System (INIS)

    Bisio, R.; Hulsund, J. E.; Nilsen, S.

    2004-04-01

    The maintenance of operating procedures in a NPP is a tedious and complicated task. Through the whole life cycle of the procedures they will be dynamic, 'living' documents. Several aspects of the procedure must be considered in a revision process. Pertinent details and attributes of the procedure must be checked. An organizational structure must be created and responsibilities allotted for drafting, revising, reviewing and publishing procedures. Available powerful computer technology provides solutions within document management and computerisation of procedures. These solutions can also support the maintenance of procedures. Not all parts of the procedure life cycle are equally amenable to computerized support. This report looks at the procedure life cycle in todays NPPs and discusses the possibilities associated with introduction of computer technology to assist the maintenance of procedures. (Author)

  5. Computer software review procedures

    International Nuclear Information System (INIS)

    Mauck, J.L.

    1993-01-01

    This article reviews the procedures which are used to review software written for computer based instrumentation and control functions in nuclear facilities. The utilization of computer based control systems is becoming much more prevalent in such installations, in addition to being retrofit into existing systems. Currently, the Nuclear Regulatory System uses Regulatory Guide 1.152, open-quotes Criteria for Programmable Digital Computer System Software in Safety-Related Systems of Nuclear Power Plantsclose quotes and ANSI/IEEE-ANS-7-4.3.2-1982, open-quotes Application Criteria for Programmable Digital Computer Systems in Safety Systems of Nuclear Power Generating Stationsclose quotes for guidance when performing reviews of digital systems. There is great concern about the process of verification and validation of these codes, so when inspections are done of such systems, inspectors examine very closely the processes which were followed in developing the codes, the errors which were detected, how they were found, and the analysis which went into tracing down the causes behind the errors to insure such errors were not propagated again in the future

  6. Light Water Reactor Sustainability Program: Computer-based procedure for field activities: results from three evaluations at nuclear power plants

    International Nuclear Information System (INIS)

    2014-01-01

    Nearly all activities that involve human interaction with the systems of a nuclear power plant are guided by procedures. The paper-based procedures (PBPs) currently used by industry have a demonstrated history of ensuring safety; however, improving procedure use could yield tremendous savings in increased efficiency and safety. One potential way to improve procedure-based activities is through the use of computer-based procedures (CBPs). Computer-based procedures provide the opportunity to incorporate context driven job aids, such as drawings, photos, just-in-time training, etc into CBP system. One obvious advantage of this capability is reducing the time spent tracking down the applicable documentation. Additionally, human performance tools can be integrated in the CBP system in such way that helps the worker focus on the task rather than the tools. Some tools can be completely incorporated into the CBP system, such as pre-job briefs, placekeeping, correct component verification, and peer checks. Other tools can be partly integrated in a fashion that reduces the time and labor required, such as concurrent and independent verification. Another benefit of CBPs compared to PBPs is dynamic procedure presentation. PBPs are static documents which limits the degree to which the information presented can be tailored to the task and conditions when the procedure is executed. The CBP system could be configured to display only the relevant steps based on operating mode, plant status, and the task at hand. A dynamic presentation of the procedure (also known as context-sensitive procedures) will guide the user down the path of relevant steps based on the current conditions. This feature will reduce the user's workload and inherently reduce the risk of incorrectly marking a step as not applicable and the risk of incorrectly performing a step that should be marked as not applicable. As part of the Department of Energy's (DOE) Light Water Reactors Sustainability Program

  7. Applied Parallel Computing Industrial Computation and Optimization

    DEFF Research Database (Denmark)

    Madsen, Kaj; NA NA NA Olesen, Dorte

    Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)......Proceedings and the Third International Workshop on Applied Parallel Computing in Industrial Problems and Optimization (PARA96)...

  8. A New Human-Machine Interfaces of Computer-based Procedure System to Reduce the Team Errors in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kim, Sa Kil; Sim, Joo Hyun; Lee, Hyun Chul

    2016-01-01

    In this study, we identify the emerging types of team errors, especially, in digitalized control room of nuclear power plants such as the APR-1400 main control room of Korea. Most works in nuclear industry are to be performed by a team of more than two persons. Even though the individual errors can be detected and recovered by the qualified others and/or the well trained team, it is rather seldom that the errors by team could be easily detected and properly recovered by the team itself. Note that the team is defined as two or more people who are appropriately interacting with each other, and the team is a dependent aggregate, which accomplishes a valuable goal. Organizational errors sometimes increase the likelihood of operator errors through the active failure pathway and, at the same time, enhance the possibility of adverse outcomes through defensive weaknesses. We incorporate the crew resource management as a representative approach to deal with the team factors of the human errors. We suggest a set of crew resource management training procedures under the unsafe environments where human errors can have devastating effects. We are on the way to develop alternative interfaces against team error in a condition of using a computer-based procedure system in a digitalized main control room. The computer-based procedure system is a representative feature of digitalized control room. In this study, we will propose new interfaces of computer-based procedure system to reduce feasible team errors. We are on the way of effectiveness test to validate whether the new interface can reduce team errors during operating with a computer-based procedure system in a digitalized control room

  9. A New Human-Machine Interfaces of Computer-based Procedure System to Reduce the Team Errors in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sa Kil; Sim, Joo Hyun; Lee, Hyun Chul [Korea Atomic Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    In this study, we identify the emerging types of team errors, especially, in digitalized control room of nuclear power plants such as the APR-1400 main control room of Korea. Most works in nuclear industry are to be performed by a team of more than two persons. Even though the individual errors can be detected and recovered by the qualified others and/or the well trained team, it is rather seldom that the errors by team could be easily detected and properly recovered by the team itself. Note that the team is defined as two or more people who are appropriately interacting with each other, and the team is a dependent aggregate, which accomplishes a valuable goal. Organizational errors sometimes increase the likelihood of operator errors through the active failure pathway and, at the same time, enhance the possibility of adverse outcomes through defensive weaknesses. We incorporate the crew resource management as a representative approach to deal with the team factors of the human errors. We suggest a set of crew resource management training procedures under the unsafe environments where human errors can have devastating effects. We are on the way to develop alternative interfaces against team error in a condition of using a computer-based procedure system in a digitalized main control room. The computer-based procedure system is a representative feature of digitalized control room. In this study, we will propose new interfaces of computer-based procedure system to reduce feasible team errors. We are on the way of effectiveness test to validate whether the new interface can reduce team errors during operating with a computer-based procedure system in a digitalized control room.

  10. A computational procedure for finding multiple solutions of convective heat transfer equations

    International Nuclear Information System (INIS)

    Mishra, S; DebRoy, T

    2005-01-01

    In recent years numerical solutions of the convective heat transfer equations have provided significant insight into the complex materials processing operations. However, these computational methods suffer from two major shortcomings. First, these procedures are designed to calculate temperature fields and cooling rates as output and the unidirectional structure of these solutions preclude specification of these variables as input even when their desired values are known. Second, and more important, these procedures cannot determine multiple pathways or multiple sets of input variables to achieve a particular output from the convective heat transfer equations. Here we propose a new method that overcomes the aforementioned shortcomings of the commonly used solutions of the convective heat transfer equations. The procedure combines the conventional numerical solution methods with a real number based genetic algorithm (GA) to achieve bi-directionality, i.e. the ability to calculate the required input variables to achieve a specific output such as temperature field or cooling rate. More important, the ability of the GA to find a population of solutions enables this procedure to search for and find multiple sets of input variables, all of which can lead to the desired specific output. The proposed computational procedure has been applied to convective heat transfer in a liquid layer locally heated on its free surface by an electric arc, where various sets of input variables are computed to achieve a specific fusion zone geometry defined by an equilibrium temperature. Good agreement is achieved between the model predictions and the independent experimental results, indicating significant promise for the application of this procedure in finding multiple solutions of convective heat transfer equations

  11. An interactive simulation-based education system for BWR emergency, procedure guidelines

    Energy Technology Data Exchange (ETDEWEB)

    Tanikawa, Naoshi; Shida, Touichi [Hitachi Ltd (Japan). Hitachi Works; Ujita, Hiroshi; Yokota, Takeshi; Kato, Kanji [Hitachi Ltd, (Japan). Energy Research Lab.

    1994-12-31

    When applying EPGs (Emergency Procedure Guidelines), an operator decides the operational procedure by predicting the change of parameters from the plant status, because EPGs are described in a symptom style for emergency conditions. Technical knowledge of the plant behavior and its operation are necessary for operator to understand the EPGs. An interactive simulation-based education system, EPG-ICAI (Intelligent Computer Assisted Instruction), has been developed for BWR plant operators to acquire the knowledge of EPGs. EPG-ICAI is designed to realize an effective education by the step-by-step study by using an interactive real time simulator and an individual education by applying an intelligent tutoring function. (orig.) (2 refs., 7 figs., 1 tab.).

  12. An interactive simulation-based education system for BWR emergency, procedure guidelines

    International Nuclear Information System (INIS)

    Tanikawa, Naoshi; Shida, Touichi; Ujita, Hiroshi; Yokota, Takeshi; Kato, Kanji

    1994-01-01

    When applying EPGs (Emergency Procedure Guidelines), an operator decides the operational procedure by predicting the change of parameters from the plant status, because EPGs are described in a symptom style for emergency conditions. Technical knowledge of the plant behavior and its operation are necessary for operator to understand the EPGs. An interactive simulation-based education system, EPG-ICAI (Intelligent Computer Assisted Instruction), has been developed for BWR plant operators to acquire the knowledge of EPGs. EPG-ICAI is designed to realize an effective education by the step-by-step study by using an interactive real time simulator and an individual education by applying an intelligent tutoring function. (orig.) (2 refs., 7 figs., 1 tab.)

  13. Unexpected effects of computer presented procedures

    International Nuclear Information System (INIS)

    Blackman, H.S.; Nelson, W.R.

    1988-01-01

    Results from experiments conducted at the Idaho National Engineering Laboratory have been presented regarding the computer presentation of procedural information. The results come from the experimental evaluation of an expert system which presented procedural instructions to be performed by a nuclear power plant operator. Lessons learned and implications from the study are discussed as well as design issues that should be considered to avoid some of the pitfalls in computer presented or selected procedures

  14. Unexpected effects of computer presented procedures

    International Nuclear Information System (INIS)

    Blackman, H.S.; Nelson, W.R.

    1988-01-01

    Results from experiments conducted at the Idaho National Engineering Laboratory will be presented regarding the computer presentation of procedural information. The results come from the experimental evaluation of an expert system which presented procedural instructions to be performed by a nuclear power plant operator. Lessons learned and implications from the study will be discussed as well as design issues that should be considered to avoid some of the pitfalls in computer presented or selected procedures. 1 ref., 1 fig

  15. SUPPORTING THE INDUSTRY BY DEVELOPING A DESIGN GUIDANCE FOR COMPUTER-BASED PROCEDURES FOR FIELD WORKERS

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna; LeBlanc, Katya

    2017-06-01

    The paper-based procedures currently used for nearly all activities in the commercial nuclear power industry have a long history of ensuring safe operation of the plants. However, there is potential to greatly increase efficiency and safety by improving how the human interacts with the procedures, which can be achieved through the use of computer-based procedures (CBPs). A CBP system offers a vast variety of improvements, such as context driven job aids, integrated human performance tools and dynamic step presentation. As a step toward the goal of improving procedure use performance, the U.S. Department of Energy Light Water Reactor Sustainability Program researchers, together with the nuclear industry, have been investigating the possibility and feasibility of replacing current paper-based procedures with CBPs. The main purpose of the CBP research conducted at the Idaho National Laboratory was to provide design guidance to the nuclear industry to be used by both utilities and vendors. After studying existing design guidance for CBP systems, the researchers concluded that the majority of the existing guidance is intended for control room CBP systems, and does not necessarily address the challenges of designing CBP systems for instructions carried out in the field. Further, the guidance is often presented on a high level, which leaves the designer to interpret what is meant by the guidance and how to specifically implement it. The authors developed a design guidance to provide guidance specifically tailored to instructions that are carried out in the field based.

  16. Self-guaranteed measurement-based quantum computation

    Science.gov (United States)

    Hayashi, Masahito; Hajdušek, Michal

    2018-05-01

    In order to guarantee the output of a quantum computation, we usually assume that the component devices are trusted. However, when the total computation process is large, it is not easy to guarantee the whole system when we have scaling effects, unexpected noise, or unaccounted for correlations between several subsystems. If we do not trust the measurement basis or the prepared entangled state, we do need to be worried about such uncertainties. To this end, we propose a self-guaranteed protocol for verification of quantum computation under the scheme of measurement-based quantum computation where no prior-trusted devices (measurement basis or entangled state) are needed. The approach we present enables the implementation of verifiable quantum computation using the measurement-based model in the context of a particular instance of delegated quantum computation where the server prepares the initial computational resource and sends it to the client, who drives the computation by single-qubit measurements. Applying self-testing procedures, we are able to verify the initial resource as well as the operation of the quantum devices and hence the computation itself. The overhead of our protocol scales with the size of the initial resource state to the power of 4 times the natural logarithm of the initial state's size.

  17. Applied computing in medicine and health

    CERN Document Server

    Al-Jumeily, Dhiya; Mallucci, Conor; Oliver, Carol

    2015-01-01

    Applied Computing in Medicine and Health is a comprehensive presentation of on-going investigations into current applied computing challenges and advances, with a focus on a particular class of applications, primarily artificial intelligence methods and techniques in medicine and health. Applied computing is the use of practical computer science knowledge to enable use of the latest technology and techniques in a variety of different fields ranging from business to scientific research. One of the most important and relevant areas in applied computing is the use of artificial intelligence (AI) in health and medicine. Artificial intelligence in health and medicine (AIHM) is assuming the challenge of creating and distributing tools that can support medical doctors and specialists in new endeavors. The material included covers a wide variety of interdisciplinary perspectives concerning the theory and practice of applied computing in medicine, human biology, and health care. Particular attention is given to AI-bas...

  18. Special Issue on Entropy-Based Applied Cryptography and Enhanced Security for Ubiquitous Computing

    Directory of Open Access Journals (Sweden)

    James (Jong Hyuk Park

    2016-09-01

    Full Text Available Entropy is a basic and important concept in information theory. It is also often used as a measure of the unpredictability of a cryptographic key in cryptography research areas. Ubiquitous computing (Ubi-comp has emerged rapidly as an exciting new paradigm. In this special issue, we mainly selected and discussed papers related with ore theories based on the graph theory to solve computational problems on cryptography and security, practical technologies; applications and services for Ubi-comp including secure encryption techniques, identity and authentication; credential cloning attacks and countermeasures; switching generator with resistance against the algebraic and side channel attacks; entropy-based network anomaly detection; applied cryptography using chaos function, information hiding and watermark, secret sharing, message authentication, detection and modeling of cyber attacks with Petri Nets, and quantum flows for secret key distribution, etc.

  19. Computer managed emergency operating procedures

    International Nuclear Information System (INIS)

    Salamun, I.; Mavko, B.; Stritar, A.

    1994-01-01

    New computer technology is a very effective tool for developing a new design of nuclear power plant control room. It allows designer possibility to create a tool for managing with large database of power plant parameters and displaying them in different graphic forms and possibility of automated execution of well known task. The structure of Emergency Operating Procedures (EOP) is very suitable for programming and for creating expert system. The Computerized Emergency Operating Procedures (CEOP) described in this paper can be considered as an upgrading of standard EOP approach. EmDiSY (Emergency Display System - computer code name for CEOP) main purpose is to supply the operator with necessary information, to document all operator actions and to execute well known tasks. It is a function oriented CEOP that gives operator guidance on how to verify the critical safety functions and how to restore and maintain these functions where they are degraded. All knowledge is coded and stored in database files. The knowledge base consists from stepping order for verifying plant parameters, desired values of parameters, conditions for comparison and links between procedures and actions. Graphical shell allows users to read database, to follow instruction and to find out correct task. The desired information is concentrated in one screen and allows users to focus on a task. User is supported in two ways: desired parameter values are displayed on the process picture and automated monitoring critical safety function status trees are all time in progress and available to the user. (author). 4 refs, 4 figs

  20. The numerical computation of seismic fragility of base-isolated Nuclear Power Plants buildings

    International Nuclear Information System (INIS)

    Perotti, Federico; Domaneschi, Marco; De Grandis, Silvia

    2013-01-01

    Highlights: • Seismic fragility of structural components in base isolated NPP is computed. • Dynamic integration, Response Surface, FORM and Monte Carlo Simulation are adopted. • Refined approach for modeling the non-linearities behavior of isolators is proposed. • Beyond-design conditions are addressed. • The preliminary design of the isolated IRIS is the application of the procedure. -- Abstract: The research work here described is devoted to the development of a numerical procedure for the computation of seismic fragilities for equipment and structural components in Nuclear Power Plants; in particular, reference is made, in the present paper, to the case of isolated buildings. The proposed procedure for fragility computation makes use of the Response Surface Methodology to model the influence of the random variables on the dynamic response. To account for stochastic loading, the latter is computed by means of a simulation procedure. Given the Response Surface, the Monte Carlo method is used to compute the failure probability. The procedure is here applied to the preliminary design of the Nuclear Power Plant reactor building within the International Reactor Innovative and Secure international project; the building is equipped with a base isolation system based on the introduction of High Damping Rubber Bearing elements showing a markedly non linear mechanical behavior. The fragility analysis is performed assuming that the isolation devices become the critical elements in terms of seismic risk and that, once base-isolation is introduced, the dynamic behavior of the building can be captured by low-dimensional numerical models

  1. Evaluation of Revised Computer-Based Procedure System Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Katya Le Blanc; Johanna Oxstrand; Cheradan Fikstad

    2013-01-01

    The nuclear power industry is very procedure driven, i.e. almost all activities that take place at a nuclear power plant are conducted by following procedures. The paper-based procedures (PBPs) currently used by the industry do a good job at keeping the industry safe. However, these procedures are most often paired with methods and tools put in place to anticipate, prevent, and catch errors related to hands-on work. These tools are commonly called human performance tools. The drawback with the current implementation of these tools is that the task of performing one procedure becomes time and labor intensive. For example, concurrent and independent verification of procedure steps are required at times, which essentially means that at least two people have to be actively involved in the task. Even though the current use of PBPs and human performance tools are keeping the industry safe, there is room for improvement. The industry could potentially increase their efficiency and safety by replacing their existing PBPs with CBPs. If implemented correctly, the CBP system could reduce the time and focus spent on using the human performance tools. Some of the tools can be completely incorporated in the CBP system in a manner that the performer does not think about the fact that these tools are being used. Examples of these tools are procedure use and adherence, placekeeping, and peer checks. Other tools can be partly integrated in a fashion that reduce the time and labor they require, such as concurrent and independent verification. The incorporation of advanced technology, such as CBP systems, may help to manage the effects of aging systems, structures, and components. The introduction of advanced technology may also make the existing LWR fleet more attractive to the future workforce, which will be of importance when the future workforce will chose between existing fleet and the newly built nuclear power plants.

  2. Breast tumor segmentation in high resolution x-ray phase contrast analyzer based computed tomography.

    Science.gov (United States)

    Brun, E; Grandl, S; Sztrókay-Gaul, A; Barbone, G; Mittone, A; Gasilov, S; Bravin, A; Coan, P

    2014-11-01

    Phase contrast computed tomography has emerged as an imaging method, which is able to outperform present day clinical mammography in breast tumor visualization while maintaining an equivalent average dose. To this day, no segmentation technique takes into account the specificity of the phase contrast signal. In this study, the authors propose a new mathematical framework for human-guided breast tumor segmentation. This method has been applied to high-resolution images of excised human organs, each of several gigabytes. The authors present a segmentation procedure based on the viscous watershed transform and demonstrate the efficacy of this method on analyzer based phase contrast images. The segmentation of tumors inside two full human breasts is then shown as an example of this procedure's possible applications. A correct and precise identification of the tumor boundaries was obtained and confirmed by manual contouring performed independently by four experienced radiologists. The authors demonstrate that applying the watershed viscous transform allows them to perform the segmentation of tumors in high-resolution x-ray analyzer based phase contrast breast computed tomography images. Combining the additional information provided by the segmentation procedure with the already high definition of morphological details and tissue boundaries offered by phase contrast imaging techniques, will represent a valuable multistep procedure to be used in future medical diagnostic applications.

  3. HuRECA: Human Reliability Evaluator for Computer-based Control Room Actions

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Lee, Seung Jun; Jang, Seung Cheol

    2011-01-01

    As computer-based design features such as computer-based procedures (CBP), soft controls (SCs), and integrated information systems are being adopted in main control rooms (MCR) of nuclear power plants, a human reliability analysis (HRA) method capable of dealing with the effects of these design features on human reliability is needed. From the observations of human factors engineering verification and validation experiments, we have drawn some major important characteristics on operator behaviors and design-related influencing factors (DIFs) from the perspective of human reliability. Firstly, there are new DIFs that should be considered in developing an HRA method for computer-based control rooms including especially CBP and SCs. In the case of the computer-based procedure rather than the paper-based procedure, the structural and managerial elements should be considered as important PSFs in addition to the procedural contents. In the case of the soft controllers, the so-called interface management tasks (or secondary tasks) should be reflected in the assessment of human error probability. Secondly, computer-based control rooms can provide more effective error recovery features than conventional control rooms. Major error recovery features for computer-based control rooms include the automatic logic checking function of the computer-based procedure and the information sharing feature of the general computer-based designs

  4. Effects of Video-Based and Applied Problems on the Procedural Math Skills of Average- and Low-Achieving Adolescents.

    Science.gov (United States)

    Bottge, Brian A.; Heinrichs, Mary; Chan, Shih-Yi; Mehta, Zara Dee; Watson, Elizabeth

    2003-01-01

    This study examined effects of video-based, anchored instruction and applied problems on the ability of 11 low-achieving (LA) and 26 average-achieving (AA) eighth graders to solve computation and word problems. Performance for both groups was higher during anchored instruction than during baseline, but no differences were found between instruction…

  5. Applying Web-Based Co-Regulated Learning to Develop Students' Learning and Involvement in a Blended Computing Course

    Science.gov (United States)

    Tsai, Chia-Wen

    2015-01-01

    This research investigated, via quasi-experiments, the effects of web-based co-regulated learning (CRL) on developing students' computing skills. Two classes of 68 undergraduates in a one-semester course titled "Applied Information Technology: Data Processing" were chosen for this research. The first class (CRL group, n = 38) received…

  6. Nevada Applied Ecology Group procedures handbook for environmental transuranics

    International Nuclear Information System (INIS)

    White, M.G.; Dunaway, P.B.

    1976-10-01

    The activities of the Nevada Applied Ecology Group (NAEG) integrated research studies of environmental plutonium and other transuranics at the Nevada Test Site have required many standardized field and laboratory procedures. These include sampling techniques, collection and preparation, radiochemical and wet chemistry analysis, data bank storage and reporting, and statistical considerations for environmental samples of soil, vegetation, resuspended particles, animals, and other biological material. This document, printed in two volumes, includes most of the Nevada Applied Ecology Group standard procedures, with explanations as to the specific applications involved in the environmental studies. Where there is more than one document concerning a procedure, it has been included to indicate special studies or applications more complex than the routine standard sampling procedures utilized

  7. Nevada Applied Ecology Group procedures handbook for environmental transuranics

    International Nuclear Information System (INIS)

    White, M.G.; Dunaway, P.B.

    1976-10-01

    The activities of the Nevada Applied Ecology Group (NAEG) integrated research studies of environmental plutonium and other transuranics at the Nevada Test Site have required many standardized field and laboratory procedures. These include sampling techniques, collection and preparation, radiochemical and wet chemistry analysis, data bank storage and reporting, and statistical considerations for environmental samples of soil, vegetation, resuspended particles, animals, and others. This document, printed in two volumes, includes most of the Nevada Applied Ecology Group standard procedures, with explanations as to the specific applications involved in the environmental studies. Where there is more than one document concerning a procedure, it has been included to indicate special studies or applications perhaps more complex than the routine standard sampling procedures utilized

  8. Applied computational physics

    CERN Document Server

    Boudreau, Joseph F; Bianchi, Riccardo Maria

    2018-01-01

    Applied Computational Physics is a graduate-level text stressing three essential elements: advanced programming techniques, numerical analysis, and physics. The goal of the text is to provide students with essential computational skills that they will need in their careers, and to increase the confidence with which they write computer programs designed for their problem domain. The physics problems give them an opportunity to reinforce their programming skills, while the acquired programming skills augment their ability to solve physics problems. The C++ language is used throughout the text. Physics problems include Hamiltonian systems, chaotic systems, percolation, critical phenomena, few-body and multi-body quantum systems, quantum field theory, simulation of radiation transport, and data modeling. The book, the fruit of a collaboration between a theoretical physicist and an experimental physicist, covers a broad range of topics from both viewpoints. Examples, program libraries, and additional documentatio...

  9. Breast tumor segmentation in high resolution x-ray phase contrast analyzer based computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Brun, E., E-mail: emmanuel.brun@esrf.fr [European Synchrotron Radiation Facility (ESRF), Grenoble 380000, France and Department of Physics, Ludwig-Maximilians University, Garching 85748 (Germany); Grandl, S.; Sztrókay-Gaul, A.; Gasilov, S. [Institute for Clinical Radiology, Ludwig-Maximilians-University Hospital Munich, 81377 Munich (Germany); Barbone, G. [Department of Physics, Harvard University, Cambridge, Massachusetts 02138 (United States); Mittone, A.; Coan, P. [Department of Physics, Ludwig-Maximilians University, Garching 85748, Germany and Institute for Clinical Radiology, Ludwig-Maximilians-University Hospital Munich, 81377 Munich (Germany); Bravin, A. [European Synchrotron Radiation Facility (ESRF), Grenoble 380000 (France)

    2014-11-01

    Purpose: Phase contrast computed tomography has emerged as an imaging method, which is able to outperform present day clinical mammography in breast tumor visualization while maintaining an equivalent average dose. To this day, no segmentation technique takes into account the specificity of the phase contrast signal. In this study, the authors propose a new mathematical framework for human-guided breast tumor segmentation. This method has been applied to high-resolution images of excised human organs, each of several gigabytes. Methods: The authors present a segmentation procedure based on the viscous watershed transform and demonstrate the efficacy of this method on analyzer based phase contrast images. The segmentation of tumors inside two full human breasts is then shown as an example of this procedure’s possible applications. Results: A correct and precise identification of the tumor boundaries was obtained and confirmed by manual contouring performed independently by four experienced radiologists. Conclusions: The authors demonstrate that applying the watershed viscous transform allows them to perform the segmentation of tumors in high-resolution x-ray analyzer based phase contrast breast computed tomography images. Combining the additional information provided by the segmentation procedure with the already high definition of morphological details and tissue boundaries offered by phase contrast imaging techniques, will represent a valuable multistep procedure to be used in future medical diagnostic applications.

  10. Cloud computing technologies applied in the virtual education of civil servants

    Directory of Open Access Journals (Sweden)

    Teodora GHERMAN

    2016-03-01

    Full Text Available From the perspective of education, e-learning through the use of Cloud Computing technologies represent one of the most important directions of educational software development, because Cloud Computing are in a rapid development and applies to all areas of the Information Society, including education. Systems require resources for virtual education on web platform (e-learning numerous hardware and software. The convenience of Internet learning, creating a learning environment based on web has become one of the strengths in virtual education research, including applied Cloud Computing technologies in virtual education of civil servants. The article presents Cloud Computing technologies as a platform for virtual education on web platforms, their advantages and disadvantages towards other technologies.

  11. Applying a Global Sensitivity Analysis Workflow to Improve the Computational Efficiencies in Physiologically-Based Pharmacokinetic Modeling

    Directory of Open Access Journals (Sweden)

    Nan-Hung Hsieh

    2018-06-01

    Full Text Available Traditionally, the solution to reduce parameter dimensionality in a physiologically-based pharmacokinetic (PBPK model is through expert judgment. However, this approach may lead to bias in parameter estimates and model predictions if important parameters are fixed at uncertain or inappropriate values. The purpose of this study was to explore the application of global sensitivity analysis (GSA to ascertain which parameters in the PBPK model are non-influential, and therefore can be assigned fixed values in Bayesian parameter estimation with minimal bias. We compared the elementary effect-based Morris method and three variance-based Sobol indices in their ability to distinguish “influential” parameters to be estimated and “non-influential” parameters to be fixed. We illustrated this approach using a published human PBPK model for acetaminophen (APAP and its two primary metabolites APAP-glucuronide and APAP-sulfate. We first applied GSA to the original published model, comparing Bayesian model calibration results using all the 21 originally calibrated model parameters (OMP, determined by “expert judgment”-based approach vs. the subset of original influential parameters (OIP, determined by GSA from the OMP. We then applied GSA to all the PBPK parameters, including those fixed in the published model, comparing the model calibration results using this full set of 58 model parameters (FMP vs. the full set influential parameters (FIP, determined by GSA from FMP. We also examined the impact of different cut-off points to distinguish the influential and non-influential parameters. We found that Sobol indices calculated by eFAST provided the best combination of reliability (consistency with other variance-based methods and efficiency (lowest computational cost to achieve convergence in identifying influential parameters. We identified several originally calibrated parameters that were not influential, and could be fixed to improve computational

  12. Total reduction of distorted echelle spectrograms - An automatic procedure. [for computer controlled microdensitometer

    Science.gov (United States)

    Peterson, R. C.; Title, A. M.

    1975-01-01

    A total reduction procedure, notable for its use of a computer-controlled microdensitometer for semi-automatically tracing curved spectra, is applied to distorted high-dispersion echelle spectra recorded by an image tube. Microdensitometer specifications are presented and the FORTRAN, TRACEN and SPOTS programs are outlined. The intensity spectrum of the photographic or electrographic plate is plotted on a graphic display. The time requirements are discussed in detail.

  13. Benchmarking gate-based quantum computers

    Science.gov (United States)

    Michielsen, Kristel; Nocon, Madita; Willsch, Dennis; Jin, Fengping; Lippert, Thomas; De Raedt, Hans

    2017-11-01

    With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.

  14. Gis-based procedures for hydropower potential spotting

    Energy Technology Data Exchange (ETDEWEB)

    Larentis, Dante G.; Collischonn, Walter; Tucci, Carlos E.M. [Instituto de Pesquisas Hidraulicas da UFRGS, Av. Bento Goncalves, 9500, CEP 91501-970, Caixa Postal 15029, Porto Alegre, RS (Brazil); Olivera, Francisco (Texas A and M University, Zachry Department of Civil Engineering 3136 TAMU, College Station, TX 77843-3136, US)

    2010-10-15

    The increasing demand for energy, especially from renewable and sustainable sources, spurs the development of small hydropower plants and encourages investment in new survey studies. Preliminary hydropower survey studies usually carry huge uncertainties about the technical, economic and environmental feasibility of the undeveloped potential. This paper presents a methodology for large-scale survey of hydropower potential sites to be applied in the inception phase of hydroelectric development planning. The sequence of procedures to identify hydropower sites is based on remote sensing and regional streamflow data and was automated within a GIS-based computational program: Hydrospot. The program allows spotting more potential sites along the drainage network than it would be possible in a traditional survey study, providing different types of dam-powerhouse layouts and two types (operating modes) of projects: run-of-the-river and storage projects. Preliminary results from its applications in a hydropower-developed basin in Brazil have shown Hydrospot's limitations and potentialities in giving support to the mid-to-long-term planning of the electricity sector. (author)

  15. Requirements for Computer Based-Procedures for Nuclear Power Plant Field Operators. Results from a Qualitative Study

    International Nuclear Information System (INIS)

    Le Blanc, Katya; Oxstrand, J.H.; Waicosky, T.

    2012-01-01

    Although computer-based procedures (CBPs) have been investigated as a way to enhance operator performance on procedural tasks in the nuclear industry for almost thirty years, they are not currently widely deployed at United States utilities. One of the barriers to the wide scale deployment of CBPs is the lack of operational experience with CBPs that could serve as a sound basis for justifying the use of CBPs for nuclear utilities. Utilities are hesitant to adopt CBPs because of concern over potential costs of implementation, and concern over regulatory approval. Regulators require a sound technical basis for the use of any procedure at the utilities; without operating experience to support the use CBPs, it is difficult to establish such a technical basis. In an effort to begin the process of developing a technical basis for CBPs, researchers at Idaho National Laboratory are partnering with industry to explore CBPs with the objective of defining requirements for CBPs and developing an industry-wide vision and path forward for the use of CBPs. This paper describes the results from a qualitative study aimed at defining requirements for CBPs to be used by field operators and maintenance technicians. (author)

  16. First International Symposium on Applied Computing and Information Technology (ACIT 2013)

    CERN Document Server

    Applied Computing and Information Technology

    2014-01-01

    This book presents the selected results of the 1st International Symposium on Applied Computers and Information Technology (ACIT 2013) held on August 31 – September 4, 2013 in Matsue City, Japan, which brought together researchers, scientists, engineers, industry practitioners, and students to discuss all aspects of  Applied Computers & Information Technology, and its practical challenges. This book includes the best 12 papers presented at the conference, which were chosen based on review scores submitted by members of the program committee and underwent further rigorous rounds of review.  

  17. Requirements for Control Room Computer-Based Procedures for use in Hybrid Control Rooms

    Energy Technology Data Exchange (ETDEWEB)

    Le Blanc, Katya Lee [Idaho National Lab. (INL), Idaho Falls, ID (United States); Oxstrand, Johanna Helene [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-05-01

    Many plants in the U.S. are currently undergoing control room modernization. The main drivers for modernization are the aging and obsolescence of existing equipment, which typically results in a like-for-like replacement of analogue equipment with digital systems. However, the modernization efforts present an opportunity to employ advanced technology that would not only extend the life, but enhance the efficiency and cost competitiveness of nuclear power. Computer-based procedures (CBPs) are one example of near-term advanced technology that may provide enhanced efficiencies above and beyond like for like replacements of analog systems. Researchers in the LWRS program are investigating the benefits of advanced technologies such as CBPs, with the goal of assisting utilities in decision making during modernization projects. This report will describe the existing research on CBPs, discuss the unique issues related to using CBPs in hybrid control rooms (i.e., partially modernized analog control rooms), and define the requirements of CBPs for hybrid control rooms.

  18. Computational representation of Alzheimer's disease evolution applied to a cooking activity.

    Science.gov (United States)

    Serna, Audrey; Rialle, Vincent; Pigot, Hélène

    2006-01-01

    This article presents a computational model and a simulation of the decrease of activities of daily living performances due to Alzheimer's disease. The disease evolution is simulated thanks to the cognitive architecture ACT-R. Activities are represented according to the retrieval of semantic units in declarative memory and the trigger of rules in procedural memory. The simulation of Alzheimer's disease decrease is simulated thanks to the variation of subsymbolic parameters. The model is applied to a cooking activity. Simulation of 100 hundred subjects shows results similar to those realised in a standardized assessment with human subjects.

  19. A PC-based computer program for simulation of containment pressurization

    International Nuclear Information System (INIS)

    Seifaee, F.

    1990-01-01

    This paper reports that a PC-based computer program has been developed to simulate a pressurized water reactor (PWR) containment during various transients. This containment model is capable of determining pressure and temperature history of a PWR containment in the event of a loss of coolant accident, as well as main steam line breaks inside the containment. Conservation of mass and energy equations are applied to the containment model. Development of the program is based on minimization of input specified information and user friendliness. Maximization of calculation efficiency is obtained by superseding the traditional trial and error procedure for determination of the state variables and implementation of an explicit solution for pressure. The program includes simplified models for active heat removal systems. The results are in close agreement between the present model and CONTEMPT-MOD5 computer code for pressure and temperature inside the containment

  20. 34 CFR 370.43 - What requirement applies to the use of mediation procedures?

    Science.gov (United States)

    2010-07-01

    ... 34 Education 2 2010-07-01 2010-07-01 false What requirement applies to the use of mediation... applies to the use of mediation procedures? (a) Each designated agency shall implement procedures designed to ensure that, to the maximum extent possible, good faith negotiations and mediation procedures are...

  1. Applied Mathematics, Modelling and Computational Science

    CERN Document Server

    Kotsireas, Ilias; Makarov, Roman; Melnik, Roderick; Shodiev, Hasan

    2015-01-01

    The Applied Mathematics, Modelling, and Computational Science (AMMCS) conference aims to promote interdisciplinary research and collaboration. The contributions in this volume cover the latest research in mathematical and computational sciences, modeling, and simulation as well as their applications in natural and social sciences, engineering and technology, industry, and finance. The 2013 conference, the second in a series of AMMCS meetings, was held August 26–30 and organized in cooperation with AIMS and SIAM, with support from the Fields Institute in Toronto, and Wilfrid Laurier University. There were many young scientists at AMMCS-2013, both as presenters and as organizers. This proceedings contains refereed papers contributed by the participants of the AMMCS-2013 after the conference. This volume is suitable for researchers and graduate students, mathematicians and engineers, industrialists, and anyone who would like to delve into the interdisciplinary research of applied and computational mathematics ...

  2. Development of Computational Procedure for Assessment of Patient Dose in Multi-Detector Computed Tomography

    International Nuclear Information System (INIS)

    Park, Dong Wook

    2007-02-01

    Technological development to improve the quality and speed with which images are obtained have fostered the growth of frequency and collective effective dose of CT examination. Especially, High-dose x-ray technique of CT has increased in the concern of patient dose. However CTDI and DLP in CT dosimetry leaves something to be desired to evaluate patient dose. And even though the evaluation of effective dose in CT practice is required for comparison with other radiography, it's not sufficient to show any estimation because it's not for medical purpose. Therefore the calculation of effective dose in CT procedure is needed for that purpose. However modelling uncertainties will be due to insufficient information from manufacturing tolerances. Therefore the purpose of this work is development of computational procedure for assessment of patient dose through the experiment for getting essential information in MDCT. In order to obtain exact absorbed dose, normalization factors must be created to relate simulated dose values with CTDI air measurement. The normalization factors applied to the calculation of CTDI 100 using axial scanning and organ effective dose using helical scanning. The calculation of helical scanning was compared with the experiment of Groves et al.(2004). The result has a about factor 2 of the experiment. It seems because AEC is not simulated. In several studies, when AEC applied to a CT examination, approximately 20-30% dose reduction was appeared. Therefore the study of AEC simulation should be added and modified

  3. 20 CFR 668.860 - What cash management procedures apply to INA grant funds?

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false What cash management procedures apply to INA... Administrative Requirements § 668.860 What cash management procedures apply to INA grant funds? INA grantees must... implement the Cash Management Improvement Act, found at 31 CFR part 205, apply by law to most recipients of...

  4. Evaluation of computer-based ultrasonic inservice inspection systems

    International Nuclear Information System (INIS)

    Harris, R.V. Jr.; Angel, L.J.; Doctor, S.R.; Park, W.R.; Schuster, G.J.; Taylor, T.T.

    1994-03-01

    This report presents the principles, practices, terminology, and technology of computer-based ultrasonic testing for inservice inspection (UT/ISI) of nuclear power plants, with extensive use of drawings, diagrams, and LTT images. The presentation is technical but assumes limited specific knowledge of ultrasonics or computers. The report is divided into 9 sections covering conventional LTT, computer-based LTT, and evaluation methodology. Conventional LTT topics include coordinate axes, scanning, instrument operation, RF and video signals, and A-, B-, and C-scans. Computer-based topics include sampling, digitization, signal analysis, image presentation, SAFI, ultrasonic holography, transducer arrays, and data interpretation. An evaluation methodology for computer-based LTT/ISI systems is presented, including questions, detailed procedures, and test block designs. Brief evaluations of several computer-based LTT/ISI systems are given; supplementary volumes will provide detailed evaluations of selected systems

  5. IDEA system - a new computer-based expert system for incorporation monitoring

    International Nuclear Information System (INIS)

    Doerfel, H.

    2007-01-01

    Recently, at the Karlsruhe Research Centre, a computer-based expert system, Internal Dose Equivalent Assessment System (IDEA System), has been developed for assisting dosimetrists in applying the relevant recommendations and guidelines for internal dosimetry. The expert system gives guidance to the user with respect to: (a) planning of monitoring, (b) performing routine and special monitoring, and (c) evaluation of primary monitoring results. The evaluation is done according to the IDEA System guidelines (Doerfel, H. et al., General guidelines for the estimation of committed effective dose from incorporation monitoring data. Research Report FZKA 7243, Research Center Karlsruhe, Karlsruhe (2006). ISSN 0947-8260.) in a three-stage procedure according to the expected level of exposure. At the first level the evaluation is performed with default or site-specific parameter values, at the second level case-specific parameter values are applied and at the third level a special evaluation is performed with individual adjustment of model parameter values. With these well-defined procedures the expert system follows the aim, in which all recommendations and guidelines are applied properly and the results in terms of committed effective and organ doses are close to the best estimate. (author)

  6. Applied Computational Mathematics in Social Sciences

    CERN Document Server

    Damaceanu, Romulus-Catalin

    2010-01-01

    Applied Computational Mathematics in Social Sciences adopts a modern scientific approach that combines knowledge from mathematical modeling with various aspects of social science. Special algorithms can be created to simulate an artificial society and a detailed analysis can subsequently be used to project social realities. This Ebook specifically deals with computations using the NetLogo platform, and is intended for researchers interested in advanced human geography and mathematical modeling studies.

  7. The Next Step in Deployment of Computer Based Procedures For Field Workers: Insights And Results From Field Evaluations at Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna; Le Blanc, Katya L.; Bly, Aaron

    2015-02-01

    The paper-based procedures currently used for nearly all activities in the commercial nuclear power industry have a long history of ensuring safe operation of the plants. However, there is potential to greatly increase efficiency and safety by improving how the human operator interacts with the procedures. One way to achieve these improvements is through the use of computer-based procedures (CBPs). A CBP system offers a vast variety of improvements, such as context driven job aids, integrated human performance tools (e.g., placekeeping, correct component verification, etc.), and dynamic step presentation. The latter means that the CBP system could only display relevant steps based on operating mode, plant status, and the task at hand. A dynamic presentation of the procedure (also known as context-sensitive procedures) will guide the operator down the path of relevant steps based on the current conditions. This feature will reduce the operator’s workload and inherently reduce the risk of incorrectly marking a step as not applicable and the risk of incorrectly performing a step that should be marked as not applicable. The research team at the Idaho National Laboratory has developed a prototype CBP system for field workers, which has been evaluated from a human factors and usability perspective in four laboratory studies. Based on the results from each study revisions were made to the CBP system. However, a crucial step to get the end users' (e.g., auxiliary operators, maintenance technicians, etc.) acceptance is to put the system in their hands and let them use it as a part of their everyday work activities. In the spring 2014 the first field evaluation of the INL CBP system was conducted at a nuclear power plant. Auxiliary operators conduct a functional test of one out of three backup air compressors each week. During the field evaluation activity, one auxiliary operator conducted the test with the paper-based procedure while a second auxiliary operator

  8. On-line computer system applied in a nuclear chemistry laboratory

    International Nuclear Information System (INIS)

    Banasik, Z.; Kierzek, J.; Parus, J.; Zoltowski, T.; Zalewski, J.

    1980-01-01

    A PDP-11/45 based computer system used in a radioanalytical chemical laboratory is described. It is mainly concerned with spectrometry of ionizing radiation and remote measurement of physico-chemical properties. The objectives in mind when constructing the hardware inter-connections and developing the software of the system were to minimize the work of the electronics and computer personnel and to provide maximum flexibility for the users. For the hardware interfacing, 3 categories of equipment are used: - LPS-11 Laboratory Peripheral System - CAMAC system with CA11F-P controller - interfaces from instrument manufacturers. Flexible operation has been achieved by using a 3-level programming structure: - data transfer by assembly language programs - data formatting using bit operations in FORTRAN - data evaluation by procedures written in FORTRAN. (Auth.)

  9. Computed neutron coincidence counting applied to passive waste assay

    Energy Technology Data Exchange (ETDEWEB)

    Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R. [Nuclear Research Centre, Mol (Belgium)

    1997-11-01

    Neutron coincidence counting applied for the passive assay of fissile material is generally realised with dedicated electronic circuits. This paper presents a software based neutron coincidence counting method with data acquisition via a commercial PC-based Time Interval Analyser (TIA). The TIA is used to measure and record all time intervals between successive pulses in the pulse train up to count-rates of 2 Mpulses/s. Software modules are then used to compute the coincidence count-rates and multiplicity related data. This computed neutron coincidence counting (CNCC) offers full access to all the time information contained in the pulse train. This paper will mainly concentrate on the application and advantages of CNCC for the non-destructive assay of waste. An advanced multiplicity selective Rossi-alpha method is presented and its implementation via CNCC demonstrated. 13 refs., 4 figs., 2 tabs.

  10. Computed neutron coincidence counting applied to passive waste assay

    International Nuclear Information System (INIS)

    Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R.

    1997-01-01

    Neutron coincidence counting applied for the passive assay of fissile material is generally realised with dedicated electronic circuits. This paper presents a software based neutron coincidence counting method with data acquisition via a commercial PC-based Time Interval Analyser (TIA). The TIA is used to measure and record all time intervals between successive pulses in the pulse train up to count-rates of 2 Mpulses/s. Software modules are then used to compute the coincidence count-rates and multiplicity related data. This computed neutron coincidence counting (CNCC) offers full access to all the time information contained in the pulse train. This paper will mainly concentrate on the application and advantages of CNCC for the non-destructive assay of waste. An advanced multiplicity selective Rossi-alpha method is presented and its implementation via CNCC demonstrated. 13 refs., 4 figs., 2 tabs

  11. Applied Computational Intelligence for finance and economics

    OpenAIRE

    Isasi Viñuela, Pedro; Quintana Montero, David; Sáez Achaerandio, Yago; Mochón, Asunción

    2007-01-01

    This article introduces some relevant research works on computational intelligence applied to finance and economics. The objective is to offer an appropriate context and a starting point for those who are new to computational intelligence in finance and economics and to give an overview of the most recent works. A classification with five different main areas is presented. Those areas are related with different applications of the most modern computational intelligence techniques showing a ne...

  12. Beam-Based Procedures for RF Guns

    CERN Document Server

    Krasilnikov, Mikhail; Grabosch, H J; Hartrott, Michael; Hui Han, Jang; Miltchev, Velizar; Oppelt, Anne; Petrosyan, Bagrat; Staykov, Lazar; Stephan, Frank

    2005-01-01

    A wide range of rf photo injector parameters has to be optimized in order to achieve an electron source performance as required for linac based high gain FELs. Some of the machine parameters can not be precisely controlled by direct measurements, whereas the tolerance on them is extremely tight. Therefore, this should be met with beam-based techniques. Procedures for beam-based alignment (BBA) of the laser on the photo cathode as well as solenoid alignment have been developed. They were applied at the Photo Injector Test facility at DESY Zeuthen (PITZ) and at the photo injector of the VUV-FEL at DESY Hamburg. A field balance of the accelerating mode in the 1 ½ cell gun cavity is one of the key beam dynamics issues of the rf gun. Since no direct field measurement in the half and full cell of the cavity is available for the PITZ gun, a beam-based technique to determine the field balance has been proposed. A beam-based rf phase monitoring procedure has been developed as well.

  13. A general digital computer procedure for synthesizing linear automatic control systems

    International Nuclear Information System (INIS)

    Cummins, J.D.

    1961-10-01

    The fundamental concepts required for synthesizing a linear automatic control system are considered. A generalized procedure for synthesizing automatic control systems is demonstrated. This procedure has been programmed for the Ferranti Mercury and the IBM 7090 computers. Details of the programmes are given. The procedure uses the linearized set of equations which describe the plant to be controlled as the starting point. Subsequent computations determine the transfer functions between any desired variables. The programmes also compute the root and phase loci for any linear (and some non-linear) configurations in the complex plane, the open loop and closed loop frequency responses of a system, the residues of a function of the complex variable 's' and the time response corresponding to these residues. With these general programmes available the design of 'one point' automatic control systems becomes a routine scientific procedure. Also dynamic assessments of plant may be carried out. Certain classes of multipoint automatic control problems may also be solved with these procedures. Autonomous systems, invariant systems and orthogonal systems may also be studied. (author)

  14. Identity-Based Authentication for Cloud Computing

    Science.gov (United States)

    Li, Hongwei; Dai, Yuanshun; Tian, Ling; Yang, Haomiao

    Cloud computing is a recently developed new technology for complex systems with massive-scale services sharing among numerous users. Therefore, authentication of both users and services is a significant issue for the trust and security of the cloud computing. SSL Authentication Protocol (SAP), once applied in cloud computing, will become so complicated that users will undergo a heavily loaded point both in computation and communication. This paper, based on the identity-based hierarchical model for cloud computing (IBHMCC) and its corresponding encryption and signature schemes, presented a new identity-based authentication protocol for cloud computing and services. Through simulation testing, it is shown that the authentication protocol is more lightweight and efficient than SAP, specially the more lightweight user side. Such merit of our model with great scalability is very suited to the massive-scale cloud.

  15. Pretreatment procedures applied to samples to be analysed by neutron activation analysis at CDTN/CNEN

    International Nuclear Information System (INIS)

    Francisco, Dovenir; Menezes, Maria Angela de Barros Correia

    2009-01-01

    The neutron activation technique - using several methods - has been applied in 80% of the analytical demand of Division for Reactor and Analytical Techniques at CDTN/CNEN, Belo Horizonte, Minas Gerais. This scenario emphasizes the responsibility of the Laboratory to provide and assure the quality of the measurements. The first step to assure the results quality is the preparation of the samples. Therefore, this paper describes the experimental procedures adopted at CDTN/CNEN in order to uniform conditions of analysis and to avoid contaminations by elements present everywhere. Some of the procedures are based on methods described in the literature; others are based on many years of experience preparing samples from many kinds of matrices. The procedures described are related to geological material - soil, sediment, rock, gems, clay, archaeological ceramics and ore - biological materials - hair, fish, plants, food - water, etc. Analytical results in sediment samples are shown as n example pointing out the efficiency of the experimental procedure. (author)

  16. Mixture-based gatekeeping procedures in adaptive clinical trials.

    Science.gov (United States)

    Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji

    2018-01-01

    Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.

  17. Efficient Reanalysis Procedures in Structural Topology Optimization

    DEFF Research Database (Denmark)

    Amir, Oded

    This thesis examines efficient solution procedures for the structural analysis problem within topology optimization. The research is motivated by the observation that when the nested approach to structural optimization is applied, most of the computational effort is invested in repeated solutions...... on approximate reanalysis. For cases where memory limitations require the utilization of iterative equation solvers, we suggest efficient procedures based on alternative termination criteria for such solvers. These approaches are tested on two- and three-dimensional topology optimization problems including...

  18. Computational anatomy based on whole body imaging basic principles of computer-assisted diagnosis and therapy

    CERN Document Server

    Masutani, Yoshitaka

    2017-01-01

    This book deals with computational anatomy, an emerging discipline recognized in medical science as a derivative of conventional anatomy. It is also a completely new research area on the boundaries of several sciences and technologies, such as medical imaging, computer vision, and applied mathematics. Computational Anatomy Based on Whole Body Imaging highlights the underlying principles, basic theories, and fundamental techniques in computational anatomy, which are derived from conventional anatomy, medical imaging, computer vision, and applied mathematics, in addition to various examples of applications in clinical data. The book will cover topics on the basics and applications of the new discipline. Drawing from areas in multidisciplinary fields, it provides comprehensive, integrated coverage of innovative approaches to computational anatomy. As well,Computational Anatomy Based on Whole Body Imaging serves as a valuable resource for researchers including graduate students in the field and a connection with ...

  19. Equivalent model construction for a non-linear dynamic system based on an element-wise stiffness evaluation procedure and reduced analysis of the equivalent system

    Science.gov (United States)

    Kim, Euiyoung; Cho, Maenghyo

    2017-11-01

    In most non-linear analyses, the construction of a system matrix uses a large amount of computation time, comparable to the computation time required by the solving process. If the process for computing non-linear internal force matrices is substituted with an effective equivalent model that enables the bypass of numerical integrations and assembly processes used in matrix construction, efficiency can be greatly enhanced. A stiffness evaluation procedure (STEP) establishes non-linear internal force models using polynomial formulations of displacements. To efficiently identify an equivalent model, the method has evolved such that it is based on a reduced-order system. The reduction process, however, makes the equivalent model difficult to parameterize, which significantly affects the efficiency of the optimization process. In this paper, therefore, a new STEP, E-STEP, is proposed. Based on the element-wise nature of the finite element model, the stiffness evaluation is carried out element-by-element in the full domain. Since the unit of computation for the stiffness evaluation is restricted by element size, and since the computation is independent, the equivalent model can be constructed efficiently in parallel, even in the full domain. Due to the element-wise nature of the construction procedure, the equivalent E-STEP model is easily characterized by design parameters. Various reduced-order modeling techniques can be applied to the equivalent system in a manner similar to how they are applied in the original system. The reduced-order model based on E-STEP is successfully demonstrated for the dynamic analyses of non-linear structural finite element systems under varying design parameters.

  20. Mathematical modeling and computational intelligence in engineering applications

    CERN Document Server

    Silva Neto, Antônio José da; Silva, Geraldo Nunes

    2016-01-01

    This book brings together a rich selection of studies in mathematical modeling and computational intelligence, with application in several fields of engineering, like automation, biomedical, chemical, civil, electrical, electronic, geophysical and mechanical engineering, on a multidisciplinary approach. Authors from five countries and 16 different research centers contribute with their expertise in both the fundamentals and real problems applications based upon their strong background on modeling and computational intelligence. The reader will find a wide variety of applications, mathematical and computational tools and original results, all presented with rigorous mathematical procedures. This work is intended for use in graduate courses of engineering, applied mathematics and applied computation where tools as mathematical and computational modeling, numerical methods and computational intelligence are applied to the solution of real problems.

  1. Study on computer-aided simulation procedure for multicomponent separating cascade

    International Nuclear Information System (INIS)

    Kinoshita, Masahiro

    1982-11-01

    The present report reviews the author's study on the computer-aided simulation procedure for a multicomponent separating cascade. As a conclusion, two very powerful simulation procedures have been developed for cascades composed of separating elements whose separation factors are very large. They are applicable in cases where interstage flow rates are input variables for the calculation and stage separation factors are given either as constants or as functions of compositions of the up and down streams. As an application of the new procedure, a computer-aided simulation study has been performed for hydrogen isotope separating cascades by porous membrane method. A cascade system configuration is developed and pertinent design specifications are determined in an example case of the feed conditions and separation requirements. (author)

  2. Fail-safe computer-based plant protection systems

    International Nuclear Information System (INIS)

    Keats, A.B.

    1983-01-01

    A fail-safe mode of operation for computers used in nuclear reactor protection systems was first evolved in the UK for application to a sodium cooled fast reactor. The fail-safe properties of both the hardware and the software were achieved by permanently connecting test signals to some of the multiplexed inputs. This results in an unambiguous data pattern, each time the inputs are sequentially scanned by the multiplexer. The ''test inputs'' simulate transient excursions beyond defined safe limits. The alternating response of the trip algorithms to the ''out-of-limits'' test signals and the normal plant measurements is recognised by hardwired pattern recognition logic external to the computer system. For more general application to plant protection systems, a ''Test Signal Generator'' (TSG) is used to compute and generate test signals derived from prevailing operational conditions. The TSG, from its knowledge of the sensitivity of the trip algorithm to each of the input variables, generates a ''test disturbance'' which is superimposed upon each variable in turn, to simulate a transient excursion beyond the safe limits. The ''tripped'' status yielded by the trip algorithm when using data from a ''disturbed'' input forms part of a pattern determined by the order in which the disturbances are applied to the multiplexer inputs. The data pattern formed by the interleaved test disturbances is again recognised by logic external to the protection system's computers. This fail-safe mode of operation of computer-based protection systems provides a powerful defence against common-mode failure. It also reduces the importance of software verification in the licensing procedure. (author)

  3. Navigation of guidewires and catheters in the body during intervention procedures : A review of computer-based models

    NARCIS (Netherlands)

    Sharei Amarghan, H.; Alderliesten, Tanja; van den Dobbelsteen, J.J.; Dankelman, J.

    2018-01-01

    Guidewires and catheters are used during minimally invasive interventional procedures to traverse in vascular system and access the desired position. Computer models are increasingly being used to predict the behavior of these instruments. This information can be used to choose the right

  4. 12 CFR 516.5 - Do the same procedures apply to all applications under this part?

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Do the same procedures apply to all applications under this part? 516.5 Section 516.5 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY APPLICATION PROCESSING PROCEDURES § 516.5 Do the same procedures apply to all...

  5. 4th International Conference on Applied Computing and Information Technology

    CERN Document Server

    2017-01-01

    This edited book presents scientific results of the 4th International Conference on Applied Computing and Information Technology (ACIT 2016) which was held on December 12–14, 2016 in Las Vegas, USA. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. The aim of this conference was also to bring out the research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them. The conference organizers selected the best papers from those papers accepted for presentation at the conference. The papers were chosen based on review scores submitted by members of the Program Committee, and underwent further rigorous rounds of review. Th...

  6. Vision based flight procedure stereo display system

    Science.gov (United States)

    Shen, Xiaoyun; Wan, Di; Ma, Lan; He, Yuncheng

    2008-03-01

    A virtual reality flight procedure vision system is introduced in this paper. The digital flight map database is established based on the Geographic Information System (GIS) and high definitions satellite remote sensing photos. The flight approaching area database is established through computer 3D modeling system and GIS. The area texture is generated from the remote sensing photos and aerial photographs in various level of detail. According to the flight approaching procedure, the flight navigation information is linked to the database. The flight approaching area vision can be dynamic displayed according to the designed flight procedure. The flight approaching area images are rendered in 2 channels, one for left eye images and the others for right eye images. Through the polarized stereoscopic projection system, the pilots and aircrew can get the vivid 3D vision of the flight destination approaching area. Take the use of this system in pilots preflight preparation procedure, the aircrew can get more vivid information along the flight destination approaching area. This system can improve the aviator's self-confidence before he carries out the flight mission, accordingly, the flight safety is improved. This system is also useful in validate the visual flight procedure design, and it helps to the flight procedure design.

  7. A single-photon ecat reconstruction procedure based on a PSF model

    International Nuclear Information System (INIS)

    Ying-Lie, O.

    1984-01-01

    Emission Computed Axial Tomography (ECAT) has been applied in nuclear medicine for the past few years. Owing to attenuation and scatter along the ray path, adequate correction methods are required. In this thesis, a correction method for attenuation, detector response and Compton scatter has been proposed. The method developed is based on a PSF model. The parameters of the models were derived by fitting experimental and simulation data. Because of its flexibility, a Monte Carlo simulation method has been employed. Using the PSF models, it was found that the ECAT problem can be described by the added modified equation. Application of the reconstruction procedure on simulation data yield satisfactory results. The algorithm tends to amplify noise and distortion in the data, however. Therefore, the applicability of the method on patient studies remain to be seen. (Auth.)

  8. Research in Applied Mathematics, Fluid Mechanics and Computer Science

    Science.gov (United States)

    1999-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1998 through March 31, 1999.

  9. TEACHING AND LEARNING METHODOLOGIES SUPPORTED BY ICT APPLIED IN COMPUTER SCIENCE

    Directory of Open Access Journals (Sweden)

    Jose CAPACHO

    2016-04-01

    Full Text Available The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory. Genetic-Cognitive Psychology Theory and Dialectics Psychology. Based on the theoretical framework the following methodologies were developed: Game Theory, Constructivist Approach, Personalized Teaching, Problem Solving, Cooperative Collaborative learning, Learning projects using ICT. These methodologies were applied to the teaching learning process during the Algorithms and Complexity – A&C course, which belongs to the area of ​​Computer Science. The course develops the concepts of Computers, Complexity and Intractability, Recurrence Equations, Divide and Conquer, Greedy Algorithms, Dynamic Programming, Shortest Path Problem and Graph Theory. The main value of the research is the theoretical support of the methodologies and their application supported by ICT using learning objects. The course aforementioned was built on the Blackboard platform evaluating the operation of methodologies. The results of the evaluation are presented for each of them, showing the learning outcomes achieved by students, which verifies that methodologies are functional.

  10. 2nd International Doctoral Symposium on Applied Computation and Security Systems

    CERN Document Server

    Cortesi, Agostino; Saeed, Khalid; Chaki, Nabendu

    2016-01-01

    The book contains the extended version of the works that have been presented and discussed in the Second International Doctoral Symposium on Applied Computation and Security Systems (ACSS 2015) held during May 23-25, 2015 in Kolkata, India. The symposium has been jointly organized by the AGH University of Science & Technology, Cracow, Poland; Ca’ Foscari University, Venice, Italy and University of Calcutta, India. The book is divided into volumes and presents dissertation works in the areas of Image Processing, Biometrics-based Authentication, Soft Computing, Data Mining, Next Generation Networking and Network Security, Remote Healthcare, Communications, Embedded Systems, Software Engineering and Service Engineering.

  11. Computing Gröbner fans

    DEFF Research Database (Denmark)

    Fukuda, K.; Jensen, Anders Nedergaard; Thomas, R.R.

    2005-01-01

    This paper presents algorithms for computing the Gröbner fan of an arbitrary polynomial ideal. The computation involves enumeration of all reduced Gröbner bases of the ideal. Our algorithms are based on a uniform definition of the Gröbner fan that applies to both homogeneous and non......-homogeneous ideals and a proof that this object is a polyhedral complex. We show that the cells of a Gröbner fan can easily be oriented acyclically and with a unique sink, allowing their enumeration by the memory-less reverse search procedure. The significance of this follows from the fact that Gröbner fans...... are not always normal fans of polyhedra in which case reverse search applies automatically. Computational results using our implementation of these algorithms in the software package Gfan are included....

  12. Report on nuclear industry quality assurance procedures for safety analysis computer code development and use

    International Nuclear Information System (INIS)

    Sheron, B.W.; Rosztoczy, Z.R.

    1980-08-01

    As a result of a request from Commissioner V. Gilinsky to investigate in detail the causes of an error discovered in a vendor Emergency Core Cooling System (ECCS) computer code in March, 1978, the staff undertook an extensive investigation of the vendor quality assurance practices applied to safety analysis computer code development and use. This investigation included inspections of code development and use practices of the four major Light Water Reactor Nuclear Steam Supply System vendors and a major reload fuel supplier. The conclusion reached by the staff as a result of the investigation is that vendor practices for code development and use are basically sound. A number of areas were identified, however, where improvements to existing vendor procedures should be made. In addition, the investigation also addressed the quality assurance (QA) review and inspection process for computer codes and identified areas for improvement

  13. CSNS computing environment Based on OpenStack

    Science.gov (United States)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  14. Research in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  15. Procedural Audio in Computer Games Using Motion Controllers: An Evaluation on the Effect and Perception

    Directory of Open Access Journals (Sweden)

    Niels Böttcher

    2013-01-01

    Full Text Available A study has been conducted into whether the use of procedural audio affects players in computer games using motion controllers. It was investigated whether or not (1 players perceive a difference between detailed and interactive procedural audio and prerecorded audio, (2 the use of procedural audio affects their motor-behavior, and (3 procedural audio affects their perception of control. Three experimental surveys were devised, two consisting of game sessions and the third consisting of watching videos of gameplay. A skiing game controlled by a Nintendo Wii balance board and a sword-fighting game controlled by a Wii remote were implemented with two versions of sound, one sample based and the other procedural based. The procedural models were designed using a perceptual approach and by alternative combinations of well-known synthesis techniques. The experimental results showed that, when being actively involved in playing or purely observing a video recording of a game, the majority of participants did not notice any difference in sound. Additionally, it was not possible to show that the use of procedural audio caused any consistent change in the motor behavior. In the skiing experiment, a portion of players perceived the control of the procedural version as being more sensitive.

  16. GRUKON - A package of applied computer programs system input and operating procedures of functional modules

    International Nuclear Information System (INIS)

    Sinitsa, V.V.; Rineiskij, A.A.

    1993-04-01

    This manual describes a software package for the production of multigroup neutron cross-sections from evaluated nuclear data files. It presents the information necessary for the implementation of the program's modules in the framework of the execution of the program, including: operating procedures of the program, the data input, the macrocommand language, the assignment of the system's procedures. This report also presents the methodology used in the coding of the individual modules: the rules, the syntax, the method of procedures. The report also presents an example of the application of the data processing module. (author)

  17. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  18. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Science.gov (United States)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  19. Computed tomography after radical pancreaticoduodenectomy (Whipple's procedure)

    International Nuclear Information System (INIS)

    Smith, S.L.; Hampson, F.; Duxbury, M.; Rae, D.M.; Sinclair, M.T.

    2008-01-01

    Whipple's procedure (radical pancreaticoduodenectomy) is currently the only curative option for patients with periampullary malignancy. The surgery is highly complex and involves multiple anastomoses. Complications are common and can lead to significant postoperative morbidity. Early detection and treatment of complications is vital, and high-quality multidetector computed tomography (MDCT) is currently the best method of investigation. This review outlines the surgical technique and illustrates the range of normal postoperative appearances together with the common complications

  20. MOBILE CLOUD COMPUTING APPLIED TO HEALTHCARE APPROACH

    OpenAIRE

    Omar AlSheikSalem

    2016-01-01

    In the past few years it was clear that mobile cloud computing was established via integrating both mobile computing and cloud computing to be add in both storage space and processing speed. Integrating healthcare applications and services is one of the vast data approaches that can be adapted to mobile cloud computing. This work proposes a framework of a global healthcare computing based combining both mobile computing and cloud computing. This approach leads to integrate all of ...

  1. A Neural Networks Based Operation Guidance System for Procedure Presentation and Validation

    International Nuclear Information System (INIS)

    Seung, Kun Mo; Lee, Seung Jun; Seong, Poong Hyun

    2006-01-01

    In this paper, a neural network based operator support system is proposed to reduce operator's errors in abnormal situations in nuclear power plants (NPPs). There are many complicated situations, in which regular and suitable operations should be done by operators accordingly. In order to regulate and validate operators' operations, it is necessary to develop an operator support system which includes computer based procedures with the functions for operation validation. Many computerized procedures systems (CPS) have been recently developed. Focusing on the human machine interface (HMI) design and procedures' computerization, most of CPSs used various methodologies to enhance system's convenience, reliability and accessibility. Other than only showing procedures, the proposed system integrates a simple CPS and an operation validation system (OVS) by using artificial neural network (ANN) for operational permission and quantitative evaluation

  2. NEPTUNIX 2: Operating on computers network - Catalogued procedures

    International Nuclear Information System (INIS)

    Roux, Pierre.

    1982-06-01

    NEPTUNIX 2 is a package which carries out the simulation of complex processes described by numerous non linear algebro-differential equations. Main features are: non linear or time dependent parameters, implicit form, stiff systems, dynamic change of equations leading to discontinuities on some variables. Thus the mathematical model is built with an equations set F(x,x',1,t), where t is the independent variable, x' the derivative of x and 1 an ''algebrized'' logical variable. The NEPTUNIX 2 package is divided into two successive major steps: a non numerical step and a numerical step. The numerical step, using results from a picture of the model translated in FORTRAN language, in a form fitted for the executive computer, carries out the simulmations; in this way, NEPTUNIX 2 numerical step is portable. On the opposite, the non numerical step must be executed on a series 370 IBM computer or on a compatible computer. The present manual describes NEPTUNIX 2 operating procedures when the two steps are executed on the same computer and also when the numerical step is executed on an other computer connected or not on the same computing network [fr

  3. Validation study of a computer-based open surgical trainer: SimPraxis(®) simulation platform.

    Science.gov (United States)

    Tran, Linh N; Gupta, Priyanka; Poniatowski, Lauren H; Alanee, Shaheen; Dall'era, Marc A; Sweet, Robert M

    2013-01-01

    Technological advances have dramatically changed medical education, particularly in the era of work-hour restrictions, which increasingly highlights a need for novel methods to teach surgical skills. The purpose of this study was to evaluate the validity of a novel, computer-based, interactive, cognitive simulator for training surgeons to perform pelvic lymph node dissection (PLND). Eight prostate cancer experts evaluated the content of the simulator. Contextual aspects of the simulator were rated on a five-point Likert scale. The experts and nine first-year residents completed a simulated PLND. Time and deviations were logged, and the results were compared between experts and novices using the Mann-Whitney test. Before training, 88% of the experts felt that a validated simulator would be useful for PLND training. After testing, 100% of the experts felt that it would be more useful than standard video training. Eighty-eight percent stated that they would like to see the simulator in the curriculum of residency programs and 56% thought it would be useful for accreditation purposes. The experts felt that the simulator aided in overall understanding, training indications, concepts and steps of the procedure, training how to use an assistant, and enhanced the knowledge of anatomy. Median performance times taken by experts and interns to complete a PLND procedure on the simulator were 12.62 and 23.97 minutes, respectively. Median deviation from the incorporated procedure pathway for experts was 24.5 and was 89 for novices. We describe an interactive, computer-based simulator designed to assist in mastery of the cognitive steps of an open surgical procedure. This platform is intuitive and flexible, and could be applied to any stepwise medical procedure. Overall, experts outperformed novices in their performance on the trainer. Experts agreed that the content was acceptable, accurate, and representative.

  4. Sagittal reconstruction computed tomography in metrizamide cisternography. Useful diagnostic procedure for malformations in craniovertebral junction and posterior fossa

    Energy Technology Data Exchange (ETDEWEB)

    Mochizuki, H.; Okita, N.; Fujii, T.; Yoshioka, M.; Saito, H. (Tohoku Univ., Sendai (Japan). School of Medicine)

    1982-08-01

    We studied the sagittal reconstruction technique in computed tomography with metrizamide. Ten ml of metrizamide, 170 mg iodine/ml in concentration, were injected by lumbar puncture. After diffusion of the injected metrizamide, axial computed tomograms were taken by thin slice width (5 mm) with overlapped technique. Then electrical sagittal reconstruction was carried out by optioned software. Injection of metrizamide, non-ionic water soluble contrast media, made clear contrasts among bone, brain parenchyma and cerebrospinal fluid with computed tomography. Sagittal reconstruction technique could reveal more precise details and accurate anatomical relations than ordinary axial computed tomography. This technique was applied on 3 cases (Arnold-Chiari malformation, large cisterna magna and partial agenesis cerebellar vermis), which demonstrated a useful diagnostic procedure for abnormalities of craniovertebral junction and posterior fossa. The adverse reactions of metrizamide were negligible in our series.

  5. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  6. [Research activities in applied mathematics, fluid mechanics, and computer science

    Science.gov (United States)

    1995-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period April 1, 1995 through September 30, 1995.

  7. RPM-WEBBSYS: A web-based computer system to apply the rational polynomial method for estimating static formation temperatures of petroleum and geothermal wells

    Science.gov (United States)

    Wong-Loya, J. A.; Santoyo, E.; Andaverde, J. A.; Quiroz-Ruiz, A.

    2015-12-01

    A Web-Based Computer System (RPM-WEBBSYS) has been developed for the application of the Rational Polynomial Method (RPM) to estimate static formation temperatures (SFT) of geothermal and petroleum wells. The system is also capable to reproduce the full thermal recovery processes occurred during the well completion. RPM-WEBBSYS has been programmed using advances of the information technology to perform more efficiently computations of SFT. RPM-WEBBSYS may be friendly and rapidly executed by using any computing device (e.g., personal computers and portable computing devices such as tablets or smartphones) with Internet access and a web browser. The computer system was validated using bottomhole temperature (BHT) measurements logged in a synthetic heat transfer experiment, where a good matching between predicted and true SFT was achieved. RPM-WEBBSYS was finally applied to BHT logs collected from well drilling and shut-in operations, where the typical problems of the under- and over-estimation of the SFT (exhibited by most of the existing analytical methods) were effectively corrected.

  8. A bidirectional coupling procedure applied to multiscale respiratory modeling

    Science.gov (United States)

    Kuprat, A. P.; Kabilan, S.; Carson, J. P.; Corley, R. A.; Einstein, D. R.

    2013-07-01

    pressure applied to the multiple sets of ODEs. In both the simplified geometry and in the imaging-based geometry, the performance of the method was comparable to that of monolithic schemes, in most cases requiring only a single CFD evaluation per time step. Thus, this new accelerator allows us to begin combining pulmonary CFD models with lower-dimensional models of pulmonary mechanics with little computational overhead. Moreover, because the CFD and lower-dimensional models are totally separate, this framework affords great flexibility in terms of the type and breadth of the adopted lower-dimensional model, allowing the biomedical researcher to appropriately focus on model design. Research funded by the National Heart and Blood Institute Award 1RO1HL073598.

  9. A bidirectional coupling procedure applied to multiscale respiratory modeling

    Energy Technology Data Exchange (ETDEWEB)

    Kuprat, A.P., E-mail: andrew.kuprat@pnnl.gov [Fundamental and Computational Sciences Directorate, Pacific Northwest National Laboratory, Richland, WA (United States); Kabilan, S., E-mail: senthil.kabilan@pnnl.gov [Fundamental and Computational Sciences Directorate, Pacific Northwest National Laboratory, Richland, WA (United States); Carson, J.P., E-mail: james.carson@pnnl.gov [Fundamental and Computational Sciences Directorate, Pacific Northwest National Laboratory, Richland, WA (United States); Corley, R.A., E-mail: rick.corley@pnnl.gov [Fundamental and Computational Sciences Directorate, Pacific Northwest National Laboratory, Richland, WA (United States); Einstein, D.R., E-mail: daniel.einstein@pnnl.gov [Fundamental and Computational Sciences Directorate, Pacific Northwest National Laboratory, Richland, WA (United States)

    2013-07-01

    pressure applied to the multiple sets of ODEs. In both the simplified geometry and in the imaging-based geometry, the performance of the method was comparable to that of monolithic schemes, in most cases requiring only a single CFD evaluation per time step. Thus, this new accelerator allows us to begin combining pulmonary CFD models with lower-dimensional models of pulmonary mechanics with little computational overhead. Moreover, because the CFD and lower-dimensional models are totally separate, this framework affords great flexibility in terms of the type and breadth of the adopted lower-dimensional model, allowing the biomedical researcher to appropriately focus on model design. Research funded by the National Heart and Blood Institute Award 1RO1HL073598.

  10. A bidirectional coupling procedure applied to multiscale respiratory modeling

    International Nuclear Information System (INIS)

    Kuprat, A.P.; Kabilan, S.; Carson, J.P.; Corley, R.A.; Einstein, D.R.

    2013-01-01

    pressure applied to the multiple sets of ODEs. In both the simplified geometry and in the imaging-based geometry, the performance of the method was comparable to that of monolithic schemes, in most cases requiring only a single CFD evaluation per time step. Thus, this new accelerator allows us to begin combining pulmonary CFD models with lower-dimensional models of pulmonary mechanics with little computational overhead. Moreover, because the CFD and lower-dimensional models are totally separate, this framework affords great flexibility in terms of the type and breadth of the adopted lower-dimensional model, allowing the biomedical researcher to appropriately focus on model design. Research funded by the National Heart and Blood Institute Award 1RO1HL073598

  11. OSL sensitivity changes during single aliquot procedures: Computer simulations

    DEFF Research Database (Denmark)

    McKeever, S.W.S.; Agersnap Larsen, N.; Bøtter-Jensen, L.

    1997-01-01

    We present computer simulations of sensitivity changes obtained during single aliquot, regeneration procedures. The simulations indicate that the sensitivity changes are the combined result of shallow trap and deep trap effects. Four separate processes have been identified. Although procedures can...... be suggested to eliminate the shallow trap effects, it appears that the deep trap effects cannot be removed. The character of the sensitivity changes which result from these effects is seen to be dependent upon several external parameters, including the extent of bleaching of the OSL signal, the laboratory...

  12. A sampler of useful computational tools for applied geometry, computer graphics, and image processing foundations for computer graphics, vision, and image processing

    CERN Document Server

    Cohen-Or, Daniel; Ju, Tao; Mitra, Niloy J; Shamir, Ariel; Sorkine-Hornung, Olga; Zhang, Hao (Richard)

    2015-01-01

    A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing shows how to use a collection of mathematical techniques to solve important problems in applied mathematics and computer science areas. The book discusses fundamental tools in analytical geometry and linear algebra. It covers a wide range of topics, from matrix decomposition to curvature analysis and principal component analysis to dimensionality reduction.Written by a team of highly respected professors, the book can be used in a one-semester, intermediate-level course in computer science. It

  13. Statistical near-real-time accountancy procedures applied to AGNS [Allied General Nuclear Services] minirun data using PROSA

    International Nuclear Information System (INIS)

    Beedgen, R.

    1988-03-01

    The computer program PROSA (PROgram for Statistical Analysis of near-real-time accountancy data) was developed as a tool to apply statistical test procedures to a sequence of materials balance results for detecting losses of material. First applications of PROSA to model facility data and real plant data showed that PROSA is also usable as a tool for process or measurement control. To deepen the experience for the application of PROSA to real data of bulk-handling facilities, we applied it to uranium data of the Allied General Nuclear Services miniruns, where accountancy data were collected on a near-real-time basis. Minirun 6 especially was considered, and the pulsed columns were chosen as materials balance area. The structure of the measurement models for flow sheet data and actual operation data are compared, and methods are studied to reduce the error for inventory measurements of the columns

  14. Pre-analysis techniques applied to area-based correlation aiming Digital Terrain Model generation

    Directory of Open Access Journals (Sweden)

    Maurício Galo

    2005-12-01

    Full Text Available Area-based matching is an useful procedure in some photogrammetric processes and its results are of crucial importance in applications such as relative orientation, phototriangulation and Digital Terrain Model generation. The successful determination of correspondence depends on radiometric and geometric factors. Considering these aspects, the use of procedures that previously estimate the quality of the parameters to be computed is a relevant issue. This paper describes these procedures and it is shown that the quality prediction can be computed before performing matching by correlation, trough the analysis of the reference window. This procedure can be incorporated in the correspondence process for Digital Terrain Model generation and Phototriangulation. The proposed approach comprises the estimation of the variance matrix of the translations from the gray levels in the reference window and the reduction of the search space using the knowledge of the epipolar geometry. As a consequence, the correlation process becomes more reliable, avoiding the application of matching procedures in doubtful areas. Some experiments with simulated and real data are presented, evidencing the efficiency of the studied strategy.

  15. Finite element design procedure for correcting the coining die profiles

    Science.gov (United States)

    Alexandrino, Paulo; Leitão, Paulo J.; Alves, Luis M.; Martins, Paulo A. F.

    2018-05-01

    This paper presents a new finite element based design procedure for correcting the coining die profiles in order to optimize the distribution of pressure and the alignment of the resultant vertical force at the end of the die stroke. The procedure avoids time consuming and costly try-outs, does not interfere with the creative process of the sculptors and extends the service life of the coining dies by significantly decreasing the applied pressure and bending moments. The numerical simulations were carried out in a computer program based on the finite element flow formulation that is currently being developed by the authors in collaboration with the Portuguese Mint. A new experimental procedure based on the stack compression test is also proposed for determining the stress-strain curve of the materials directly from the coin blanks.

  16. Computation of stress intensity factors for nozzle corner cracks by various finite element procedures

    International Nuclear Information System (INIS)

    Broekhoven, M.J.G.

    1975-01-01

    The present study aims at deriving accurate K-factors for a series of 5 elliptical nozzle corner cracks of increasing size by various finite element procedures, using a three-level recursive substructuring scheme to perform the computations in an economic way on an intermediate size computer (IBM 360/65 system). A nozzle on a flat plate has been selected for subsequent experimental verification, this configuration being considered an adequate simulation of a nozzle on a shallow shell. The computations have been performed with the ASKA finite element system using mainly HEXEC-27 (incomplete quartic) elements. The geometry has been subdivided into 5 subnets with a total of 3515 nodal points and 6250 unknowns, two main nets and one hyper net. Each crack front is described by 11 nodal points and all crack front nodes are inserted in the hyper net, which allows for the realization of the successive crack geometries by changing only a relatively small hyper net (615 to 725 unknowns). Output data have been interpreted in terms of K-factors by the global energy method, the displacement method and the stress method. Besides, a stiffness derivative procedure, recently developed at Brown University, which takes full advantage of the finite element formulation to calculate local K-factors, has been applied. Finally it has been investigated whether sufficiently accurate results can be obtained by analyzing a considerably smaller part than one half of the geometry (as strictly required by symmetry considerations), using fixed boundary conditions derived from a far cheaper analysis of the uncracked structure

  17. Computer-Aided Test Flow in Core-Based Design

    OpenAIRE

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of embedded cores. The CAT now is applied to a few cores within the Philips Core Test Pilot IC project

  18. A web-based procedure for liver segmentation in CT images

    Science.gov (United States)

    Yuan, Rong; Luo, Ming; Wang, Luyao; Xie, Qingguo

    2015-03-01

    Liver segmentation in CT images has been acknowledged as a basic and indispensable part in systems of computer aided liver surgery for operation design and risk evaluation. In this paper, we will introduce and implement a web-based procedure for liver segmentation to help radiologists and surgeons get an accurate result efficiently and expediently. Several clinical datasets are used to evaluate the accessibility and the accuracy. This procedure seems a promising approach for extraction of liver volumetry of various shapes. Moreover, it is possible for user to access the segmentation wherever the Internet is available without any specific machine.

  19. Computer- and Suggestion-based Cognitive Rehabilitation following Acquired Brain Injury

    DEFF Research Database (Denmark)

    Lindeløv, Jonas Kristoffer

    . That is, training does not cause cognitive transfer and thus does not constitute “brain training” or “brain exercise” of any clinical relevance. A larger study found more promising results for a suggestion-based treatment in a hypnotic procedure. Patients improved to above population average in a matter...... of 4-8 hours, making this by far the most effective treatment compared to computer-based training, physical exercise, phamaceuticals, meditation, and attention process training. The contrast between computer-based methods and the hypnotic suggestion treatment may be reflect a more general discrepancy...

  20. Quality Control Procedures Applied to the CMS Muon Chambers Built at CIEMAT

    International Nuclear Information System (INIS)

    Fouz, M. C.; Puerta Pelayo, J.

    2004-01-01

    In this document the quality control procedures applied to the CMS muon drift chambers built at CIEMAT are described. It includes a description of the high voltage and front electronics associated to the chambers. Every procedure is described with detail and a list of the more common problems and possible solutions is given. This document can be considered as a chamber test handbook for beginners. (Author) 3 refs

  1. Computer-based systems important to safety (COMPSIS) - Reporting guidelines

    International Nuclear Information System (INIS)

    1999-07-01

    The objective of this procedure is to help the user to prepare an COMPSIS report on an event so that important lessons learned are most efficiently transferred to the database. This procedure focuses on the content of the information to be provided in the report rather than on its format. The established procedure follows to large extend the procedure chosen by the IRS incident reporting system. However this database is built for I and C equipment with the purpose of the event report database to collect and disseminate information on events of significance involving Computer-Based Systems important to safety in nuclear power plants, and feedback conclusions and lessons learnt from such events. For events where human performance is dominant to draw lessons, more detailed guidance on the specific information that should be supplied is spelled out in the present procedure. This guidance differs somewhat from that for the provision of technical information, and takes into account that the engineering world is usually less familiar with human behavioural analysis than with technical analysis. The events to be reported to the COMPSIS database should be based on the national reporting criteria in the participating member countries. The aim is that all reports including computer based systems that meet each country reporting criteria should be reported. The database should give a broad picture of events/incidents occurring in operation with computer control systems. As soon as an event has been identified, the insights and lessons learnt to be conveyed to the international nuclear community shall be clearly identified. On the basis of the description of the event, the event shall be analyzed in detail under the aspect of direct and potential impact to plant safety functions. The first part should show the common involvement of operation and safety systems and the second part should show the special aspects of I and C functions, hardware and software

  2. System of didactic procedures to drive the teaching-learning of the computation in the half school level

    Directory of Open Access Journals (Sweden)

    Isabel Alonso-Berenguer

    2016-09-01

    Full Text Available A system of teaching methods is presented to drive the dynamics of the teaching-learning process of the Computation in the Half School Level, based on an interdisciplinary logic from cognitive nodes. This practical construct was designed using the Systemic-Structural-Functional method and was structured in two procedures, the relative to the appropriation of a computational culture and the concerning to the development of computational thinking, which in turn, are composed of a set of linked actions that are structured logically, that make possible the development of said dynamic. It also has evaluative criteria and patterns of achievement that allow for evaluation of the results obtained. The feasibility and relevance of the system of procedures was validated by conducting a socialization workshop with specialists of territory and through a survey of specialists from other provinces. Its application during the last two years has enabled its improvement.

  3. Computer-based control systems of nuclear power plants

    International Nuclear Information System (INIS)

    Kalashnikov, V.K.; Shugam, R.A.; Ol'shevsky, Yu.N.

    1975-01-01

    Computer-based control systems of nuclear power plants may be classified into those using computers for data acquisition only, those using computers for data acquisition and data processing, and those using computers for process control. In the present paper a brief review is given of the functions the systems above mentioned perform, their applications in different nuclear power plants, and some of their characteristics. The trend towards hierarchic systems using control computers with reserves already becomes clear when consideration is made of the control systems applied in the Canadian nuclear power plants that pertain to the first ones equipped with process computers. The control system being now under development for the large Soviet reactors of WWER type will also be based on the use of control computers. That part of the system concerned with controlling the reactor assembly is described in detail

  4. The Effect of Animation in Multimedia Computer-Based Learning and Learning Style to the Learning Results

    Directory of Open Access Journals (Sweden)

    Muhammad RUSLI

    2017-10-01

    Full Text Available The effectiveness of a learning depends on four main elements, they are content, desired learning outcome, instructional method and the delivery media. The integration of those four elements can be manifested into a learning modul which is called multimedia learning or learning by using multimedia. In learning context by using computer-based multimedia, there are two main things that need to be noticed so that the learning process can run effectively: how the content is presented, and what the learner’s chosen way in accepting and processing the information into a meaningful knowledge. First it is related with the way to visualize the content and how people learn. The second one is related with the learning style of the learner. This research aims to investigate the effect of the type of visualization—static vs animated—on a multimedia computer-based learning, and learning styles—visual vs verbal, towards the students’ capability in applying the concepts, procedures, principles of Java programming. Visualization type act as independent variables, and learning styles of the students act as a moderator variable. Moreover, the instructional strategies followed the Component Display Theory of Merril, and the format of presentation of multimedia followed the Seven Principles of Multimedia Learning of Mayer and Moreno. Learning with the multimedia computer-based learning has been done in the classroom. The subject of this research was the student of STMIK-STIKOM Bali in odd semester 2016-2017 which followed the course of Java programming. The Design experiments used multivariate analysis of variance, MANOVA 2 x 2, with a large sample of 138 students in 4 classes. Based on the results of the analysis, it can be concluded that the animation in multimedia interactive learning gave a positive effect in improving students’ learning outcomes, particularly in the applying the concepts, procedures, and principles of Java programming. The

  5. IDEA-system - a new computer based expert system for incorporation monitoring

    International Nuclear Information System (INIS)

    Doerfel, H.

    2005-01-01

    Full text: There is an increasing number of national and international recommendations and guidelines for incorporation monitoring (ICRP Publications, IAEA Safety Reports, ISO Standards, etc.). These recommendations cover different phases of incorporation monitoring and they provide general requirements for the measuring techniques, the monitoring procedures and for the procedures to evaluate intakes and doses from the monitoring results. There is, however, still a strong need for giving guidance to the dosimetrists on how to apply all the regulations properly. Thus, the EU project IDEAS was launched in order to provide general guidelines for the assessment of internal dose from incorporation monitoring data. These guidelines have recently been discussed in a virtual workshop on the internet (www.ideas-workshop.de) and they are being considered by ICRP for possible adoption in the near future. Recently, in the Karlsruhe Research Centre, a computer-based expert system has been developed for assisting dosimetrists in applying the relevant recommendations and guidelines for incorporation monitoring and internal dosimetry. The expert system gives guidance to the user with respect to: planning of monitoring (estimation of potential exposures, decision on the requirements of monitoring, definition of optimum measuring techniques and monitoring intervals); performing routine and special monitoring and evaluation of primary monitoring results. The evaluation of primary monitoring results is done according to the IDEAS guidelines in a threestage procedure according to the expected level of exposure (E = committed effective dose): standard evaluation with default or site specific parameter values (E 6 mSv). With these well-defined procedures the expert system follows the aim, that all recommendations and guidelines are applied properly and thus: internal exposures of more than 1 mSv are very likely to be detected in all situations; the results in terms of committed effective

  6. Increasing the speed of computational fluid dynamics procedure for minimization the nitrogen oxide polution from the premixed atmospheric gas burner

    Directory of Open Access Journals (Sweden)

    Fotev Vasko G.

    2017-01-01

    Full Text Available This article presents innovative method for increasing the speed of procedure which includes complex computational fluid dynamic calculations for finding the distance between flame openings of atmospheric gas burner that lead to minimal NO pollution. The method is based on standard features included in commercial computational fluid dynamic software and shortens computer working time roughly seven times in this particular case.

  7. PC based temporary shielding administrative procedure (TSAP)

    International Nuclear Information System (INIS)

    Olsen, D.E.; Pederson, G.E.; Hamby, P.N.

    1995-01-01

    A completely new Administrative Procedure for temporary shielding was developed for use at Commonwealth Edison's six nuclear stations. This procedure promotes the use of shielding, and addresses industry requirements for the use and control of temporary shielding. The importance of an effective procedure has increased since more temporary shielding is being used as ALARA goals become more ambitious. To help implement the administrative procedure, a personal computer software program was written to incorporate the procedural requirements. This software incorporates the useability of a Windows graphical user interface with extensive help and database features. This combination of a comprehensive administrative procedure and user friendly software promotes the effective use and management of temporary shielding while ensuring that industry requirements are met

  8. PC based temporary shielding administrative procedure (TSAP)

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, D.E.; Pederson, G.E. [Sargent & Lundy, Chicago, IL (United States); Hamby, P.N. [Commonwealth Edison Co., Downers Grove, IL (United States)

    1995-03-01

    A completely new Administrative Procedure for temporary shielding was developed for use at Commonwealth Edison`s six nuclear stations. This procedure promotes the use of shielding, and addresses industry requirements for the use and control of temporary shielding. The importance of an effective procedure has increased since more temporary shielding is being used as ALARA goals become more ambitious. To help implement the administrative procedure, a personal computer software program was written to incorporate the procedural requirements. This software incorporates the useability of a Windows graphical user interface with extensive help and database features. This combination of a comprehensive administrative procedure and user friendly software promotes the effective use and management of temporary shielding while ensuring that industry requirements are met.

  9. Computer vision based room interior design

    Science.gov (United States)

    Ahmad, Nasir; Hussain, Saddam; Ahmad, Kashif; Conci, Nicola

    2015-12-01

    This paper introduces a new application of computer vision. To the best of the author's knowledge, it is the first attempt to incorporate computer vision techniques into room interior designing. The computer vision based interior designing is achieved in two steps: object identification and color assignment. The image segmentation approach is used for the identification of the objects in the room and different color schemes are used for color assignment to these objects. The proposed approach is applied to simple as well as complex images from online sources. The proposed approach not only accelerated the process of interior designing but also made it very efficient by giving multiple alternatives.

  10. Computer-based information management system for interventional radiology

    International Nuclear Information System (INIS)

    Forman, B.H.; Silverman, S.G.; Mueller, P.R.; Hahn, P.F.; Papanicolaou, N.; Tung, G.A.; Brink, J.A.; Ferrucci, J.T.

    1989-01-01

    The authors authored and implemented a computer-based information management system (CBIMS) for the integrated analysis of data from a variety of abdominal nonvascular interventional procedures. The CBIMS improved on their initial handwritten-card system (which listed only patient name, hospital number, and type of procedure) by capturing relevant patient data in an organized fashion and integrating information for meaningful analysis. Advantages of CBIMS include enhanced compilation of monthly census, easy access to a patient's interventional history, and flexible querying capability that allows easy extraction of subsets of information from the patient database

  11. Comparative cost analysis -- computed tomography vs. alternative diagnostic procedures, 1977-1980

    International Nuclear Information System (INIS)

    Gempel, P.A.; Harris, G.H.; Evans, R.G.

    1977-12-01

    In comparing the total national cost of utilizing computed tomography (CT) for medically indicated diagnoses with that of conventional x-ray, ultrasonography, nuclear medicine, and exploratory surgery, this investigation concludes that there was little, if any, added net cost from CT use in 1977 or will there be in 1980. Computed tomography, generally recognized as a reliable and useful diagnostic modality, has the potential to reduce net costs provided that an optimal number of units can be made available to physicians and patients to achieve projected reductions in alternative procedures. This study examines the actual cost impact of CT on both cranial and body diagnostic procedures. For abdominal and mediastinal disorders, CT scanning is just beginning to emerge as a diagnostic modality. As such, clinical experience is somewhat limited and the authors assume that no significant reduction in conventional procedures took place in 1977. It is estimated that the approximately 375,000 CT body procedures performed in 1977 represent only a 5 percent cost increase over use of other diagnostic modalities. It is projected that 2,400,000 CT body procedures will be performed in 1980 and, depending on assumptions used, total body diagnostic costs will increase only slightly or be reduced. Thirty-one tables appear throughout the text presenting cost data broken down by types of diagnostic procedures used and projections by years. Appendixes present technical cost components for diagnostic procedures, the comparative efficacy of CT as revealed in abstracts of published literature, selected medical diagnoses, and references

  12. Development of an ICF-based eligibility procedure for education in Switzerland.

    Science.gov (United States)

    Hollenweger, Judith

    2011-05-31

    Starting in January 2011, Switzerland will implement a multidimensional, context-sensitive procedure to establish eligibility in education systems. This paper provides a brief overview of the different eligibility-related practices with a special focus on children with disabilities. The paper then outlines the philosophical and conceptual framework of the eligibility procedure based on the International Classification of Functioning, Disability and Health, and the UN Convention on the Rights of Persons with Disability. The different components and methodology applied to organise information in the process towards establishing eligibility are also presented. Finally, some observations are made regarding transparent and just applications of the eligibility procedure, and the implementation of this new eligibility procedure.

  13. The computer-based control system of the NAC accelerator

    International Nuclear Information System (INIS)

    Burdzik, G.F.; Bouckaert, R.F.A.; Cloete, I.; Du Toit, J.S.; Kohler, I.H.; Truter, J.N.J.; Visser, K.

    1982-01-01

    The National Accelerator Centre (NAC) of the CSIR is building a two-stage accelerator which will provide charged-particle beams for the use in medical and research applications. The control system for this accelerator is based on three mini-computers and a CAMAC interfacing network. Closed-loop control is being relegated to the various subsystems of the accelerators, and the computers and CAMAC network will be used in the first instance for data transfer, monitoring and servicing of the control consoles. The processing power of the computers will be utilized for automating start-up and beam-change procedures, for providing flexible and convenient information at the control consoles, for fault diagnosis and for beam-optimizing procedures. Tasks of a localized or dedicated nature are being off-loaded onto microcomputers, which are being used either in front-end devices or as slaves to the mini-computers. On the control consoles only a few instruments for setting and monitoring variables are being provided, but these instruments are universally-linkable to any appropriate machine variable

  14. Comparative analysis of lockout programs and procedures applied to industrial machines

    Energy Technology Data Exchange (ETDEWEB)

    Chinniah, Y.; Champoux, M.; Burlet-Vienney, D.; Daigle, R. [Institut de recherche Robert-Sauve en sante et en securite du travail, Montreal, PQ (Canada)

    2008-09-15

    In 2005, approximately 20 workers in Quebec were killed by dangerous machines. Approximately 13,000 accidents in the province were linked to the use of machines. The resulting cost associated with these accidents was estimated to be $70 million to the Quebec Occupational Health and Safety Commission (CSST) in compensation and salary replacement. According to article 185 of the Quebec Occupational Health and Safety Regulation (RSST), workers intervening in hazardous zones of machines and processes during maintenance, repairs, and unjamming activities must apply lockout procedures. Lockout is defined as the placement of a lock or tag on an energy-isolating device in accordance with an established procedure, indicating that the energy-isolating device is not to be operated until removal of the lock or tag in accordance with an established procedure. This report presented a comparative analysis of lockout programs and procedures applied to industrial machines. The study attempted to answer several questions regarding the concept of lockout and its definition in the literature; the differences between legal lockout requirements among provinces and countries; different standards on lockout; the contents of lockout programs as described by different documents; and the compliance of lockout programs in a sample of industries in Quebec in terms of Canadian standard on lockout, the CSA Z460-05 (2005). The report discussed the research objectives, methodology, and results of the study. It was concluded that the concept of lockout has different meanings or definitions in the literature, especially in regulations. However, definitions of lockout which are found in standards have certain similarities. 50 refs., 52 tabs., 2 appendices.

  15. Experimental design technique applied to the validation of an instrumental Neutron Activation Analysis procedure

    International Nuclear Information System (INIS)

    Santos, Uanda Paula de M. dos; Moreira, Edson Gonçalves

    2017-01-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) method were carried out for the determination of the elements bromine, chlorine, magnesium, manganese, potassium, sodium and vanadium in biological matrix materials using short irradiations at a pneumatic system. 2 k experimental designs were applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. The chosen experimental designs were the 2 3 and the 2 4 , depending on the radionuclide half life. Different certified reference materials and multi-element comparators were analyzed considering the following variables: sample decay time, irradiation time, counting time and sample distance to detector. Comparator concentration, sample mass and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations, it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN /CNEN-SP). Optimized conditions were estimated based on the results of z-score tests, main effect, interaction effects and better irradiation conditions. (author)

  16. Experimental design technique applied to the validation of an instrumental Neutron Activation Analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Uanda Paula de M. dos; Moreira, Edson Gonçalves, E-mail: uandapaula@gmail.com, E-mail: emoreira@ipen.br [Instituto de Pesquisas Energéticas e Nucleares (IPEN/CNEN-SP), São Paulo, SP (Brazil)

    2017-07-01

    In this study optimization of procedures and standardization of Instrumental Neutron Activation Analysis (INAA) method were carried out for the determination of the elements bromine, chlorine, magnesium, manganese, potassium, sodium and vanadium in biological matrix materials using short irradiations at a pneumatic system. 2{sup k} experimental designs were applied for evaluation of the individual contribution of selected variables of the analytical procedure in the final mass fraction result. The chosen experimental designs were the 2{sup 3} and the 2{sup 4}, depending on the radionuclide half life. Different certified reference materials and multi-element comparators were analyzed considering the following variables: sample decay time, irradiation time, counting time and sample distance to detector. Comparator concentration, sample mass and irradiation time were maintained constant in this procedure. By means of the statistical analysis and theoretical and experimental considerations, it was determined the optimized experimental conditions for the analytical methods that will be adopted for the validation procedure of INAA methods in the Neutron Activation Analysis Laboratory (LAN) of the Research Reactor Center (CRPq) at the Nuclear and Energy Research Institute (IPEN /CNEN-SP). Optimized conditions were estimated based on the results of z-score tests, main effect, interaction effects and better irradiation conditions. (author)

  17. Applied Computational Intelligence in Engineering and Information Technology Revised and Selected Papers from the 6th IEEE International Symposium on Applied Computational Intelligence and Informatics SACI 2011

    CERN Document Server

    Precup, Radu-Emil; Preitl, Stefan

    2012-01-01

    This book highlights the potential of getting benefits from various applications of computational intelligence techniques. The present book is structured such that to include a set of selected and extended papers from the 6th IEEE International Symposium on Applied Computational Intelligence and Informatics SACI 2011, held in Timisoara, Romania, from 19 to 21 May 2011. After a serious paper review performed by the Technical Program Committee only 116 submissions were accepted, leading to a paper acceptance ratio of 65 %. A further refinement was made after the symposium, based also on the assessment of the presentation quality. Concluding, this book includes the extended and revised versions of the very best papers of SACI 2011 and few invited papers authored by prominent specialists. The readers will benefit from gaining knowledge of the computational intelligence and on what problems can be solved in several areas; they will learn what kind of approaches is advised to use in order to solve these problems. A...

  18. "Transit data"-based MST computation

    Directory of Open Access Journals (Sweden)

    Thodoris Karatasos

    2017-10-01

    Full Text Available In this work, we present an innovative image recognition technique which is based on the exploitation of transit-data in images or simple photographs of sites of interest. Our objective is to automatically transform real-world images to graphs and, then, compute Minimum Spanning Trees (MST in them.We apply this framework and present an application which automatically computes efficient construction plans (for escalator or low-emission hot spots for connecting all points of interest in cultural sites, i.e., archaeological sites, museums, galleries, etc, aiming to to facilitate global physical access to cultural heritage and artistic work and make it accessible to all groups of population.

  19. Paper-Based and Computer-Based Concept Mappings: The Effects on Computer Achievement, Computer Anxiety and Computer Attitude

    Science.gov (United States)

    Erdogan, Yavuz

    2009-01-01

    The purpose of this paper is to compare the effects of paper-based and computer-based concept mappings on computer hardware achievement, computer anxiety and computer attitude of the eight grade secondary school students. The students were randomly allocated to three groups and were given instruction on computer hardware. The teaching methods used…

  20. Reciprocity in computer-human interaction: source-based, norm-based, and affect-based explanations.

    Science.gov (United States)

    Lee, Seungcheol Austin; Liang, Yuhua Jake

    2015-04-01

    Individuals often apply social rules when they interact with computers, and this is known as the Computers Are Social Actors (CASA) effect. Following previous work, one approach to understand the mechanism responsible for CASA is to utilize computer agents and have the agents attempt to gain human compliance (e.g., completing a pattern recognition task). The current study focuses on three key factors frequently cited to influence traditional notions of compliance: evaluations toward the source (competence and warmth), normative influence (reciprocity), and affective influence (mood). Structural equation modeling assessed the effects of these factors on human compliance with computer request. The final model shows that norm-based influence (reciprocity) increased the likelihood of compliance, while evaluations toward the computer agent did not significantly influence compliance.

  1. 3rd International Doctoral Symposium on Applied Computation and Security Systems

    CERN Document Server

    Saeed, Khalid; Cortesi, Agostino; Chaki, Nabendu

    2017-01-01

    This book presents extended versions of papers originally presented and discussed at the 3rd International Doctoral Symposium on Applied Computation and Security Systems (ACSS 2016) held from August 12 to 14, 2016 in Kolkata, India. The symposium was jointly organized by the AGH University of Science & Technology, Cracow, Poland; Ca’ Foscari University, Venice, Italy; and the University of Calcutta, India. The book is divided into two volumes, Volumes 3 and 4, and presents dissertation works in the areas of Image Processing, Biometrics-based Authentication, Soft Computing, Data Mining, Next-Generation Networking and Network Security, Remote Healthcare, Communications, Embedded Systems, Software Engineering and Service Engineering. The first two volumes of the book published the works presented at the ACSS 2015, which was held from May 23 to 25, 2015 in Kolkata, India.

  2. Recent progress and modern challenges in applied mathematics, modeling and computational science

    CERN Document Server

    Makarov, Roman; Belair, Jacques

    2017-01-01

    This volume is an excellent resource for professionals in various areas of applications of mathematics, modeling, and computational science. It focuses on recent progress and modern challenges in these areas. The volume provides a balance between fundamental theoretical and applied developments, emphasizing the interdisciplinary nature of modern trends and detailing state-of-the-art achievements in Applied Mathematics, Modeling, and Computational Science.  The chapters have been authored by international experts in their respective fields, making this book ideal for researchers in academia, practitioners, and graduate students. It can also serve as a reference in the diverse selected areas of applied mathematics, modelling, and computational sciences, and is ideal for interdisciplinary collaborations.

  3. Procedure for extraction of disparate data from maps into computerized data bases

    Science.gov (United States)

    Junkin, B. G.

    1979-01-01

    A procedure is presented for extracting disparate sources of data from geographic maps and for the conversion of these data into a suitable format for processing on a computer-oriented information system. Several graphic digitizing considerations are included and related to the NASA Earth Resources Laboratory's Digitizer System. Current operating procedures for the Digitizer System are given in a simplified and logical manner. The report serves as a guide to those organizations interested in converting map-based data by using a comparable map digitizing system.

  4. Computer-Aided Test Flow in Core-Based Design

    NARCIS (Netherlands)

    Zivkovic, V.; Tangelder, R.J.W.T.; Kerkhoff, Hans G.

    2000-01-01

    This paper copes with the test-pattern generation and fault coverage determination in the core based design. The basic core-test strategy that one has to apply in the core-based design is stated in this work. A Computer-Aided Test (CAT) flow is proposed resulting in accurate fault coverage of

  5. Uncertainty Analysis of A Flood Risk Mapping Procedure Applied In Urban Areas

    Science.gov (United States)

    Krause, J.; Uhrich, S.; Bormann, H.; Diekkrüger, B.

    In the framework of IRMA-Sponge program the presented study was part of the joint research project FRHYMAP (flood risk and hydrological mapping). A simple con- ceptual flooding model (FLOODMAP) has been developed to simulate flooded areas besides rivers within cities. FLOODMAP requires a minimum of input data (digital el- evation model (DEM), river line, water level plain) and parameters and calculates the flood extent as well as the spatial distribution of flood depths. of course the simulated model results are affected by errors and uncertainties. Possible sources of uncertain- ties are the model structure, model parameters and input data. Thus after the model validation (comparison of simulated water to observed extent, taken from airborne pictures) the uncertainty of the essential input data set (digital elevation model) was analysed. Monte Carlo simulations were performed to assess the effect of uncertain- ties concerning the statistics of DEM quality and to derive flooding probabilities from the set of simulations. The questions concerning a minimum resolution of a DEM re- quired for flood simulation and concerning the best aggregation procedure of a given DEM was answered by comparing the results obtained using all available standard GIS aggregation procedures. Seven different aggregation procedures were applied to high resolution DEMs (1-2m) in three cities (Bonn, Cologne, Luxembourg). Basing on this analysis the effect of 'uncertain' DEM data was estimated and compared with other sources of uncertainties. Especially socio-economic information and monetary transfer functions required for a damage risk analysis show a high uncertainty. There- fore this study helps to analyse the weak points of the flood risk and damage risk assessment procedure.

  6. 45 CFR 660.6 - What procedures apply to the selection of programs and activities under these regulations?

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 3 2010-10-01 2010-10-01 false What procedures apply to the selection of programs... Public Welfare (Continued) NATIONAL SCIENCE FOUNDATION INTERGOVERNMENTAL REVIEW OF THE NATIONAL SCIENCE FOUNDATION PROGRAMS AND ACTIVITIES § 660.6 What procedures apply to the selection of programs and activities...

  7. Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.

    Science.gov (United States)

    Said, Nadia; Engelhart, Michael; Kirches, Christian; Körkel, Stefan; Holt, Daniel V

    2016-01-01

    Computational models of cognition provide an interface to connect advanced mathematical tools and methods to empirically supported theories of behavior in psychology, cognitive science, and neuroscience. In this article, we consider a computational model of instance-based learning, implemented in the ACT-R cognitive architecture. We propose an approach for obtaining mathematical reformulations of such cognitive models that improve their computational tractability. For the well-established Sugar Factory dynamic decision making task, we conduct a simulation study to analyze central model parameters. We show how mathematical optimization techniques can be applied to efficiently identify optimal parameter values with respect to different optimization goals. Beyond these methodological contributions, our analysis reveals the sensitivity of this particular task with respect to initial settings and yields new insights into how average human performance deviates from potential optimal performance. We conclude by discussing possible extensions of our approach as well as future steps towards applying more powerful derivative-based optimization methods.

  8. Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.

    Directory of Open Access Journals (Sweden)

    Nadia Said

    Full Text Available Computational models of cognition provide an interface to connect advanced mathematical tools and methods to empirically supported theories of behavior in psychology, cognitive science, and neuroscience. In this article, we consider a computational model of instance-based learning, implemented in the ACT-R cognitive architecture. We propose an approach for obtaining mathematical reformulations of such cognitive models that improve their computational tractability. For the well-established Sugar Factory dynamic decision making task, we conduct a simulation study to analyze central model parameters. We show how mathematical optimization techniques can be applied to efficiently identify optimal parameter values with respect to different optimization goals. Beyond these methodological contributions, our analysis reveals the sensitivity of this particular task with respect to initial settings and yields new insights into how average human performance deviates from potential optimal performance. We conclude by discussing possible extensions of our approach as well as future steps towards applying more powerful derivative-based optimization methods.

  9. Personal Computer (PC) based image processing applied to fluid mechanics

    Science.gov (United States)

    Cho, Y.-C.; Mclachlan, B. G.

    1987-01-01

    A PC based image processing system was employed to determine the instantaneous velocity field of a two-dimensional unsteady flow. The flow was visualized using a suspension of seeding particles in water, and a laser sheet for illumination. With a finite time exposure, the particle motion was captured on a photograph as a pattern of streaks. The streak pattern was digitized and processed using various imaging operations, including contrast manipulation, noise cleaning, filtering, statistical differencing, and thresholding. Information concerning the velocity was extracted from the enhanced image by measuring the length and orientation of the individual streaks. The fluid velocities deduced from the randomly distributed particle streaks were interpolated to obtain velocities at uniform grid points. For the interpolation a simple convolution technique with an adaptive Gaussian window was used. The results are compared with a numerical prediction by a Navier-Stokes computation.

  10. Computational Modelling in Development of a Design Procedure for Concrete Road

    Directory of Open Access Journals (Sweden)

    B. Novotný

    2000-01-01

    Full Text Available The computational modelling plays a decisive part in development of a new design procedure for concrete pavement by quantifying impacts of individual design factors. In the present paper, the emphasis is placed on the modelling of a structural response of the jointed concrete pavement as a system of interacting rectangular slabs transferring wheel loads into an elastic layered subgrade. The finite element plate analysis is combined with the assumption of a linear contact stress variation over triangular elements of the contact region division. The linking forces are introduced to model the load transfer across the joints. The unknown contact stress nodal intensities as well as unknown linking forces are determined in an iterative way to fulfil slab/foundation and slab/slab contact conditions. The temperature effects are also considered and space is reserved for modelling of inelastic and additional environmental effects. It is pointed out that pavement design should be based on full data of pavement stressing, in contradiction to procedures accounting only for the axle load induced stresses.

  11. Computational integration of the phases and procedures of calibration processes for radioprotection

    International Nuclear Information System (INIS)

    Santos, Gleice R. dos; Thiago, Bibiana dos S.; Rocha, Felicia D.G.; Santos, Gelson P. dos; Potiens, Maria da Penha A.; Vivolo, Vitor

    2011-01-01

    This work proceed the computational integration of the processes phases by using only a single computational software, from the entrance of the instrument at the Instrument Calibration Laboratory (LCI-IPEN) to the conclusion of calibration procedures. So, the initial information such as trade mark, model, manufacturer, owner, and the calibration records are digitized once until the calibration certificate emission

  12. Evaluation of computer-based NDE techniques and regional support of inspection activities

    International Nuclear Information System (INIS)

    Taylor, T.T.; Kurtz, R.J.; Heasler, P.G.; Doctor, S.R.

    1991-01-01

    This paper describes the technical progress during fiscal year 1990 for the program entitled 'Evaluation of Computer-Based nondestructive evaluation (NDE) Techniques and Regional Support of Inspection Activities.' Highlights of the technical progress include: development of a seminar to provide basic knowledge required to review and evaluate computer-based systems; review of a typical computer-based field procedure to determine compliance with applicable codes, ambiguities in procedure guidance, and overall effectiveness and utility; design and fabrication of a series of three test blocks for NRC staff use for training or audit of UT systems; technical assistance in reviewing (1) San Onofre ten year reactor pressure vessel inservice inspection activities and (2) the capability of a proposed phased array inspection of the feedwater nozzle at Oyster Creek; completion of design calculations to determine the feasibility and significance of various sizes of mockup assemblies that could be used to evaluate the effectiveness of eddy current examinations performed on steam generators; and discussion of initial mockup design features and methods for fabricating flaws in steam generator tubes

  13. Efficient p-value evaluation for resampling-based tests

    KAUST Repository

    Yu, K.

    2011-01-05

    The resampling-based test, which often relies on permutation or bootstrap procedures, has been widely used for statistical hypothesis testing when the asymptotic distribution of the test statistic is unavailable or unreliable. It requires repeated calculations of the test statistic on a large number of simulated data sets for its significance level assessment, and thus it could become very computationally intensive. Here, we propose an efficient p-value evaluation procedure by adapting the stochastic approximation Markov chain Monte Carlo algorithm. The new procedure can be used easily for estimating the p-value for any resampling-based test. We show through numeric simulations that the proposed procedure can be 100-500 000 times as efficient (in term of computing time) as the standard resampling-based procedure when evaluating a test statistic with a small p-value (e.g. less than 10( - 6)). With its computational burden reduced by this proposed procedure, the versatile resampling-based test would become computationally feasible for a much wider range of applications. We demonstrate the application of the new method by applying it to a large-scale genetic association study of prostate cancer.

  14. Research in progress in applied mathematics, numerical analysis, fluid mechanics, and computer science

    Science.gov (United States)

    1994-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, fluid mechanics, and computer science during the period October 1, 1993 through March 31, 1994. The major categories of the current ICASE research program are: (1) applied and numerical mathematics, including numerical analysis and algorithm development; (2) theoretical and computational research in fluid mechanics in selected areas of interest to LaRC, including acoustics and combustion; (3) experimental research in transition and turbulence and aerodynamics involving LaRC facilities and scientists; and (4) computer science.

  15. Applying Behavior Analytic Procedures to Effectively Teach Literacy Skills in the Classroom

    Science.gov (United States)

    Joseph, Laurice M.; Alber-Morgan, Sheila; Neef, Nancy

    2016-01-01

    The purpose of this article is to discuss the application of behavior analytic procedures for advancing and evaluating methods for teaching literacy skills in the classroom. Particularly, applied behavior analysis has contributed substantially to examining the relationship between teacher behavior and student literacy performance. Teacher…

  16. Employing Subgoals in Computer Programming Education

    Science.gov (United States)

    Margulieux, Lauren E.; Catrambone, Richard; Guzdial, Mark

    2016-01-01

    The rapid integration of technology into our professional and personal lives has left many education systems ill-equipped to deal with the influx of people seeking computing education. To improve computing education, we are applying techniques that have been developed for other procedural fields. The present study applied such a technique, subgoal…

  17. Computer-Based Technologies in Dentistry: Types and Applications

    Directory of Open Access Journals (Sweden)

    Rajaa Mahdi Musawi

    2016-10-01

    Full Text Available During dental education, dental students learn how to examine patients, make diagnosis, plan treatment and perform dental procedures perfectly and efficiently. However, progresses in computer-based technologies including virtual reality (VR simulators, augmented reality (AR and computer aided design/computer aided manufacturing (CAD/CAM systems have resulted in new modalities for instruction and practice of dentistry. Virtual reality dental simulators enable repeated, objective and assessable practice in various controlled situations. Superimposition of three-dimensional (3D virtual images on actual images in AR allows surgeons to simultaneously visualize the surgical site and superimpose informative 3D images of invisible regions on the surgical site to serve as a guide. The use of CAD/CAM systems for designing and manufacturing of dental appliances and prostheses has been well established.This article reviews computer-based technologies, their application in dentistry and their potentials and limitations in promoting dental education, training and practice. Practitioners will be able to choose from a broader spectrum of options in their field of practice by becoming familiar with new modalities of training and practice.Keywords: Virtual Reality Exposure Therapy; Immersion; Computer-Aided Design; Dentistry; Education

  18. Controller Design of DFIG Based Wind Turbine by Using Evolutionary Soft Computational Techniques

    Directory of Open Access Journals (Sweden)

    O. P. Bharti

    2017-06-01

    Full Text Available This manuscript illustrates the controller design for a doubly fed induction generator based variable speed wind turbine by using a bioinspired scheme. This methodology is based on exploiting two proficient swarm intelligence based evolutionary soft computational procedures. The particle swarm optimization (PSO and bacterial foraging optimization (BFO techniques are employed to design the controller intended for small damping plant of the DFIG. Wind energy overview and DFIG operating principle along with the equivalent circuit model is adequately discussed in this paper. The controller design for DFIG based WECS using PSO and BFO are described comparatively in detail. The responses of the DFIG system regarding terminal voltage, current, active-reactive power, and DC-Link voltage have slightly improved with the evolutionary soft computational procedure. Lastly, the obtained output is equated with a standard technique for performance improvement of DFIG based wind energy conversion system.

  19. Existence and instability of steady states for a triangular cross-diffusion system: A computer-assisted proof

    Science.gov (United States)

    Breden, Maxime; Castelli, Roberto

    2018-05-01

    In this paper, we present and apply a computer-assisted method to study steady states of a triangular cross-diffusion system. Our approach consist in an a posteriori validation procedure, that is based on using a fixed point argument around a numerically computed solution, in the spirit of the Newton-Kantorovich theorem. It allows to prove the existence of various non homogeneous steady states for different parameter values. In some situations, we obtain as many as 13 coexisting steady states. We also apply the a posteriori validation procedure to study the linear stability of the obtained steady states, proving that many of them are in fact unstable.

  20. On a computational method for modelling complex ecosystems by superposition procedure

    International Nuclear Information System (INIS)

    He Shanyu.

    1986-12-01

    In this paper, the Superposition Procedure is concisely described, and a computational method for modelling a complex ecosystem is proposed. With this method, the information contained in acceptable submodels and observed data can be utilized to maximal degree. (author). 1 ref

  1. Development of utility generic functional requirements for electronic work packages and computer-based procedures

    Energy Technology Data Exchange (ETDEWEB)

    Oxstrand, Johanna [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2017-06-01

    The Nuclear Electronic Work Packages - Enterprise Requirements (NEWPER) initiative is a step toward a vision of implementing an eWP framework that includes many types of eWPs. This will enable immediate paper-related cost savings in work management and provide a path to future labor efficiency gains through enhanced integration and process improvement in support of the Nuclear Promise (Nuclear Energy Institute 2016). The NEWPER initiative was organized by the Nuclear Information Technology Strategic Leadership (NITSL) group, which is an organization that brings together leaders from the nuclear utility industry and regulatory agencies to address issues involved with information technology used in nuclear-power utilities. NITSL strives to maintain awareness of industry information technology-related initiatives and events and communicates those events to its membership. NITSL and LWRS Program researchers have been coordinating activities, including joint organization of NEWPER-related meetings and report development. The main goal of the NEWPER initiative was to develop a set of utility generic functional requirements for eWP systems. This set of requirements will support each utility in their process of identifying plant-specific functional and non-functional requirements. The NEWPER initiative has 140 members where the largest group of members consists of 19 commercial U.S. nuclear utilities and eleven of the most prominent vendors of eWP solutions. Through the NEWPER initiative two sets of functional requirements were developed; functional requirements for electronic work packages and functional requirements for computer-based procedures. This paper will describe the development process as well as a summary of the requirements.

  2. Reliability assessment of a manual-based procedure towards learning curve modeling and fmea analysis

    Directory of Open Access Journals (Sweden)

    Gustavo Rech

    2013-03-01

    Full Text Available Separation procedures in drug Distribution Centers (DC are manual-based activities prone to failures such as shipping exchanged, expired or broken drugs to the customer. Two interventions seem as promising in improving the reliability in the separation procedure: (i selection and allocation of appropriate operators to the procedure, and (ii analysis of potential failure modes incurred by selected operators. This article integrates Learning Curves (LC and FMEA (Failure Mode and Effect Analysis aimed at reducing the occurrence of failures in the manual separation of a drug DC. LCs parameters enable generating an index to identify the recommended operators to perform the procedures. The FMEA is then applied to the separation procedure carried out by the selected operators in order to identify failure modes. It also deployed the traditional FMEA severity index into two sub-indexes related to financial issues and damage to company´s image in order to characterize failures severity. When applied to a drug DC, the proposed method significantly reduced the frequency and severity of failures in the separation procedure.

  3. Mathematical calibration procedure of a capacitive sensor-based indexed metrology platform

    International Nuclear Information System (INIS)

    Brau-Avila, A; Valenzuela-Galvan, M; Herrera-Jimenez, V M; Santolaria, J; Aguilar, J J; Acero, R

    2017-01-01

    The demand for faster and more reliable measuring tasks for the control and quality assurance of modern production systems has created new challenges for the field of coordinate metrology. Thus, the search for new solutions in coordinate metrology systems and the need for the development of existing ones still persists. One example of such a system is the portable coordinate measuring machine (PCMM), the use of which in industry has considerably increased in recent years, mostly due to its flexibility for accomplishing in-line measuring tasks as well as its reduced cost and operational advantages compared to traditional coordinate measuring machines. Nevertheless, PCMMs have a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification and optimization procedures. In this work the mathematical calibration procedure of a capacitive sensor-based indexed metrology platform (IMP) is presented. This calibration procedure is based on the readings and geometric features of six capacitive sensors and their targets with nanometer resolution. The final goal of the IMP calibration procedure is to optimize the geometric features of the capacitive sensors and their targets in order to use the optimized data in the verification procedures of PCMMs. (paper)

  4. Mathematical calibration procedure of a capacitive sensor-based indexed metrology platform

    Science.gov (United States)

    Brau-Avila, A.; Santolaria, J.; Acero, R.; Valenzuela-Galvan, M.; Herrera-Jimenez, V. M.; Aguilar, J. J.

    2017-03-01

    The demand for faster and more reliable measuring tasks for the control and quality assurance of modern production systems has created new challenges for the field of coordinate metrology. Thus, the search for new solutions in coordinate metrology systems and the need for the development of existing ones still persists. One example of such a system is the portable coordinate measuring machine (PCMM), the use of which in industry has considerably increased in recent years, mostly due to its flexibility for accomplishing in-line measuring tasks as well as its reduced cost and operational advantages compared to traditional coordinate measuring machines. Nevertheless, PCMMs have a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification and optimization procedures. In this work the mathematical calibration procedure of a capacitive sensor-based indexed metrology platform (IMP) is presented. This calibration procedure is based on the readings and geometric features of six capacitive sensors and their targets with nanometer resolution. The final goal of the IMP calibration procedure is to optimize the geometric features of the capacitive sensors and their targets in order to use the optimized data in the verification procedures of PCMMs.

  5. Three-dimensional integrated CAE system applying computer graphic technique

    International Nuclear Information System (INIS)

    Kato, Toshisada; Tanaka, Kazuo; Akitomo, Norio; Obata, Tokayasu.

    1991-01-01

    A three-dimensional CAE system for nuclear power plant design is presented. This system utilizes high-speed computer graphic techniques for the plant design review, and an integrated engineering database for handling the large amount of nuclear power plant engineering data in a unified data format. Applying this system makes it possible to construct a nuclear power plant using only computer data from the basic design phase to the manufacturing phase, and it increases the productivity and reliability of the nuclear power plants. (author)

  6. An optimal range of information quantity on computer-based procedure interface design in the advanced main control room

    International Nuclear Information System (INIS)

    Hsieh Minchih; Chiu Mingchuan; Hwang Sheueling

    2015-01-01

    The quantification of information in the interface design is a critical issue. Too much information on an interface can confuse a user while executing a task, and too little information may result in poor user performance. This study focused on the quantification of visible information on computer-based procedures (CBPs). Levels of information quantity and task complexity were considered in this experiment. Simulated CBPs were developed to consist of three levels: high (at least 10 events, i.e. 3.32 bits), medium (4–8 events, i.e. 2–3 bits), and low information quantity (1 or 2 events, i.e. 0 or 1 bits). Task complexity comprised two levels: complex tasks and simple tasks. The dependent variables include operation time, secondary task performance, and mental workload. Results suggested that medium information quantity of five to eight events has a remarkable advantage in supporting operator performance under both simple and complex tasks. This research not only suggested the appropriate range of information quantity on the CBP interface, but also complemented certain deficient results of previous CBP interface design studies. Additionally, based on results obtained by this study, the quantification of information on the CBP interface should be considered to ensure safe operation of nuclear power plants. (author)

  7. Computer aided planning of orthopaedic surgeries: the definition of generic planning steps for bone removal procedures.

    Science.gov (United States)

    Putzer, David; Moctezuma, Jose Luis; Nogler, Michael

    2017-11-01

    An increasing number of orthopaedic surgeons are using computer aided planning tools for bone removal applications. The aim of the study was to consolidate a set of generic functions to be used for a 3D computer assisted planning or simulation. A limited subset of 30 surgical procedures was analyzed and verified in 243 surgical procedures of a surgical atlas. Fourteen generic functions to be used in 3D computer assisted planning and simulations were extracted. Our results showed that the average procedure comprises 14 ± 10 (SD) steps with ten different generic planning steps and four generic bone removal steps. In conclusion, the study shows that with a limited number of 14 planning functions it is possible to perform 243 surgical procedures out of Campbell's Operative Orthopedics atlas. The results may be used as a basis for versatile generic intraoperative planning software.

  8. Global Conference on Applied Computing in Science and Engineering

    CERN Document Server

    2016-01-01

    The Global Conference on Applied Computing in Science and Engineering is organized by academics and researchers belonging to different scientific areas of the C3i/Polytechnic Institute of Portalegre (Portugal) and the University of Extremadura (Spain) with the technical support of ScienceKnow Conferences. The event has the objective of creating an international forum for academics, researchers and scientists from worldwide to discuss worldwide results and proposals regarding to the soundest issues related to Applied Computing in Science and Engineering. This event will include the participation of renowned keynote speakers, oral presentations, posters sessions and technical conferences related to the topics dealt with in the Scientific Program as well as an attractive social and cultural program. The papers will be published in the Proceedings e-books. The proceedings of the conference will be sent to possible indexing on Thomson Reuters (selective by Thomson Reuters, not all-inclusive) and Google Scholar...

  9. Computer–Based Procedures for Nuclear Power Plant Field Workers: Preliminary Results from Two Evaluation Studies

    Energy Technology Data Exchange (ETDEWEB)

    Katya L Le Blanc; Johanna H Oxstrand

    2013-10-01

    The Idaho National Laboratory and participants from the U.S. nuclear industry are collaborating on a research effort aimed to augment the existing guidance on computer-based procedure (CBP) design with specific guidance on how to design CBP user interfaces such that they support procedure execution in ways that exceed the capabilities of paper-based procedures (PBPs) without introducing new errors. Researchers are employing an iterative process where the human factors issues and interface design principles related to CBP usage are systematically addressed and evaluated in realistic settings. This paper describes the process of developing a CBP prototype and the two studies conducted to evaluate the prototype. The results indicate that CBPs may improve performance by reducing errors, but may increase the time it takes to complete procedural tasks.

  10. State Token Petri Net modeling method for formal verification of computerized procedure including operator's interruptions of procedure execution flow

    International Nuclear Information System (INIS)

    Kim, Yun Goo; Seong, Poong Hyun

    2012-01-01

    The Computerized Procedure System (CPS) is one of the primary operating support systems in the digital Main Control Room. The CPS displays procedure on the computer screen in the form of a flow chart, and displays plant operating information along with procedure instructions. It also supports operator decision making by providing a system decision. A procedure flow should be correct and reliable, as an error would lead to operator misjudgement and inadequate control. In this paper we present a modeling for the CPS that enables formal verification based on Petri nets. The proposed State Token Petri Nets (STPN) also support modeling of a procedure flow that has various interruptions by the operator, according to the plant condition. STPN modeling is compared with Coloured Petri net when they are applied to Emergency Operating Computerized Procedure. A converting program for Computerized Procedure (CP) to STPN has been also developed. The formal verification and validation methods of CP with STPN increase the safety of a nuclear power plant and provide digital quality assurance means that are needed when the role and function of the CPS is increasing.

  11. Computer-based teaching module design: principles derived from learning theories.

    Science.gov (United States)

    Lau, K H Vincent

    2014-03-01

    The computer-based teaching module (CBTM), which has recently gained prominence in medical education, is a teaching format in which a multimedia program serves as a single source for knowledge acquisition rather than playing an adjunctive role as it does in computer-assisted learning (CAL). Despite empirical validation in the past decade, there is limited research into the optimisation of CBTM design. This review aims to summarise research in classic and modern multimedia-specific learning theories applied to computer learning, and to collapse the findings into a set of design principles to guide the development of CBTMs. Scopus was searched for: (i) studies of classic cognitivism, constructivism and behaviourism theories (search terms: 'cognitive theory' OR 'constructivism theory' OR 'behaviourism theory' AND 'e-learning' OR 'web-based learning') and their sub-theories applied to computer learning, and (ii) recent studies of modern learning theories applied to computer learning (search terms: 'learning theory' AND 'e-learning' OR 'web-based learning') for articles published between 1990 and 2012. The first search identified 29 studies, dominated in topic by the cognitive load, elaboration and scaffolding theories. The second search identified 139 studies, with diverse topics in connectivism, discovery and technical scaffolding. Based on their relative representation in the literature, the applications of these theories were collapsed into a list of CBTM design principles. Ten principles were identified and categorised into three levels of design: the global level (managing objectives, framing, minimising technical load); the rhetoric level (optimising modality, making modality explicit, scaffolding, elaboration, spaced repeating), and the detail level (managing text, managing devices). This review examined the literature in the application of learning theories to CAL to develop a set of principles that guide CBTM design. Further research will enable educators to

  12. Efficacy of navigation in skull base surgery using composite computer graphics of magnetic resonance and computed tomography images

    International Nuclear Information System (INIS)

    Hayashi, Nakamasa; Kurimoto, Masanori; Hirashima, Yutaka; Ikeda, Hiroaki; Shibata, Takashi; Tomita, Takahiro; Endo, Shunro

    2001-01-01

    The efficacy of a neurosurgical navigation system using three-dimensional composite computer graphics (CGs) of magnetic resonance (MR) and computed tomography (CT) images was evaluated in skull base surgery. Three-point transformation was used for integration of MR and CT images. MR and CT image data were obtained with three skin markers placed on the patient's scalp. Volume-rendering manipulations of the data produced three-dimensional CGs of the scalp, brain, and lesions from the MR images, and the scalp and skull from the CT. Composite CGs of the scalp, skull, brain, and lesion were created by registering the three markers on the three-dimensional rendered scalp images obtained from MR imaging and CT in the system. This system was used for 14 patients with skull base lesions. Three-point transformation using three-dimensional CGs was easily performed for multimodal registration. Simulation of surgical procedures on composite CGs aided in comprehension of the skull base anatomy and selection of the optimal approaches. Intraoperative navigation aided in determination of actual spatial position in the skull base and the optimal trajectory to the tumor during surgical procedures. (author)

  13. Security Considerations and Recommendations in Computer-Based Testing

    Directory of Open Access Journals (Sweden)

    Saleh M. Al-Saleem

    2014-01-01

    Full Text Available Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT. However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password in order to check the identity and authenticity of the examinee.

  14. Security considerations and recommendations in computer-based testing.

    Science.gov (United States)

    Al-Saleem, Saleh M; Ullah, Hanif

    2014-01-01

    Many organizations and institutions around the globe are moving or planning to move their paper-and-pencil based testing to computer-based testing (CBT). However, this conversion will not be the best option for all kinds of exams and it will require significant resources. These resources may include the preparation of item banks, methods for test delivery, procedures for test administration, and last but not least test security. Security aspects may include but are not limited to the identification and authentication of examinee, the risks that are associated with cheating on the exam, and the procedures related to test delivery to the examinee. This paper will mainly investigate the security considerations associated with CBT and will provide some recommendations for the security of these kinds of tests. We will also propose a palm-based biometric authentication system incorporated with basic authentication system (username/password) in order to check the identity and authenticity of the examinee.

  15. Morphing-Based Shape Optimization in Computational Fluid Dynamics

    Science.gov (United States)

    Rousseau, Yannick; Men'Shov, Igor; Nakamura, Yoshiaki

    In this paper, a Morphing-based Shape Optimization (MbSO) technique is presented for solving Optimum-Shape Design (OSD) problems in Computational Fluid Dynamics (CFD). The proposed method couples Free-Form Deformation (FFD) and Evolutionary Computation, and, as its name suggests, relies on the morphing of shape and computational domain, rather than direct shape parameterization. Advantages of the FFD approach compared to traditional parameterization are first discussed. Then, examples of shape and grid deformations by FFD are presented. Finally, the MbSO approach is illustrated and applied through an example: the design of an airfoil for a future Mars exploration airplane.

  16. Validation of an advanced analytical procedure applied to the measurement of environmental radioactivity.

    Science.gov (United States)

    Thanh, Tran Thien; Vuong, Le Quang; Ho, Phan Long; Chuong, Huynh Dinh; Nguyen, Vo Hoang; Tao, Chau Van

    2018-04-01

    In this work, an advanced analytical procedure was applied to calculate radioactivity in spiked water samples in a close geometry gamma spectroscopy. It included MCNP-CP code in order to calculate the coincidence summing correction factor (CSF). The CSF results were validated by a deterministic method using ETNA code for both p-type HPGe detectors. It showed that a good agreement for both codes. Finally, the validity of the developed procedure was confirmed by a proficiency test to calculate the activities of various radionuclides. The results of the radioactivity measurement with both detectors using the advanced analytical procedure were received the ''Accepted'' statuses following the proficiency test. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Research in progress in applied mathematics, numerical analysis, and computer science

    Science.gov (United States)

    1990-01-01

    Research conducted at the Institute in Science and Engineering in applied mathematics, numerical analysis, and computer science is summarized. The Institute conducts unclassified basic research in applied mathematics in order to extend and improve problem solving capabilities in science and engineering, particularly in aeronautics and space.

  18. Assessment of creep-fatigue damage using the UK strain based procedure

    International Nuclear Information System (INIS)

    Bate, S.K.

    1997-01-01

    The UK strain based procedures have been developed for the evaluation of damage in structures, arising from fatigue cycles and creep processes. The fatigue damage is assessed on the basis of modelling crack growth from about one grain depth to an allowable limit which represents an engineering definition of crack formation. Creep damage is based up on the exhaustion of available ductility by creep strain accumulation. The procedures are applicable only when level A and B service conditions apply, as defined in RCC-MR or ASME Code Case N47. The procedures require the components of strain to be evaluated separately, thus they may be used with either full inelastic analysis or simplified methods. To support the development of the UK strain based creep-fatigue procedures an experimental program was undertaken by NNC to study creep-fatigue interaction of structures operating at high temperature. These tests, collectively known as the SALTBATH tests considered solid cylinder and tube-plate specimens, manufactured from Type 316 stainless steel. These specimens were subjected to thermal cycles between 250 deg. C and 600 deg. C. In all the cases the thermal cycle produces tensile residual stresses during dwells at 600 deg. C. One of the tube-plate specimens was used as a benchmark for validating the strain based creep fatigue procedures and subsequently as part of a CEC co-operative study. This benchmark work is described in this paper. A thermal and inelastic stress analysis was carried out using the finite element code ABAQUS. The inelastic behaviour of the material was described using the ORNL constitutive equations. A creep fatigue assessment using the strain based procedures has been compared with an assessment using the RCC-MR inelastic rules. The analyses indicated that both the UK strain based procedures and the RCC-MR rules were conservative, but the conservatism was greater for the RCC-MR rules. (author). 8 refs, 8 figs, 4 tabs

  19. International Conference on Applied Mathematics, Modeling and Computational Science & Annual meeting of the Canadian Applied and Industrial Mathematics

    CERN Document Server

    Bélair, Jacques; Kunze, Herb; Makarov, Roman; Melnik, Roderick; Spiteri, Raymond J

    2016-01-01

    Focusing on five main groups of interdisciplinary problems, this book covers a wide range of topics in mathematical modeling, computational science and applied mathematics. It presents a wealth of new results in the development of modeling theories and methods, advancing diverse areas of applications and promoting interdisciplinary interactions between mathematicians, scientists, engineers and representatives from other disciplines. The book offers a valuable source of methods, ideas, and tools developed for a variety of disciplines, including the natural and social sciences, medicine, engineering, and technology. Original results are presented on both the fundamental and applied level, accompanied by an ample number of real-world problems and examples emphasizing the interdisciplinary nature and universality of mathematical modeling, and providing an excellent outline of today’s challenges. Mathematical modeling, with applied and computational methods and tools, plays a fundamental role in modern science a...

  20. A Comparison of Exposure Control Procedures in CAT Systems Based on Different Measurement Models for Testlets

    Science.gov (United States)

    Boyd, Aimee M.; Dodd, Barbara; Fitzpatrick, Steven

    2013-01-01

    This study compared several exposure control procedures for CAT systems based on the three-parameter logistic testlet response theory model (Wang, Bradlow, & Wainer, 2002) and Masters' (1982) partial credit model when applied to a pool consisting entirely of testlets. The exposure control procedures studied were the modified within 0.10 logits…

  1. Applying 'Evidence-Based Medicine' Theory to Interventional Radiology.Part 2: A Spreadsheet for Swift Assessment of Procedural Benefit and Harm

    International Nuclear Information System (INIS)

    MacEneaney, Peter M.; Malone, Dermot E.

    2000-01-01

    AIM: To design a spreadsheet program to analyse interventional radiology (IR) data rapidly produced in local research or reported in the literature using 'evidence-based medicine' (EBM) parameters of treatment benefit and harm. MATERIALS AND METHODS: Microsoft Excel TM was used. The spreadsheet consists of three worksheets. The first shows the 'Levels of Evidence and Grades of Recommendations' that can be assigned to therapeutic studies as defined by the Oxford Centre for EBM. The second and third worksheets facilitate the EBM assessment of therapeutic benefit and harm. Validity criteria are described. These include the assessment of the adequacy of sample size in the detection of possible procedural complications. A contingency (2 x 2) table for raw data on comparative outcomes in treated patients and controls has been incorporated. Formulae for EBM calculations are related to these numerators and denominators in the spreadsheet. The parameters calculated are benefit -- relative risk reduction, absolute risk reduction, number needed to treat (NNT). Harm -- relative risk, relative odds, number needed to harm (NNH). Ninety-five per cent confidence intervals are calculated for all these indices. The results change automatically when the data in the therapeutic outcome cells are changed. A final section allows the user to correct the NNT or NNH in their application to individual patients. RESULTS: This spreadsheet can be used on desktop and palmtop computers. The MS Excel TM version can be downloaded via the Internet from the URL ftp://radiography.com/pub/TxHarm00.xls. CONCLUSION: A spreadsheet is useful for the rapid analysis of the clinical benefit and harm from IR procedures. MacEneaney, P.M. and Malone, D.E

  2. Power secant method applied to natural frequency extraction of Timoshenko beam structures

    Directory of Open Access Journals (Sweden)

    C.A.N. Dias

    Full Text Available This work deals with an improved plane frame formulation whose exact dynamic stiffness matrix (DSM presents, uniquely, null determinant for the natural frequencies. In comparison with the classical DSM, the formulation herein presented has some major advantages: local mode shapes are preserved in the formulation so that, for any positive frequency, the DSM will never be ill-conditioned; in the absence of poles, it is possible to employ the secant method in order to have a more computationally efficient eigenvalue extraction procedure. Applying the procedure to the more general case of Timoshenko beams, we introduce a new technique, named "power deflation", that makes the secant method suitable for the transcendental nonlinear eigenvalue problems based on the improved DSM. In order to avoid overflow occurrences that can hinder the secant method iterations, limiting frequencies are formulated, with scaling also applied to the eigenvalue problem. Comparisons with results available in the literature demonstrate the strength of the proposed method. Computational efficiency is compared with solutions obtained both by FEM and by the Wittrick-Williams algorithm.

  3. Computer based approach to fatigue analysis and design

    International Nuclear Information System (INIS)

    Comstock, T.R.; Bernard, T.; Nieb, J.

    1979-01-01

    An approach is presented which uses a mini-computer based system for data acquisition, analysis and graphic displays relative to fatigue life estimation and design. Procedures are developed for identifying an eliminating damaging events due to overall duty cycle, forced vibration and structural dynamic characteristics. Two case histories, weld failures in heavy vehicles and low cycle fan blade failures, are discussed to illustrate the overall approach. (orig.) 891 RW/orig. 892 RKD [de

  4. Computer-Based Learning: The Use of SPSS Statistical Program for Improving Biostatistical Competence of Medical Students

    Directory of Open Access Journals (Sweden)

    Zvi H. Perry

    2014-01-01

    Full Text Available Background. We changed the biostatistics curriculum for our medical students and have created a course entitled “Multivariate analysis of statistical data, using the SPSS package.” Purposes. The aim of this course was to develop students’ skills in computerized data analysis, as well as enhancing their ability to read and interpret statistical data analysis in the literature. Methods. In the current study we have shown that a computer-based course for biostatistics and advanced data analysis is feasible and efficient, using course specific evaluation questionnaires. Results. Its efficacy is both subjective (our subjects felt better prepared to do their theses, as well as to read articles with advanced statistical data analysis and objective (their knowledge of how and when to apply statistical procedures seemed to improve. Conclusions. We showed that a formal evaluative process for such a course is possible and that it enhances the learning experience both for the students and their teachers. In the current study we have shown that a computer-based course for biostatistics and advanced data analysis is feasible and efficient.

  5. Teaching and Learning Methodologies Supported by ICT Applied in Computer Science

    Science.gov (United States)

    Capacho, Jose

    2016-01-01

    The main objective of this paper is to show a set of new methodologies applied in the teaching of Computer Science using ICT. The methodologies are framed in the conceptual basis of the following sciences: Psychology, Education and Computer Science. The theoretical framework of the research is supported by Behavioral Theory, Gestalt Theory.…

  6. Computer vision based nacre thickness measurement of Tahitian pearls

    Science.gov (United States)

    Loesdau, Martin; Chabrier, Sébastien; Gabillon, Alban

    2017-03-01

    The Tahitian Pearl is the most valuable export product of French Polynesia contributing with over 61 million Euros to more than 50% of the total export income. To maintain its excellent reputation on the international market, an obligatory quality control for every pearl deemed for exportation has been established by the local government. One of the controlled quality parameters is the pearls nacre thickness. The evaluation is currently done manually by experts that are visually analyzing X-ray images of the pearls. In this article, a computer vision based approach to automate this procedure is presented. Even though computer vision based approaches for pearl nacre thickness measurement exist in the literature, the very specific features of the Tahitian pearl, namely the large shape variety and the occurrence of cavities, have so far not been considered. The presented work closes the. Our method consists of segmenting the pearl from X-ray images with a model-based approach, segmenting the pearls nucleus with an own developed heuristic circle detection and segmenting possible cavities with region growing. Out of the obtained boundaries, the 2-dimensional nacre thickness profile can be calculated. A certainty measurement to consider imaging and segmentation imprecisions is included in the procedure. The proposed algorithms are tested on 298 manually evaluated Tahitian pearls, showing that it is generally possible to automatically evaluate the nacre thickness of Tahitian pearls with computer vision. Furthermore the results show that the automatic measurement is more precise and faster than the manual one.

  7. Computed tomography after radical pancreaticoduodenectomy (Whipple's procedure)

    Energy Technology Data Exchange (ETDEWEB)

    Smith, S.L. [Department of Radiology, Ipswich Hospital, Ipswich IP4 5PD (United Kingdom)], E-mail: simon.smith@ipswichhospital.nhs.uk; Hampson, F. [Department of Radiology, Addenbrooke' s Hospital NHS Trust, Cambridge (United Kingdom); Duxbury, M. [Department of Surgery, Royal Infirmary of Edinburgh, Little France, Edinburgh EH16 4SU (United Kingdom); Rae, D.M.; Sinclair, M.T. [Department of Pancreaticobiliary surgery, Ipswich Hospital, Ipswich IP4 5PD (United Kingdom)

    2008-08-15

    Whipple's procedure (radical pancreaticoduodenectomy) is currently the only curative option for patients with periampullary malignancy. The surgery is highly complex and involves multiple anastomoses. Complications are common and can lead to significant postoperative morbidity. Early detection and treatment of complications is vital, and high-quality multidetector computed tomography (MDCT) is currently the best method of investigation. This review outlines the surgical technique and illustrates the range of normal postoperative appearances together with the common complications.

  8. Establishing Reliable Cognitive Change in Children with Epilepsy: The Procedures and Results for a Sample with Epilepsy

    Science.gov (United States)

    van Iterson, Loretta; Augustijn, Paul B.; de Jong, Peter F.; van der Leij, Aryan

    2013-01-01

    The goal of this study was to investigate reliable cognitive change in epilepsy by developing computational procedures to determine reliable change index scores (RCIs) for the Dutch Wechsler Intelligence Scales for Children. First, RCIs were calculated based on stability coefficients from a reference sample. Then, these RCIs were applied to a…

  9. A simple but accurate procedure for solving the five-parameter model

    International Nuclear Information System (INIS)

    Mares, Oana; Paulescu, Marius; Badescu, Viorel

    2015-01-01

    Highlights: • A new procedure for extracting the parameters of the one-diode model is proposed. • Only the basic information listed in the datasheet of PV modules are required. • Results demonstrate a simple, robust and accurate procedure. - Abstract: The current–voltage characteristic of a photovoltaic module is typically evaluated by using a model based on the solar cell equivalent circuit. The complexity of the procedure applied for extracting the model parameters depends on data available in manufacture’s datasheet. Since the datasheet is not detailed enough, simplified models have to be used in many cases. This paper proposes a new procedure for extracting the parameters of the one-diode model in standard test conditions, using only the basic data listed by all manufactures in datasheet (short circuit current, open circuit voltage and maximum power point). The procedure is validated by using manufacturers’ data for six commercially crystalline silicon photovoltaic modules. Comparing the computed and measured current–voltage characteristics the determination coefficient is in the range 0.976–0.998. Thus, the proposed procedure represents a feasible tool for solving the five-parameter model applied to crystalline silicon photovoltaic modules. The procedure is described in detail, to guide potential users to derive similar models for other types of photovoltaic modules.

  10. Cone-beam computed tomography imaging: therapeutic staff dose during chemoembolisation procedure

    International Nuclear Information System (INIS)

    Paul, Jijo; Vogl, Thomas J; Chacko, Annamma; Mbalisike, Emmanuel C

    2014-01-01

    Cone-beam computed tomography (CBCT) imaging is an important requirement to perform real-time therapeutic image-guided procedures on patients. The purpose of this study is to estimate the personal-dose-equivalent and annual-personal-dose from CBCT imaging during transarterial chemoembolisation (TACE). Therapeutic staff doses (therapeutic and assistant physician) were collected during 200 patient (65  ±  15 years, range: 40–86) CBCT examinations over six months. Absorbed doses were assessed using thermo-luminescent dosimeters during patient hepatic TACE therapy. We estimated personal-dose-equivalent (PDE) and annual-personal-dose (APD) from absorbed dose based on international atomic energy agency protocol. APD for therapeutic procedure was calculated (therapeutic physician: 5.6 mSv; assistant physician: 5.08 mSv) based on institutional work load. Regarding PDE, the hands of the staff members received a greater dose compared to other anatomical locations (therapeutic physician: 56 mSv, 72 mSv; assistant physician: 12 mSv, 14 mSv). Annual radiation doses to the eyes and hands of the staff members were lower compared to the prescribed limits by the International Commission on Radiological Protection (ICRP). PDE and APD of both therapeutic staff members were within the recommended ICRP-103 annual limit. Dose to the assistant physician was lower than the dose to the therapeutic physician during imaging. Annual radiation doses to eye-lenses and hands of both staff members were lower than prescribed limits. (paper)

  11. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  12. Interactive computer programs for applied nutrition education.

    Science.gov (United States)

    Wise, A

    1985-12-01

    DIET2 and DIET3 are programs written for a Dec2050 computer and intended for teaching applied nutrition to students of nutrition, dietetics, home economics, and hotel and institutional administration. DIET2 combines all the facilities of the separate dietary programs already available at Robert Gordon's Institute of Technology into a single package, and extends these to give students a large amount of relevant information about the nutritional balance of foods (including DHSS and NACNE recommendations) prior to choosing them for meals. Students are also helped by the inclusion of typical portion weights. They are presented with an analysis of nutrients and their balance in the menu created, with an easy mechanism for ammendation of the menu and addition of foods which provide the nutrients that are lacking. At any stage the computer can give the proportion of total nutrient provided by each meal. DIET3 is a relatively simple program that displays the nutritional profile of foods and diets semigraphically.

  13. Virtual rounds: simulation-based education in procedural medicine

    Science.gov (United States)

    Shaffer, David W.; Meglan, Dwight A.; Ferrell, Margaret; Dawson, Steven L.

    1999-07-01

    Computer-based simulation is a goal for training physicians in specialties where traditional training puts patients at risk. Intuitively, interactive simulation of anatomy, pathology, and therapeutic actions should lead to shortening of the learning curve for novice or inexperienced physicians. Effective transfer of knowledge acquired in simulators must be shown for such devices to be widely accepted in the medical community. We have developed an Interventional Cardiology Training Simulator which incorporates real-time graphic interactivity coupled with haptic response, and an embedded curriculum permitting rehearsal, hypertext links, personal archiving and instructor review and testing capabilities. This linking of purely technical simulation with educational content creates a more robust educational purpose for procedural simulators.

  14. Cell-based land use screening procedure for regional siting analysis

    International Nuclear Information System (INIS)

    Jalbert, J.S.; Dobson, J.E.

    1976-01-01

    An energy facility site-screening methodology which permits the land resource planner to identify candidate siting areas was developed. Through the use of spatial analysis procedures and computer graphics, a selection of candidate areas is obtained. Specific sites then may be selected from among candidate areas for environmental impact analysis. The computerized methodology utilizes a cell-based geographic information system for specifying the suitability of candidate areas for an energy facility. The criteria to be considered may be specified by the user and weighted in terms of importance. Three primary computer programs have been developed. These programs produce thematic maps, proximity calculations, and suitability calculations. Programs are written so as to be transferrable to regional planning or regulatory agencies to assist in rational and comprehensive power plant site identification and analysis

  15. A CFD-based aerodynamic design procedure for hypersonic wind-tunnel nozzles

    Science.gov (United States)

    Korte, John J.

    1993-01-01

    A new procedure which unifies the best of current classical design practices, computational fluid dynamics (CFD), and optimization procedures is demonstrated for designing the aerodynamic lines of hypersonic wind-tunnel nozzles. The new procedure can be used to design hypersonic wind tunnel nozzles with thick boundary layers where the classical design procedure has been shown to break down. An efficient CFD code, which solves the parabolized Navier-Stokes (PNS) equations using an explicit upwind algorithm, is coupled to a least-squares (LS) optimization procedure. A LS problem is formulated to minimize the difference between the computed flow field and the objective function, consisting of the centerline Mach number distribution and the exit Mach number and flow angle profiles. The aerodynamic lines of the nozzle are defined using a cubic spline, the slopes of which are optimized with the design procedure. The advantages of the new procedure are that it allows full use of powerful CFD codes in the design process, solves an optimization problem to determine the new contour, can be used to design new nozzles or improve sections of existing nozzles, and automatically compensates the nozzle contour for viscous effects as part of the unified design procedure. The new procedure is demonstrated by designing two Mach 15, a Mach 12, and a Mach 18 helium nozzles. The flexibility of the procedure is demonstrated by designing the two Mach 15 nozzles using different constraints, the first nozzle for a fixed length and exit diameter and the second nozzle for a fixed length and throat diameter. The computed flow field for the Mach 15 least squares parabolized Navier-Stokes (LS/PNS) designed nozzle is compared with the classically designed nozzle and demonstrates a significant improvement in the flow expansion process and uniform core region.

  16. Internet messenger based smart virtual class learning using ubiquitous computing

    Science.gov (United States)

    Umam, K.; Mardi, S. N. S.; Hariadi, M.

    2017-06-01

    Internet messenger (IM) has become an important educational technology component in college education, IM makes it possible for students to engage in learning and collaborating at smart virtual class learning (SVCL) using ubiquitous computing. However, the model of IM-based smart virtual class learning using ubiquitous computing and empirical evidence that would favor a broad application to improve engagement and behavior are still limited. In addition, the expectation that IM based SVCL using ubiquitous computing could improve engagement and behavior on smart class cannot be confirmed because the majority of the reviewed studies followed instructions paradigms. This article aims to present the model of IM-based SVCL using ubiquitous computing and showing learners’ experiences in improved engagement and behavior for learner-learner and learner-lecturer interactions. The method applied in this paper includes design process and quantitative analysis techniques, with the purpose of identifying scenarios of ubiquitous computing and realize the impressions of learners and lecturers about engagement and behavior aspect and its contribution to learning

  17. Evaluation of the performance of MP4-based procedures for a wide range of thermochemical and kinetic properties

    Energy Technology Data Exchange (ETDEWEB)

    Yu, Li-Juan; Wan, Wenchao; Karton, Amir, E-mail: amir.karton@uwa.edu.au

    2016-11-30

    We evaluate the performance of standard and modified MPn procedures for a wide set of thermochemical and kinetic properties, including atomization energies, structural isomerization energies, conformational energies, and reaction barrier heights. The reference data are obtained at the CCSD(T)/CBS level by means of the Wn thermochemical protocols. We find that none of the MPn-based procedures show acceptable performance for the challenging W4-11 and BH76 databases. For the other thermochemical/kinetic databases, the MP2.5 and MP3.5 procedures provide the most attractive accuracy-to-computational cost ratios. The MP2.5 procedure results in a weighted-total-root-mean-square deviation (WTRMSD) of 3.4 kJ/mol, whilst the computationally more expensive MP3.5 procedure results in a WTRMSD of 1.9 kJ/mol (the same WTRMSD obtained for the CCSD(T) method in conjunction with a triple-zeta basis set). We also assess the performance of the computationally economical CCSD(T)/CBS(MP2) method, which provides the best overall performance for all the considered databases, including W4-11 and BH76.

  18. The use of gold nanoparticle aggregation for DNA computing and logic-based biomolecular detection

    International Nuclear Information System (INIS)

    Lee, In-Hee; Yang, Kyung-Ae; Zhang, Byoung-Tak; Lee, Ji-Hoon; Park, Ji-Yoon; Chai, Young Gyu; Lee, Jae-Hoon

    2008-01-01

    The use of DNA molecules as a physical computational material has attracted much interest, especially in the area of DNA computing. DNAs are also useful for logical control and analysis of biological systems if efficient visualization methods are available. Here we present a quick and simple visualization technique that displays the results of the DNA computing process based on a colorimetric change induced by gold nanoparticle aggregation, and we apply it to the logic-based detection of biomolecules. Our results demonstrate its effectiveness in both DNA-based logical computation and logic-based biomolecular detection

  19. Applying activity theory to computer-supported collaborative learning and work-based activities in corporate settings

    NARCIS (Netherlands)

    Collis, Betty; Margaryan, A.

    2004-01-01

    Business needs in many corporations call for learning outcomes that involve problem solutions, and creating and sharing new knowledge within worksplace situation that may involve collaboration among members of a team. We argue that work-based activities (WBA) and computer-supported collaborative

  20. Quantum neural network-based EEG filtering for a brain-computer interface.

    Science.gov (United States)

    Gandhi, Vaibhav; Prasad, Girijesh; Coyle, Damien; Behera, Laxmidhar; McGinnity, Thomas Martin

    2014-02-01

    A novel neural information processing architecture inspired by quantum mechanics and incorporating the well-known Schrodinger wave equation is proposed in this paper. The proposed architecture referred to as recurrent quantum neural network (RQNN) can characterize a nonstationary stochastic signal as time-varying wave packets. A robust unsupervised learning algorithm enables the RQNN to effectively capture the statistical behavior of the input signal and facilitates the estimation of signal embedded in noise with unknown characteristics. The results from a number of benchmark tests show that simple signals such as dc, staircase dc, and sinusoidal signals embedded within high noise can be accurately filtered and particle swarm optimization can be employed to select model parameters. The RQNN filtering procedure is applied in a two-class motor imagery-based brain-computer interface where the objective was to filter electroencephalogram (EEG) signals before feature extraction and classification to increase signal separability. A two-step inner-outer fivefold cross-validation approach is utilized to select the algorithm parameters subject-specifically for nine subjects. It is shown that the subject-specific RQNN EEG filtering significantly improves brain-computer interface performance compared to using only the raw EEG or Savitzky-Golay filtered EEG across multiple sessions.

  1. 3rd ACIS International Conference on Applied Computing and Information Technology

    CERN Document Server

    2016-01-01

    This edited book presents scientific results of the 3nd International Conference on Applied Computing and Information Technology (ACIT 2015) which was held on July 12-16, 2015 in Okayama, Japan. The aim of this conference was to bring together researchers and scientists, businessmen and entrepreneurs, teachers, engineers, computer users, and students to discuss the numerous fields of computer science and to share their experiences and exchange new ideas and information in a meaningful way. Research results about all aspects (theory, applications and tools) of computer and information science, and to discuss the practical challenges encountered along the way and the solutions adopted to solve them.

  2. Computer-based training at Sellafield

    International Nuclear Information System (INIS)

    Cartmell, A.; Evans, M.C.

    1986-01-01

    British Nuclear Fuel Limited (BNFL) operate the United Kingdom's spent-fuel receipt, storage, and reprocessing complex at Sellafield. Spent fuel from graphite-moderated CO 2 -cooled Magnox reactors has been reprocessed at Sellafield for 22 yr. Spent fuel from light water and advanced gas reactors is stored pending reprocessing in the Thermal Oxide Reprocessing Plant currently being constructed. The range of knowledge and skills needed for plant operation, construction, and commissioning represents a formidable training requirement. In addition, employees need to be acquainted with company practices and procedures. Computer-based training (CBT) is expected to play a significant role in this process. In this paper, current applications of CBT to the filed of nuclear criticality safety are described and plans for the immediate future are outlined

  3. An Innovative Adaptive Pushover Procedure Based on Storey Shear

    International Nuclear Information System (INIS)

    Shakeri, Kazem; Shayanfar, Mohsen A.

    2008-01-01

    Since the conventional pushover analyses are unable to consider the effect of the higher modes and progressive variation in dynamic properties, recent years have witnessed the development of some advanced adaptive pushover methods. However in these methods, using the quadratic combination rules to combine the modal forces result in a positive value in load pattern at all storeys and the reversal sign of the modes is removed; consequently these methods do not have a major advantage over their non-adaptive counterparts. Herein an innovative adaptive pushover method based on storey shear is proposed which can take into account the reversal signs in higher modes. In each storey the applied load pattern is derived from the storey shear profile; consequently, the sign of the applied loads in consecutive steps could be changed. Accuracy of the proposed procedure is examined by applying it to a 20-storey steel building. It illustrates a good estimation of the peak response in inelastic phase

  4. Component-based software for high-performance scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.

  5. Component-based software for high-performance scientific computing

    International Nuclear Information System (INIS)

    Alexeev, Yuri; Allan, Benjamin A; Armstrong, Robert C; Bernholdt, David E; Dahlgren, Tamara L; Gannon, Dennis; Janssen, Curtis L; Kenny, Joseph P; Krishnan, Manojkumar; Kohl, James A; Kumfert, Gary; McInnes, Lois Curfman; Nieplocha, Jarek; Parker, Steven G; Rasmussen, Craig; Windus, Theresa L

    2005-01-01

    Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly

  6. Computational model for dosimetric purposes in dental procedures

    International Nuclear Information System (INIS)

    Kawamoto, Renato H.; Campos, Tarcisio R.

    2013-01-01

    This study aims to develop a computational model for dosimetric purposes the oral region, based on computational tools SISCODES and MCNP-5, to predict deterministic effects and minimize stochastic effects caused by ionizing radiation by radiodiagnosis. Based on a set of digital information provided by computed tomography, three-dimensional voxel model was created, and its tissues represented. The model was exported to the MCNP code. In association with SICODES, we used the Monte Carlo N-Particle Transport Code (MCNP-5) method to play the corresponding interaction of nuclear particles with human tissues statistical process. The study will serve as a source of data for dosimetric studies in the oral region, providing deterministic effect and minimize the stochastic effect of ionizing radiation

  7. Do flow principles of operations management apply to computing centres?

    CERN Document Server

    Abaunza, Felipe; Hameri, Ari-Pekka; Niemi, Tapio

    2014-01-01

    By analysing large data-sets on jobs processed in major computing centres, we study how operations management principles apply to these modern day processing plants. We show that Little’s Law on long-term performance averages holds to computing centres, i.e. work-in-progress equals throughput rate multiplied by process lead time. Contrary to traditional manufacturing principles, the law of variation does not hold to computing centres, as the more variation in job lead times the better the throughput and utilisation of the system. We also show that as the utilisation of the system increases lead times and work-in-progress increase, which complies with traditional manufacturing. In comparison with current computing centre operations these results imply that better allocation of jobs could increase throughput and utilisation, while less computing resources are needed, thus increasing the overall efficiency of the centre. From a theoretical point of view, in a system with close to zero set-up times, as in the c...

  8. [Geometry, analysis, and computation in mathematics and applied science]. Progress report

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, D.

    1994-02-01

    The principal investigators` work on a variety of pure and applied problems in Differential Geometry, Calculus of Variations and Mathematical Physics has been done in a computational laboratory and been based on interactive scientific computer graphics and high speed computation created by the principal investigators to study geometric interface problems in the physical sciences. We have developed software to simulate various physical phenomena from constrained plasma flow to the electron microscope imaging of the microstructure of compound materials, techniques for the visualization of geometric structures that has been used to make significant breakthroughs in the global theory of minimal surfaces, and graphics tools to study evolution processes, such as flow by mean curvature, while simultaneously developing the mathematical foundation of the subject. An increasingly important activity of the laboratory is to extend this environment in order to support and enhance scientific collaboration with researchers at other locations. Toward this end, the Center developed the GANGVideo distributed video software system and software methods for running lab-developed programs simultaneously on remote and local machines. Further, the Center operates a broadcast video network, running in parallel with the Center`s data networks, over which researchers can access stored video materials or view ongoing computations. The graphical front-end to GANGVideo can be used to make ``multi-media mail`` from both ``live`` computing sessions and stored materials without video editing. Currently, videotape is used as the delivery medium, but GANGVideo is compatible with future ``all-digital`` distribution systems. Thus as a byproduct of mathematical research, we are developing methods for scientific communication. But, most important, our research focuses on important scientific problems; the parallel development of computational and graphical tools is driven by scientific needs.

  9. A Novel UDT-Based Transfer Speed-Up Protocol for Fog Computing

    Directory of Open Access Journals (Sweden)

    Zhijie Han

    2018-01-01

    Full Text Available Fog computing is a distributed computing model as the middle layer between the cloud data center and the IoT device/sensor. It provides computing, network, and storage devices so that cloud based services can be closer to IOT devices and sensors. Cloud computing requires a lot of bandwidth, and the bandwidth of the wireless network is limited. In contrast, the amount of bandwidth required for “fog computing” is much less. In this paper, we improved a new protocol Peer Assistant UDT-Based Data Transfer Protocol (PaUDT, applied to Iot-Cloud computing. Furthermore, we compared the efficiency of the congestion control algorithm of UDT with the Adobe’s Secure Real-Time Media Flow Protocol (RTMFP, based on UDP completely at the transport layer. At last, we built an evaluation model of UDT in RTT and bit error ratio which describes the performance. The theoretical analysis and experiment result have shown that UDT has good performance in IoT-Cloud computing.

  10. Encountering the Expertise Reversal Effect with a Computer-Based Environment on Electrical Circuit Analysis

    Science.gov (United States)

    Reisslein, Jana; Atkinson, Robert K.; Seeling, Patrick; Reisslein, Martin

    2006-01-01

    This study examined the effectiveness of a computer-based environment employing three example-based instructional procedures (example-problem, problem-example, and fading) to teach series and parallel electrical circuit analysis to learners classified by two levels of prior knowledge (low and high). Although no differences between the…

  11. A surrogate based multistage-multilevel optimization procedure for multidisciplinary design optimization

    NARCIS (Netherlands)

    Yao, W.; Chen, X.; Ouyang, Q.; Van Tooren, M.

    2011-01-01

    Optimization procedure is one of the key techniques to address the computational and organizational complexities of multidisciplinary design optimization (MDO). Motivated by the idea of synthetically exploiting the advantage of multiple existing optimization procedures and meanwhile complying with

  12. Applied field test procedures on petroleum release sites

    International Nuclear Information System (INIS)

    Gilbert, G.; Nichols, L.

    1995-01-01

    The effective remediation of petroleum contaminated soils and ground water is a significant issue for Williams Pipe Line Co. (Williams): costing $6.8 million in 1994. It is in the best interest, then, for Williams to adopt approaches and apply technologies that will be both cost-effective and comply with regulations. Williams has found the use of soil vapor extraction (SVE) and air sparging (AS) field test procedures at the onset of a petroleum release investigation/remediation accomplish these goals. This paper focuses on the application of AS/SVE as the preferred technology to a specific type of remediation: refined petroleum products. In situ field tests are used prior to designing a full-scale remedial system to first validate or disprove initial assumptions on applicability of the technology. During the field test, remedial system design parameters are also collected to tailor the design and operation of a full-scale system to site specific conditions: minimizing cost and optimizing effectiveness. In situ field tests should be designed and operated to simulate as close as possible the operation of a full-scale remedial system. The procedures of an in situ field test will be presented. The results of numerous field tests and the associated costs will also be evaluated and compared to full-scale remedial systems and total project costs to demonstrate overall effectiveness. There are many advantages of As/SVE technologies over conventional fluid extraction or SVE systems alone. However, the primary advantage is the ability to simultaneously reduce volatile and biodegradable compound concentrations in the phreatic, capillary fringe, and unsaturated zones

  13. Evaluation of Computer Based Testing in lieu of Regular Examinations in Computer Literacy

    Science.gov (United States)

    Murayama, Koichi

    Because computer based testing (CBT) has many advantages compared with the conventional paper and pencil testing (PPT) examination method, CBT has begun to be used in various situations in Japan, such as in qualifying examinations and in the TOEFL. This paper describes the usefulness and the problems of CBT applied to a regular college examination. The regular computer literacy examinations for first year students were held using CBT, and the results were analyzed. Responses to a questionnaire indicated many students accepted CBT with no unpleasantness and considered CBT a positive factor, improving their motivation to study. CBT also decreased the work of faculty in terms of marking tests and reducing data.

  14. Identity based Encryption and Biometric Authentication Scheme for Secure Data Access in Cloud Computing

    DEFF Research Database (Denmark)

    Cheng, Hongbing; Rong, Chunming; Tan, Zheng-Hua

    2012-01-01

    Cloud computing will be a main information infrastructure in the future; it consists of many large datacenters which are usually geographically distributed and heterogeneous. How to design a secure data access for cloud computing platform is a big challenge. In this paper, we propose a secure data...... access scheme based on identity-based encryption and biometric authentication for cloud computing. Firstly, we describe the security concern of cloud computing and then propose an integrated data access scheme for cloud computing, the procedure of the proposed scheme include parameter setup, key...... distribution, feature template creation, cloud data processing and secure data access control. Finally, we compare the proposed scheme with other schemes through comprehensive analysis and simulation. The results show that the proposed data access scheme is feasible and secure for cloud computing....

  15. Proposed design procedure for transmission shafting under fatigue loading

    Science.gov (United States)

    Loewenthal, S. H.

    1978-01-01

    The B106 American National Standards Committee is currently preparing a new standard for the design of transmission shafting. A design procedure, proposed for use in the new standard, for computing the diameter of rotating solid steel shafts under combined cyclic bending and steady torsion is presented. The formula is based on an elliptical variation of endurance strength with torque exhibited by combined stress fatigue data. Fatigue factors are cited to correct specimen bending endurance strength data for use in the shaft formula. A design example illustrates how the method is to be applied.

  16. Computer-based nuclear radiation detection and instrumentation teaching laboratory system

    International Nuclear Information System (INIS)

    Ellis, W.H.; He, Q.

    1993-01-01

    The integration of computers into the University of Florida's Nuclear Engineering Sciences teaching laboratories is based on the innovative use of MacIntosh 2 microcomputers, IEEE-488 (GPIB) communication and control bus system and protocol, compatible modular nuclear instrumentation (NIM) and test equipment, LabVIEW graphics and applications software, with locally prepared, interactive, menu-driven, HyperCard based multi-exercise laboratory instruction sets and procedures. Results thus far have been highly successful with the majority of the laboratory exercises having been implemented

  17. Quality control procedures applied to nuclear instruments. Proceedings of a technical meeting

    International Nuclear Information System (INIS)

    2008-11-01

    Quality Control (QC), test procedures for Nuclear Instrumentation are important for assurance of proper and safe operation of the instruments, especially with regard to equipment related to radiological safety, human health and national safety. Correct measurements of radiation parameters must be ensured, i.e., accurate measurement of the number of radioactive events, counting times and in some cases accurate measurements of the radiation energy and occurring time of the nuclear events. There are several kinds of testing on nuclear instruments, for example, type-testing done by suppliers, acceptance testing made by the end users, Quality Control tests after repair and Quality Assurance/Quality Controls tests made by end-users. All of these tests are based in many cases on practical guidelines or on the experience of the own specialist, the available standards on this topic also need to be adapted to specific instruments. The IAEA has provided nuclear instruments and supported the operational maintenance efforts of the Member States. Although Nuclear Instrumentation is continuously upgraded, some older or aged instruments are still in use and in good working condition. Some of these instruments may not, however, meet modern requirements for the end-user therefore, Member States, mostly those with emerging economies, modernize/refurbish such instruments to meet the end-user demands. As a result, new instrumentation which is not commercially available, or modernized/refurbished instruments, need to be tested or verified with QC procedures to meet national or international certification requirements. A technical meeting on QC procedures applied to nuclear instruments was organized in Vienna from 23 to 24 August 2007. Existing and required QC test procedures necessary for the verification of operation and measurement of the main characteristics of nuclear instruments was the focus of discussion at this meeting. Presentations made at the technical meeting provided

  18. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  19. The Evolution of Computer Based Learning Software Design: Computer Assisted Teaching Unit Experience.

    Science.gov (United States)

    Blandford, A. E.; Smith, P. R.

    1986-01-01

    Describes the style of design of computer simulations developed by Computer Assisted Teaching Unit at Queen Mary College with reference to user interface, input and initialization, input data vetting, effective display screen use, graphical results presentation, and need for hard copy. Procedures and problems relating to academic involvement are…

  20. Computer-based safeguards information and accounting system

    International Nuclear Information System (INIS)

    1977-01-01

    Acquiring, processing and analysing information about inventories and flow of nuclear materials are essential parts of IAEA safeguards. Safeguards information originates from several sources. The information to be provided is specified in the various safeguards agreements between the States and the IAEA, including both NPT agreements and safeguards trilateral agreements. Most of the safeguards information currently received by the IAEA is contained in accounting reports from the States party to the NPT. Within the frame of the material balance concept of NPT, three types of reports are provided to the IAEA by the States: Physical Inventory Listings (PIL); Inventory Change Reports (ICR); Material Balance Reports (MBR). In addition, facility design information is reported when NPT safeguards are applied and whenever there is a change in the facility or its operation. Based on this data, an accounting system is used to make available such information as the book inventories of nuclear material as a function of time, material balance evaluations, and analysis of shipments versus receipts of nuclear material. A second source of NPT safeguards information is the inspection activities carried out in the field as a necessary counterpart for verification of the data presented by the States in their accounting reports. The processing of inspection reports and other inspection data is carried out by the present system in a provisional manner until a new system, which is under development is available. The major effort currently is directed not to computer processing but toward developing and applying uniform inspection procedures and information requirements. A third source of NPT safeguards information is advanced notifications and notifications of transfer of source materials before the starting point of safeguards. Since, however, the States are not completely aware of the need and requirement to provide these data, this is a point to be emphasized in future workshops and

  1. A new characterization procedure for computed radiography performance levels based on EPS, SNR and basic spatial resolution measurements

    International Nuclear Information System (INIS)

    Ewert, Uwe; Zscherpel, Uwe; Baer, Sylke

    2016-01-01

    The standards EN 14784-1:2005 and ISO 16371-1:2011 describe the classification of Computed Radiography systems for industrial applications. After 10 years of classification experience, it can be concluded that all certified NDT CR systems achieve the best classification result: IP 1. The measured basic spatial resolution is different depending on the manufacturer's brand and the IP used. Therefore, a revision was recommended to obtain a better gradation for the different brands. Users in USA and Europe classify the CR systems based on different parameters. Consequently, a new revision of ASTM E 2446-15 was finalized in 2015, which describes the characterization of CR systems based on CR performance levels. The key parameters are the normalized Signal to Noise Ratio (SNRN), the interpolated basic spatial resolution (iSR b detector ) and the achieved equivalent penetrameter sensitivity (aEPS). A series of further tests is required for complete characterization by manufacturers or certifying laboratories. This includes e.g.: geometric distortion, laser jitter, PMT non-linearity, scanner slippage, shading or banding, erasure, burn-In, spatial linearity, artefacts, imaging plate response variation and imaging plate fading. ASTM E 2445-15 describes several tests, for users to perform periodic quality assurance. The measurement procedures are described and the resulting values as CR speed, achieved contrast sensitivity and efficiency are discussed. The results will be presented graphically in a spider net graph in the qualification/certification statement. A revision of the related CEN and ISO standards is discussed.

  2. Applying ligands profiling using multiple extended electron distribution based field templates and feature trees similarity searching in the discovery of new generation of urea-based antineoplastic kinase inhibitors.

    Directory of Open Access Journals (Sweden)

    Eman M Dokla

    Full Text Available This study provides a comprehensive computational procedure for the discovery of novel urea-based antineoplastic kinase inhibitors while focusing on diversification of both chemotype and selectivity pattern. It presents a systematic structural analysis of the different binding motifs of urea-based kinase inhibitors and the corresponding configurations of the kinase enzymes. The computational model depends on simultaneous application of two protocols. The first protocol applies multiple consecutive validated virtual screening filters including SMARTS, support vector-machine model (ROC = 0.98, Bayesian model (ROC = 0.86 and structure-based pharmacophore filters based on urea-based kinase inhibitors complexes retrieved from literature. This is followed by hits profiling against different extended electron distribution (XED based field templates representing different kinase targets. The second protocol enables cancericidal activity verification by using the algorithm of feature trees (Ftrees similarity searching against NCI database. Being a proof-of-concept study, this combined procedure was experimentally validated by its utilization in developing a novel series of urea-based derivatives of strong anticancer activity. This new series is based on 3-benzylbenzo[d]thiazol-2(3H-one scaffold which has interesting chemical feasibility and wide diversification capability. Antineoplastic activity of this series was assayed in vitro against NCI 60 tumor-cell lines showing very strong inhibition of GI(50 as low as 0.9 uM. Additionally, its mechanism was unleashed using KINEX™ protein kinase microarray-based small molecule inhibitor profiling platform and cell cycle analysis showing a peculiar selectivity pattern against Zap70, c-src, Mink1, csk and MeKK2 kinases. Interestingly, it showed activity on syk kinase confirming the recent studies finding of the high activity of diphenyl urea containing compounds against this kinase. Allover, the new series

  3. Applying Computational Scoring Functions to Assess Biomolecular Interactions in Food Science: Applications to the Estrogen Receptors

    Directory of Open Access Journals (Sweden)

    Francesca Spyrakis

    2016-10-01

    Thus, key computational medicinal chemistry methods like molecular dynamics can be used to decipher protein flexibility and to obtain stable models for docking and scoring in food-related studies, and virtual screening is increasingly being applied to identify molecules with potential to act as endocrine disruptors, food mycotoxins, and new nutraceuticals [3,4,5]. All of these methods and simulations are based on protein-ligand interaction phenomena, and represent the basis for any subsequent modification of the targeted receptor's or enzyme's physiological activity. We describe here the energetics of binding of biological complexes, providing a survey of the most common and successful algorithms used in evaluating these energetics, and we report case studies in which computational techniques have been applied to food science issues. In particular, we explore a handful of studies involving the estrogen receptors for which we have a long-term interest.

  4. Development of pharmacophore similarity-based quantitative activity hypothesis and its applicability domain: applied on a diverse data-set of HIV-1 integrase inhibitors.

    Science.gov (United States)

    Kumar, Sivakumar Prasanth; Jasrai, Yogesh T; Mehta, Vijay P; Pandya, Himanshu A

    2015-01-01

    Quantitative pharmacophore hypothesis combines the 3D spatial arrangement of pharmacophore features with biological activities of the ligand data-set and predicts the activities of geometrically and/or pharmacophoric similar ligands. Most pharmacophore discovery programs face difficulties in conformational flexibility, molecular alignment, pharmacophore features sampling, and feature selection to score models if the data-set constitutes diverse ligands. Towards this focus, we describe a ligand-based computational procedure to introduce flexibility in aligning the small molecules and generating a pharmacophore hypothesis without geometrical constraints to define pharmacophore space, enriched with chemical features necessary to elucidate common pharmacophore hypotheses (CPHs). Maximal common substructure (MCS)-based alignment method was adopted to guide the alignment of carbon molecules, deciphered the MCS atom connectivity to cluster molecules in bins and subsequently, calculated the pharmacophore similarity matrix with the bin-specific reference molecules. After alignment, the carbon molecules were enriched with original atoms in their respective positions and conventional pharmacophore features were perceived. Distance-based pharmacophoric descriptors were enumerated by computing the interdistance between perceived features and MCS-aligned 'centroid' position. The descriptor set and biological activities were used to develop support vector machine models to predict the activities of the external test set. Finally, fitness score was estimated based on pharmacophore similarity with its bin-specific reference molecules to recognize the best and poor alignments and, also with each reference molecule to predict outliers of the quantitative hypothesis model. We applied this procedure to a diverse data-set of 40 HIV-1 integrase inhibitors and discussed its effectiveness with the reported CPH model.

  5. Activity-based computing: computational management of activities reflecting human intention

    DEFF Research Database (Denmark)

    Bardram, Jakob E; Jeuris, Steven; Houben, Steven

    2015-01-01

    paradigm that has been applied in personal information management applications as well as in ubiquitous, multidevice, and interactive surface computing. ABC has emerged as a response to the traditional application- and file-centered computing paradigm, which is oblivious to a notion of a user’s activity...

  6. GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique

    Science.gov (United States)

    Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.

    2015-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).

  7. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  8. Condensation of pure and near-azeotropic refrigerants in microfin tubes: A new computational procedure

    Energy Technology Data Exchange (ETDEWEB)

    Cavallini, A; Del Col, D; Mancin, S; Rossetto, L [Dipartimento di Fisica Tecnica, University of Padova, Via Venezia 1, Padova 35131 (Italy)

    2009-01-15

    Microfin tubes are widely used in air cooled and water cooled heat exchangers for heat pump and refrigeration applications during condensation or evaporation of refrigerants. In order to design heat exchangers and to optimize heat transfer surfaces, accurate procedures for computing pressure drops and heat transfer coefficients are necessary. This paper presents a new simple model for the prediction of the heat transfer coefficient to be applied to condensation in horizontal microfin tubes of halogenated and natural refrigerants, pure fluids or nearly azeotropic mixtures. The updated model accounts for refrigerant physical properties, two-phase flow patterns in microfin tubes and geometrical characteristics of the tubes. It is validated against a data bank of 3115 experimental heat transfer coefficients measured in different independent laboratories all over the world including diverse inside tube geometries and different condensing refrigerants among which R22, R134a, R123, R410A and CO{sub 2}. (author)

  9. Development of a Computer-Based Measure of Listening Comprehension of Science Talk

    Science.gov (United States)

    Lin, Sheau-Wen; Liu, Yu; Chen, Shin-Feng; Wang, Jing-Ru; Kao, Huey-Lien

    2015-01-01

    The purpose of this study was to develop a computer-based assessment for elementary school students' listening comprehension of science talk within an inquiry-oriented environment. The development procedure had 3 steps: a literature review to define the framework of the test, collecting and identifying key constructs of science talk, and…

  10. 14 CFR 382.127 - What procedures apply to stowage of battery-powered mobility aids?

    Science.gov (United States)

    2010-01-01

    ...-powered mobility aids? 382.127 Section 382.127 Aeronautics and Space OFFICE OF THE SECRETARY, DEPARTMENT... DISABILITY IN AIR TRAVEL Stowage of Wheelchairs, Other Mobility Aids, and Other Assistive Devices § 382.127 What procedures apply to stowage of battery-powered mobility aids? (a) Whenever baggage compartment...

  11. Validation study of a computer-based open surgical trainer: SimPraxis® simulation platform

    Directory of Open Access Journals (Sweden)

    Tran LN

    2013-03-01

    .Conclusion: We describe an interactive, computer-based simulator designed to assist in mastery of the cognitive steps of an open surgical procedure. This platform is intuitive and flexible, and could be applied to any stepwise medical procedure. Overall, experts outperformed novices in their performance on the trainer. Experts agreed that the content was acceptable, accurate, and representative.Keywords: simulation, surgical education, training, simulator, video

  12. A surrogate based multistage-multilevel optimization procedure for multidisciplinary design optimization

    OpenAIRE

    Yao, W.; Chen, X.; Ouyang, Q.; Van Tooren, M.

    2011-01-01

    Optimization procedure is one of the key techniques to address the computational and organizational complexities of multidisciplinary design optimization (MDO). Motivated by the idea of synthetically exploiting the advantage of multiple existing optimization procedures and meanwhile complying with the general process of satellite system design optimization in conceptual design phase, a multistage-multilevel MDO procedure is proposed in this paper by integrating multiple-discipline-feasible (M...

  13. 24 CFR 1000.54 - What procedures apply to complaints arising out of any of the methods of providing for Indian...

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false What procedures apply to complaints arising out of any of the methods of providing for Indian preference? 1000.54 Section 1000.54 Housing and... ACTIVITIES General § 1000.54 What procedures apply to complaints arising out of any of the methods of...

  14. Summary of research in applied mathematics, numerical analysis and computer science at the Institute for Computer Applications in Science and Engineering

    Science.gov (United States)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science during the period October 1, 1983 through March 31, 1984 is summarized.

  15. Applying tensor-based morphometry to parametric surfaces can improve MRI-based disease diagnosis.

    Science.gov (United States)

    Wang, Yalin; Yuan, Lei; Shi, Jie; Greve, Alexander; Ye, Jieping; Toga, Arthur W; Reiss, Allan L; Thompson, Paul M

    2013-07-01

    Many methods have been proposed for computer-assisted diagnostic classification. Full tensor information and machine learning with 3D maps derived from brain images may help detect subtle differences or classify subjects into different groups. Here we develop a new approach to apply tensor-based morphometry to parametric surface models for diagnostic classification. We use this approach to identify cortical surface features for use in diagnostic classifiers. First, with holomorphic 1-forms, we compute an efficient and accurate conformal mapping from a multiply connected mesh to the so-called slit domain. Next, the surface parameterization approach provides a natural way to register anatomical surfaces across subjects using a constrained harmonic map. To analyze anatomical differences, we then analyze the full Riemannian surface metric tensors, which retain multivariate information on local surface geometry. As the number of voxels in a 3D image is large, sparse learning is a promising method to select a subset of imaging features and to improve classification accuracy. Focusing on vertices with greatest effect sizes, we train a diagnostic classifier using the surface features selected by an L1-norm based sparse learning method. Stability selection is applied to validate the selected feature sets. We tested the algorithm on MRI-derived cortical surfaces from 42 subjects with genetically confirmed Williams syndrome and 40 age-matched controls, multivariate statistics on the local tensors gave greater effect sizes for detecting group differences relative to other TBM-based statistics including analysis of the Jacobian determinant and the largest eigenvalue of the surface metric. Our method also gave reasonable classification results relative to the Jacobian determinant, the pair of eigenvalues of the Jacobian matrix and volume features. This analysis pipeline may boost the power of morphometry studies, and may assist with image-based classification. Copyright © 2013

  16. Procedure generation and verification

    International Nuclear Information System (INIS)

    Sheely, W.F.

    1986-01-01

    The Department of Energy has used Artificial Intelligence of ''AI'' concepts to develop two powerful new computer-based techniques to enhance safety in nuclear applications. The Procedure Generation System, and the Procedure Verification System, can be adapted to other commercial applications, such as a manufacturing plant. The Procedure Generation System can create a procedure to deal with the off-normal condition. The operator can then take correct actions on the system in minimal time. The Verification System evaluates the logic of the Procedure Generator's conclusions. This evaluation uses logic techniques totally independent of the Procedure Generator. The rapid, accurate generation and verification of corrective procedures can greatly reduce the human error, possible in a complex (stressful/high stress) situation

  17. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    Science.gov (United States)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  18. Web-based video monitoring of CT and MRI procedures

    Science.gov (United States)

    Ratib, Osman M.; Dahlbom, Magdalena; Kho, Hwa T.; Valentino, Daniel J.; McCoy, J. Michael

    2000-05-01

    A web-based video transmission of images from CT and MRI consoles was implemented in an Intranet environment for real- time monitoring of ongoing procedures. Images captured from the consoles are compressed to video resolution and broadcasted through a web server. When called upon, the attending radiologists can view these live images on any computer within the secured Intranet network. With adequate compression, these images can be displayed simultaneously in different locations at a rate of 2 to 5 images/sec through standard LAN. The quality of the images being insufficient for diagnostic purposes, our users survey showed that they were suitable for supervising a procedure, positioning the imaging slices and for routine quality checking before completion of a study. The system was implemented at UCLA to monitor 9 CTs and 6 MRIs distributed in 4 buildings. This system significantly improved the radiologists productivity by saving precious time spent in trips between reading rooms and examination rooms. It also improved patient throughput by reducing the waiting time for the radiologists to come to check a study before moving the patient from the scanner.

  19. Fuzzy model predictive control algorithm applied in nuclear power plant

    International Nuclear Information System (INIS)

    Zuheir, Ahmad

    2006-01-01

    The aim of this paper is to design a predictive controller based on a fuzzy model. The Takagi-Sugeno fuzzy model with an Adaptive B-splines neuro-fuzzy implementation is used and incorporated as a predictor in a predictive controller. An optimization approach with a simplified gradient technique is used to calculate predictions of the future control actions. In this approach, adaptation of the fuzzy model using dynamic process information is carried out to build the predictive controller. The easy description of the fuzzy model and the easy computation of the gradient sector during the optimization procedure are the main advantages of the computation algorithm. The algorithm is applied to the control of a U-tube steam generation unit (UTSG) used for electricity generation. (author)

  20. New layer-based imaging and rapid prototyping techniques for computer-aided design and manufacture of custom dental restoration.

    Science.gov (United States)

    Lee, M-Y; Chang, C-C; Ku, Y C

    2008-01-01

    Fixed dental restoration by conventional methods greatly relies on the skill and experience of the dental technician. The quality and accuracy of the final product depends mostly on the technician's subjective judgment. In addition, the traditional manual operation involves many complex procedures, and is a time-consuming and labour-intensive job. Most importantly, no quantitative design and manufacturing information is preserved for future retrieval. In this paper, a new device for scanning the dental profile and reconstructing 3D digital information of a dental model based on a layer-based imaging technique, called abrasive computer tomography (ACT) was designed in-house and proposed for the design of custom dental restoration. The fixed partial dental restoration was then produced by rapid prototyping (RP) and computer numerical control (CNC) machining methods based on the ACT scanned digital information. A force feedback sculptor (FreeForm system, Sensible Technologies, Inc., Cambridge MA, USA), which comprises 3D Touch technology, was applied to modify the morphology and design of the fixed dental restoration. In addition, a comparison of conventional manual operation and digital manufacture using both RP and CNC machining technologies for fixed dental restoration production is presented. Finally, a digital custom fixed restoration manufacturing protocol integrating proposed layer-based dental profile scanning, computer-aided design, 3D force feedback feature modification and advanced fixed restoration manufacturing techniques is illustrated. The proposed method provides solid evidence that computer-aided design and manufacturing technologies may become a new avenue for custom-made fixed restoration design, analysis, and production in the 21st century.

  1. Computer control applied to accelerators

    CERN Document Server

    Crowley-Milling, Michael C

    1974-01-01

    The differences that exist between control systems for accelerators and other types of control systems are outlined. It is further indicated that earlier accelerators had manual control systems to which computers were added, but that it is essential for the new, large accelerators to include computers in the control systems right from the beginning. Details of the computer control designed for the Super Proton Synchrotron are presented. The method of choosing the computers is described, as well as the reasons for CERN having to design the message transfer system. The items discussed include: CAMAC interface systems, a new multiplex system, operator-to-computer interaction (such as touch screen, computer-controlled knob, and non- linear track-ball), and high-level control languages. Brief mention is made of the contributions of other high-energy research laboratories as well as of some other computer control applications at CERN. (0 refs).

  2. Computer mapping software and geographic data base development: Oak Ridge National Laboratory user experience

    International Nuclear Information System (INIS)

    Honea, B.; Johnson, P.

    1978-01-01

    As users of computer display tools, our opinion is that the researcher's needs should guide and direct the computer scientist's development of mapping software and data bases. Computer graphic techniques developed for the sake of the computer graphics community tend to be esoteric and rarely suitable for user problems. Two types of users exist for computer graphic tools: the researcher who is generally satisfied with abstract but accurate displays for analysis purposes and the decision maker who requires synoptic and easily comprehended displays relevant to the issues he or she must address. Computer mapping software and data bases should be developed for the user in a generalized and standardized format for ease in transferring and to facilitate the linking or merging with larger analysis systems. Maximum utility of computer mapping tools is accomplished when linked to geographic information and analysis systems. Computer graphic techniques have varying degrees of utility depending upon whether they are used for data validation, analysis procedures or presenting research results

  3. Practical experimental certification of computational quantum gates using a twirling procedure.

    Science.gov (United States)

    Moussa, Osama; da Silva, Marcus P; Ryan, Colm A; Laflamme, Raymond

    2012-08-17

    Because of the technical difficulty of building large quantum computers, it is important to be able to estimate how faithful a given implementation is to an ideal quantum computer. The common approach of completely characterizing the computation process via quantum process tomography requires an exponential amount of resources, and thus is not practical even for relatively small devices. We solve this problem by demonstrating that twirling experiments previously used to characterize the average fidelity of quantum memories efficiently can be easily adapted to estimate the average fidelity of the experimental implementation of important quantum computation processes, such as unitaries in the Clifford group, in a practical and efficient manner with applicability in current quantum devices. Using this procedure, we demonstrate state-of-the-art coherent control of an ensemble of magnetic moments of nuclear spins in a single crystal solid by implementing the encoding operation for a 3-qubit code with only a 1% degradation in average fidelity discounting preparation and measurement errors. We also highlight one of the advances that was instrumental in achieving such high fidelity control.

  4. Capability-based computer systems

    CERN Document Server

    Levy, Henry M

    2014-01-01

    Capability-Based Computer Systems focuses on computer programs and their capabilities. The text first elaborates capability- and object-based system concepts, including capability-based systems, object-based approach, and summary. The book then describes early descriptor architectures and explains the Burroughs B5000, Rice University Computer, and Basic Language Machine. The text also focuses on early capability architectures. Dennis and Van Horn's Supervisor; CAL-TSS System; MIT PDP-1 Timesharing System; and Chicago Magic Number Machine are discussed. The book then describes Plessey System 25

  5. The J3 SCR model applied to resonant converter simulation

    Science.gov (United States)

    Avant, R. L.; Lee, F. C. Y.

    1985-01-01

    The J3 SCR model is a continuous topology computer model for the SCR. Its circuit analog and parameter estimation procedure are uniformly applicable to popular computer-aided design and analysis programs such as SPICE2 and SCEPTRE. The circuit analog is based on the intrinsic three pn junction structure of the SCR. The parameter estimation procedure requires only manufacturer's specification sheet quantities as a data base.

  6. Computer-based planning of optimal donor sites for autologous osseous grafts

    Science.gov (United States)

    Krol, Zdzislaw; Chlebiej, Michal; Zerfass, Peter; Zeilhofer, Hans-Florian U.; Sader, Robert; Mikolajczak, Pawel; Keeve, Erwin

    2002-05-01

    Bone graft surgery is often necessary for reconstruction of craniofacial defects after trauma, tumor, infection or congenital malformation. In this operative technique the removed or missing bone segment is filled with a bone graft. The mainstay of the craniofacial reconstruction rests with the replacement of the defected bone by autogeneous bone grafts. To achieve sufficient incorporation of the autograft into the host bone, precise planning and simulation of the surgical intervention is required. The major problem is to determine as accurately as possible the donor site where the graft should be dissected from and to define the shape of the desired transplant. A computer-aided method for semi-automatic selection of optimal donor sites for autografts in craniofacial reconstructive surgery has been developed. The non-automatic step of graft design and constraint setting is followed by a fully automatic procedure to find the best fitting position. In extension to preceding work, a new optimization approach based on the Levenberg-Marquardt method has been implemented and embedded into our computer-based surgical planning system. This new technique enables, once the pre-processing step has been performed, selection of the optimal donor site in time less than one minute. The method has been applied during surgery planning step in more than 20 cases. The postoperative observations have shown that functional results, such as speech and chewing ability as well as restoration of bony continuity were clearly better compared to conventionally planned operations. Moreover, in most cases the duration of the surgical interventions has been distinctly reduced.

  7. High-speed technique based on a parallel projection correlation procedure for digital image correlation

    Science.gov (United States)

    Zaripov, D. I.; Renfu, Li

    2018-05-01

    The implementation of high-efficiency digital image correlation methods based on a zero-normalized cross-correlation (ZNCC) procedure for high-speed, time-resolved measurements using a high-resolution digital camera is associated with big data processing and is often time consuming. In order to speed-up ZNCC computation, a high-speed technique based on a parallel projection correlation procedure is proposed. The proposed technique involves the use of interrogation window projections instead of its two-dimensional field of luminous intensity. This simplification allows acceleration of ZNCC computation up to 28.8 times compared to ZNCC calculated directly, depending on the size of interrogation window and region of interest. The results of three synthetic test cases, such as a one-dimensional uniform flow, a linear shear flow and a turbulent boundary-layer flow, are discussed in terms of accuracy. In the latter case, the proposed technique is implemented together with an iterative window-deformation technique. On the basis of the results of the present work, the proposed technique is recommended to be used for initial velocity field calculation, with further correction using more accurate techniques.

  8. Radiochromic film for dosimetric measurements in radiation shielding composites synthesized for applied in radiology procedures of high dose

    Energy Technology Data Exchange (ETDEWEB)

    Fontainha, C. C. P. [Universidade Federal de Minas Gerais, Departamento de Engenharia Nuclear, Av. Pte. Antonio Carlos 6627, 31270-901 Belo Horizonte, Minas Gerais (Brazil); Baptista N, A. T.; Faria, L. O., E-mail: crissia@gmail.com [Centro de Desenvolvimento da Tecnologia Nuclear / CNEN, Av. Pte. Antonio Carlos 6627, 31270-901 Belo Horizonte, Minas Gerais (Brazil)

    2015-10-15

    Full text: Medical radiology offers great benefit to patients. However, although specifics procedures of high dose, as fluoroscopy, Interventional Radiology, Computed Tomography (CT) make up a small percent of the imaging procedures, they contribute to significantly increase dose to population. The patients may suffer tissue damage. The probability of deterministic effects incidence depends on the type of procedure performed, exposure time, and the amount of applied dose at the irradiated area. Calibrated radiochromic films can identify size and distribution of the radiated fields and measure intensities of doses. Radiochromic films are sensitive for doses ranging from 0.1 to 20 c Gy and they have the same response for X-rays effective energies ranging from 20 to 100 keV. New radiation attenuators materials have been widely investigated resulting in dose reduction entrance skin dose. In this work, Bi{sub 2}O{sub 3} and ZrO{sub 2}:8 % Y{sub 2}O{sub 3} composites were obtained by mixing them with P(VDF-Tr Fe) copolymers matrix from casting method and then characterized by Ftir. Dosimetric measurements were obtained with Xr-Q A2 Gafchromic radiochromic films. In this setup, one radiochromic film is directly exposed to the X-rays beam and another one measures the attenuated beam were exposed to an absorbed dose of 10 mGy of RQR5 beam quality (70 kV X-ray beam). Under the same conditions, irradiated Xr-Q A2 films were stored and scanned measurement in order to obtain a more reliable result. The attenuation factors, evaluated by Xr-Q A2 radiochromic films, indicate that both composites are good candidates for use as patient radiation shielding in high dose medical procedures. (Author)

  9. Hanford general employee training: Computer-based training instructor's manual

    Energy Technology Data Exchange (ETDEWEB)

    1990-10-01

    The Computer-Based Training portion of the Hanford General Employee Training course is designed to be used in a classroom setting with a live instructor. Future references to this course'' refer only to the computer-based portion of the whole. This course covers the basic Safety, Security, and Quality issues that pertain to all employees of Westinghouse Hanford Company. The topics that are covered were taken from the recommendations and requirements for General Employee Training as set forth by the Institute of Nuclear Power Operations (INPO) in INPO 87-004, Guidelines for General Employee Training, applicable US Department of Energy orders, and Westinghouse Hanford Company procedures and policy. Besides presenting fundamental concepts, this course also contains information on resources that are available to assist students. It does this using Interactive Videodisk technology, which combines computer-generated text and graphics with audio and video provided by a videodisk player.

  10. Computational chemistry and metal-based radiopharmaceuticals

    International Nuclear Information System (INIS)

    Neves, M.; Fausto, R.

    1998-01-01

    Computer-assisted techniques have found extensive use in the design of organic pharmaceuticals but have not been widely applied on metal complexes, particularly on radiopharmaceuticals. Some examples of computer generated structures of complexes of In, Ga and Tc with N, S, O and P donor ligands are referred. Besides parameters directly related with molecular geometries, molecular properties of the predicted structures, as ionic charges or dipole moments, are considered to be related with biodistribution studies. The structure of a series of oxo neutral Tc-biguanide complexes are predicted by molecular mechanics calculations, and their interactions with water molecules or peptide chains correlated with experimental data of partition coefficients and percentage of human protein binding. The results stress the interest of using molecular modelling to predict molecular properties of metal-based radiopharmaceuticals, which can be successfully correlated with results of in vitro studies. (author)

  11. Unit cell-based computer-aided manufacturing system for tissue engineering

    International Nuclear Information System (INIS)

    Kang, Hyun-Wook; Park, Jeong Hun; Kang, Tae-Yun; Seol, Young-Joon; Cho, Dong-Woo

    2012-01-01

    Scaffolds play an important role in the regeneration of artificial tissues or organs. A scaffold is a porous structure with a micro-scale inner architecture in the range of several to several hundreds of micrometers. Therefore, computer-aided construction of scaffolds should provide sophisticated functionality for porous structure design and a tool path generation strategy that can achieve micro-scale architecture. In this study, a new unit cell-based computer-aided manufacturing (CAM) system was developed for the automated design and fabrication of a porous structure with micro-scale inner architecture that can be applied to composite tissue regeneration. The CAM system was developed by first defining a data structure for the computing process of a unit cell representing a single pore structure. Next, an algorithm and software were developed and applied to construct porous structures with a single or multiple pore design using solid freeform fabrication technology and a 3D tooth/spine computer-aided design model. We showed that this system is quite feasible for the design and fabrication of a scaffold for tissue engineering. (paper)

  12. Unit cell-based computer-aided manufacturing system for tissue engineering.

    Science.gov (United States)

    Kang, Hyun-Wook; Park, Jeong Hun; Kang, Tae-Yun; Seol, Young-Joon; Cho, Dong-Woo

    2012-03-01

    Scaffolds play an important role in the regeneration of artificial tissues or organs. A scaffold is a porous structure with a micro-scale inner architecture in the range of several to several hundreds of micrometers. Therefore, computer-aided construction of scaffolds should provide sophisticated functionality for porous structure design and a tool path generation strategy that can achieve micro-scale architecture. In this study, a new unit cell-based computer-aided manufacturing (CAM) system was developed for the automated design and fabrication of a porous structure with micro-scale inner architecture that can be applied to composite tissue regeneration. The CAM system was developed by first defining a data structure for the computing process of a unit cell representing a single pore structure. Next, an algorithm and software were developed and applied to construct porous structures with a single or multiple pore design using solid freeform fabrication technology and a 3D tooth/spine computer-aided design model. We showed that this system is quite feasible for the design and fabrication of a scaffold for tissue engineering.

  13. Assessment of modal-pushover-based scaling procedure for nonlinear response history analysis of ordinary standard bridges

    Science.gov (United States)

    Kalkan, E.; Kwong, N.

    2012-01-01

    The earthquake engineering profession is increasingly utilizing nonlinear response history analyses (RHA) to evaluate seismic performance of existing structures and proposed designs of new structures. One of the main ingredients of nonlinear RHA is a set of ground motion records representing the expected hazard environment for the structure. When recorded motions do not exist (as is the case in the central United States) or when high-intensity records are needed (as is the case in San Francisco and Los Angeles), ground motions from other tectonically similar regions need to be selected and scaled. The modal-pushover-based scaling (MPS) procedure was recently developed to determine scale factors for a small number of records such that the scaled records provide accurate and efficient estimates of “true” median structural responses. The adjective “accurate” refers to the discrepancy between the benchmark responses and those computed from the MPS procedure. The adjective “efficient” refers to the record-to-record variability of responses. In this paper, the accuracy and efficiency of the MPS procedure are evaluated by applying it to four types of existing Ordinary Standard bridges typical of reinforced concrete bridge construction in California. These bridges are the single-bent overpass, multi-span bridge, curved bridge, and skew bridge. As compared with benchmark analyses of unscaled records using a larger catalog of ground motions, it is demonstrated that the MPS procedure provided an accurate estimate of the engineering demand parameters (EDPs) accompanied by significantly reduced record-to-record variability of the EDPs. Thus, it is a useful tool for scaling ground motions as input to nonlinear RHAs of Ordinary Standard bridges.

  14. BPH Procedural Treatment: The Case for Value-Based Pay for Performance

    Directory of Open Access Journals (Sweden)

    Mark Stovsky

    2008-01-01

    Full Text Available The concept of “pay for performance” (P4P applied to the practice of medicine has become a major foundation in current public and private payer reimbursement strategies for both institutional and individual physician providers. “Pay for performance” programs represent a substantial shift from traditional service-based reimbursement to a system of performance-based provider payment using financial incentives to drive improvements in the quality of care. P4P strategies currently embody rudimentary structure and process (as opposed to outcomes metrics which set relatively low-performance thresholds. P4P strategies that align reimbursement allocation with “free market” type shifts in cognitive and procedural care using evidence-based data and positive reinforcement are more likely to produce large-scale improvements in quality and cost efficiency with respect to clinical urologic care. This paper reviews current paradigms and, using BPH procedural therapy outcomes, cost, and reimbursement data, makes the case for a fundamental change in perspective to value-based pay for performance as a reimbursement system with the potential to align the interests of patients, physicians, and payers and to improve global clinical outcomes while preserving free choice of clinically efficacious treatments.

  15. Evaluation of solution procedures for material and/or geometrically nonlinear structural analysis by the direct stiffness method.

    Science.gov (United States)

    Stricklin, J. A.; Haisler, W. E.; Von Riesemann, W. A.

    1972-01-01

    This paper presents an assessment of the solution procedures available for the analysis of inelastic and/or large deflection structural behavior. A literature survey is given which summarized the contribution of other researchers in the analysis of structural problems exhibiting material nonlinearities and combined geometric-material nonlinearities. Attention is focused at evaluating the available computation and solution techniques. Each of the solution techniques is developed from a common equation of equilibrium in terms of pseudo forces. The solution procedures are applied to circular plates and shells of revolution in an attempt to compare and evaluate each with respect to computational accuracy, economy, and efficiency. Based on the numerical studies, observations and comments are made with regard to the accuracy and economy of each solution technique.

  16. GRAPH-BASED POST INCIDENT INTERNAL AUDIT METHOD OF COMPUTER EQUIPMENT

    Directory of Open Access Journals (Sweden)

    I. S. Pantiukhin

    2016-05-01

    Full Text Available Graph-based post incident internal audit method of computer equipment is proposed. The essence of the proposed solution consists in the establishing of relationships among hard disk damps (image, RAM and network. This method is intended for description of information security incident properties during the internal post incident audit of computer equipment. Hard disk damps receiving and formation process takes place at the first step. It is followed by separation of these damps into the set of components. The set of components includes a large set of attributes that forms the basis for the formation of the graph. Separated data is recorded into the non-relational database management system (NoSQL that is adapted for graph storage, fast access and processing. Damps linking application method is applied at the final step. The presented method gives the possibility to human expert in information security or computer forensics for more precise, informative internal audit of computer equipment. The proposed method allows reducing the time spent on internal audit of computer equipment, increasing accuracy and informativeness of such audit. The method has a development potential and can be applied along with the other components in the tasks of users’ identification and computer forensics.

  17. Procedure and code for calculating black control rods taking into account epithermal absorption, code CAS-1

    International Nuclear Information System (INIS)

    Martinc, R.; Trivunac, N.; Zivkovic, Z.

    1964-12-01

    This report describes the computer code CAS-1, calculation method and procedure applied for calculating the black control rods taking into account the epithermal neutron absorption. Results obtained for supercell method applied for regular lattice reflected in the multiplication medium is part of this report in addition to the computer code manual

  18. DualTrust: A Trust Management Model for Swarm-Based Autonomic Computing Systems

    Energy Technology Data Exchange (ETDEWEB)

    Maiden, Wendy M. [Washington State Univ., Pullman, WA (United States)

    2010-05-01

    Trust management techniques must be adapted to the unique needs of the application architectures and problem domains to which they are applied. For autonomic computing systems that utilize mobile agents and ant colony algorithms for their sensor layer, certain characteristics of the mobile agent ant swarm -- their lightweight, ephemeral nature and indirect communication -- make this adaptation especially challenging. This thesis looks at the trust issues and opportunities in swarm-based autonomic computing systems and finds that by monitoring the trustworthiness of the autonomic managers rather than the swarming sensors, the trust management problem becomes much more scalable and still serves to protect the swarm. After analyzing the applicability of trust management research as it has been applied to architectures with similar characteristics, this thesis specifies the required characteristics for trust management mechanisms used to monitor the trustworthiness of entities in a swarm-based autonomic computing system and describes a trust model that meets these requirements.

  19. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  20. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  1. Computer aided instrumented Charpy test applied dynamic fracture toughness evaluation system

    International Nuclear Information System (INIS)

    Kobayashi, Toshiro; Niinomi, Mitsuo

    1986-01-01

    Micro computer aided data treatment system and personal computer aided data analysis system were applied to the traditional instrumented Charpy impact test system. The analysis of Charpy absorbed energy (E i , E p , E t ) and load (P y , P m ), and the evaluation of dynamic toughness through whole fracture process, i.e. J Id , J R curve and T mat was examined using newly developed computer aided instrumented Charpy impact test system. E i , E p , E t , P y and P m were effectively analyzed using moving average method and printed out automatically by micro computer aided data treatment system. J Id , J R curve and T mat could be measured by stop block test method. Then, J Id , J R curve and T mat were effectively estimated using compliance changing rate method and key curve method on the load-load point displacement curve of single fatigue cracked specimen by personal computer aided data analysis system. (author)

  2. Computer-based systems for nuclear power stations

    International Nuclear Information System (INIS)

    Humble, P.J.; Welbourne, D.; Belcher, G.

    1995-01-01

    The published intentions of vendors are for extensive touch-screen control and computer-based protection. The software features needed for acceptance in the UK are indicated. The defence in depth needed is analyzed. Current practice in aircraft flight control systems and the software methods available are discussed. Software partitioning and mathematically formal methods are appropriate for the structures and simple logic needed for nuclear power applications. The potential for claims of diversity and independence between two computer-based subsystems of a protection system is discussed. Features needed to meet a single failure criterion applied to software are discussed. Conclusions are given on the main factors which a design should allow for. The work reported was done for the Health and Safety Executive of the UK (HSE), and acknowledgement is given to them, to NNC Ltd and to GEC-Marconi Avionics Ltd for permission to publish. The opinions and recommendations expressed are those of the authors and do not necessarily reflect those of HSE. (Author)

  3. Applied computation and security systems

    CERN Document Server

    Saeed, Khalid; Choudhury, Sankhayan; Chaki, Nabendu

    2015-01-01

    This book contains the extended version of the works that have been presented and discussed in the First International Doctoral Symposium on Applied Computation and Security Systems (ACSS 2014) held during April 18-20, 2014 in Kolkata, India. The symposium has been jointly organized by the AGH University of Science & Technology, Cracow, Poland and University of Calcutta, India. The Volume I of this double-volume book contains fourteen high quality book chapters in three different parts. Part 1 is on Pattern Recognition and it presents four chapters. Part 2 is on Imaging and Healthcare Applications contains four more book chapters. The Part 3 of this volume is on Wireless Sensor Networking and it includes as many as six chapters. Volume II of the book has three Parts presenting a total of eleven chapters in it. Part 4 consists of five excellent chapters on Software Engineering ranging from cloud service design to transactional memory. Part 5 in Volume II is on Cryptography with two book...

  4. Microscope self-calibration based on micro laser line imaging and soft computing algorithms

    Science.gov (United States)

    Apolinar Muñoz Rodríguez, J.

    2018-06-01

    A technique to perform microscope self-calibration via micro laser line and soft computing algorithms is presented. In this technique, the microscope vision parameters are computed by means of soft computing algorithms based on laser line projection. To implement the self-calibration, a microscope vision system is constructed by means of a CCD camera and a 38 μm laser line. From this arrangement, the microscope vision parameters are represented via Bezier approximation networks, which are accomplished through the laser line position. In this procedure, a genetic algorithm determines the microscope vision parameters by means of laser line imaging. Also, the approximation networks compute the three-dimensional vision by means of the laser line position. Additionally, the soft computing algorithms re-calibrate the vision parameters when the microscope vision system is modified during the vision task. The proposed self-calibration improves accuracy of the traditional microscope calibration, which is accomplished via external references to the microscope system. The capability of the self-calibration based on soft computing algorithms is determined by means of the calibration accuracy and the micro-scale measurement error. This contribution is corroborated by an evaluation based on the accuracy of the traditional microscope calibration.

  5. Computer-Based Methods for Collecting Peer Nomination Data: Utility, Practice, and Empirical Support.

    Science.gov (United States)

    van den Berg, Yvonne H M; Gommans, Rob

    2017-09-01

    New technologies have led to several major advances in psychological research over the past few decades. Peer nomination research is no exception. Thanks to these technological innovations, computerized data collection is becoming more common in peer nomination research. However, computer-based assessment is more than simply programming the questionnaire and asking respondents to fill it in on computers. In this chapter the advantages and challenges of computer-based assessments are discussed. In addition, a list of practical recommendations and considerations is provided to inform researchers on how computer-based methods can be applied to their own research. Although the focus is on the collection of peer nomination data in particular, many of the requirements, considerations, and implications are also relevant for those who consider the use of other sociometric assessment methods (e.g., paired comparisons, peer ratings, peer rankings) or computer-based assessments in general. © 2017 Wiley Periodicals, Inc.

  6. ZIVIS: A City Computing Platform Based on Volunteer Computing

    International Nuclear Information System (INIS)

    Antoli, B.; Castejon, F.; Giner, A.; Losilla, G.; Reynolds, J. M.; Rivero, A.; Sangiao, S.; Serrano, F.; Tarancon, A.; Valles, R.; Velasco, J. L.

    2007-01-01

    Abstract Volunteer computing has come up as a new form of distributed computing. Unlike other computing paradigms like Grids, which use to be based on complex architectures, volunteer computing has demonstrated a great ability to integrate dispersed, heterogeneous computing resources with ease. This article presents ZIVIS, a project which aims to deploy a city-wide computing platform in Zaragoza (Spain). ZIVIS is based on BOINC (Berkeley Open Infrastructure for Network Computing), a popular open source framework to deploy volunteer and desktop grid computing systems. A scientific code which simulates the trajectories of particles moving inside a stellarator fusion device, has been chosen as the pilot application of the project. In this paper we describe the approach followed to port the code to the BOINC framework as well as some novel techniques, based on standard Grid protocols, we have used to access the output data present in the BOINC server from a remote visualizer. (Author)

  7. Future Directions of Applying Healthcare Cloud for Home-based Chronic Disease Care

    OpenAIRE

    Hu, Yan; Eriksén, Sara; Lundberg, Jenny

    2017-01-01

    The care of chronic disease has become the main challenge for healthcare institutions around the world. To meet the growing needs of patients, moving the front desk of healthcare from hospital to home is essential. Recently, cloud computing has been applied to healthcare domain; however, adapting to and using this technology effectively for home-based care is still in its initial phase. We have proposed a conceptual hybrid cloud model for home-based chronic disease care, and have evaluated it...

  8. Computer-based Astronomy Labs for Non-science Majors

    Science.gov (United States)

    Smith, A. B. E.; Murray, S. D.; Ward, R. A.

    1998-12-01

    We describe and demonstrate two laboratory exercises, Kepler's Third Law and Stellar Structure, which are being developed for use in an astronomy laboratory class aimed at non-science majors. The labs run with Microsoft's Excel 98 (Macintosh) or Excel 97 (Windows). They can be run in a classroom setting or in an independent learning environment. The intent of the labs is twofold; first and foremost, students learn the subject matter through a series of informational frames. Next, students enhance their understanding by applying their knowledge in lab procedures, while also gaining familiarity with the use and power of a widely-used software package and scientific tool. No mathematical knowledge beyond basic algebra is required to complete the labs or to understand the computations in the spreadsheets, although the students are exposed to the concepts of numerical integration. The labs are contained in Excel workbook files. In the files are multiple spreadsheets, which contain either a frame with information on how to run the lab, material on the subject, or one or more procedures. Excel's VBA macro language is used to automate the labs. The macros are accessed through button interfaces positioned on the spreadsheets. This is done intentionally so that students can focus on learning the subject matter and the basic spreadsheet features without having to learn advanced Excel features all at once. Students open the file and progress through the informational frames to the procedures. After each procedure, student comments and data are automatically recorded in a preformatted Lab Report spreadsheet. Once all procedures have been completed, the student is prompted for a filename in which to save their Lab Report. The lab reports can then be printed or emailed to the instructor. The files will have full worksheet and workbook protection, and will have a "redo" feature at the end of the lab for students who want to repeat a procedure.

  9. Design of Intelligent Robot as A Tool for Teaching Media Based on Computer Interactive Learning and Computer Assisted Learning to Improve the Skill of University Student

    Science.gov (United States)

    Zuhrie, M. S.; Basuki, I.; Asto B, I. G. P.; Anifah, L.

    2018-01-01

    The focus of the research is the teaching module which incorporates manufacturing, planning mechanical designing, controlling system through microprocessor technology and maneuverability of the robot. Computer interactive and computer-assisted learning is strategies that emphasize the use of computers and learning aids (computer assisted learning) in teaching and learning activity. This research applied the 4-D model research and development. The model is suggested by Thiagarajan, et.al (1974). 4-D Model consists of four stages: Define Stage, Design Stage, Develop Stage, and Disseminate Stage. This research was conducted by applying the research design development with an objective to produce a tool of learning in the form of intelligent robot modules and kit based on Computer Interactive Learning and Computer Assisted Learning. From the data of the Indonesia Robot Contest during the period of 2009-2015, it can be seen that the modules that have been developed confirm the fourth stage of the research methods of development; disseminate method. The modules which have been developed for students guide students to produce Intelligent Robot Tool for Teaching Based on Computer Interactive Learning and Computer Assisted Learning. Results of students’ responses also showed a positive feedback to relate to the module of robotics and computer-based interactive learning.

  10. Fibonacci’s Computation Methods vs Modern Algorithms

    Directory of Open Access Journals (Sweden)

    Ernesto Burattini

    2013-12-01

    Full Text Available In this paper we discuss some computational procedures given by Leonardo Pisano Fibonacci in his famous Liber Abaci book, and we propose their translation into a modern language for computers (C ++. Among the other we describe the method of “cross” multiplication, we evaluate its computational complexity in algorithmic terms and we show the output of a C ++ code that describes the development of the method applied to the product of two integers. In a similar way we show the operations performed on fractions introduced by Fibonacci. Thanks to the possibility to reproduce on a computer, the Fibonacci’s different computational procedures, it was possible to identify some calculation errors present in the different versions of the original text.

  11. Human Reliability Analysis For Computerized Procedures

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Gertman, David I.; Le Blanc, Katya

    2011-01-01

    This paper provides a characterization of human reliability analysis (HRA) issues for computerized procedures in nuclear power plant control rooms. It is beyond the scope of this paper to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper provides a review of HRA as applied to traditional paper-based procedures, followed by a discussion of what specific factors should additionally be considered in HRAs for computerized procedures. Performance shaping factors and failure modes unique to computerized procedures are highlighted. Since there is no definitive guide to HRA for paper-based procedures, this paper also serves to clarify the existing guidance on paper-based procedures before delving into the unique aspects of computerized procedures.

  12. Standard operating procedure for computing pangenome trees

    DEFF Research Database (Denmark)

    Snipen, L.; Ussery, David

    2010-01-01

    We present the pan-genome tree as a tool for visualizing similarities and differences between closely related microbial genomes within a species or genus. Distance between genomes is computed as a weighted relative Manhattan distance based on gene family presence/absence. The weights can be chose...

  13. Summary of research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    Science.gov (United States)

    1989-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1988 through March 31, 1989 is summarized.

  14. Image Analysis Based on Soft Computing and Applied on Space Shuttle During the Liftoff Process

    Science.gov (United States)

    Dominquez, Jesus A.; Klinko, Steve J.

    2007-01-01

    Imaging techniques based on Soft Computing (SC) and developed at Kennedy Space Center (KSC) have been implemented on a variety of prototype applications related to the safety operation of the Space Shuttle during the liftoff process. These SC-based prototype applications include detection and tracking of moving Foreign Objects Debris (FOD) during the Space Shuttle liftoff, visual anomaly detection on slidewires used in the emergency egress system for the Space Shuttle at the laJlIlch pad, and visual detection of distant birds approaching the Space Shuttle launch pad. This SC-based image analysis capability developed at KSC was also used to analyze images acquired during the accident of the Space Shuttle Columbia and estimate the trajectory and velocity of the foam that caused the accident.

  15. Documentation for assessment of modal pushover-based scaling procedure for nonlinear response history analysis of "ordinary standard" bridges

    Science.gov (United States)

    Kalkan, Erol; Kwong, Neal S.

    2010-01-01

    The earthquake engineering profession is increasingly utilizing nonlinear response history analyses (RHA) to evaluate seismic performance of existing structures and proposed designs of new structures. One of the main ingredients of nonlinear RHA is a set of ground-motion records representing the expected hazard environment for the structure. When recorded motions do not exist (as is the case for the central United States), or when high-intensity records are needed (as is the case for San Francisco and Los Angeles), ground motions from other tectonically similar regions need to be selected and scaled. The modal-pushover-based scaling (MPS) procedure recently was developed to determine scale factors for a small number of records, such that the scaled records provide accurate and efficient estimates of 'true' median structural responses. The adjective 'accurate' refers to the discrepancy between the benchmark responses and those computed from the MPS procedure. The adjective 'efficient' refers to the record-to-record variability of responses. Herein, the accuracy and efficiency of the MPS procedure are evaluated by applying it to four types of existing 'ordinary standard' bridges typical of reinforced-concrete bridge construction in California. These bridges are the single-bent overpass, multi span bridge, curved-bridge, and skew-bridge. As compared to benchmark analyses of unscaled records using a larger catalog of ground motions, it is demonstrated that the MPS procedure provided an accurate estimate of the engineering demand parameters (EDPs) accompanied by significantly reduced record-to-record variability of the responses. Thus, the MPS procedure is a useful tool for scaling ground motions as input to nonlinear RHAs of 'ordinary standard' bridges.

  16. Development of processing procedure preparing for digital computer controlled equipment on modular design base

    International Nuclear Information System (INIS)

    Starosel'tsev, O.P.; Khrundin, V.I.

    1982-01-01

    In order to reduce labour consumption of technological preparation of production for digital computer controlled machines during the treatment of steam turbines articles created is a system of modular design of technological processes and controlling programs. A set of typical modulas-transitions, being a number of surfaces of an articles treated with one cutting tool in optimum sequence, and a library of cutting tools are the base of the system. Introduction of such a system sharply enhaneces the efficiency of the equipment utilization [ru

  17. 3rd International Conference on Computer Science, Applied Mathematics and Applications

    CERN Document Server

    Nguyen, Ngoc; Do, Tien

    2015-01-01

    This volume contains the extended versions of papers presented at the 3rd International Conference on Computer Science, Applied Mathematics and Applications (ICCSAMA 2015) held on 11-13 May, 2015 in Metz, France. The book contains 5 parts: 1. Mathematical programming and optimization: theory, methods and software, Operational research and decision making, Machine learning, data security, and bioinformatics, Knowledge information system, Software engineering. All chapters in the book discuss theoretical and algorithmic as well as practical issues connected with computation methods & optimization methods for knowledge engineering and machine learning techniques.  

  18. Radiation doses in pediatric computed tomography procedures: challenges facing new technologies

    International Nuclear Information System (INIS)

    Cotelo, E.; Padilla, M.; Dibarboure, L.

    2008-01-01

    Despite the fact that in recent years an increasing number of radiologists and radiological technologists have been applying radiation dose optimization techniques in paediatric Computed Tomography (CT) examinations, dual and multi -slice CT (MSCT) scanners present a new challenge in Radiation Protection (RP). While on one hand these scanners are provided with Automatic Exposure Control (AEC) devices, dose reduction modes and dose estimation software, on the other hand Quality Control (QC) tests and CT Kerma Index (C) measurements and patient dose estimation present specific difficulties and require changes or adaptations of traditional QC protocols. This implies a major challenge in most developing countries where Quality Assurance Programmes (QAP) have not been implemented yet and there is a shortage in the number of medical physicists This paper analyses clinical and technical protocols as well as patient doses in 204 CT body procedures performed in 154 children. The investigation was carried out in a paediatric reference hospital of Uruguay, where are performed an average of 450 paediatric CT examinations per month in a sole CT dual scanner. Besides, C VOL reported from the scanner display was registered in order to be related with the same dosimetric quantity derived from technical parameters and C values published on tables. Results showed that not all the radiologists applied the same protocol in similar clinical situations delivering unnecessary patient dose with no significant differences in image quality. Moreover, it was found that dose reduction modes represent a drawback in order to estimate patient dose when mA changes according to tissue attenuation, in most cases in each rotation. The study concluded on the importance of QAP that must include education on RP of radiologists and technologists, as well as in the need of medical physicists to perform QC tests and patient dose estimations and measurements. (author)

  19. What Communication Theories Can Teach the Designer of Computer-Based Training.

    Science.gov (United States)

    Larsen, Ronald E.

    1985-01-01

    Reviews characteristics of computer-based training (CBT) that make application of communication theories appropriate and presents principles from communication theory (e.g., general systems theory, symbolic interactionism, rule theories, and interpersonal communication theories) to illustrate how CBT developers can profitably apply them to…

  20. Computer aided fixture design - A case based approach

    Science.gov (United States)

    Tanji, Shekhar; Raiker, Saiesh; Mathew, Arun Tom

    2017-11-01

    Automated fixture design plays important role in process planning and integration of CAD and CAM. An automated fixture setup design system is developed where when fixturing surfaces and points are described allowing modular fixture components to get automatically select for generating fixture units and placed into position with satisfying assembled conditions. In past, various knowledge based system have been developed to implement CAFD in practice. In this paper, to obtain an acceptable automated machining fixture design, a case-based reasoning method with developed retrieval system is proposed. Visual Basic (VB) programming language is used in integrating with SolidWorks API (Application programming interface) module for better retrieval procedure reducing computational time. These properties are incorporated in numerical simulation to determine the best fit for practical use.

  1. Office-based procedures for diagnosis and treatment of esophageal pathology.

    Science.gov (United States)

    Wellenstein, David J; Schutte, Henrieke W; Marres, Henri A M; Honings, Jimmie; Belafsky, Peter C; Postma, Gregory N; Takes, Robert P; van den Broek, Guido B

    2017-09-01

    Diagnostic and therapeutic office-based procedures under topical anesthesia are emerging in the daily practice of laryngologists and head and neck surgeons. Since the introduction of the transnasal esophagoscope, office-based procedures for the esophagus are increasingly performed. We conducted a systematic review of literature on office-based procedures under topical anesthesia for the esophagus. Transnasal esophagoscopy is an extensively investigated office-based procedure. This procedure shows better patient tolerability and equivalent accuracy compared to conventional transoral esophagoscopy, as well as time and cost savings. Secondary tracheoesophageal puncture, esophageal dilatation, esophageal sphincter injection, and foreign body removal are less investigated, but show promising results. With the introduction of the transnasal esophagoscope, an increasing number of diagnostic and therapeutic office-based procedures for the esophagus are possible, with multiple advantages. Further investigation must prove the clinical feasibility and effectiveness of the therapeutic office-based procedures. © 2017 Wiley Periodicals, Inc.

  2. New Procedures for the Management of Computer Accounts

    CERN Multimedia

    2006-01-01

    In today's computing environment where there is a permanent threat of computer security incidents, it is vitally important to control who has access to CERN's computing infrastructure. To obtain a computer account people are already required to be registered at CERN but until now the closing of accounts has relied on an annual review carried out by the group administrators. This process is now being automated and linked into the CERN registration database. The account review will run permanently and trigger action both when an account is unused and when a person's contract is soon to end. Advance warning of actions to be taken will be sent to both users and their supervisors: when an account has been unused for 6 months the account will be blocked. A year later, after a second warning, the account will be deleted; when a person's association with CERN comes to an end (based on the 'end date' in the official registration database) their computer accounts will be blocked. For most members of the personnel t...

  3. New Procedures for the Management of Computer Accounts

    CERN Multimedia

    2006-01-01

    In today's computing environment where there is a permanent threat of computer security incidents, it is vitally important to control who has access to CERN's computing infrastructure. To obtain a computer account it is already required that people be registered at CERN but up till now closing of accounts has relied on an annual review carried out by the group administrators. This process is now being automated and linked into the CERN registration database. The account review will run permanently and trigger action both when an account is unused and when a person's contract is soon to end. Advance warning of actions to be taken will be sent to both users and their supervisors: when an account has been unused for 6 months the account will be blocked. A year later, after a second warning, the account will be deleted; when a person's association with CERN comes to an end (based on the 'end date' in the official registration database) their computer accounts will be blocked. For most members of the personnel...

  4. Parallel computation of rotating flows

    DEFF Research Database (Denmark)

    Lundin, Lars Kristian; Barker, Vincent A.; Sørensen, Jens Nørkær

    1999-01-01

    This paper deals with the simulation of 3‐D rotating flows based on the velocity‐vorticity formulation of the Navier‐Stokes equations in cylindrical coordinates. The governing equations are discretized by a finite difference method. The solution is advanced to a new time level by a two‐step process...... is that of solving a singular, large, sparse, over‐determined linear system of equations, and the iterative method CGLS is applied for this purpose. We discuss some of the mathematical and numerical aspects of this procedure and report on the performance of our software on a wide range of parallel computers. Darbe...

  5. Assessing mental workload and situation awareness in the evaluation of computerized procedures in the main control room

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Chih-Wei, E-mail: yangcw@iner.gov.tw [Institute of Nuclear Energy Research, 1000, Wenhua Rd., Jiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China); Yang, Li-Chen; Cheng, Tsung-Chieh [Institute of Nuclear Energy Research, 1000, Wenhua Rd., Jiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China); Jou, Yung-Tsan; Chiou, Shian-Wei [Department of Industrial Engineering, Chung-Yuan Christian University, 200, Chung Pei Rd., Chung-Li 32023, Taiwan (China)

    2012-09-15

    Highlights: Black-Right-Pointing-Pointer This study investigates procedure types' effects on operators' performance. Black-Right-Pointing-Pointer The computer-based procedure is suggested to be implemented in the main control room. Black-Right-Pointing-Pointer The computer-based procedure brings lowest mental workload. Black-Right-Pointing-Pointer And it also generates fewer error of omission, and the highest situation awareness. Black-Right-Pointing-Pointer The shift supervisor has the highest workload and the lowest situation awareness. - Abstract: Computerized procedure (CP) system has been developed in nuclear power plant (NPP) instrumentation and control (I and C) system. The system may include normal operating procedures (OPs), abnormal operating procedures (AOPs), alarm response procedures (ARPs), surveillance test procedures (STPs) and/or emergency operating procedures (EOPs). While there are many ways to evaluate computerized procedures design, the user's mental workload and situation awareness (SA) are particularly important considerations in the supervisory control of safety-critical systems. Users' mental workload and situation awareness may be influenced by human factor issues relating to computerized procedures, e.g., level of automation, dealing with (partially) unavailable I and C, switching to back-up system (e.g., paper-based procedures). Some of the positive impacts of CPs on operator performance include the following: tasks can be performed more quickly; overall workload can be reduced; cognitive workload can be minimized; fewer errors may be made in transitioning through or between procedures. However, various challenges have also been identified with CP systems. These should be addressed in the design and implementation of CPs where they are applicable. For example, narrower 'field of view' provided by CP systems than with paper-based procedures could reduce crew communications and crewmember awareness of the

  6. Assessing mental workload and situation awareness in the evaluation of computerized procedures in the main control room

    International Nuclear Information System (INIS)

    Yang, Chih-Wei; Yang, Li-Chen; Cheng, Tsung-Chieh; Jou, Yung-Tsan; Chiou, Shian-Wei

    2012-01-01

    Highlights: ► This study investigates procedure types’ effects on operators’ performance. ► The computer-based procedure is suggested to be implemented in the main control room. ► The computer-based procedure brings lowest mental workload. ► And it also generates fewer error of omission, and the highest situation awareness. ► The shift supervisor has the highest workload and the lowest situation awareness. - Abstract: Computerized procedure (CP) system has been developed in nuclear power plant (NPP) instrumentation and control (I and C) system. The system may include normal operating procedures (OPs), abnormal operating procedures (AOPs), alarm response procedures (ARPs), surveillance test procedures (STPs) and/or emergency operating procedures (EOPs). While there are many ways to evaluate computerized procedures design, the user's mental workload and situation awareness (SA) are particularly important considerations in the supervisory control of safety-critical systems. Users’ mental workload and situation awareness may be influenced by human factor issues relating to computerized procedures, e.g., level of automation, dealing with (partially) unavailable I and C, switching to back-up system (e.g., paper-based procedures). Some of the positive impacts of CPs on operator performance include the following: tasks can be performed more quickly; overall workload can be reduced; cognitive workload can be minimized; fewer errors may be made in transitioning through or between procedures. However, various challenges have also been identified with CP systems. These should be addressed in the design and implementation of CPs where they are applicable. For example, narrower “field of view” provided by CP systems than with paper-based procedures could reduce crew communications and crewmember awareness of the status and progress through the procedure. Based on a human factors experiment in which each participant monitored and controlled multiple simulated

  7. Three-dimensional computer graphics for surgical procedure learning: Web three-dimensional application for cleft lip repair.

    Science.gov (United States)

    Kobayashi, Masahiro; Nakajima, Tatsuo; Mori, Ayako; Tanaka, Daigo; Fujino, Toyomi; Chiyokura, Hiroaki

    2006-05-01

    In surgical procedures for cleft lip, surgeons attempt to use various skin incisions and small flaps to achieve a better and more natural shape postoperatively. They must understand the three-dimensional (3D) structure of the lips. However, they may have difficulty learning the surgical procedures precisely from normal textbooks with two-dimensional illustrations. Recent developments in 3D computed tomography (3D-CT) and laser stereolithography have enabled surgeons to visualize the structures of cleft lips from desired viewpoints. However, this method cannot reflect the advantages offered by specific surgical procedures. To solve this problem, we used the benefits offered by 3D computer graphics (3D-CG) and 3D animation. By using scanning 3D-CT image data of patients with cleft lips, 3D-CG models of the cleft lips were created. Several animations for surgical procedures such as incision designs, rotation of small skin flaps, and sutures were made. This system can recognize the details of an operation procedure clearly from any viewpoint, which cannot be acquired from the usual textbook illustrations. This animation system can be used for developing new skin-flap design, understanding the operational procedure, and using tools in case presentations. The 3D animations can also be uploaded to the World Wide Web for use in teleconferencing.

  8. Advanced computer-based training

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, H D; Martin, H D

    1987-05-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment.

  9. Advanced computer-based training

    International Nuclear Information System (INIS)

    Fischer, H.D.; Martin, H.D.

    1987-01-01

    The paper presents new techniques of computer-based training for personnel of nuclear power plants. Training on full-scope simulators is further increased by use of dedicated computer-based equipment. An interactive communication system runs on a personal computer linked to a video disc; a part-task simulator runs on 32 bit process computers and shows two versions: as functional trainer or as on-line predictor with an interactive learning system (OPAL), which may be well-tailored to a specific nuclear power plant. The common goal of both develoments is the optimization of the cost-benefit ratio for training and equipment. (orig.) [de

  10. Image based Monte Carlo modeling for computational phantom

    International Nuclear Information System (INIS)

    Cheng, M.; Wang, W.; Zhao, K.; Fan, Y.; Long, P.; Wu, Y.

    2013-01-01

    Full text of the publication follows. The evaluation on the effects of ionizing radiation and the risk of radiation exposure on human body has been becoming one of the most important issues for radiation protection and radiotherapy fields, which is helpful to avoid unnecessary radiation and decrease harm to human body. In order to accurately evaluate the dose on human body, it is necessary to construct more realistic computational phantom. However, manual description and verification of the models for Monte Carlo (MC) simulation are very tedious, error-prone and time-consuming. In addition, it is difficult to locate and fix the geometry error, and difficult to describe material information and assign it to cells. MCAM (CAD/Image-based Automatic Modeling Program for Neutronics and Radiation Transport Simulation) was developed as an interface program to achieve both CAD- and image-based automatic modeling. The advanced version (Version 6) of MCAM can achieve automatic conversion from CT/segmented sectioned images to computational phantoms such as MCNP models. Imaged-based automatic modeling program(MCAM6.0) has been tested by several medical images and sectioned images. And it has been applied in the construction of Rad-HUMAN. Following manual segmentation and 3D reconstruction, a whole-body computational phantom of Chinese adult female called Rad-HUMAN was created by using MCAM6.0 from sectioned images of a Chinese visible human dataset. Rad-HUMAN contains 46 organs/tissues, which faithfully represented the average anatomical characteristics of the Chinese female. The dose conversion coefficients (Dt/Ka) from kerma free-in-air to absorbed dose of Rad-HUMAN were calculated. Rad-HUMAN can be applied to predict and evaluate dose distributions in the Treatment Plan System (TPS), as well as radiation exposure for human body in radiation protection. (authors)

  11. Arbitrated Quantum Signature with Hamiltonian Algorithm Based on Blind Quantum Computation

    Science.gov (United States)

    Shi, Ronghua; Ding, Wanting; Shi, Jinjing

    2018-03-01

    A novel arbitrated quantum signature (AQS) scheme is proposed motivated by the Hamiltonian algorithm (HA) and blind quantum computation (BQC). The generation and verification of signature algorithm is designed based on HA, which enables the scheme to rely less on computational complexity. It is unnecessary to recover original messages when verifying signatures since the blind quantum computation is applied, which can improve the simplicity and operability of our scheme. It is proved that the scheme can be deployed securely, and the extended AQS has some extensive applications in E-payment system, E-government, E-business, etc.

  12. Proof Rules for Recursive Procedures

    NARCIS (Netherlands)

    Hesselink, Wim H.

    1993-01-01

    Four proof rules for recursive procedures in a Pascal-like language are presented. The main rule deals with total correctness and is based on results of Gries and Martin. The rule is easier to apply than Martin's. It is introduced as an extension of a specification format for Pascal-procedures, with

  13. Machine learning based Intelligent cognitive network using fog computing

    Science.gov (United States)

    Lu, Jingyang; Li, Lun; Chen, Genshe; Shen, Dan; Pham, Khanh; Blasch, Erik

    2017-05-01

    In this paper, a Cognitive Radio Network (CRN) based on artificial intelligence is proposed to distribute the limited radio spectrum resources more efficiently. The CRN framework can analyze the time-sensitive signal data close to the signal source using fog computing with different types of machine learning techniques. Depending on the computational capabilities of the fog nodes, different features and machine learning techniques are chosen to optimize spectrum allocation. Also, the computing nodes send the periodic signal summary which is much smaller than the original signal to the cloud so that the overall system spectrum source allocation strategies are dynamically updated. Applying fog computing, the system is more adaptive to the local environment and robust to spectrum changes. As most of the signal data is processed at the fog level, it further strengthens the system security by reducing the communication burden of the communications network.

  14. Hardware architecture design of image restoration based on time-frequency domain computation

    Science.gov (United States)

    Wen, Bo; Zhang, Jing; Jiao, Zipeng

    2013-10-01

    The image restoration algorithms based on time-frequency domain computation is high maturity and applied widely in engineering. To solve the high-speed implementation of these algorithms, the TFDC hardware architecture is proposed. Firstly, the main module is designed, by analyzing the common processing and numerical calculation. Then, to improve the commonality, the iteration control module is planed for iterative algorithms. In addition, to reduce the computational cost and memory requirements, the necessary optimizations are suggested for the time-consuming module, which include two-dimensional FFT/IFFT and the plural calculation. Eventually, the TFDC hardware architecture is adopted for hardware design of real-time image restoration system. The result proves that, the TFDC hardware architecture and its optimizations can be applied to image restoration algorithms based on TFDC, with good algorithm commonality, hardware realizability and high efficiency.

  15. Performance of the Seven-step Procedure in Problem-based Hospitality Management Education

    Directory of Open Access Journals (Sweden)

    Wichard Zwaal

    2016-12-01

    Full Text Available The study focuses on the seven-step procedure (SSP in problem-based learning (PBL. The way students apply the seven-step procedure will help us understand how students work in a problem-based learning curriculum. So far, little is known about how students rate the performance and importance of the different steps, the amount of time they spend on each step and the perceived quality of execution of the procedure. A survey was administered to a sample of 101 students enrolled in a problem-based hospitality management program. Results show that students consider step 6 (Collect additional information outside the group to be most important. The highest performance-rating is for step two (Define the problem and the lowest for step four (Draw a systemic inventory of explanations from step three. Step seven is classified as low in performance and high in importance implicating urgent attention. The average amount of time spent on the seven steps is 133 minutes with the largest part of the time spent on self-study outside the group (42 minutes. The assessment of the execution of a set of specific guidelines (the Blue Card did not completely match with the overall performance ratings for the seven steps. The SSP could be improved by reducing the number of steps and incorporating more attention to group dynamics.

  16. Bootstrap-based procedures for inference in nonparametric receiver-operating characteristic curve regression analysis.

    Science.gov (United States)

    Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G

    2018-03-01

    Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.

  17. Applied dynamics

    CERN Document Server

    Schiehlen, Werner

    2014-01-01

    Applied Dynamics is an important branch of engineering mechanics widely applied to mechanical and automotive engineering, aerospace and biomechanics as well as control engineering and mechatronics. The computational methods presented are based on common fundamentals. For this purpose analytical mechanics turns out to be very useful where D’Alembert’s principle in the Lagrangian formulation proves to be most efficient. The method of multibody systems, finite element systems and continuous systems are treated consistently. Thus, students get a much better understanding of dynamical phenomena, and engineers in design and development departments using computer codes may check the results more easily by choosing models of different complexity for vibration and stress analysis.

  18. Decomposition and Cross-Product-Based Method for Computing the Dynamic Equation of Robots

    Directory of Open Access Journals (Sweden)

    Ching-Long Shih

    2012-08-01

    Full Text Available This paper aims to demonstrate a clear relationship between Lagrange equations and Newton-Euler equations regarding computational methods for robot dynamics, from which we derive a systematic method for using either symbolic or on-line numerical computations. Based on the decomposition approach and cross-product operation, a computing method for robot dynamics can be easily developed. The advantages of this computing framework are that: it can be used for both symbolic and on-line numeric computation purposes, and it can also be applied to biped systems, as well as some simple closed-chain robot systems.

  19. Computational modeling applied to stress gradient analysis for metallic alloys

    International Nuclear Information System (INIS)

    Iglesias, Susana M.; Assis, Joaquim T. de; Monine, Vladimir I.

    2009-01-01

    Nowadays composite materials including materials reinforced by particles are the center of the researcher's attention. There are problems with the stress measurements in these materials, connected with the superficial stress gradient caused by the difference of the stress state of particles on the surface and in the matrix of the composite material. Computer simulation of diffraction profile formed by superficial layers of material allows simulate the diffraction experiment and gives the possibility to resolve the problem of stress measurements when the stress state is characterized by strong gradient. The aim of this paper is the application of computer simulation technique, initially developed for homogeneous materials, for diffraction line simulation of composite materials and alloys. Specifically we applied this technique for siluminum fabricated by powder metallurgy. (author)

  20. Determining procedures for simulation-based training in radiology

    DEFF Research Database (Denmark)

    Nayahangan, Leizl Joy; Nielsen, Kristina Rue; Albrecht-Beste, Elisabeth

    2018-01-01

    , and basic abdominal ultrasound. CONCLUSION: A needs assessment identified and prioritized 13 technical procedures to include in a simulation-based curriculum. The list may be used as guide for development of training programs. KEY POINTS: • Simulation-based training can supplement training on patients......OBJECTIVES: New training modalities such as simulation are widely accepted in radiology; however, development of effective simulation-based training programs is challenging. They are often unstructured and based on convenience or coincidence. The study objective was to perform a nationwide needs...... assessment to identify and prioritize technical procedures that should be included in a simulation-based curriculum. METHODS: A needs assessment using the Delphi method was completed among 91 key leaders in radiology. Round 1 identified technical procedures that radiologists should learn. Round 2 explored...

  1. Intelligent Decisional Assistant that Facilitate the Choice of a Proper Computer System Applied in Busines

    OpenAIRE

    Nicolae MARGINEAN

    2009-01-01

    The choice of a proper computer system is not an easy task for a decider. One reason could be the present market development of computer systems applied in business. The big number of the Romanian market players determines a big number of computerized products, with a multitude of various properties. Our proposal tries to optimize and facilitate this decisional process within an e-shop where are sold IT packets applied in business, building an online decisional assistant, a special component ...

  2. A Numerical Procedure for Model Identifiability Analysis Applied to Enzyme Kinetics

    DEFF Research Database (Denmark)

    Daele, Timothy, Van; Van Hoey, Stijn; Gernaey, Krist

    2015-01-01

    The proper calibration of models describing enzyme kinetics can be quite challenging. In the literature, different procedures are available to calibrate these enzymatic models in an efficient way. However, in most cases the model structure is already decided on prior to the actual calibration...... and Pronzato (1997) and which can be easily set up for any type of model. In this paper the proposed approach is applied to the forward reaction rate of the enzyme kinetics proposed by Shin and Kim(1998). Structural identifiability analysis showed that no local structural model problems were occurring......) identifiability problems. By using the presented approach it is possible to detect potential identifiability problems and avoid pointless calibration (and experimental!) effort....

  3. A Real-Time Plagiarism Detection Tool for Computer-Based Assessments

    Science.gov (United States)

    Jeske, Heimo J.; Lall, Manoj; Kogeda, Okuthe P.

    2018-01-01

    Aim/Purpose: The aim of this article is to develop a tool to detect plagiarism in real time amongst students being evaluated for learning in a computer-based assessment setting. Background: Cheating or copying all or part of source code of a program is a serious concern to academic institutions. Many academic institutions apply a combination of…

  4. Development of a web-based CANDU core management procedures automation system

    International Nuclear Information System (INIS)

    Lee, S.; Park, D.; Yeom, C.; Suh, H.

    2007-01-01

    Introduce CANDU core management procedures automation system (COMPAS) - A web-based application which semi-automates several CANDU core management tasks. It provides various functionalities including selection and evaluation of refueling channel, detector calibration, coolant flow estimation and thermal power calculation through automated interfacing with analysis codes (RFSP, NUCIRC, etc.) and plant data. It also utilizes brand new .NET computing technology such as ASP.NET, smart client, web services and so on. Since almost all functions are abstracted from the previous experiences of the current working members of the Wolsong Nuclear Power Plant (NPP), it will lead to an efficient and safe operation of CANDU plants. (author)

  5. Development of a web-based CANDU core management procedures automation system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.; Park, D.; Yeom, C. [Inst. for Advanced Engineering (IAE), Yongin (Korea, Republic of); Suh, H. [Korea Hydro and Nuclear Power (KHNP), Wolsong (Korea, Republic of)

    2007-07-01

    Introduce CANDU core management procedures automation system (COMPAS) - A web-based application which semi-automates several CANDU core management tasks. It provides various functionalities including selection and evaluation of refueling channel, detector calibration, coolant flow estimation and thermal power calculation through automated interfacing with analysis codes (RFSP, NUCIRC, etc.) and plant data. It also utilizes brand new .NET computing technology such as ASP.NET, smart client, web services and so on. Since almost all functions are abstracted from the previous experiences of the current working members of the Wolsong Nuclear Power Plant (NPP), it will lead to an efficient and safe operation of CANDU plants. (author)

  6. Ultra-fast computation of electronic spectra for large systems by tight-binding based simplified Tamm-Dancoff approximation (sTDA-xTB)

    International Nuclear Information System (INIS)

    Grimme, Stefan; Bannwarth, Christoph

    2016-01-01

    The computational bottleneck of the extremely fast simplified Tamm-Dancoff approximated (sTDA) time-dependent density functional theory procedure [S. Grimme, J. Chem. Phys. 138, 244104 (2013)] for the computation of electronic spectra for large systems is the determination of the ground state Kohn-Sham orbitals and eigenvalues. This limits such treatments to single structures with a few hundred atoms and hence, e.g., sampling along molecular dynamics trajectories for flexible systems or the calculation of chromophore aggregates is often not possible. The aim of this work is to solve this problem by a specifically designed semi-empirical tight binding (TB) procedure similar to the well established self-consistent-charge density functional TB scheme. The new special purpose method provides orbitals and orbital energies of hybrid density functional character for a subsequent and basically unmodified sTDA procedure. Compared to many previous semi-empirical excited state methods, an advantage of the ansatz is that a general eigenvalue problem in a non-orthogonal, extended atomic orbital basis is solved and therefore correct occupied/virtual orbital energy splittings as well as Rydberg levels are obtained. A key idea for the success of the new model is that the determination of atomic charges (describing an effective electron-electron interaction) and the one-particle spectrum is decoupled and treated by two differently parametrized Hamiltonians/basis sets. The three-diagonalization-step composite procedure can routinely compute broad range electronic spectra (0-8 eV) within minutes of computation time for systems composed of 500-1000 atoms with an accuracy typical of standard time-dependent density functional theory (0.3-0.5 eV average error). An easily extendable parametrization based on coupled-cluster and density functional computed reference data for the elements H–Zn including transition metals is described. The accuracy of the method termed sTDA-xTB is first

  7. Ultra-fast computation of electronic spectra for large systems by tight-binding based simplified Tamm-Dancoff approximation (sTDA-xTB)

    Energy Technology Data Exchange (ETDEWEB)

    Grimme, Stefan, E-mail: grimme@thch.uni-bonn.de; Bannwarth, Christoph [Mulliken Center for Theoretical Chemistry, Institut für Physikalische und Theoretische Chemie, Rheinische Friedrich-Wilhelms Universität Bonn, Beringstraße 4, 53115 Bonn (Germany)

    2016-08-07

    The computational bottleneck of the extremely fast simplified Tamm-Dancoff approximated (sTDA) time-dependent density functional theory procedure [S. Grimme, J. Chem. Phys. 138, 244104 (2013)] for the computation of electronic spectra for large systems is the determination of the ground state Kohn-Sham orbitals and eigenvalues. This limits such treatments to single structures with a few hundred atoms and hence, e.g., sampling along molecular dynamics trajectories for flexible systems or the calculation of chromophore aggregates is often not possible. The aim of this work is to solve this problem by a specifically designed semi-empirical tight binding (TB) procedure similar to the well established self-consistent-charge density functional TB scheme. The new special purpose method provides orbitals and orbital energies of hybrid density functional character for a subsequent and basically unmodified sTDA procedure. Compared to many previous semi-empirical excited state methods, an advantage of the ansatz is that a general eigenvalue problem in a non-orthogonal, extended atomic orbital basis is solved and therefore correct occupied/virtual orbital energy splittings as well as Rydberg levels are obtained. A key idea for the success of the new model is that the determination of atomic charges (describing an effective electron-electron interaction) and the one-particle spectrum is decoupled and treated by two differently parametrized Hamiltonians/basis sets. The three-diagonalization-step composite procedure can routinely compute broad range electronic spectra (0-8 eV) within minutes of computation time for systems composed of 500-1000 atoms with an accuracy typical of standard time-dependent density functional theory (0.3-0.5 eV average error). An easily extendable parametrization based on coupled-cluster and density functional computed reference data for the elements H–Zn including transition metals is described. The accuracy of the method termed sTDA-xTB is first

  8. Computer-based multi-channel analyzer based on internet

    International Nuclear Information System (INIS)

    Zhou Xinzhi; Ning Jiaoxian

    2001-01-01

    Combined the technology of Internet with computer-based multi-channel analyzer, a new kind of computer-based multi-channel analyzer system which is based on browser is presented. Its framework and principle as well as its implementation are discussed

  9. Multi-binding site model-based curve-fitting program for the computation of RIA data

    International Nuclear Information System (INIS)

    Malan, P.G.; Ekins, R.P.; Cox, M.G.; Long, E.M.R.

    1977-01-01

    In this paper, a comparison will be made of model-based and empirical curve-fitting procedures. The implementation of a multiple binding-site curve-fitting model which will successfully fit a wide range of assay data, and which can be run on a mini-computer is described. The latter sophisticated model also provides estimates of binding site concentrations and the values of the respective equilibrium constants present: the latter have been used for refining assay conditions using computer optimisation techniques. (orig./AJ) [de

  10. GPU-BASED MONTE CARLO DUST RADIATIVE TRANSFER SCHEME APPLIED TO ACTIVE GALACTIC NUCLEI

    International Nuclear Information System (INIS)

    Heymann, Frank; Siebenmorgen, Ralf

    2012-01-01

    A three-dimensional parallel Monte Carlo (MC) dust radiative transfer code is presented. To overcome the huge computing-time requirements of MC treatments, the computational power of vectorized hardware is used, utilizing either multi-core computer power or graphics processing units. The approach is a self-consistent way to solve the radiative transfer equation in arbitrary dust configurations. The code calculates the equilibrium temperatures of two populations of large grains and stochastic heated polycyclic aromatic hydrocarbons. Anisotropic scattering is treated applying the Heney-Greenstein phase function. The spectral energy distribution (SED) of the object is derived at low spatial resolution by a photon counting procedure and at high spatial resolution by a vectorized ray tracer. The latter allows computation of high signal-to-noise images of the objects at any frequencies and arbitrary viewing angles. We test the robustness of our approach against other radiative transfer codes. The SED and dust temperatures of one- and two-dimensional benchmarks are reproduced at high precision. The parallelization capability of various MC algorithms is analyzed and included in our treatment. We utilize the Lucy algorithm for the optical thin case where the Poisson noise is high, the iteration-free Bjorkman and Wood method to reduce the calculation time, and the Fleck and Canfield diffusion approximation for extreme optical thick cells. The code is applied to model the appearance of active galactic nuclei (AGNs) at optical and infrared wavelengths. The AGN torus is clumpy and includes fluffy composite grains of various sizes made up of silicates and carbon. The dependence of the SED on the number of clumps in the torus and the viewing angle is studied. The appearance of the 10 μm silicate features in absorption or emission is discussed. The SED of the radio-loud quasar 3C 249.1 is fit by the AGN model and a cirrus component to account for the far-infrared emission.

  11. Nanotube devices based crossbar architecture: toward neuromorphic computing

    International Nuclear Information System (INIS)

    Zhao, W S; Gamrat, C; Agnus, G; Derycke, V; Filoramo, A; Bourgoin, J-P

    2010-01-01

    Nanoscale devices such as carbon nanotube and nanowires based transistors, memristors and molecular devices are expected to play an important role in the development of new computing architectures. While their size represents a decisive advantage in terms of integration density, it also raises the critical question of how to efficiently address large numbers of densely integrated nanodevices without the need for complex multi-layer interconnection topologies similar to those used in CMOS technology. Two-terminal programmable devices in crossbar geometry seem particularly attractive, but suffer from severe addressing difficulties due to cross-talk, which implies complex programming procedures. Three-terminal devices can be easily addressed individually, but with limited gain in terms of interconnect integration. We show how optically gated carbon nanotube devices enable efficient individual addressing when arranged in a crossbar geometry with shared gate electrodes. This topology is particularly well suited for parallel programming or learning in the context of neuromorphic computing architectures.

  12. Engineering seismology application of a computer base comprise of french macroseismic data

    International Nuclear Information System (INIS)

    Godefroy, P.; Levret, A.

    1990-01-01

    France, a moderately seismic country, has compiled a computer base of macroseismic data for the purpose of satisfying safety requirements implicated in its nuclear electric program. This evolving base includes not only information about the event and its epicenter, but also all the individual macroseismic observations. The analysis of these data serves as an input to the deterministic assessment of seismic hazard for high-risk facilities. Current practice implements a seismotectonic approach wherein geological and seismic data are used to determine active faults or, when this is not possible, provinces shown to be homogeneous on the basis of a certain number of criteria. According to the safety procedure applied, the first step is to act as if this earthquake could occur at any spot within the entity to which it belongs, and thence at the point nearest the site. The maximum macroseismic intensity induced thereby on the site, either after displacing isoseismals or through use of laws of intensity attenuation versus distance, constitutes an initial level of seismic hazard with respect to which protective measures in the design of certain types of installation are taken. In the nuclear field, the regulations call for a second level of hazard, intended to afford an additional safety margin expected to cover, notably, uncertainties in the seismotectonic analysis or insufficiencies in the seismic data itself. This second level of hazard, designated Safety Design Earthquake, the effect of which is to raise the first-level intensity by one degree, is characterized by its response spectrum, in terms of which the facility's safety functions must remain unimpaired. Examples, drawn from south-eastern France, of the procedure just described will be presented

  13. Real-time slicing algorithm for Stereolithography (STL) CAD model applied in additive manufacturing industry

    Science.gov (United States)

    Adnan, F. A.; Romlay, F. R. M.; Shafiq, M.

    2018-04-01

    Owing to the advent of the industrial revolution 4.0, the need for further evaluating processes applied in the additive manufacturing application particularly the computational process for slicing is non-trivial. This paper evaluates a real-time slicing algorithm for slicing an STL formatted computer-aided design (CAD). A line-plane intersection equation was applied to perform the slicing procedure at any given height. The application of this algorithm has found to provide a better computational time regardless the number of facet in the STL model. The performance of this algorithm is evaluated by comparing the results of the computational time for different geometry.

  14. Applying Integrated Computer Assisted Media (ICAM in Teaching Vocabulary

    Directory of Open Access Journals (Sweden)

    Opick Dwi Indah

    2015-02-01

    Full Text Available The objective of this research was to find out whether the use of integrated computer assisted media (ICAM is effective to improve the vocabulary achievement of the second semester students of Cokroaminoto Palopo University. The population of this research was the second semester students of English department of Cokroaminoto Palopo University in academic year 2013/2014. The samples of this research were 60 students and they were placed into two groups: experimental and control group where each group consisted of 30 students. This research used cluster random sampling technique. The research data was collected by applying vocabulary test and it was analyzed by using descriptive and inferential statistics. The result of this research was integrated computer assisted media (ICAM can improve vocabulary achievement of the students of English department of Cokroaminoto Palopo University. It can be concluded that the use of ICAM in the teaching vocabulary is effective to be implemented in improving the students’ vocabulary achievement.

  15. General rigid motion correction for computed tomography imaging based on locally linear embedding

    Science.gov (United States)

    Chen, Mianyi; He, Peng; Feng, Peng; Liu, Baodong; Yang, Qingsong; Wei, Biao; Wang, Ge

    2018-02-01

    The patient motion can damage the quality of computed tomography images, which are typically acquired in cone-beam geometry. The rigid patient motion is characterized by six geometric parameters and are more challenging to correct than in fan-beam geometry. We extend our previous rigid patient motion correction method based on the principle of locally linear embedding (LLE) from fan-beam to cone-beam geometry and accelerate the computational procedure with the graphics processing unit (GPU)-based all scale tomographic reconstruction Antwerp toolbox. The major merit of our method is that we need neither fiducial markers nor motion-tracking devices. The numerical and experimental studies show that the LLE-based patient motion correction is capable of calibrating the six parameters of the patient motion simultaneously, reducing patient motion artifacts significantly.

  16. Design flood hydrograph estimation procedure for small and fully-ungauged basins

    Science.gov (United States)

    Grimaldi, S.; Petroselli, A.

    2013-12-01

    The Rational Formula is the most applied equation in practical hydrology due to its simplicity and the effective compromise between theory and data availability. Although the Rational Formula is affected by several drawbacks, it is reliable and surprisingly accurate considering the paucity of input information. However, after more than a century, the recent computational, theoretical, and large-scale monitoring progresses compel us to try to suggest a more advanced yet still empirical procedure for estimating peak discharge in small and ungauged basins. In this contribution an alternative empirical procedure (named EBA4SUB - Event Based Approach for Small and Ungauged Basins) based on the common modelling steps: design hyetograph, rainfall excess, and rainfall-runoff transformation, is described. The proposed approach, accurately adapted for the fully-ungauged basin condition, provides a potentially better estimation of the peak discharge, a design hydrograph shape, and, most importantly, reduces the subjectivity of the hydrologist in its application.

  17. Instrumentation for Scientific Computing in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics.

    Science.gov (United States)

    1987-10-01

    include Security Classification) Instrumentation for scientific computing in neural networks, information science, artificial intelligence, and...instrumentation grant to purchase equipment for support of research in neural networks, information science, artificail intellignece , and applied mathematics...in Neural Networks, Information Science, Artificial Intelligence, and Applied Mathematics Contract AFOSR 86-0282 Principal Investigator: Stephen

  18. Computer-Aided Diagnosis of Solid Breast Lesions Using an Ultrasonic Multi-Feature Analysis Procedure

    Science.gov (United States)

    2011-01-01

    ultrasound. 1. BACKGROUND AND INTRODUCTION Breast cancer affects one of every eight women, it kills one of 29 women in the United States, and is the leading...feature analysis procedure for computer-aided diagnosis of solid breast lesions,” Ultrason Imag, 2010 (In Press). 22. C. B. Shakespeare , personal

  19. Applied modelling and computing in social science

    CERN Document Server

    Povh, Janez

    2015-01-01

    In social science outstanding results are yielded by advanced simulation methods, based on state of the art software technologies and an appropriate combination of qualitative and quantitative methods. This book presents examples of successful applications of modelling and computing in social science: business and logistic process simulation and optimization, deeper knowledge extractions from big data, better understanding and predicting of social behaviour and modelling health and environment changes.

  20. Soil Erosion Estimation Using Grid-based Computation

    Directory of Open Access Journals (Sweden)

    Josef Vlasák

    2005-06-01

    Full Text Available Soil erosion estimation is an important part of a land consolidation process. Universal soil loss equation (USLE was presented by Wischmeier and Smith. USLE computation uses several factors, namely R – rainfall factor, K – soil erodability, L – slope length factor, S – slope gradient factor, C – cropping management factor, and P – erosion control management factor. L and S factors are usually combined to one LS factor – Topographic factor. The single factors are determined from several sources, such as DTM (Digital Terrain Model, BPEJ – soil type map, aerial and satellite images, etc. A conventional approach to the USLE computation, which is widely used in the Czech Republic, is based on the selection of characteristic profiles for which all above-mentioned factors must be determined. The result (G – annual soil loss of such computation is then applied for a whole area (slope of interest. Another approach to the USLE computation uses grids as a main data-structure. A prerequisite for a grid-based USLE computation is that each of the above-mentioned factors exists as a separate grid layer. The crucial step in this computation is a selection of appropriate grid resolution (grid cell size. A large cell size can cause an undesirable precision degradation. Too small cell size can noticeably slow down the whole computation. Provided that the cell size is derived from the source’s precision, the appropriate cell size for the Czech Republic varies from 30m to 50m. In some cases, especially when new surveying was done, grid computations can be performed with higher accuracy, i.e. with a smaller grid cell size. In such case, we have proposed a new method using the two-step computation. The first step computation uses a bigger cell size and is designed to identify higher erosion spots. The second step then uses a smaller cell size but it make the computation only the area identified in the previous step. This decomposition allows a

  1. An Implementation of Parallel and Networked Computing Schemes for the Real-Time Image Reconstruction Based on Electrical Tomography

    International Nuclear Information System (INIS)

    Park, Sook Hee

    2001-02-01

    This thesis implements and analyzes the parallel and networked computing libraries based on the multiprocessor computer architecture as well as networked computers, aiming at improving the computation speed of ET(Electrical Tomography) system which requires enormous CPU time in reconstructing the unknown internal state of the target object. As an instance of the typical tomography technology, ET partitions the cross-section of the target object into the tiny elements and calculates the resistivity of them with signal values measured at the boundary electrodes surrounding the surface of the object after injecting the predetermined current pattern through the object. The number of elements is determined considering the trade-off between the accuracy of the reconstructed image and the computation time. As the elements become more finer, the number of element increases, and the system can get the better image. However, the reconstruction time increases polynomially with the number of partitioned elements since the procedure consists of a number of time consuming matrix operations such as multiplication, inverse, pseudo inverse, Jacobian and so on. Consequently, the demand for improving computation speed via multiple processor grows indispensably. Moreover, currently released PCs can be stuffed with up to 4 CPUs interconnected to the shared memory while some operating systems enable the application process to benefit from such computer by allocating the threaded job to each CPU, resulting in concurrent processing. In addition, a networked computing or cluster computing environment is commonly available to almost every computer which contains communication protocol and is connected to local or global network. After partitioning the given job(numerical operation), each CPU or computer calculates the partial result independently, and the results are merged via common memory to produce the final result. It is desirable to adopt the commonly used library such as Matlab to

  2. The Implementation of Blended Learning Using Android-Based Tutorial Video in Computer Programming Course II

    Science.gov (United States)

    Huda, C.; Hudha, M. N.; Ain, N.; Nandiyanto, A. B. D.; Abdullah, A. G.; Widiaty, I.

    2018-01-01

    Computer programming course is theoretical. Sufficient practice is necessary to facilitate conceptual understanding and encouraging creativity in designing computer programs/animation. The development of tutorial video in an Android-based blended learning is needed for students’ guide. Using Android-based instructional material, students can independently learn anywhere and anytime. The tutorial video can facilitate students’ understanding about concepts, materials, and procedures of programming/animation making in detail. This study employed a Research and Development method adapting Thiagarajan’s 4D model. The developed Android-based instructional material and tutorial video were validated by experts in instructional media and experts in physics education. The expert validation results showed that the Android-based material was comprehensive and very feasible. The tutorial video was deemed feasible as it received average score of 92.9%. It was also revealed that students’ conceptual understanding, skills, and creativity in designing computer program/animation improved significantly.

  3. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2004-01-01

    systems important to safety in nuclear power plants, for all phases of the system life cycle. The guidance is applicable to systems important to safety. Since at present the reliability of a computer based system cannot be predicted on the sole basis of, or built in by, the design process, it is difficult to define and to agree systematically on any possible relaxation in the guidance to apply to software for safety related systems. Whenever possible, recommendations which apply only to safety systems and not to safety related systems are explicitly identified. The guidance relates primarily to the software used in computer based systems important to safety. Guidance on the other aspects of computer based systems, such as those concerned with the design of the computer based system itself and its hardware, is limited to the issues raised by the development, verification and validation of software.The main focus of this Safety Guide is on the preparation of documentation that is used for an adequate demonstration of the safety and reliability of computer based systems important to safety.This Safety Guide applies to all types of software: pre-existing software or firmware (such as an operating system), software to be specifically developed for the project, or software to be developed from an existing pre developed equipment family of hardware or software modules. This Safety Guide is intended for use by those involved in the production, assessment and licensing of computer based systems, including plant system designers, software designers and programmers, verifiers, validators, certifiers and regulators, as well as plant operators. The various interfaces between those involved are considered. (author)

  4. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2005-01-01

    systems important to safety in nuclear power plants, for all phases of the system life cycle. The guidance is applicable to systems important to safety. Since at present the reliability of a computer based system cannot be predicted on the sole basis of, or built in by, the design process, it is difficult to define and to agree systematically on any possible relaxation in the guidance to apply to software for safety related systems. Whenever possible, recommendations which apply only to safety systems and not to safety related systems are explicitly identified. The guidance relates primarily to the software used in computer based systems important to safety. Guidance on the other aspects of computer based systems, such as those concerned with the design of the computer based system itself and its hardware, is limited to the issues raised by the development, verification and validation of software.The main focus of this Safety Guide is on the preparation of documentation that is used for an adequate demonstration of the safety and reliability of computer based systems important to safety. This Safety Guide applies to all types of software: pre-existing software or firmware (such as an operating system), software to be specifically developed for the project, or software to be developed from an existing pre developed equipment family of hardware or software modules. This Safety Guide is intended for use by those involved in the production, assessment and licensing of computer based systems, including plant system designers, software designers and programmers, verifiers, validators, certifiers and regulators, as well as plant operators. The various interfaces between those involved are considered

  5. Software for computer based systems important to safety in nuclear power plants. Safety guide

    International Nuclear Information System (INIS)

    2000-01-01

    systems important to safety in nuclear power plants, for all phases of the system life cycle. The guidance is applicable to systems important to safety. Since at present the reliability of a computer based system cannot be predicted on the sole basis of, or built in by, the design process, it is difficult to define and to agree systematically on any possible relaxation in the guidance to apply to software for safety related systems. Whenever possible, recommendations which apply only to safety systems and not to safety related systems are explicitly identified. The guidance relates primarily to the software used in computer based systems important to safety. Guidance on the other aspects of computer based systems, such as those concerned with the design of the computer based system itself and its hardware, is limited to the issues raised by the development, verification and validation of software.The main focus of this Safety Guide is on the preparation of documentation that is used for an adequate demonstration of the safety and reliability of computer based systems important to safety.This Safety Guide applies to all types of software: pre-existing software or firmware (such as an operating system), software to be specifically developed for the project, or software to be developed from an existing pre developed equipment family of hardware or software modules. This Safety Guide is intended for use by those involved in the production, assessment and licensing of computer based systems, including plant system designers, software designers and programmers, verifiers, validators, certifiers and regulators, as well as plant operators. The various interfaces between those involved are considered

  6. Office-Based Procedures for the Diagnosis and Treatment of Laryngeal Pathology.

    Science.gov (United States)

    Wellenstein, David J; Schutte, Henrieke W; Takes, Robert P; Honings, Jimmie; Marres, Henri A M; Burns, James A; van den Broek, Guido B

    2017-09-18

    Since the development of distal chip endoscopes with a working channel, diagnostic and therapeutic possibilities in the outpatient clinic in the management of laryngeal pathology have increased. Which of these office-based procedures are currently available, and their clinical indications and possible advantages, remains unclear. Review of literature on office-based procedures in laryngology and head and neck oncology. Flexible endoscopic biopsy (FEB), vocal cord injection, and laser surgery are well-established office-based procedures that can be performed under topical anesthesia. These procedures demonstrate good patient tolerability and multiple advantages. Office-based procedures under topical anesthesia are currently an established method in the management of laryngeal pathology. These procedures offer medical and economic advantages compared with operating room-performed procedures. Furthermore, office-based procedures enhance the speed and timing of the diagnostic and therapeutic process. Copyright © 2017 The Voice Foundation. All rights reserved.

  7. Knowledge-based framework for procedure synthesis and its application to the emergency response in a nuclear power plant

    International Nuclear Information System (INIS)

    Sharma, D.D.

    1986-01-01

    In this dissertation a nuclear power plant operator is viewed as a knowledge based problem solver. It is shown that, in responding to an abnormal situation, an operator typically solves several problems, for example, plant status monitoring, diagnosis, sensor data validation, consequence prediction, and procedure synthesis. It is proposed that, in order to respond to unexpected situations and handle procedure failures the capability to synthesize and modify procedures dynamically or in runtime is required. A knowledge based framework for dynamically synthesizing procedures (DPS), a knowledge representation language to apply DPS framework for real problems (DPSRL), and a framework for emergency procedure synthesis (FEPS) for nuclear power plants based on DPS are developed. The DPS framework views the task of synthesis as a process of selecting predefined procedures to match the needs of the dynamically changing plant conditions. The DPSRL language provides knowledge organization and representation primitives required to support the DPS framework. Specifically, it provides mechanisms to build various plant libraries and procedures to access information from them. The capabilities and the use of DPS, DPSRL, and FEPS are demonstrated by developing an experimental expert system for a typical boiling water reactor and analyzing its performance for various selected abnormal incidents

  8. Applied discrete-time queues

    CERN Document Server

    Alfa, Attahiru S

    2016-01-01

    This book introduces the theoretical fundamentals for modeling queues in discrete-time, and the basic procedures for developing queuing models in discrete-time. There is a focus on applications in modern telecommunication systems. It presents how most queueing models in discrete-time can be set up as discrete-time Markov chains. Techniques such as matrix-analytic methods (MAM) that can used to analyze the resulting Markov chains are included. This book covers single node systems, tandem system and queueing networks. It shows how queues with time-varying parameters can be analyzed, and illustrates numerical issues associated with computations for the discrete-time queueing systems. Optimal control of queues is also covered. Applied Discrete-Time Queues targets researchers, advanced-level students and analysts in the field of telecommunication networks. It is suitable as a reference book and can also be used as a secondary text book in computer engineering and computer science. Examples and exercises are includ...

  9. PROSA: A computer program for statistical analysis of near-real-time-accountancy (NRTA) data

    International Nuclear Information System (INIS)

    Beedgen, R.; Bicking, U.

    1987-04-01

    The computer program PROSA (Program for Statistical Analysis of NRTA Data) is a tool to decide on the basis of statistical considerations if, in a given sequence of materials balance periods, a loss of material might have occurred or not. The evaluation of the material balance data is based on statistical test procedures. In PROSA three truncated sequential tests are applied to a sequence of material balances. The manual describes the statistical background of PROSA and how to use the computer program on an IBM-PC with DOS 3.1. (orig.) [de

  10. Budget-based power consumption for application execution on a plurality of compute nodes

    Science.gov (United States)

    Archer, Charles J; Inglett, Todd A; Ratterman, Joseph D

    2012-10-23

    Methods, apparatus, and products are disclosed for budget-based power consumption for application execution on a plurality of compute nodes that include: assigning an execution priority to each of one or more applications; executing, on the plurality of compute nodes, the applications according to the execution priorities assigned to the applications at an initial power level provided to the compute nodes until a predetermined power consumption threshold is reached; and applying, upon reaching the predetermined power consumption threshold, one or more power conservation actions to reduce power consumption of the plurality of compute nodes during execution of the applications.

  11. Computer Simulation of the Formation of Non-Metallic Precipitates During a Continuous Casting of Steel

    Directory of Open Access Journals (Sweden)

    Kalisz D.

    2016-03-01

    Full Text Available The authors own computer software, based on the Ueshima mathematical model with taking into account the back diffusion, determined from the Wołczyński equation, was developed for simulation calculations. The applied calculation procedure allowed to determine the chemical composition of the non-metallic phase in steel deoxidised by means of Mn, Si and Al, at the given cooling rate. The calculation results were confirmed by the analysis of samples taken from the determined areas of the cast ingot. This indicates that the developed computer software can be applied for designing the steel casting process of the strictly determined chemical composition and for obtaining the required non-metallic precipitates.

  12. Evaluating the Theoretic Adequacy and Applied Potential of Computational Models of the Spacing Effect.

    Science.gov (United States)

    Walsh, Matthew M; Gluck, Kevin A; Gunzelmann, Glenn; Jastrzembski, Tiffany; Krusmark, Michael

    2018-03-02

    The spacing effect is among the most widely replicated empirical phenomena in the learning sciences, and its relevance to education and training is readily apparent. Yet successful applications of spacing effect research to education and training is rare. Computational modeling can provide the crucial link between a century of accumulated experimental data on the spacing effect and the emerging interest in using that research to enable adaptive instruction. In this paper, we review relevant literature and identify 10 criteria for rigorously evaluating computational models of the spacing effect. Five relate to evaluating the theoretic adequacy of a model, and five relate to evaluating its application potential. We use these criteria to evaluate a novel computational model of the spacing effect called the Predictive Performance Equation (PPE). Predictive Performance Equation combines elements of earlier models of learning and memory including the General Performance Equation, Adaptive Control of Thought-Rational, and the New Theory of Disuse, giving rise to a novel computational account of the spacing effect that performs favorably across the complete sets of theoretic and applied criteria. We implemented two other previously published computational models of the spacing effect and compare them to PPE using the theoretic and applied criteria as guides. © 2018 Cognitive Science Society, Inc.

  13. Containment integrity and leak testing. Procedures applied and experiences gained in European countries

    International Nuclear Information System (INIS)

    1987-01-01

    Containment systems are the ultimate safety barrier for preventing the escape of gaseous, liquid and solid radioactive materials produced in normal operation, not retained in process systems, and for keeping back radioactive materials released by system malfunction or equipment failure. A primary element of the containment shell is therefore its leak-tight design. The report describes the present containment concepts mostly used in European countries. The leak-testing procedures applied and the experiences gained in their application are also discussed. The report refers more particularly to pre-operational testing, periodic testing and extrapolation methods of leak rates measured at test conditions to expected leak rates at calculated accident conditions. The actual problems in periodic containment leak rate testing are critically reviewed. In the appendix to the report a summary is given of the regulations and specifications applied in different member countries

  14. Improving the efficiency of aerodynamic shape optimization procedures

    Science.gov (United States)

    Burgreen, Greg W.; Baysal, Oktay; Eleshaky, Mohamed E.

    1992-01-01

    The computational efficiency of an aerodynamic shape optimization procedure which is based on discrete sensitivity analysis is increased through the implementation of two improvements. The first improvement involves replacing a grid point-based approach for surface representation with a Bezier-Bernstein polynomial parameterization of the surface. Explicit analytical expressions for the grid sensitivity terms are developed for both approaches. The second improvement proposes the use of Newton's method in lieu of an alternating direction implicit (ADI) methodology to calculate the highly converged flow solutions which are required to compute the sensitivity coefficients. The modified design procedure is demonstrated by optimizing the shape of an internal-external nozzle configuration. A substantial factor of 8 decrease in computational time for the optimization process was achieved by implementing both of the design improvements.

  15. Optical high-performance computing: introduction to the JOSA A and Applied Optics feature.

    Science.gov (United States)

    Caulfield, H John; Dolev, Shlomi; Green, William M J

    2009-08-01

    The feature issues in both Applied Optics and the Journal of the Optical Society of America A focus on topics of immediate relevance to the community working in the area of optical high-performance computing.

  16. The Effect of the Extinction Procedure in Function-Based Intervention

    Science.gov (United States)

    Janney, Donna M.; Umbreit, John; Ferro, Jolenea B.; Liaupsin, Carl J.; Lane, Kathleen L.

    2013-01-01

    In this study, we examined the contribution of the extinction procedure in function-based interventions implemented in the general education classrooms of three at-risk elementary-aged students. Function-based interventions included antecedent adjustments, reinforcement procedures, and function-matched extinction procedures. Using a combined ABC…

  17. Decoherence and Noise in Spin-based Solid State Quantum Computers. Approximation-Free Numerical Simulations

    National Research Council Canada - National Science Library

    Harmon, Bruce N; Dobrovitski, Viatcheslav V

    2007-01-01

    ...) have also been developed and applied. Most recently, specific strategies for quantum control have been investigated for realistic systems in order to extend the coherence times for spin-based quantum computing implementations...

  18. Analyze image quality and comparative study between conventional and computed radiography applied to the inspection of alloys

    International Nuclear Information System (INIS)

    Machado, Alessandra S.; Oliveira, Davi F.; Silva, Aline S.S.; Nascimento, Joseilson R.; Lopes, Ricardo T.

    2011-01-01

    Piping system design takes into account relevant factors such as: internal coating, dimensioning, vibration system, adequate supports and principally, piping material. Cost is a decisive factor in the phase of material selection. The non-destructive testing method most commonly employed in industry to analyze the structure of an object is radiographic testing. Computed radiography (CR) is a quicker and much more efficient alternative to conventional radiography but, although CR presents numerous advantages, testing procedures are still largely based on trial and error, due to the lack of a consecrated methodology to choose parameters as it exists for conventional radiography. Notwithstanding, this paper presents a study that uses the technique of computed radiography to analyze metal alloys. These metal alloys are used as internal pipe coating aiming to protect against corrosion and cracks. This study seeks to evaluate parameters such as basic spatial resolution, Normalized Signal-to-Noise Ratio (SNRN), contrast, intensity and also to compare conventional radiography with CR. (author)

  19. simEye: computer-based simulation of visual perception under various eye defects using Zernike polynomials

    OpenAIRE

    Fink, Wolfgang; Micol, Daniel

    2006-01-01

    We describe a computer eye model that allows for aspheric surfaces and a three-dimensional computer-based ray-tracing technique to simulate optical properties of the human eye and visual perception under various eye defects. Eye surfaces, such as the cornea, eye lens, and retina, are modeled or approximated by a set of Zernike polynomials that are fitted to input data for the respective surfaces. A ray-tracing procedure propagates light rays using Snell’s law of refraction from an input objec...

  20. Computer Labs | College of Engineering & Applied Science

    Science.gov (United States)

    Engineering Concentration on Ergonomics M.S. Program in Computer Science Interdisciplinary Concentration on Structural Engineering Laboratory Water Resources Laboratory Computer Science Department Computer Science Academic Programs Computer Science Undergraduate Programs Computer Science Major Computer Science Tracks

  1. Computer Resources | College of Engineering & Applied Science

    Science.gov (United States)

    Engineering Concentration on Ergonomics M.S. Program in Computer Science Interdisciplinary Concentration on Structural Engineering Laboratory Water Resources Laboratory Computer Science Department Computer Science Academic Programs Computer Science Undergraduate Programs Computer Science Major Computer Science Tracks

  2. Computer Science | Classification | College of Engineering & Applied

    Science.gov (United States)

    Engineering Concentration on Ergonomics M.S. Program in Computer Science Interdisciplinary Concentration on Structural Engineering Laboratory Water Resources Laboratory Computer Science Department Computer Science Academic Programs Computer Science Undergraduate Programs Computer Science Major Computer Science Tracks

  3. Actor-Network Procedures

    NARCIS (Netherlands)

    Pavlovic, Dusko; Meadows, Catherine; Ramanujam, R.; Ramaswamy, Srini

    2012-01-01

    In this paper we propose actor-networks as a formal model of computation in heterogenous networks of computers, humans and their devices, where these new procedures run; and we introduce Procedure Derivation Logic (PDL) as a framework for reasoning about security in actor-networks, as an extension

  4. APPLYING ARTIFICIAL INTELLIGENCE TECHNIQUES TO HUMAN-COMPUTER INTERFACES

    DEFF Research Database (Denmark)

    Sonnenwald, Diane H.

    1988-01-01

    A description is given of UIMS (User Interface Management System), a system using a variety of artificial intelligence techniques to build knowledge-based user interfaces combining functionality and information from a variety of computer systems that maintain, test, and configure customer telephone...... and data networks. Three artificial intelligence (AI) techniques used in UIMS are discussed, namely, frame representation, object-oriented programming languages, and rule-based systems. The UIMS architecture is presented, and the structure of the UIMS is explained in terms of the AI techniques....

  5. Intelligent Decisional Assistant that Facilitate the Choice of a Proper Computer System Applied in Busines

    Directory of Open Access Journals (Sweden)

    Nicolae MARGINEAN

    2009-01-01

    Full Text Available The choice of a proper computer system is not an easy task for a decider. One reason could be the present market development of computer systems applied in business. The big number of the Romanian market players determines a big number of computerized products, with a multitude of various properties. Our proposal tries to optimize and facilitate this decisional process within an e-shop where are sold IT packets applied in business, building an online decisional assistant, a special component conceived to facilitate the decision making needed for the selection of the pertinent IT package that fits the requirements of one certain business, described by the decider. The user interacts with the system as an online buyer that visit an e-shop where are sold IT package applied in economy.

  6. Monte Carlo shielding analyses using an automated biasing procedure

    International Nuclear Information System (INIS)

    Tang, J.S.; Hoffman, T.J.

    1988-01-01

    A systematic and automated approach for biasing Monte Carlo shielding calculations is described. In particular, adjoint fluxes from a one-dimensional discrete ordinates calculation are used to generate biasing parameters for a Monte Carlo calculation. The entire procedure of adjoint calculation, biasing parameters generation, and Monte Carlo calculation has been automated. The automated biasing procedure has been applied to several realistic deep-penetration shipping cask problems. The results obtained for neutron and gamma-ray transport indicate that with the automated biasing procedure Monte Carlo shielding calculations of spent-fuel casks can be easily performed with minimum effort and that accurate results can be obtained at reasonable computing cost

  7. A Descriptive Study towards Green Computing Practice Application for Data Centers in IT Based Industries

    Directory of Open Access Journals (Sweden)

    Anthony Jnr. Bokolo

    2018-01-01

    Full Text Available The progressive upsurge in demand for processing and computing power has led to a subsequent upsurge in data center carbon emissions, cost incurred, unethical waste management, depletion of natural resources and high energy utilization. This raises the issue of the sustainability attainment in data centers of Information Technology (IT based industries. Green computing practice can be applied to facilitate sustainability attainment as IT based industries utilizes data centers to provide services to staffs, practitioners and end users. But it is a known fact that enterprise servers utilize huge quantity of energy and incur other expenditures in cooling operations and it is difficult to address the needs of accuracy and efficiency in data centers while yet encouraging a greener application practice alongside cost reduction. Thus this research study focus on the practice application of Green computing in data centers which houses servers and as such presents the Green computing life cycle strategies and best practices to be practiced for better management in data centers in IT based industries. Data was collected through questionnaire from 133 respondents in industries that currently operate their in-house data centers. The analysed data was used to verify the Green computing life cycle strategies presented in this study. Findings from the data shows that each of the life cycles strategies is significant in assisting IT based industries apply Green computing practices in their data centers. This study would be of interest to knowledge and data management practitioners as well as environmental manager and academicians in deploying Green data centers in their organizations.

  8. Computer-based irrigation scheduling for cotton crop

    International Nuclear Information System (INIS)

    Laghari, K.Q.; Memon, H.M.

    2008-01-01

    In this study a real time irrigation schedule for cotton crop has been tested using mehran model, a computer-based DDS (Decision Support System). The irrigation schedule was set on selected MAD (Management Allowable Depletion) and the current root depth position. The total 451 mm irrigation water applied to the crop field. The seasonal computed crop ET (Evapotranspiration) was estimated 421.32 mm and actual (ET/sub ca/) observed was 413 mm. The model over-estimated seasonal ET by only 1.94. WUE (Water Use Efficiency) for seed-cotton achieved 6.59 Kg (ha mm)/sup -1/. The statistical analysis (R/sup 2/=0.96, ARE%=2.00, T-1.17 and F=550.57) showed good performance of the model in simulated and observed ET values. The designed Mehran model is designed quite versatile for irrigation scheduling and can be successfully used as irrigation DSS tool for various crop types. (author)

  9. Static and dynamic buckling of large thin shells. (Design procedure, computation tools. Physical understanding of the mechanisms)

    International Nuclear Information System (INIS)

    Combescure, A.

    1986-04-01

    During the last ten years, the French Research Institute for Nuclear Energy (Commissariat a l'Energie Atomique) achieved many theoretical as well as experimental studies for designing the first large size pool type fast breeder reactor. Many of the sensitive parts of this reactor are thin shells subjected to high temperatures and loads. Special care has been given to buckling, because it often governs design. Most of the thin shells structures of the french breeder reactor are axisymmetric. However, imperfections have to be accounted for. In order to keep the advantage of an axisymmetric analysis (low computational costs), a special element has been implemented and used with considerable success in the recent years. This element (COMU) is described in the first chapter, its main features are: either non axisymmetric imperfection or non axisymmetric load, large displacement, non linear material behaviour, computational costs about ten times cheaper than the equivalent three dimensional analysis. This paper based on a careful comparison between experimental and computational results, obtained with the COMU, will analyse three problems: First: design procedure against buckling of thin shells structures subjected to primary loads; Second: static post buckling; Third: buckling under seismic loads [fr

  10. Computing and Systems Applied in Support of Coordinated Energy, Environmental, and Climate Planning

    Science.gov (United States)

    This talk focuses on how Dr. Loughlin is applying Computing and Systems models, tools and methods to more fully understand the linkages among energy systems, environmental quality, and climate change. Dr. Loughlin will highlight recent and ongoing research activities, including: ...

  11. A computational procedure for the investigation of whipping effect on ITER High Energy Piping and its application to the ITER divertor primary heat transfer system

    Energy Technology Data Exchange (ETDEWEB)

    Spagnuolo, G.A., E-mail: Alessandro.Spagnuolo@kraftanlagen.com [Kraftanlagen Heidelberg Gmbh, Im Breitspiel 7, D-69126 Heidelberg (Germany); Dell’Orco, G. [ITER Organization, Route de Vinon-sur-Verdon, CS 90 046, 13067 St. Paul Lez Durance Cedex (France); Di Maio, P.A. [Dipartimento di Energia, Ingegneria dell’Informazione e Modelli Matematici, Università di Palermo Viale delle Scienze, 90128 Palermo (Italy); Mazzei, M. [Kraftanlagen Heidelberg Gmbh, Im Breitspiel 7, D-69126 Heidelberg (Germany)

    2015-10-15

    Highlights: • High Energy Piping (HEP) are components containing water or steam with P ≥ 2.0 MPa and/or T ≥ 100 °C. • The whipping effect in HEP may cause dangerous domino effect with relative rupture propagation. • The rapture is envisaged or postulated according to the stress state of piping. • A FEM analysis is performed in order to study the dynamic of whipping effect. • Study of special support to avoid and/or mitigate the whipping effect. - Abstract: The Tokamak Cooling Water System of nuclear facility has the function to remove heat from plasma facing components maintaining coolant temperatures, pressures and flow rates as required and, depending on thermal-hydraulic requirements, its systems are defined as High Energy Piping (HEP) because they contain fluids, such as water or steam, at a pressure greater than or equal to 2.0 MPa and/or at a temperature greater than or equal to 100 °C, or even gas at pressure above the atmospheric one. The French standards contemplate the need to consider the whipping effect on HEP design. This effect happens when, after a double ended guillotine break, the reaction force could create a displacement of the piping which might affect adjacent components. A research campaign has been performed, in cooperation by ITER Organization and University of Palermo, to outline the procedure to check whether whipping effect might occur and assess its potential damage effects so to allow their mitigation. This procedure is based on the guidelines issued by U.S. Nuclear Regulatory Commission. The proposed procedure has been applied to the analysis of the whipping effect of divertor primary heat transfer system HEP, using a theoretical–computational approach based on the finite element method.

  12. Predicting the effect of seine rope layout pattern and haul-in procedure on the effectiveness of demersal seine fishing: A Computer simulation-based approach.

    Science.gov (United States)

    Madsen, Nina A H; Aarsæther, Karl G; Herrmann, Bent

    2017-01-01

    Demersal Seining is an active fishing method applying two long seine ropes and a seine net. Demersal seining relies on fish responding to the seine rope as it moves during the fishing process. The seine ropes and net are deployed in a specific pattern encircling an area on the seabed. In some variants of demersal seining the haul-in procedure includes a towing phase where the fishing vessel moves forward before starting to winch in the seine ropes. The initial seine rope encircled area, the gradual change in it during the haul-in process and the fish's reaction to the moving seine ropes play an important role in the catch performance of demersal seine fishing. The current study investigates this subject by applying computer simulation models for demersal seine fishing. The demersal seine fishing is dynamic in nature and therefore a dynamic model, SeineSolver is applied for simulating the physical behaviour of the seine ropes during the fishing process. Information about the seine rope behaviour is used as input to another simulation tool, SeineFish that predicts the catch performance of the demersal seine fishing process. SeineFish implements a simple model for how fish at the seabed reacts to an approaching seine rope. Here, the SeineSolver and SeineFish tools are applied to investigate catching performance for a Norwegian demersal seine fishery targeting cod (Gadus morhua) in the coastal zone. The effect of seine rope layout pattern and the duration of the towing phase are investigated. Among the four different layout patterns investigated, the square layout pattern was predicted to perform best; catching 69%-86% more fish than would be obtained with the rectangular layout pattern. Inclusion of a towing phase in the fishing process was found to increase the catch performance for all layout patterns. For the square layout pattern, inclusion of a towing phase of 15 or 35 minutes increased the catch performance by respectively 37% and 48% compared to fishing without

  13. Knowledge-based computer security advisor

    International Nuclear Information System (INIS)

    Hunteman, W.J.; Squire, M.B.

    1991-01-01

    The rapid expansion of computer security information and technology has included little support to help the security officer identify the safeguards needed to comply with a policy and to secure a computing system. This paper reports that Los Alamos is developing a knowledge-based computer security system to provide expert knowledge to the security officer. This system includes a model for expressing the complex requirements in computer security policy statements. The model is part of an expert system that allows a security officer to describe a computer system and then determine compliance with the policy. The model contains a generic representation that contains network relationships among the policy concepts to support inferencing based on information represented in the generic policy description

  14. Design Transformations for Rule-based Procedural Modeling

    KAUST Repository

    Lienhard, Stefan; Lau, Cheryl; Mü ller, Pascal; Wonka, Peter; Pauly, Mark

    2017-01-01

    We introduce design transformations for rule-based procedural models, e.g., for buildings and plants. Given two or more procedural designs, each specified by a grammar, a design transformation combines elements of the existing designs to generate new designs. We introduce two technical components to enable design transformations. First, we extend the concept of discrete rule switching to rule merging, leading to a very large shape space for combining procedural models. Second, we propose an algorithm to jointly derive two or more grammars, called grammar co-derivation. We demonstrate two applications of our work: we show that our framework leads to a larger variety of models than previous work, and we show fine-grained transformation sequences between two procedural models.

  15. Design Transformations for Rule-based Procedural Modeling

    KAUST Repository

    Lienhard, Stefan

    2017-05-24

    We introduce design transformations for rule-based procedural models, e.g., for buildings and plants. Given two or more procedural designs, each specified by a grammar, a design transformation combines elements of the existing designs to generate new designs. We introduce two technical components to enable design transformations. First, we extend the concept of discrete rule switching to rule merging, leading to a very large shape space for combining procedural models. Second, we propose an algorithm to jointly derive two or more grammars, called grammar co-derivation. We demonstrate two applications of our work: we show that our framework leads to a larger variety of models than previous work, and we show fine-grained transformation sequences between two procedural models.

  16. Investigation of UT procedure for crack depth sizing by phased array UT in Ni-based alloy weld

    International Nuclear Information System (INIS)

    Hirasawa, Taiji; Fukutomi, Hiroyuki

    2013-01-01

    Recently, it has been reported that the primary water stress corrosion cracking (PWSCC) has occurred in nickel based alloy weld components such as steam generator safe end weld, reactor vessel safe end weld, and so on, in PWR. Defect detection and sizing are important in order to ensure the reliable operation and life extension of nuclear power plants. In the reactor vessel safe end weld, it was impossible to measure crack depth of PWSCC. The cracks have occurred in the axial direction of the safe end weld. Furthermore, the cracks had some features such as deep, large aspect ratio (ratio of crack depth and length), sharp geometry of crack tip, and so on. Therefore, development and improvement of defect depth sizing capabilities by ultrasonic testing (UT) have been required. Phased array UT technique was applied with regard to defect depth sizing at the inside inspection in Ni-based alloy welds. Phased array UT was examined a standard block specimen with side drilled holes (SDHs). From the experimental results, the performance of linear array probes and dual matrix array probe were investigated. In the basis of the results, UT procedure for defect depth sizing was investigated and proposed. The UT procedure was applied to the defect depth measurement in Ni-based alloy weld specimen with electric discharge machine (EDM) notches. From these results, good accuracy of defect depth sizing by phased array UT for the inside inspection was shown. Therefore, it was clarified the effectiveness of the UT procedure for defect depth sizing in Ni-based alloy weld. (author)

  17. A Dynamic Object Behavior Model and Implementation Based on Computational Reflection

    Institute of Scientific and Technical Information of China (English)

    HE Cheng-wan; HE Fei; HE Ke-qing

    2005-01-01

    A dynamic object behavior model based on computational reflection is proposed. This model consists of function level and meta level, the meta objects in meta level manage the base objects and behaviors in function level, including dynamic binding and unbinding of base object and behavior.We implement this model with RoleJava Language, which is our self linguistic extension of the Java Language. Meta Objects are generated automatically at compile-time, this makes the reflecton mechanism transparent to programmers. Finally an example applying this model to a banking system is presented.

  18. Memristor-based nanoelectronic computing circuits and architectures

    CERN Document Server

    Vourkas, Ioannis

    2016-01-01

    This book considers the design and development of nanoelectronic computing circuits, systems and architectures focusing particularly on memristors, which represent one of today’s latest technology breakthroughs in nanoelectronics. The book studies, explores, and addresses the related challenges and proposes solutions for the smooth transition from conventional circuit technologies to emerging computing memristive nanotechnologies. Its content spans from fundamental device modeling to emerging storage system architectures and novel circuit design methodologies, targeting advanced non-conventional analog/digital massively parallel computational structures. Several new results on memristor modeling, memristive interconnections, logic circuit design, memory circuit architectures, computer arithmetic systems, simulation software tools, and applications of memristors in computing are presented. High-density memristive data storage combined with memristive circuit-design paradigms and computational tools applied t...

  19. Operating and maintenance experience with computer-based systems in nuclear power plants - A report by the PWG-1 Task Group on Computer-based Systems Important to Safety

    International Nuclear Information System (INIS)

    1998-01-01

    changes for the new installation or the upgrade. Procedures, training, and other practices, as well, may be affected by the failure analysis because administration controls, periodic calibration and surveillance procedures may all be used to provide defence against potential failures. An important input to the failure analysis activities comes from the feedback of operating and maintenance experience. Feedback of operating experience in nuclear power plants has long been recognised as a valuable source for improving system design, procedures or human performance to achieve safety and to prevent recurrence of failures. This is particularly true in the case of complex systems such as computer-based systems. The process of feedback would provide designers with information on systems failures, unforeseen scenarios, or unanalysed configurations. The review of operating experience and the identification of causes of failures is also essential for the regulators in performing their safety assessments. Currently, the NRC reviews the electromagnetic compatibility (EMC), software reliability, and the human-machine interface when it performs a safety evaluation of digital upgrades to ensure that the digital system failures resulting from the identified causes are within the acceptable level of a system's reliability. Operating experience with computer-based system is one of the topics raised in the SESAR report. The CSNI Bureau of the OECD has requested NEA Principal Working Group No. 1 (PWG1) to review this topic. A task group led by Canada was therefore formed within PWG1, including France, Japan, U.K., and U.S.A. to address the related issues. The purpose of this report is to summarise the observations and some findings related to the operating and maintenance experience, based on contributions from France, U.S.A., U.K., Japan, and Canada. Additional information from the review of the open literature is also included. A number of the operational incidents, selected as case

  20. Applying Kitaev's algorithm in an ion trap quantum computer

    International Nuclear Information System (INIS)

    Travaglione, B.; Milburn, G.J.

    2000-01-01

    Full text: Kitaev's algorithm is a method of estimating eigenvalues associated with an operator. Shor's factoring algorithm, which enables a quantum computer to crack RSA encryption codes, is a specific example of Kitaev's algorithm. It has been proposed that the algorithm can also be used to generate eigenstates. We extend this proposal for small quantum systems, identifying the conditions under which the algorithm can successfully generate eigenstates. We then propose an implementation scheme based on an ion trap quantum computer. This scheme allows us to illustrate a simple example, in which the algorithm effectively generates eigenstates

  1. DCE. Future IHEP's computing environment

    International Nuclear Information System (INIS)

    Zheng Guorui; Liu Xiaoling

    1995-01-01

    IHEP'S computing environment consists of several different computing environments established on IHEP computer networks. In which, the BES environment supported HEP computing is the main part of IHEP computing environment. Combining with the procedure of improvement and extension of BES environment, the authors describe development of computing environments in outline as viewed from high energy physics (HEP) environment establishment. The direction of developing to distributed computing of the IHEP computing environment based on the developing trend of present distributed computing is presented

  2. Cloud Computing and Internet of Things Concepts Applied on Buildings Data Analysis

    Directory of Open Access Journals (Sweden)

    Hebean Florin-Adrian

    2017-12-01

    Full Text Available Used and developed initially for the IT industry, the Cloud computing and Internet of Things concepts are found at this moment in a lot of sectors of activity, building industry being one of them. These are defined like a global computing, monitoring and analyze network, which is composed of hardware and software resources, with the feature of allocating and dynamically relocating the shared resources, in accordance with user requirements. Data analysis and process optimization techniques based on these new concepts are used increasingly more in the buildings industry area, especially for an optimal operations of the buildings installations and also for increasing occupants comfort. The multitude of building data taken from HVAC sensor, from automation and control systems and from the other systems connected to the network are optimally managed by these new analysis techniques. Through analysis techniques can be identified and manage the issues the arise in operation of building installations like critical alarms, nonfunctional equipment, issues regarding the occupants comfort, for example the upper and lower temperature deviation to the set point and other issues related to equipment maintenance. In this study, a new approach regarding building control is presented and also a generalized methodology for applying data analysis to building services data is described. This methodology is then demonstrated using two case studies.

  3. Safety analysis procedures for PHWR

    International Nuclear Information System (INIS)

    Min, Byung Joo; Kim, Hyoung Tae; Yoo, Kun Joong

    2004-03-01

    The methodology of safety analyses for CANDU reactors in Canada, a vendor country, uses a combination of best-estimate physical models and conservative input parameters so as to minimize the uncertainty of the plant behavior predictions. As using the conservative input parameters, the results of the safety analyses are assured the regulatory requirements such as the public dose, the integrity of fuel and fuel channel, the integrity of containment and reactor structures, etc. However, there is not the comprehensive and systematic procedures for safety analyses for CANDU reactors in Korea. In this regard, the development of the safety analyses procedures for CANDU reactors is being conducted not only to establish the safety analyses system, but also to enhance the quality assurance of the safety assessment. In the first phase of this study, the general procedures of the deterministic safety analyses are developed. The general safety procedures are covered the specification of the initial event, selection of the methodology and accident sequences, computer codes, safety analysis procedures, verification of errors and uncertainties, etc. Finally, These general procedures of the safety analyses are applied to the Large Break Loss Of Coolant Accident (LBLOCA) in Final Safety Analysis Report (FSAR) for Wolsong units 2, 3, 4

  4. Procedure Redesign Methods : E3-Control: a redesign methodology for control procedures

    NARCIS (Netherlands)

    Liu, J.; Hofman, W.J.; Tan, Y.H.

    2011-01-01

    This chapter highlights the core research methodology, e3-control, that is applied throughout the ITAIDE project for the purpose of control procedure redesign. We present the key concept of the e3-control methodology and its technical guidelines. Based on the output of this chapter, domain experts

  5. MATLAB-implemented estimation procedure for model-based assessment of hepatic insulin degradation from standard intravenous glucose tolerance test data.

    Science.gov (United States)

    Di Nardo, Francesco; Mengoni, Michele; Morettini, Micaela

    2013-05-01

    Present study provides a novel MATLAB-based parameter estimation procedure for individual assessment of hepatic insulin degradation (HID) process from standard frequently-sampled intravenous glucose tolerance test (FSIGTT) data. Direct access to the source code, offered by MATLAB, enabled us to design an optimization procedure based on the alternating use of Gauss-Newton's and Levenberg-Marquardt's algorithms, which assures the full convergence of the process and the containment of computational time. Reliability was tested by direct comparison with the application, in eighteen non-diabetic subjects, of well-known kinetic analysis software package SAAM II, and by application on different data. Agreement between MATLAB and SAAM II was warranted by intraclass correlation coefficients ≥0.73; no significant differences between corresponding mean parameter estimates and prediction of HID rate; and consistent residual analysis. Moreover, MATLAB optimization procedure resulted in a significant 51% reduction of CV% for the worst-estimated parameter by SAAM II and in maintaining all model-parameter CV% MATLAB-based procedure was suggested as a suitable tool for the individual assessment of HID process. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  6. Current Human Reliability Analysis Methods Applied to Computerized Procedures

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring

    2012-06-01

    Computerized procedures (CPs) are an emerging technology within nuclear power plant control rooms. While CPs have been implemented internationally in advanced control rooms, to date no US nuclear power plant has implemented CPs in its main control room (Fink et al., 2009). Yet, CPs are a reality of new plant builds and are an area of considerable interest to existing plants, which see advantages in terms of enhanced ease of use and easier records management by omitting the need for updating hardcopy procedures. The overall intent of this paper is to provide a characterization of human reliability analysis (HRA) issues for computerized procedures. It is beyond the scope of this document to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper serves as a review of current HRA as it may be used for the analysis and review of computerized procedures.

  7. Spintronics-based computing

    CERN Document Server

    Prenat, Guillaume

    2015-01-01

    This book provides a comprehensive introduction to spintronics-based computing for the next generation of ultra-low power/highly reliable logic, which is widely considered a promising candidate to replace conventional, pure CMOS-based logic. It will cover aspects from device to system-level, including magnetic memory cells, device modeling, hybrid circuit structure, design methodology, CAD tools, and technological integration methods. This book is accessible to a variety of readers and little or no background in magnetism and spin electronics are required to understand its content.  The multidisciplinary team of expert authors from circuits, devices, computer architecture, CAD and system design reveal to readers the potential of spintronics nanodevices to reduce power consumption, improve reliability and enable new functionality.  .

  8. 28 CFR 30.6 - What procedures apply to the selection of programs and activities under these regulations?

    Science.gov (United States)

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false What procedures apply to the selection of programs and activities under these regulations? 30.6 Section 30.6 Judicial Administration DEPARTMENT OF... consult with local elected officials. (b) Each state that adopts a process shall notify the Attorney...

  9. 25 CFR 900.58 - Do the same accountability and control procedures described above apply to Federal property?

    Science.gov (United States)

    2010-04-01

    ... 25 Indians 2 2010-04-01 2010-04-01 false Do the same accountability and control procedures described above apply to Federal property? 900.58 Section 900.58 Indians BUREAU OF INDIAN AFFAIRS... Organization Management Systems Property Management System Standards § 900.58 Do the same accountability and...

  10. 49 CFR 17.6 - What procedures apply to the selection of programs and activities under these regulations?

    Science.gov (United States)

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false What procedures apply to the selection of programs and activities under these regulations? 17.6 Section 17.6 Transportation Office of the Secretary of Transportation INTERGOVERNMENTAL REVIEW OF DEPARTMENT OF TRANSPORTATION PROGRAMS AND ACTIVITIES § 17.6 What...

  11. Issues involved in a knowledge-based approach to procedure synthesis

    International Nuclear Information System (INIS)

    Hajek, B.K.; Khartabil, L.F.; Miller, D.W.

    1992-01-01

    Many knowledge-based systems (KBSs) have been built to assist human operators in managing nuclear power plant operating functions, such as monitoring, fault diagnosis, alarm filtering, and procedure management. For procedure management, KBSs have been built to display and track existing written procedures or to dynamically follow procedure execution by monitoring plant data and action execution and suggesting recovery steps. More recent works build KBSs able to synthesize procedures. This paper addresses and examines the main issues related to the implementation of on-line procedure synthesis using KBSs. A KBS for procedure synthesis can provide a more robust and effective procedural plan during accidents. Currently existing procedures for abnormal plant conditions, written as precompiled step sets based on the event and symptom approaches, are inherently not robust because anticipation of all potential plant states and associated plant responses is not possible. Thus, their failure recovery capability is limited to the precompiled set. Procedure synthesis has the potential to overcome these two problems because it does not require such precompilation of large sets of plant states and associated recovery procedures. Other benefits obtained from a complete procedure synthesis system are providing (a) a methodology for off-line procedure verification and (b) a methodology for the eventual automation of plant operations

  12. A Computer-Based, Interactive Videodisc Job Aid and Expert System for Electron Beam Lithography Integration and Diagnostic Procedures.

    Science.gov (United States)

    Stevenson, Kimberly

    This master's thesis describes the development of an expert system and interactive videodisc computer-based instructional job aid used for assisting in the integration of electron beam lithography devices. Comparable to all comprehensive training, expert system and job aid development require a criterion-referenced systems approach treatment to…

  13. Methods for testing the logical structure of plant procedure documents

    International Nuclear Information System (INIS)

    Horne, C.P.; Colley, R.; Fahley, J.M.

    1990-01-01

    This paper describes an ongoing EPRI project to investigate computer based methods to improve the development, maintenance, and verification of plant operating procedures. This project began as an evaluation of the applicability of structured software analysis methods to operating procedures. It was found that these methods offer benefits, if procedures are transformed to a structured representation to make them amenable to computer analysis. The next task was to investigate methods to transform procedures into a structured representation. The use of natural language techniques to read and compile the procedure documents appears to be viable for this purpose and supports conformity to guidelines. The final task was to consider possibilities of automated verification methods for procedures. Methods to help verify procedures were defined and information requirements specified. These methods take the structured representation of procedures as input. The software system being constructed in this project is called PASS, standing for Procedures Analysis Software System

  14. A Computer-Based Simulation of an Acid-Base Titration

    Science.gov (United States)

    Boblick, John M.

    1971-01-01

    Reviews the advantages of computer simulated environments for experiments, referring in particular to acid-base titrations. Includes pre-lab instructions and a sample computer printout of a student's use of an acid-base simulation. Ten references. (PR)

  15. The Research of Computer Aided Farm Machinery Designing Method Based on Ergonomics

    Science.gov (United States)

    Gao, Xiyin; Li, Xinling; Song, Qiang; Zheng, Ying

    Along with agricultural economy development, the farm machinery product type Increases gradually, the ergonomics question is also getting more and more prominent. The widespread application of computer aided machinery design makes it possible that farm machinery design is intuitive, flexible and convenient. At present, because the developed computer aided ergonomics software has not suitable human body database, which is needed in view of farm machinery design in China, the farm machinery design have deviation in ergonomics analysis. This article puts forward that using the open database interface procedure in CATIA to establish human body database which aims at the farm machinery design, and reading the human body data to ergonomics module of CATIA can product practical application virtual body, using human posture analysis and human activity analysis module to analysis the ergonomics in farm machinery, thus computer aided farm machinery designing method based on engineering can be realized.

  16. Computer-assisted preoperative simulation for positioning and fixation of plate in 2-stage procedure combining maxillary advancement by distraction technique and mandibular setback surgery.

    Science.gov (United States)

    Suenaga, Hideyuki; Taniguchi, Asako; Yonenaga, Kazumichi; Hoshi, Kazuto; Takato, Tsuyoshi

    2016-01-01

    Computer-assisted preoperative simulation surgery is employed to plan and interact with the 3D images during the orthognathic procedure. It is useful for positioning and fixation of maxilla by a plate. We report a case of maxillary retrusion by a bilateral cleft lip and palate, in which a 2-stage orthognathic procedure (maxillary advancement by distraction technique and mandibular setback surgery) was performed following a computer-assisted preoperative simulation planning to achieve the positioning and fixation of the plate. A high accuracy was achieved in the present case. A 21-year-old male patient presented to our department with a complaint of maxillary retrusion following bilateral cleft lip and palate. Computer-assisted preoperative simulation with 2-stage orthognathic procedure using distraction technique and mandibular setback surgery was planned. The preoperative planning of the procedure resulted in good aesthetic outcomes. The error of the maxillary position was less than 1mm. The implementation of the computer-assisted preoperative simulation for the positioning and fixation of plate in 2-stage orthognathic procedure using distraction technique and mandibular setback surgery yielded good results. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  17. Validation procedures of software applied in nuclear instruments. Proceedings of a technical meeting

    International Nuclear Information System (INIS)

    2007-09-01

    The IAEA has supported the availability of well functioning nuclear instruments in Member States over more than three decades. Some older or aged instruments are still being used and are still in good working condition. However, those instruments may not meet modern software requirements for the end-user in all cases. Therefore, Member States, mostly those with emerging economies, modernize/refurbish such instruments to meet the end-user demands. New advanced software is not only applied in case of new instrumentation, but often also for new and improved applications of modernized and/or refurbished instruments in many Member States for which in few cases the IAEA also provided support. Modern software applied in nuclear instrumentation plays a key role for their safe operation and execution of commands in a user friendly manner. Correct data handling and transfer has to be ensured. Additional features such as data visualization, interfacing to PC for control and data storage are often included. To finalize the task, where new instrumentation which is not commercially available is used, or aged instruments are modernized/refurbished, the applied software has to be verified and validated. A Technical Meeting on 'Validation Procedures of Software Applied in Nuclear Instruments' was organized in Vienna, 20-23 November 2006, to discuss the verification and validation process of software applied to operation and use of nuclear instruments. The presentations at the technical meeting included valuable information, which has been compiled and summarized in this publication, which should be useful for technical staff in Member States when modernizing/refurbishing nuclear instruments. 22 experts in the field of modernization/refurbishment of nuclear instruments as well as users of applied software presented their latest results. Discussion sessions followed the presentations. This publication is the outcome of deliberations during the meeting

  18. Subsea HIPPS design procedure

    International Nuclear Information System (INIS)

    Aaroe, R.; Lund, B.F.; Onshus, T.

    1995-01-01

    The paper is based on a feasibility study investigating the possibilities of using a HIPPS (High Integrity Pressure Protection System) to protect a subsea pipeline that is not rated for full wellhead shut-in pressure. The study was called the Subsea OPPS Feasibility Study, and was performed by SINTEF, Norway. Here, OPPS is an acronym for Overpressure Pipeline Protection System. A design procedure for a subsea HIPPS is described, based on the experience and knowledge gained through the ''Subsea OPPS Feasibility Study''. Before a subsea HIPPS can be applied, its technical feasibility, reliability and profitability must be demonstrated. The subsea HIPPS design procedure will help to organize and plan the design activities both with respect to development and verification of a subsea HIPPS. The paper also gives examples of how some of the discussed design steps were performed in the Subsea OPPS Feasibility Study. Finally, further work required to apply a subsea HIPPS is discussed

  19. Applying Modeling Tools to Ground System Procedures

    Science.gov (United States)

    Di Pasquale, Peter

    2012-01-01

    As part of a long-term effort to revitalize the Ground Systems (GS) Engineering Section practices, Systems Modeling Language (SysML) and Business Process Model and Notation (BPMN) have been used to model existing GS products and the procedures GS engineers use to produce them.

  20. Long range Debye-Hückel correction for computation of grid-based electrostatic forces between biomacromolecules

    International Nuclear Information System (INIS)

    Mereghetti, Paolo; Martinez, Michael; Wade, Rebecca C

    2014-01-01

    Brownian dynamics (BD) simulations can be used to study very large molecular systems, such as models of the intracellular environment, using atomic-detail structures. Such simulations require strategies to contain the computational costs, especially for the computation of interaction forces and energies. A common approach is to compute interaction forces between macromolecules by precomputing their interaction potentials on three-dimensional discretized grids. For long-range interactions, such as electrostatics, grid-based methods are subject to finite size errors. We describe here the implementation of a Debye-Hückel correction to the grid-based electrostatic potential used in the SDA BD simulation software that was applied to simulate solutions of bovine serum albumin and of hen egg white lysozyme. We found that the inclusion of the long-range electrostatic correction increased the accuracy of both the protein-protein interaction profiles and the protein diffusion coefficients at low ionic strength. An advantage of this method is the low additional computational cost required to treat long-range electrostatic interactions in large biomacromolecular systems. Moreover, the implementation described here for BD simulations of protein solutions can also be applied in implicit solvent molecular dynamics simulations that make use of gridded interaction potentials

  1. Development and application of the automated Monte Carlo biasing procedure in SAS4

    International Nuclear Information System (INIS)

    Tang, J.S.; Broadhead, B.L.

    1993-01-01

    An automated approach for biasing Monte Carlo shielding calculations is described. In particular, adjoint fluxes from a one-dimensional discrete-ordinates calculation are used to generate biasing parameters for a three-dimensional Monte Carlo calculation. The automated procedure consisting of cross-section processing, adjoint flux determination, biasing parameter generation, and the initiation of a MORSE-SGC/S Monte Carlo calculation has been implemented in the SAS4 module of the SCALE computer code system. The automated procedure has been used extensively in the investigation of both computational and experimental benchmarks for the NEACRP working group on shielding assessment of transportation packages. The results of these studies indicate that with the automated biasing procedure, Monte Carlo shielding calculations of spent fuel casks can be easily performed with minimum effort and that accurate results can be obtained at reasonable computing cost. The systematic biasing approach described in this paper can also be applied to other similar shielding problems

  2. Recent developments in computer vision-based analytical chemistry: A tutorial review.

    Science.gov (United States)

    Capitán-Vallvey, Luis Fermín; López-Ruiz, Nuria; Martínez-Olmos, Antonio; Erenas, Miguel M; Palma, Alberto J

    2015-10-29

    Chemical analysis based on colour changes recorded with imaging devices is gaining increasing interest. This is due to its several significant advantages, such as simplicity of use, and the fact that it is easily combinable with portable and widely distributed imaging devices, resulting in friendly analytical procedures in many areas that demand out-of-lab applications for in situ and real-time monitoring. This tutorial review covers computer vision-based analytical (CVAC) procedures and systems from 2005 to 2015, a period of time when 87.5% of the papers on this topic were published. The background regarding colour spaces and recent analytical system architectures of interest in analytical chemistry is presented in the form of a tutorial. Moreover, issues regarding images, such as the influence of illuminants, and the most relevant techniques for processing and analysing digital images are addressed. Some of the most relevant applications are then detailed, highlighting their main characteristics. Finally, our opinion about future perspectives is discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. The Geospatial Data Cloud: An Implementation of Applying Cloud Computing in Geosciences

    Directory of Open Access Journals (Sweden)

    Xuezhi Wang

    2014-11-01

    Full Text Available The rapid growth in the volume of remote sensing data and its increasing computational requirements bring huge challenges for researchers as traditional systems cannot adequately satisfy the huge demand for service. Cloud computing has the advantage of high scalability and reliability, which can provide firm technical support. This paper proposes a highly scalable geospatial cloud platform named the Geospatial Data Cloud, which is constructed based on cloud computing. The architecture of the platform is first introduced, and then two subsystems, the cloud-based data management platform and the cloud-based data processing platform, are described.  ––– This paper was presented at the First Scientific Data Conference on Scientific Research, Big Data, and Data Science, organized by CODATA-China and held in Beijing on 24-25 February, 2014.

  4. Challenging the in-vivo assessment of biomechanical properties of the uterine cervix: A critical analysis of ultrasound based quasi-static procedures.

    Science.gov (United States)

    Maurer, M M; Badir, S; Pensalfini, M; Bajka, M; Abitabile, P; Zimmermann, R; Mazza, E

    2015-06-25

    Measuring the stiffness of the uterine cervix might be useful in the prediction of preterm delivery, a still unsolved health issue of global dimensions. Recently, a number of clinical studies have addressed this topic, proposing quantitative methods for the assessment of the mechanical properties of the cervix. Quasi-static elastography, maximum compressibility using ultrasound and aspiration tests have been applied for this purpose. The results obtained with the different methods seem to provide contradictory information about the physiologic development of cervical stiffness during pregnancy. Simulations and experiments were performed in order to rationalize the findings obtained with ultrasound based, quasi-static procedures. The experimental and computational results clearly illustrate that standardization of quasi-static elastography leads to repeatable strain values, but for different loading forces. Since force cannot be controlled, this current approach does not allow the distinction between a globally soft and stiff cervix. It is further shown that introducing a reference elastomer into the elastography measurement might overcome the problem of force standardization, but a careful mechanical analysis is required to obtain reliable stiffness values for cervical tissue. In contrast, the maximum compressibility procedure leads to a repeatable, semi-quantitative assessment of cervical consistency, due to the nonlinear nature of the mechanical behavior of cervical tissue. The evolution of cervical stiffness in pregnancy obtained with this procedure is in line with data from aspiration tests. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Development of a 3-dimensional flow analysis procedure for axial pump impellers

    International Nuclear Information System (INIS)

    Kim, Min Hwan; Kim, Jong In; Park, Jin Seok; Huh, Houng Huh; Chang, Moon Hee

    1999-06-01

    A fluid dynamic analysis procedure was developed using the three-dimensional solid model of an axial pump impeller which was theoretically designed using I-DEAS CAD/CAM/CAE software. The CFD software FLUENT was used in the flow field analysis. The steady-state flow regime in the MCP impeller and diffuser was simulated using the developed procedure. The results of calculation were analyzed to confirm whether the design requirements were properly implemented in the impeller model. The validity of the developed procedure was demonstrated by comparing the calculation results with the experimental data available. The pump performance at the design point could be effectively predicted using the developed procedure. The computed velocity distributions have shown a good agreement with the experimental data except for the regions near the wall. The computed head, however, was over-predicted than the experiment. The design period and cost required for the development of an axial pump impeller can be significantly reduced by applying the proposed methodology. (author). 7 refs., 2 tabs

  6. Automation of reliability evaluation procedures through CARE - The computer-aided reliability estimation program.

    Science.gov (United States)

    Mathur, F. P.

    1972-01-01

    Description of an on-line interactive computer program called CARE (Computer-Aided Reliability Estimation) which can model self-repair and fault-tolerant organizations and perform certain other functions. Essentially CARE consists of a repository of mathematical equations defining the various basic redundancy schemes. These equations, under program control, are then interrelated to generate the desired mathematical model to fit the architecture of the system under evaluation. The mathematical model is then supplied with ground instances of its variables and is then evaluated to generate values for the reliability-theoretic functions applied to the model.

  7. Learning Mathematics by Designing, Programming, and Investigating with Interactive, Dynamic Computer-Based Objects

    Science.gov (United States)

    Marshall, Neil; Buteau, Chantal

    2014-01-01

    As part of their undergraduate mathematics curriculum, students at Brock University learn to create and use computer-based tools with dynamic, visual interfaces, called Exploratory Objects, developed for the purpose of conducting pure or applied mathematical investigations. A student's Development Process Model of creating and using an Exploratory…

  8. Expert-systems and computer-based industrial systems

    International Nuclear Information System (INIS)

    Terrien, J.F.

    1987-01-01

    Framatome makes wide use of expert systems, computer-assisted engineering, production management and personnel training. It has set up separate business units and subsidiaries and also participates in other companies which have the relevant expertise. Five examples of the products and services available in these are discussed. These are in the field of applied artificial intelligence and expert-systems, in integrated computer-aid design and engineering, structural analysis, computer-related products and services and document management systems. The structure of the companies involved and the work they are doing is discussed. (UK)

  9. Experimental designs for autoregressive models applied to industrial maintenance

    International Nuclear Information System (INIS)

    Amo-Salas, M.; López-Fidalgo, J.; Pedregal, D.J.

    2015-01-01

    Some time series applications require data which are either expensive or technically difficult to obtain. In such cases scheduling the points in time at which the information should be collected is of paramount importance in order to optimize the resources available. In this paper time series models are studied from a new perspective, consisting in the use of Optimal Experimental Design setup to obtain the best times to take measurements, with the principal aim of saving costs or discarding useless information. The model and the covariance function are expressed in an explicit form to apply the usual techniques of Optimal Experimental Design. Optimal designs for various approaches are computed and their efficiencies are compared. The methods working in an application of industrial maintenance of a critical piece of equipment at a petrochemical plant are shown. This simple model allows explicit calculations in order to show openly the procedure to find the correlation structure, needed for computing the optimal experimental design. In this sense the techniques used in this paper to compute optimal designs may be transferred to other situations following the ideas of the paper, but taking into account the increasing difficulty of the procedure for more complex models. - Highlights: • Optimal experimental design theory is applied to AR models to reduce costs. • The first observation has an important impact on any optimal design. • Either the lack of precision or small starting observations claim for large times. • Reasonable optimal times were obtained relaxing slightly the efficiency. • Optimal designs were computed in a predictive maintenance context

  10. Bioinspired computation in combinatorial optimization: algorithms and their computational complexity

    DEFF Research Database (Denmark)

    Neumann, Frank; Witt, Carsten

    2012-01-01

    Bioinspired computation methods, such as evolutionary algorithms and ant colony optimization, are being applied successfully to complex engineering and combinatorial optimization problems, and it is very important that we understand the computational complexity of these algorithms. This tutorials...... problems. Classical single objective optimization is examined first. They then investigate the computational complexity of bioinspired computation applied to multiobjective variants of the considered combinatorial optimization problems, and in particular they show how multiobjective optimization can help...... to speed up bioinspired computation for single-objective optimization problems. The tutorial is based on a book written by the authors with the same title. Further information about the book can be found at www.bioinspiredcomputation.com....

  11. Characterization of breast tissue using energy-dispersive X-ray diffraction computed tomography

    International Nuclear Information System (INIS)

    Pani, S.; Cook, E.J.; Horrocks, J.A.; Jones, J.L.; Speller, R.D.

    2010-01-01

    A method for sample characterization using energy-dispersive X-ray diffraction computed tomography (EDXRDCT) is presented. The procedures for extracting diffraction patterns from the data and the corrections applied are discussed. The procedures were applied to the characterization of breast tissue samples, 6 mm in diameter. Comparison with histological sections of the samples confirmed the possibility of grouping the patterns into five families, corresponding to adipose tissue, fibrosis, poorly differentiated cancer, well differentiated cancer and benign tumour.

  12. Constructing an exposure chart: step by step (based on standard procedures)

    International Nuclear Information System (INIS)

    David, Jocelyn L; Cansino, Percedita T.; Taguibao, Angileo P.

    2000-01-01

    An exposure chart is very important in conducting radiographic inspection of materials. By using an accurate exposure chart, an inspector is able to avoid a trial and error way of determining correct time to expose a specimen, thereby producing a radiograph that has an acceptable density based on a standard. The chart gives the following information: x-ray machine model and brand, distance of the x-ray tube from the film, type and thickness of intensifying screens, film type, radiograph density, and film processing conditions. The methods of preparing an exposure chart are available in existing radiographic testing manuals. These described methods are presented in step by step procedures, covering the actual laboratory set-up, data gathering, computations, and transformation of derived data into Characteristic Curve and Exposure Chart

  13. Efficient Procedure to Compute the Microcanonical Volume of Initial Conditions that Lead to Escape Trajectories from a Multidimensional Potential Well

    NARCIS (Netherlands)

    Waalkens, Holger; Burbanks, Andrew; Wiggins, Stephen

    2005-01-01

    A procedure is presented for computing the phase space volume of initial conditions for trajectories that escape or ‘‘react’’ from a multidimensional potential well. The procedure combines a phase space transition state theory, which allows one to construct dividing surfaces that are free of local

  14. A full-spectral Bayesian reconstruction approach based on the material decomposition model applied in dual-energy computed tomography

    International Nuclear Information System (INIS)

    Cai, C.; Rodet, T.; Mohammad-Djafari, A.; Legoupil, S.

    2013-01-01

    Purpose: Dual-energy computed tomography (DECT) makes it possible to get two fractions of basis materials without segmentation. One is the soft-tissue equivalent water fraction and the other is the hard-matter equivalent bone fraction. Practical DECT measurements are usually obtained with polychromatic x-ray beams. Existing reconstruction approaches based on linear forward models without counting the beam polychromaticity fail to estimate the correct decomposition fractions and result in beam-hardening artifacts (BHA). The existing BHA correction approaches either need to refer to calibration measurements or suffer from the noise amplification caused by the negative-log preprocessing and the ill-conditioned water and bone separation problem. To overcome these problems, statistical DECT reconstruction approaches based on nonlinear forward models counting the beam polychromaticity show great potential for giving accurate fraction images.Methods: This work proposes a full-spectral Bayesian reconstruction approach which allows the reconstruction of high quality fraction images from ordinary polychromatic measurements. This approach is based on a Gaussian noise model with unknown variance assigned directly to the projections without taking negative-log. Referring to Bayesian inferences, the decomposition fractions and observation variance are estimated by using the joint maximum a posteriori (MAP) estimation method. Subject to an adaptive prior model assigned to the variance, the joint estimation problem is then simplified into a single estimation problem. It transforms the joint MAP estimation problem into a minimization problem with a nonquadratic cost function. To solve it, the use of a monotone conjugate gradient algorithm with suboptimal descent steps is proposed.Results: The performance of the proposed approach is analyzed with both simulated and experimental data. The results show that the proposed Bayesian approach is robust to noise and materials. It is also

  15. Modal-pushover-based ground-motion scaling procedure

    Science.gov (United States)

    Kalkan, Erol; Chopra, Anil K.

    2011-01-01

    Earthquake engineering is increasingly using nonlinear response history analysis (RHA) to demonstrate the performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. This paper presents a modal-pushover-based scaling (MPS) procedure to scale ground motions for use in a nonlinear RHA of buildings. In the MPS method, the ground motions are scaled to match to a specified tolerance, a target value of the inelastic deformation of the first-mode inelastic single-degree-of-freedom (SDF) system whose properties are determined by the first-mode pushover analysis. Appropriate for first-mode dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-mode SDF systems in selecting a subset of the scaled ground motions. Based on results presented for three actual buildings-4, 6, and 13-story-the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  16. Development of a testing methodology for computerized procedure system based on JUnit framework and MFM

    International Nuclear Information System (INIS)

    Qin, Wei

    2004-02-01

    Paper Based Procedure (PBP) and Computerized Procedure System (CPS) are studied to demonstrate that it is necessary to develop CPS in Nuclear Power Plant (NPP) Instrumentation and Control (I and C) system. Computerized procedure system is actually a software system. All the desired and undesired properties of a software system can be described and evaluated as software qualities. Generally, software qualities can be categorized into product quality and process quality. In order to achieve product quality, the process quality of a software system should also be considered and achieved. Characteristics of CPS will be described to analyse the product and process of an example CPS: ImPRO. At the same time, several main product and process issues will be analysed from Verification and Validation (V and V) point of view. It is concluded and suggested that V and V activities can also be regarded as a software development process, this point of view then is applied to the V and V activities of ImPRO as a systematic approach of testing of ImPRO. To support and realize this approach, suitable testing technologies and testing strategies are suggested based on JUnit framework and Multi-level Flow Modeling (MFM)

  17. Employing subgoals in computer programming education

    Science.gov (United States)

    Margulieux, Lauren E.; Catrambone, Richard; Guzdial, Mark

    2016-01-01

    The rapid integration of technology into our professional and personal lives has left many education systems ill-equipped to deal with the influx of people seeking computing education. To improve computing education, we are applying techniques that have been developed for other procedural fields. The present study applied such a technique, subgoal labeled worked examples, to explore whether it would improve programming instruction. The first two experiments, conducted in a laboratory, suggest that the intervention improves undergraduate learners' problem-solving performance and affects how learners approach problem-solving. The third experiment demonstrates that the intervention has similar, and perhaps stronger, effects in an online learning environment with in-service K-12 teachers who want to become qualified to teach computing courses. By implementing this subgoal intervention as a tool for educators to teach themselves and their students, education systems could improve computing education and better prepare learners for an increasingly technical world.

  18. CARVEDILOL POPULATION PHARMACOKINETIC ANALYSIS – APPLIED VALIDATION PROCEDURE

    Directory of Open Access Journals (Sweden)

    Aleksandra Catić-Đorđević

    2013-09-01

    Full Text Available Carvedilol is a nonselective beta blocker/alpha-1 blocker, which is used for treatment of essential hypertension, chronic stable angina, unstable angina and ischemic left ventricular dysfunction. The aim of this study was to describe carvedilol population pharmacokinetic (PK analysis as well as the validation of analytical procedure, which is an important step regarding this approach. In contemporary clinical practice, population PK analysis is often more important than standard PK approach in setting a mathematical model that describes the PK parameters. Also, it includes the variables that have particular importance in the drugs pharmacokinetics such as sex, body mass, dosage, pharmaceutical form, pathophysiological state, disease associated with the organism or the presence of a specific polymorphism in the isoenzyme important for biotransformation of the drug. One of the most frequently used approach in population PK analysis is the Nonlinear Modeling of Mixed Effects - NONMEM modeling. Analytical methods used in the data collection period is of great importance for the implementation of a population PK analysis of carvedilol in order to obtain reliable data that can be useful in clinical practice. High performance liquid chromatography (HPLC analysis of carvedilol is used to confirm the identity of a drug and provide quantitative results and also to monitor the efficacy of the therapy. Analytical procedures used in other studies could not be fully implemented in our research as it was necessary to perform certain modification and validation of the method with the aim of using the obtained results for the purpose of a population pharmacokinetic analysis. Validation process is a logical terminal phase of analytical procedure development that provides applicability of the procedure itself. The goal of validation is to ensure consistency of the method and accuracy of results or to confirm the selection of analytical method for a given sample

  19. Determining procedures for simulation-based training in radiology: a nationwide needs assessment.

    Science.gov (United States)

    Nayahangan, Leizl Joy; Nielsen, Kristina Rue; Albrecht-Beste, Elisabeth; Bachmann Nielsen, Michael; Paltved, Charlotte; Lindorff-Larsen, Karen Gilboe; Nielsen, Bjørn Ulrik; Konge, Lars

    2018-01-09

    New training modalities such as simulation are widely accepted in radiology; however, development of effective simulation-based training programs is challenging. They are often unstructured and based on convenience or coincidence. The study objective was to perform a nationwide needs assessment to identify and prioritize technical procedures that should be included in a simulation-based curriculum. A needs assessment using the Delphi method was completed among 91 key leaders in radiology. Round 1 identified technical procedures that radiologists should learn. Round 2 explored frequency of procedure, number of radiologists performing the procedure, risk and/or discomfort for patients, and feasibility for simulation. Round 3 was elimination and prioritization of procedures. Response rates were 67 %, 70 % and 66 %, respectively. In Round 1, 22 technical procedures were included. Round 2 resulted in pre-prioritization of procedures. In round 3, 13 procedures were included in the final prioritized list. The three highly prioritized procedures were ultrasound-guided (US) histological biopsy and fine-needle aspiration, US-guided needle puncture and catheter drainage, and basic abdominal ultrasound. A needs assessment identified and prioritized 13 technical procedures to include in a simulation-based curriculum. The list may be used as guide for development of training programs. • Simulation-based training can supplement training on patients in radiology. • Development of simulation-based training should follow a structured approach. • The CAMES Needs Assessment Formula explores needs for simulation training. • A national Delphi study identified and prioritized procedures suitable for simulation training. • The prioritized list serves as guide for development of courses in radiology.

  20. On a problematic procedure to manipulate response biases in recognition experiments: the case of "implied" base rates.

    Science.gov (United States)

    Bröder, Arndt; Malejka, Simone

    2017-07-01

    The experimental manipulation of response biases in recognition-memory tests is an important means for testing recognition models and for estimating their parameters. The textbook manipulations for binary-response formats either vary the payoff scheme or the base rate of targets in the recognition test, with the latter being the more frequently applied procedure. However, some published studies reverted to implying different base rates by instruction rather than actually changing them. Aside from unnecessarily deceiving participants, this procedure may lead to cognitive conflicts that prompt response strategies unknown to the experimenter. To test our objection, implied base rates were compared to actual base rates in a recognition experiment followed by a post-experimental interview to assess participants' response strategies. The behavioural data show that recognition-memory performance was estimated to be lower in the implied base-rate condition. The interview data demonstrate that participants used various second-order response strategies that jeopardise the interpretability of the recognition data. We thus advice researchers against substituting actual base rates with implied base rates.

  1. Computer-based and web-based radiation safety training

    Energy Technology Data Exchange (ETDEWEB)

    Owen, C., LLNL

    1998-03-01

    The traditional approach to delivering radiation safety training has been to provide a stand-up lecture of the topic, with the possible aid of video, and to repeat the same material periodically. New approaches to meeting training requirements are needed to address the advent of flexible work hours and telecommuting, and to better accommodate individuals learning at their own pace. Computer- based and web-based radiation safety training can provide this alternative. Computer-based and web- based training is an interactive form of learning that the student controls, resulting in enhanced and focused learning at a time most often chosen by the student.

  2. Applying analytic hierarchy process to assess healthcare-oriented cloud computing service systems.

    Science.gov (United States)

    Liao, Wen-Hwa; Qiu, Wan-Li

    2016-01-01

    Numerous differences exist between the healthcare industry and other industries. Difficulties in the business operation of the healthcare industry have continually increased because of the volatility and importance of health care, changes to and requirements of health insurance policies, and the statuses of healthcare providers, which are typically considered not-for-profit organizations. Moreover, because of the financial risks associated with constant changes in healthcare payment methods and constantly evolving information technology, healthcare organizations must continually adjust their business operation objectives; therefore, cloud computing presents both a challenge and an opportunity. As a response to aging populations and the prevalence of the Internet in fast-paced contemporary societies, cloud computing can be used to facilitate the task of balancing the quality and costs of health care. To evaluate cloud computing service systems for use in health care, providing decision makers with a comprehensive assessment method for prioritizing decision-making factors is highly beneficial. Hence, this study applied the analytic hierarchy process, compared items related to cloud computing and health care, executed a questionnaire survey, and then classified the critical factors influencing healthcare cloud computing service systems on the basis of statistical analyses of the questionnaire results. The results indicate that the primary factor affecting the design or implementation of optimal cloud computing healthcare service systems is cost effectiveness, with the secondary factors being practical considerations such as software design and system architecture.

  3. Simplified method of computation for fatigue crack growth

    International Nuclear Information System (INIS)

    Stahlberg, R.

    1978-01-01

    A procedure is described for drastically reducing the computation time in calculating crack growth for variable-amplitude fatigue loading when the loading sequence is periodic. By the proposed procedure, the crack growth, r, per loading is approximated as a smooth function and its reciprocal is integrated, rather than summing crack growth cycle by cycle. The savings in computation time results since only a few pointwise values of r must be computed to generate an accurate interpolation function for numerical integration. Further time savings can be achieved by selecting the stress intensity coefficient (stress intensity divided by load) as the argument of r. Once r has been obtained as a function of stress intensity coefficient for a given material, environment, and loading sequence, it applies to any configuration of cracked structure. (orig.) [de

  4. Assessment of organ doses by standard X-ray procedures in the GDR

    International Nuclear Information System (INIS)

    Tautz, M.; Brandt, G.A.

    1986-01-01

    A modern method has been described to assess the radiation burden by X-ray procedures with consideration of the standards of our Society for Medical Radiology in the GDR. The underlying methodology is a Monte Carlo computer technique, which simulates stochastically the energy deposition of X-ray photons in a mathematically described heterogeneous anthropomorphic phantom by Rosenstein (US Department of Health, Education and Welfare). To apply the procedure specific values for the following parameters must be determined for each dose estimation: projection and view, X-ray field size and location entrance exposure at skin surface, beam quality, source-to-image receptor distance. The base data are obtained in terms of tissue-air ratio. Organ doses were calculated for chest, urography, skull, cervical spine, thoracic spine, lumbar spine, pelvis and lymphography. Concluding possibilities have been discussed for reduction of radiation burden. 9 refs., 6 figs., 9 tabs. (author)

  5. Bridging Theory and Practice: Developing Guidelines to Facilitate the Design of Computer-based Learning Environments

    Directory of Open Access Journals (Sweden)

    Lisa D. Young

    2003-10-01

    Full Text Available Abstract. The design of computer-based learning environments has undergone a paradigm shift; moving students away from instruction that was considered to promote technical rationality grounded in objectivism, to the application of computers to create cognitive tools utilized in constructivist environments. The goal of the resulting computer-based learning environment design principles is to have students learn with technology, rather than from technology. This paper reviews the general constructivist theory that has guided the development of these environments, and offers suggestions for the adaptation of modest, generic guidelines, not mandated principles, that can be flexibly applied and allow for the expression of true constructivist ideals in online learning environments.

  6. Three-dimensional reconstructed computed tomography-magnetic resonance fusion image-based preoperative planning for surgical procedures for spinal lipoma or tethered spinal cord after myelomeningocele repair. Technical note

    International Nuclear Information System (INIS)

    Bamba, Yohei; Nonaka, Masahiro; Nakajima, Shin; Yamasaki, Mami

    2011-01-01

    Surgical procedures for spinal lipoma or tethered spinal cord after myelomeningocele (MMC) repair are often difficult and complicated, because the anatomical structures can be deformed in complex and unpredictable ways. Imaging helps the surgeon understand the patient's spinal anatomy. Whereas two-dimensional images provide only limited information for surgical planning, three-dimensional (3D) reconstructed computed tomography (CT)-magnetic resonance (MR) fusion images produce clearer representations of the spinal regions. Here we describe simple and quick methods for obtaining 3D reconstructed CT-MR fusion images for preoperative planning of surgical procedures using the iPlan cranial (BrainLAB AG, Feldkirchen, Germany) neuronavigation software. 3D CT images of the vertebral bone were combined with heavily T 2 -weighted MR images of the spinal cord, lipoma, cerebrospinal fluid (CSF) space, and nerve root through a process of fusion, segmentation, and reconstruction of the 3D images. We also used our procedure called 'Image Overlay' to directly project the 3D reconstructed image onto the body surface using an light emitting diode (LED) projector. The final reconstructed 3D images took 10-30 minutes to obtain, and provided the surgeon with a representation of the individual pathological structures, so enabled the design of effective surgical plans, even in patients with bony deformity such as scoliosis. None of the 19 patients treated based on our 3D reconstruction method has had neurological complications, except for CSF leakage. This 3D reconstructed imaging method, combined with Image Overlay, improves the visual understanding of complicated surgical situations, and should improve surgical efficiency and outcome. (author)

  7. Applying systemic-structural activity theory to design of human-computer interaction systems

    CERN Document Server

    Bedny, Gregory Z; Bedny, Inna

    2015-01-01

    Human-Computer Interaction (HCI) is an interdisciplinary field that has gained recognition as an important field in ergonomics. HCI draws on ideas and theoretical concepts from computer science, psychology, industrial design, and other fields. Human-Computer Interaction is no longer limited to trained software users. Today people interact with various devices such as mobile phones, tablets, and laptops. How can you make such interaction user friendly, even when user proficiency levels vary? This book explores methods for assessing the psychological complexity of computer-based tasks. It also p

  8. An XML Representation for Crew Procedures

    Science.gov (United States)

    Simpson, Richard C.

    2005-01-01

    NASA ensures safe operation of complex systems through the use of formally-documented procedures, which encode the operational knowledge of the system as derived from system experts. Crew members use procedure documentation on the ground for training purposes and on-board space shuttle and space station to guide their activities. Investigators at JSC are developing a new representation for procedures that is content-based (as opposed to display-based). Instead of specifying how a procedure should look on the printed page, the content-based representation will identify the components of a procedure and (more importantly) how the components are related (e.g., how the activities within a procedure are sequenced; what resources need to be available for each activity). This approach will allow different sets of rules to be created for displaying procedures on a computer screen, on a hand-held personal digital assistant (PDA), verbally, or on a printed page, and will also allow intelligent reasoning processes to automatically interpret and use procedure definitions. During his NASA fellowship, Dr. Simpson examined how various industries represent procedures (also called business processes or workflows), in areas such as manufacturing, accounting, shipping, or customer service. A useful method for designing and evaluating workflow representation languages is by determining their ability to encode various workflow patterns, which depict abstract relationships between the components of a procedure removed from the context of a specific procedure or industry. Investigators have used this type of analysis to evaluate how well-suited existing workflow representation languages are for various industries based on the workflow patterns that commonly arise across industry-specific procedures. Based on this type of analysis, it is already clear that existing workflow representations capture discrete flow of control (i.e., when one activity should start and stop based on when other

  9. The Safety Assessment of OPR-1000 for Station Blackout Applying Combined Deterministic and Probabilistic Procedure

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dong Gu; Ahn, Seung-Hoon; Cho, Dae-Hyung [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    This is termed station blackout (SBO). However, it does not generally include the loss of available AC power to safety buses fed by station batteries through inverters or by alternate AC sources. Historically, risk analysis results have indicated that SBO was a significant contributor to overall core damage frequency. In this study, the safety assessment of OPR-1000 nuclear power plant for SBO accident, which is a typical beyond design basis accident and important contributor to overall plant risk, is performed by applying the combined deterministic and probabilistic procedure (CDPP). In addition, discussions are made for reevaluation of SBO risk at OPR-1000 by eliminating excessive conservatism in existing PSA. The safety assessment of OPR-1000 for SBO accident, which is a typical BDBA and significant contributor to overall plant risk, was performed by applying the combined deterministic and probabilistic procedure. However, the reference analysis showed that the CDF and CCDP did not meet the acceptable risk, and it was confirmed that the SBO risk should be reevaluated. By estimating the offsite power restoration time appropriately, the SBO risk was reevaluated, and it was finally confirmed that current OPR-1000 system lies in the acceptable risk against the SBO. In addition, it was demonstrated that the proposed CDPP is applicable to safety assessment of BDBAs in nuclear power plants without significant erosion of the safety margin.

  10. An Emotional Agent Model Based on Granular Computing

    Directory of Open Access Journals (Sweden)

    Jun Hu

    2012-01-01

    Full Text Available Affective computing has a very important significance for fulfilling intelligent information processing and harmonious communication between human being and computers. A new model for emotional agent is proposed in this paper to make agent have the ability of handling emotions, based on the granular computing theory and the traditional BDI agent model. Firstly, a new emotion knowledge base based on granular computing for emotion expression is presented in the model. Secondly, a new emotional reasoning algorithm based on granular computing is proposed. Thirdly, a new emotional agent model based on granular computing is presented. Finally, based on the model, an emotional agent for patient assistant in hospital is realized, experiment results show that it is efficient to handle simple emotions.

  11. Ultrasound-guided versus computed tomography-scan guided biopsy of pleural-based lung lesions.

    Science.gov (United States)

    Khosla, Rahul; McLean, Anna W; Smith, Jessica A

    2016-01-01

    Computed tomography (CT) guided biopsies have long been the standard technique to obtain tissue from the thoracic cavity and is traditionally performed by interventional radiologists. Ultrasound (US) guided biopsy of pleural-based lesions, performed by pulmonologists is gaining popularity and has the advantage of multi-planar imaging, real-time technique, and the absence of radiation exposure to patients. In this study, we aim to determine the diagnostic accuracy, the time to diagnosis after the initial consult placement, and the complications rates between the two different modalities. A retrospective study of electronic medical records was done of patients who underwent CT-guided biopsies and US-guided biopsies for pleural-based lesions between 2005 and 2014 and the data collected were analyzed for comparing the two groups. A total of 158 patients underwent 162 procedures during the study period. 86 patients underwent 89 procedures in the US group, and 72 patients underwent 73 procedures in the CT group. The overall yield in the US group was 82/89 (92.1%) versus 67/73 (91.8%) in the CT group (P = 1.0). Average days to the procedure was 7.2 versus 17.5 (P = 0.00001) in the US and CT group, respectively. Complication rate was higher in CT group 17/73 (23.3%) versus 1/89 (1.1%) in the US group (P guided biopsy is similar to that of CT-guided biopsy, with a lower complication rate and a significantly reduced time to the procedure.

  12. An Evaluation of Windows-Based Computer Forensics Application Software Running on a Macintosh

    Directory of Open Access Journals (Sweden)

    Gregory H. Carlton

    2008-09-01

    Full Text Available The two most common computer forensics applications perform exclusively on Microsoft Windows Operating Systems, yet contemporary computer forensics examinations frequently encounter one or more of the three most common operating system environments, namely Windows, OS-X, or some form of UNIX or Linux. Additionally, government and private computer forensics laboratories frequently encounter budget constraints that limit their access to computer hardware. Currently, Macintosh computer systems are marketed with the ability to accommodate these three common operating system environments, including Windows XP in native and virtual environments. We performed a series of experiments to measure the functionality and performance of the two most commonly used Windows-based computer forensics applications on a Macintosh running Windows XP in native mode and in two virtual environments relative to a similarly configured Dell personal computer. The research results are directly beneficial to practitioners, and the process illustrates affective pedagogy whereby students were engaged in applied research.

  13. Applying a machine learning model using a locally preserving projection based feature regeneration algorithm to predict breast cancer risk

    Science.gov (United States)

    Heidari, Morteza; Zargari Khuzani, Abolfazl; Danala, Gopichandh; Mirniaharikandehei, Seyedehnafiseh; Qian, Wei; Zheng, Bin

    2018-03-01

    Both conventional and deep machine learning has been used to develop decision-support tools applied in medical imaging informatics. In order to take advantages of both conventional and deep learning approach, this study aims to investigate feasibility of applying a locally preserving projection (LPP) based feature regeneration algorithm to build a new machine learning classifier model to predict short-term breast cancer risk. First, a computer-aided image processing scheme was used to segment and quantify breast fibro-glandular tissue volume. Next, initially computed 44 image features related to the bilateral mammographic tissue density asymmetry were extracted. Then, an LLP-based feature combination method was applied to regenerate a new operational feature vector using a maximal variance approach. Last, a k-nearest neighborhood (KNN) algorithm based machine learning classifier using the LPP-generated new feature vectors was developed to predict breast cancer risk. A testing dataset involving negative mammograms acquired from 500 women was used. Among them, 250 were positive and 250 remained negative in the next subsequent mammography screening. Applying to this dataset, LLP-generated feature vector reduced the number of features from 44 to 4. Using a leave-onecase-out validation method, area under ROC curve produced by the KNN classifier significantly increased from 0.62 to 0.68 (p breast cancer detected in the next subsequent mammography screening.

  14. Applied Linguistics Project: Student-Led Computer Assisted Research in High School EAL/EAP

    Science.gov (United States)

    Bohát, Róbert; Rödlingová, Beata; Horáková, Nina

    2015-01-01

    The Applied Linguistics Project (ALP) started at the International School of Prague (ISP) in 2013. Every year, Grade 9 English as an Additional Language (EAL) students identify an area of learning in need of improvement and design a research method followed by data collection and analysis using basic computer software tools or online corpora.…

  15. Quantitative computed tomography applied to interstitial lung diseases.

    Science.gov (United States)

    Obert, Martin; Kampschulte, Marian; Limburg, Rebekka; Barańczuk, Stefan; Krombach, Gabriele A

    2018-03-01

    To evaluate a new image marker that retrieves information from computed tomography (CT) density histograms, with respect to classification properties between different lung parenchyma groups. Furthermore, to conduct a comparison of the new image marker with conventional markers. Density histograms from 220 different subjects (normal = 71; emphysema = 73; fibrotic = 76) were used to compare the conventionally applied emphysema index (EI), 15 th percentile value (PV), mean value (MV), variance (V), skewness (S), kurtosis (K), with a new histogram's functional shape (HFS) method. Multinomial logistic regression (MLR) analyses was performed to calculate predictions of different lung parenchyma group membership using the individual methods, as well as combinations thereof, as covariates. Overall correct assigned subjects (OCA), sensitivity (sens), specificity (spec), and Nagelkerke's pseudo R 2 (NR 2 ) effect size were estimated. NR 2 was used to set up a ranking list of the different methods. MLR indicates the highest classification power (OCA of 92%; sens 0.95; spec 0.89; NR 2 0.95) when all histogram analyses methods were applied together in the MLR. Highest classification power among individually applied methods was found using the HFS concept (OCA 86%; sens 0.93; spec 0.79; NR 2 0.80). Conventional methods achieved lower classification potential on their own: EI (OCA 69%; sens 0.95; spec 0.26; NR 2 0.52); PV (OCA 69%; sens 0.90; spec 0.37; NR 2 0.57); MV (OCA 65%; sens 0.71; spec 0.58; NR 2 0.61); V (OCA 66%; sens 0.72; spec 0.53; NR 2 0.66); S (OCA 65%; sens 0.88; spec 0.26; NR 2 0.55); and K (OCA 63%; sens 0.90; spec 0.16; NR 2 0.48). The HFS method, which was so far applied to a CT bone density curve analysis, is also a remarkable information extraction tool for lung density histograms. Presumably, being a principle mathematical approach, the HFS method can extract valuable health related information also from histograms from complete different areas

  16. Applying a computer-aided scheme to detect a new radiographic image marker for prediction of chemotherapy outcome

    International Nuclear Information System (INIS)

    Wang, Yunzhi; Qiu, Yuchen; Thai, Theresa; Moore, Kathleen; Liu, Hong; Zheng, Bin

    2016-01-01

    To investigate the feasibility of automated segmentation of visceral and subcutaneous fat areas from computed tomography (CT) images of ovarian cancer patients and applying the computed adiposity-related image features to predict chemotherapy outcome. A computerized image processing scheme was developed to segment visceral and subcutaneous fat areas, and compute adiposity-related image features. Then, logistic regression models were applied to analyze association between the scheme-generated assessment scores and progression-free survival (PFS) of patients using a leave-one-case-out cross-validation method and a dataset involving 32 patients. The correlation coefficients between automated and radiologist’s manual segmentation of visceral and subcutaneous fat areas were 0.76 and 0.89, respectively. The scheme-generated prediction scores using adiposity-related radiographic image features significantly associated with patients’ PFS (p < 0.01). Using a computerized scheme enables to more efficiently and robustly segment visceral and subcutaneous fat areas. The computed adiposity-related image features also have potential to improve accuracy in predicting chemotherapy outcome

  17. Variational method for inverting the Kohn-Sham procedure

    International Nuclear Information System (INIS)

    Kadantsev, Eugene S.; Stott, M.J.

    2004-01-01

    A procedure based on a variational principle is developed for determining the local Kohn-Sham (KS) potential corresponding to a given ground-state electron density. This procedure is applied to calculate the exchange-correlation part of the effective Kohn-Sham (KS) potential for the neon atom and the methane molecule

  18. Applying knowledge engineering tools for the personal computer to the operation and maintenance of radiopharmaceutical production systems

    International Nuclear Information System (INIS)

    Alexoff, D.L.

    1990-01-01

    A practical consequence of over three decades of Artificial Intelligence (AI) research has been the emergence of Personal Computer-based AI programming tools. A special class of this microcomputer-based software, called expert systems shells, is now applied routinely outside the realm of classical AI to solve many types of problems, particularly in analytical chemistry. These AI tools offer not only some of the advantages inherent to symbolic programming languages, but, as significant, they bring with them advanced program development environments which can facilitate software development and maintenance. Exploitation of this enhanced programming environment was a major motivation for using an AI tool. The goal of this work is to evaluate the use of an example-based expert system shell (1st Class FUSION, 1st Class Expert Systems, Inc.) as a programming tool for developing software useful for automated radiopharmaceutical production

  19. A Web-Based Integration Procedure for the Development of Reconfigurable Robotic Work-Cells

    Directory of Open Access Journals (Sweden)

    Paulo Ferreira

    2013-07-01

    Full Text Available Concepts related to the development of reconfigurable manufacturing systems (RMS and methodologies to provide the best practices in the processing industry and factory automation, such as system integration and web-based technology, are major issues in designing next-generation manufacturing systems (NGMS. Adaptable and integrable devices are crucial for the success of NGMS. In robotic cells the integration of manufacturing components is essential to accelerate system adaptability. Sensors, control architectures and communication technologies have contributed to achieving further agility in reconfigurable factories. In this work a web-based robotic cell integration procedure is proposed to aid the identification of reconfigurable issues and requirements. This methodology is applied to an industrial robot manipulator to enhance system flexibility towards the development of a reconfigurable robotic platform.

  20. Multiphase Simulated Annealing Based on Boltzmann and Bose-Einstein Distribution Applied to Protein Folding Problem.

    Science.gov (United States)

    Frausto-Solis, Juan; Liñán-García, Ernesto; Sánchez-Hernández, Juan Paulo; González-Barbosa, J Javier; González-Flores, Carlos; Castilla-Valdez, Guadalupe

    2016-01-01

    A new hybrid Multiphase Simulated Annealing Algorithm using Boltzmann and Bose-Einstein distributions (MPSABBE) is proposed. MPSABBE was designed for solving the Protein Folding Problem (PFP) instances. This new approach has four phases: (i) Multiquenching Phase (MQP), (ii) Boltzmann Annealing Phase (BAP), (iii) Bose-Einstein Annealing Phase (BEAP), and (iv) Dynamical Equilibrium Phase (DEP). BAP and BEAP are simulated annealing searching procedures based on Boltzmann and Bose-Einstein distributions, respectively. DEP is also a simulated annealing search procedure, which is applied at the final temperature of the fourth phase, which can be seen as a second Bose-Einstein phase. MQP is a search process that ranges from extremely high to high temperatures, applying a very fast cooling process, and is not very restrictive to accept new solutions. However, BAP and BEAP range from high to low and from low to very low temperatures, respectively. They are more restrictive for accepting new solutions. DEP uses a particular heuristic to detect the stochastic equilibrium by applying a least squares method during its execution. MPSABBE parameters are tuned with an analytical method, which considers the maximal and minimal deterioration of problem instances. MPSABBE was tested with several instances of PFP, showing that the use of both distributions is better than using only the Boltzmann distribution on the classical SA.

  1. The radiation protective devices for interventional procedures using computed tomography

    International Nuclear Information System (INIS)

    Iida, Hiroji; Chabatake, Mitsuhiro; Shimizu, Mitsuru; Tamura, Sakio

    2002-01-01

    A scattered dose and a surface dose from phantom measurements during interventional procedures with computed tomography (IVR-CT) were evaluated. To reduce the personnel exposure in IVR-CT, the new protective devices were developed and its effect evaluated. Two radiation protection devices were experimentally made using a lead vinyl sheet with lead equivalent 0.125 mmPb. The first device is a lead curtain which shields the space of CT-gantry and phantom for the CT examination. The second device is a lead drape which shields on the phantom surface adjacent to the scanning plane for the CT-fluoroscopy. Scattered dose and phantom surface dose were measured with an abdominal phantom during Cine-CT (130 kV, 150 mA, 5 seconds, 10 mm section thickness). They were measured by using ionization chamber dosimeter. They were measured with and without a lead curtain and a lead drape. Scattered dose rate was measured at distance of 50-150 cm from the scanning plane. And, surface dose was measured at distance of 4-21 cm from the scanning plane on the phantom. On operator's standing position, scattered dose rates were from 8.4 to 11.6 μGy/sec at CT examination. The lead curtain and the lead drape reduced scattered dose rate at distance of 50 cm from the scanning plane by 66% and 58.3% respectively. Surface dose rate were 118 μGy/sec at distance of 5 cm from the scanning plane at CT-fluoroscopy. The lead drape reduced the surface dose by 60.5%. High scattered exposure to personnel may occur during interventional procedures using CT. They were considerably reduced during CT-arteriography by attaching the lead curtain in CT equipment. And they were substantially reduced during CT-fluoroscopy by placing the lead drape adjacent to the scanning plane, in addition, operator's hand would be protected from unnecessary radiation scattered by phantom. It was suggested that the scattered exposure to personnel could be sufficiently reduced by using radiation protection devices in IVR-CT. The

  2. A finite element based substructuring procedure for design analysis of large smart structural systems

    International Nuclear Information System (INIS)

    Ashwin, U; Raja, S; Dwarakanathan, D

    2009-01-01

    A substructuring based design analysis procedure is presented for large smart structural system using the Craig–Bampton method. The smart structural system is distinctively characterized as an active substructure, modelled as a design problem, and a passive substructure, idealized as an analysis problem. Furthermore, a novel thought has been applied by introducing the electro–elastic coupling into the reduction scheme to solve the global structural control problem in a local domain. As an illustration, a smart composite box beam with surface bonded actuators/sensors is considered, and results of the local to global control analysis are presented to show the potential use of the developed procedure. The present numerical scheme is useful for optimally designing the active substructures to study their locations, coupled structure–actuator interaction and provide a solution to the global design of large smart structural systems

  3. Dedicated OO expertise applied to Run II software projects

    International Nuclear Information System (INIS)

    Amidei, D.

    2000-01-01

    The change in software language and methodology by CDF and D0 to object-oriented from procedural Fortran is significant. Both experiments requested dedicated expertise that could be applied to software design, coding, advice and review. The Fermilab Run II offline computing outside review panel agreed strongly with the request and recommended that the Fermilab Computing Division hire dedicated OO expertise for the CDF/D0/Computing Division joint project effort. This was done and the two experts have been an invaluable addition to the CDF and D0 upgrade software projects and to the Computing Division in general. These experts have encouraged common approaches and increased the overall quality of the upgrade software. Advice on OO techniques and specific advice on C++ coding has been used. Recently a set of software reviews has been accomplished. This has been a very successful instance of a targeted application of computing expertise, and constitutes a very interesting study of how to move toward modern computing methodologies in HEP

  4. Direct integration of the S-matrix applied to rigorous diffraction

    International Nuclear Information System (INIS)

    Iff, W; Lindlein, N; Tishchenko, A V

    2014-01-01

    A novel Fourier method for rigorous diffraction computation at periodic structures is presented. The procedure is based on a differential equation for the S-matrix, which allows direct integration of the S-matrix blocks. This results in a new method in Fourier space, which can be considered as a numerically stable and well-parallelizable alternative to the conventional differential method based on T-matrix integration and subsequent conversions from the T-matrices to S-matrix blocks. Integration of the novel differential equation in implicit manner is expounded. The applicability of the new method is shown on the basis of 1D periodic structures. It is clear however, that the new technique can also be applied to arbitrary 2D periodic or periodized structures. The complexity of the new method is O(N 3 ) similar to the conventional differential method with N being the number of diffraction orders. (fast track communication)

  5. A study on Requirements of Data Base Translator for APR1400 Computerized Procedure System at Shin-Hanul unit 1 and 2

    International Nuclear Information System (INIS)

    Seong, Nokyu; Lee, Sungjin

    2015-01-01

    The CPS is one of the Man Machine Interface (MMI) resources and the CPS can directly display plant graphic objects which are in the Digital Control System (DCS). And the CPS can send a request to DCS to provide DCS screen which is called step support display through DCS link button on a computerized procedure. The procedure writers can insert DCS graphic information to computerized procedure through data base which is provided by CPS Editing System (CPSES). The data base which is provided by CPSES conforms to the naming rule of DCS graphic objects. The naming rule of DCS graphic objects is defined by vendor thus status of DCS graphic objects which are in computerized procedure at Shin-Kori plant cannot be displayed on CPS at Shin-Hanul plant. To use computerized procedure which is written by other plant procedure writer, DCS graphic objects shall be translated by its plant data base. This paper introduces requirements of data base translator to reduce translation and re-inserting graphic objects burden. This paper introduces the requirements of data base translator of CPSES for APR1400 CPS at Shin-Hanul unit 1 and 2. The translator algorithms shall be tested to update data base of CPSES effectively. The prototype of translator is implemented and is being tested using real plant DB. This translator can be applied to Shin- Hanul unit1 and 2 through software V and V

  6. A study on Requirements of Data Base Translator for APR1400 Computerized Procedure System at Shin-Hanul unit 1 and 2

    Energy Technology Data Exchange (ETDEWEB)

    Seong, Nokyu; Lee, Sungjin [KHNP, Daejeon (Korea, Republic of)

    2015-05-15

    The CPS is one of the Man Machine Interface (MMI) resources and the CPS can directly display plant graphic objects which are in the Digital Control System (DCS). And the CPS can send a request to DCS to provide DCS screen which is called step support display through DCS link button on a computerized procedure. The procedure writers can insert DCS graphic information to computerized procedure through data base which is provided by CPS Editing System (CPSES). The data base which is provided by CPSES conforms to the naming rule of DCS graphic objects. The naming rule of DCS graphic objects is defined by vendor thus status of DCS graphic objects which are in computerized procedure at Shin-Kori plant cannot be displayed on CPS at Shin-Hanul plant. To use computerized procedure which is written by other plant procedure writer, DCS graphic objects shall be translated by its plant data base. This paper introduces requirements of data base translator to reduce translation and re-inserting graphic objects burden. This paper introduces the requirements of data base translator of CPSES for APR1400 CPS at Shin-Hanul unit 1 and 2. The translator algorithms shall be tested to update data base of CPSES effectively. The prototype of translator is implemented and is being tested using real plant DB. This translator can be applied to Shin- Hanul unit1 and 2 through software V and V.

  7. Micro-computed tomography-based phenotypic approaches in embryology: procedural artifacts on assessments of embryonic craniofacial growth and development.

    Science.gov (United States)

    Schmidt, Eric J; Parsons, Trish E; Jamniczky, Heather A; Gitelman, Julian; Trpkov, Cvett; Boughner, Julia C; Logan, C Cairine; Sensen, Christoph W; Hallgrímsson, Benedikt

    2010-02-17

    Growing demand for three dimensional (3D) digital images of embryos for purposes of phenotypic assessment drives implementation of new histological and imaging techniques. Among these micro-computed tomography (microCT) has recently been utilized as an effective and practical method for generating images at resolutions permitting 3D quantitative analysis of gross morphological attributes of developing tissues and organs in embryonic mice. However, histological processing in preparation for microCT scanning induces changes in organ size and shape. Establishing normative expectations for experimentally induced changes in size and shape will be an important feature of 3D microCT-based phenotypic assessments, especially if quantifying differences in the values of those parameters between comparison sets of developing embryos is a primary aim. Toward that end, we assessed the nature and degree of morphological artifacts attending microCT scanning following use of common fixatives, using a two dimensional (2D) landmark geometric morphometric approach to track the accumulation of distortions affecting the embryonic head from the native, uterine state through to fixation and subsequent scanning. Bouin's fixation reduced average centroid sizes of embryonic mouse crania by approximately 30% and substantially altered the morphometric shape, as measured by the shift in Procrustes distance, from the unfixed state, after the data were normalized for naturally occurring shape variation. Subsequent microCT scanning produced negligible changes in size but did appear to reduce or even reverse fixation-induced random shape changes. Mixtures of paraformaldehyde + glutaraldehyde reduced average centroid sizes by 2-3%. Changes in craniofacial shape progressively increased post-fixation. The degree to which artifacts are introduced in the generation of random craniofacial shape variation relates to the degree of specimen dehydration during the initial fixation. Fixation methods that

  8. The Computer Student Worksheet Based Mathematical Literacy for Statistics

    Science.gov (United States)

    Manoy, J. T.; Indarasati, N. A.

    2018-01-01

    The student worksheet is one of media teaching which is able to improve teaching an activity in the classroom. Indicators in mathematical literacy were included in a student worksheet is able to help the students for applying the concept in daily life. Then, the use of computers in learning can create learning with environment-friendly. This research used developmental research which was Thiagarajan (Four-D) development design. There are 4 stages in the Four-D, define, design, develop, and disseminate. However, this research was finish until the third stage, develop stage. The computer student worksheet based mathematical literacy for statistics executed good quality. This student worksheet is achieving the criteria if able to achieve three aspects, validity, practicality, and effectiveness. The subject in this research was the students at The 1st State Senior High School of Driyorejo, Gresik, grade eleven of The 5th Mathematics and Natural Sciences. The computer student worksheet products based mathematical literacy for statistics executed good quality, while it achieved the aspects for validity, practical, and effectiveness. This student worksheet achieved the validity aspects with an average of 3.79 (94.72%), and practical aspects with an average of 2.85 (71.43%). Besides, it achieved the effectiveness aspects with a percentage of the classical complete students of 94.74% and a percentage of the student positive response of 75%.

  9. Pervasive Computing Support for Hospitals: An Overview of the Activity-Based Computing Project

    DEFF Research Database (Denmark)

    Christensen, Henrik Bærbak; Bardram, Jakob E

    2007-01-01

    The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital......The activity-based computing project researched pervasive computing support for clinical hospital work. Such technologies have potential for supporting the mobile, collaborative, and disruptive use of heterogeneous embedded devices in a hospital...

  10. Reheating breakfast: Age and multitasking on a computer-based and a non-computer-based task

    OpenAIRE

    Feinkohl, I.; Cress, U.; Kimmerle, J.

    2016-01-01

    Computer-based assessments are popular means to measure individual differences, including age differences, in cognitive ability, but are rarely tested for the extent to which they correspond to more realistic behavior. In the present study, we explored the extent to which performance on an existing computer-based task of multitasking ('cooking breakfast') may be generalizable by comparing it with a newly developed version of the same task that required interaction with physical objects. Twent...

  11. How should Fitts' Law be applied to human-computer interaction?

    Science.gov (United States)

    Gillan, D. J.; Holden, K.; Adam, S.; Rudisill, M.; Magee, L.

    1992-01-01

    The paper challenges the notion that any Fitts' Law model can be applied generally to human-computer interaction, and proposes instead that applying Fitts' Law requires knowledge of the users' sequence of movements, direction of movement, and typical movement amplitudes as well as target sizes. Two experiments examined a text selection task with sequences of controlled movements (point-click and point-drag). For the point-click sequence, a Fitts' Law model that used the diagonal across the text object in the direction of pointing (rather than the horizontal extent of the text object) as the target size provided the best fit for the pointing time data, whereas for the point-drag sequence, a Fitts' Law model that used the vertical size of the text object as the target size gave the best fit. Dragging times were fitted well by Fitts' Law models that used either the vertical or horizontal size of the terminal character in the text object. Additional results of note were that pointing in the point-click sequence was consistently faster than in the point-drag sequence, and that pointing in either sequence was consistently faster than dragging. The discussion centres around the need to define task characteristics before applying Fitts' Law to an interface design or analysis, analyses of pointing and of dragging, and implications for interface design.

  12. Photogrammetry procedures applied to anthropometry.

    Science.gov (United States)

    Okimoto, Maria Lúcialeite Ribeiro; Klein, Alison Alfred

    2012-01-01

    This study aims to evaluate the reliability and establish procedures for the use of digital photogrammetry in anthropometric measurements of the human hand. The methodology included the construction of a platform to allow the placement of the hand always equivalent to a distance of the camera lens and to annul the effects of parallax. We developed a software to perform the measurements from the images and built up a subject of proof in a cast from a negative mold, this object was subjected to measurements with digital photogrammetry using the data collection platform in caliper and the Coordinate Measuring Machine (MMC). The results of the application of photogrammetry in the data collection segment hand, allow us to conclude that photogrammetry is an effective presenting precision coefficient below 0.940. Within normal and acceptable values, given the magnitude of the data used in anthropometry. It was concluded photogrammetry then be reliable, accurate and efficient for carrying out anthropometric surveys of population, and presents less difficulty to collect in-place.

  13. A Decentralized Eigenvalue Computation Method for Spectrum Sensing Based on Average Consensus

    Science.gov (United States)

    Mohammadi, Jafar; Limmer, Steffen; Stańczak, Sławomir

    2016-07-01

    This paper considers eigenvalue estimation for the decentralized inference problem for spectrum sensing. We propose a decentralized eigenvalue computation algorithm based on the power method, which is referred to as generalized power method GPM; it is capable of estimating the eigenvalues of a given covariance matrix under certain conditions. Furthermore, we have developed a decentralized implementation of GPM by splitting the iterative operations into local and global computation tasks. The global tasks require data exchange to be performed among the nodes. For this task, we apply an average consensus algorithm to efficiently perform the global computations. As a special case, we consider a structured graph that is a tree with clusters of nodes at its leaves. For an accelerated distributed implementation, we propose to use computation over multiple access channel (CoMAC) as a building block of the algorithm. Numerical simulations are provided to illustrate the performance of the two algorithms.

  14. Short- and long-term effects of clinical audits on compliance with procedures in CT scanning

    International Nuclear Information System (INIS)

    Oliveri, Antonio; Howarth, Nigel; Gevenois, Pierre Alain; Tack, Denis

    2016-01-01

    To test the hypothesis that quality clinical audits improve compliance with the procedures in computed tomography (CT) scanning. This retrospective study was conducted in two hospitals, based on 6950 examinations and four procedures, focusing on the acquisition length in lumbar spine CT, the default tube current applied in abdominal un-enhanced CT, the tube potential selection for portal phase abdominal CT and the use of a specific ''paediatric brain CT'' procedure. The first clinical audit reported compliance with these procedures. After presenting the results to the stakeholders, a second audit was conducted to measure the impact of this information on compliance and was repeated the next year. Comparisons of proportions were performed using the Chi-square Pearson test. Depending on the procedure, the compliance rate ranged from 27 to 88 % during the first audit. After presentation of the audit results to the stakeholders, the compliance rate ranged from 68 to 93 % and was significantly improved for all procedures (P ranging from <0.001 to 0.031) in both hospitals and remained unchanged during the third audit (P ranging from 0.114 to 0.999). Quality improvement through repeated compliance audits with CT procedures durably improves this compliance. (orig.)

  15. The interplay of ultrasound and computed tomography in the planning and execution of interventional procedures

    International Nuclear Information System (INIS)

    Stafford, S.; Mueller, P.R.

    1987-01-01

    Even in large academic and private settings, where subspecialists abound and diagnostic and interventional radiologists are divided, both physically and philosophically, the interventionalist has emerged from the fluoroscopic suite to participate in the imaging workup of patients referred for precutaneous procedures. This expanded imaging role for the interventionalist is a natural outgrowth of several developments in radiology training. Computed tomography and ultrasound no longer are obscure techniques, understood only by an elite group of academic radiologists in large centers with access to equipment. All residents receive extensive education in these modalities, as imaging is a major part of general radiology. In addition, fellowship programs have been expanded to emphasize organ system training as opposed to ''modality'' training alone. Armed with imaging skills, the interventionalist is able to evaluate the cross-sectional diagnostic images better and to address specific findings and issues with respect to the planned procedure. These specific issues, elucidated by cross-sectional imaging, impact on the planning of interventional procedures addressed in this chapter

  16. Computational methods assuring nuclear power plant structural integrity and safety: an overview of the recent activities at VTT

    International Nuclear Information System (INIS)

    Keinaenen, H.; Talja, H.; Rintamaa, R.

    1998-01-01

    Numerical, simplified engineering and standardised methods are applied in the safety analyses of primary circuit components and reactor pressure vessels. The integrity assessment procedures require input relating both to the steady state and transient loading actual material properties data and precise knowledge of the size and geometry of defects. Current procedures bold extensive information regarding these aspects. It is important to verify the accuracy of the different assessment methods especially in the case of complex structures and loading. The focus of this paper is on the recent results and development of computational fracture assessment methods at VTT Manufacturing Technology. The methods include effective engineering type tools for rapid structural integrity assessments and more sophisticated finite-element based methods. An integrated PC-based program system MASI for engineering fracture analysis is described. A summary of the verification of the methods in computational benchmark analyses and against the results of large scale experiments is presented. (orig.)

  17. Midi-maxi computer interaction in the interpretation of nuclear medicine procedures

    International Nuclear Information System (INIS)

    Schlapper, G.A.

    1977-01-01

    A study of renal function with an Anger Gamma Camera coupled with a Digital Equipment Corporation Gamma-11 System and an IBM System 370 demonstrates the potential of quantitative determinations of physiological function through the application of midi-maxi computer interaction in the interpretation of nuclear medicine procedures. It is shown that radiotracers can provide an opportunity to assess physiological processes of renal function by noninvasively following the path of a tracer as a function of time. Time-activity relationships obtained over seven anatomically defined regions are related to parameters of a seven compartment model employed to describe the renal clearance process. The values obtained for clinically significant parameters agree with known renal pathophysiology. Differentiation of failure of acute, chronic, and obstructive forms is indicated

  18. Procedures monitoring and MAAP analysis

    International Nuclear Information System (INIS)

    May, R.S.

    1991-01-01

    Numerous studies of severe accidents in light water reactors have shown that operator response can play a crucial role in the predicted outcomes of dominant accident scenarios. MAAP provides the capability to specify certain operator actions as input data. However, making reasonable assumptions about the nature and timing of operator response requires substantial knowledge about plant practices and procedures and what they imply for the event being analyzed. The appearance of knowledge based software technology in the mid-1980s provided a natural format for representing and maintaining procedures as IF-THEN rules. The boiling water reactor (BWR) Emergency Operating Procedures Tracking System (EOPTS) was composed of a rule base of procedures and a dedicated inference engine (problem-solver). Based on the general approach and experience of EOPTS, the authors have developed a prototype procedures monitoring system that reads MAAP transient output files and evaluate the EOP messages and instructions that would be implied during each transient time interval. The prototype system was built using the NEXPERT OBJECT expert system development system, running on a 386-class personal computer with 4 MB of memory. The limited scope prototype includes a reduced set of BWR6 EOPs procedures evaluation on a coarse time interval, a simple text-based user interface, and a summary-report generator. The prototype, which is limited to batch-mode analysis of MAAP output, is intended to demonstrate the concept and aid in the design of a production system, which will involve a direct link to MAAP and interactive capabilities

  19. Cloud Computing in Science and Engineering and the “SciShop.ru” Computer Simulation Center

    Directory of Open Access Journals (Sweden)

    E. V. Vorozhtsov

    2011-12-01

    Full Text Available Various aspects of cloud computing applications for scientific research, applied design, and remote education are described in this paper. An analysis of the different aspects is performed based on the experience from the “SciShop.ru” Computer Simulation Center. This analysis shows that cloud computing technology has wide prospects in scientific research applications, applied developments and also remote education of specialists, postgraduates, and students.

  20. A sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory.

    Energy Technology Data Exchange (ETDEWEB)

    Johnson, J. D. (Prostat, Mesa, AZ); Oberkampf, William Louis; Helton, Jon Craig (Arizona State University, Tempe, AZ); Storlie, Curtis B. (North Carolina State University, Raleigh, NC)

    2006-10-01

    Evidence theory provides an alternative to probability theory for the representation of epistemic uncertainty in model predictions that derives from epistemic uncertainty in model inputs, where the descriptor epistemic is used to indicate uncertainty that derives from a lack of knowledge with respect to the appropriate values to use for various inputs to the model. The potential benefit, and hence appeal, of evidence theory is that it allows a less restrictive specification of uncertainty than is possible within the axiomatic structure on which probability theory is based. Unfortunately, the propagation of an evidence theory representation for uncertainty through a model is more computationally demanding than the propagation of a probabilistic representation for uncertainty, with this difficulty constituting a serious obstacle to the use of evidence theory in the representation of uncertainty in predictions obtained from computationally intensive models. This presentation describes and illustrates a sampling-based computational strategy for the representation of epistemic uncertainty in model predictions with evidence theory. Preliminary trials indicate that the presented strategy can be used to propagate uncertainty representations based on evidence theory in analysis situations where naive sampling-based (i.e., unsophisticated Monte Carlo) procedures are impracticable due to computational cost.

  1. FFTF fuel pin design procedure verification for transient operation

    International Nuclear Information System (INIS)

    Baars, R.E.

    1975-05-01

    The FFTF design procedures for evaluating fuel pin transient performance are briefly reviewed, and data where available are compared with design procedure predictions. Specifically, burst conditions derived from Fuel Cladding Transient Tester (FCTT) tests and from ANL loss-of-flow tests are compared with burst pressures computed using the design procedure upon which the cladding integrity limit was based. Failure times are predicted using the design procedure for evaluation of rapid reactivity insertion accidents, for five unterminated TREAT experiments in which well characterized fuel failures were deliberately incurred. (U.S.)

  2. Fragment informatics and computational fragment-based drug design: an overview and update.

    Science.gov (United States)

    Sheng, Chunquan; Zhang, Wannian

    2013-05-01

    Fragment-based drug design (FBDD) is a promising approach for the discovery and optimization of lead compounds. Despite its successes, FBDD also faces some internal limitations and challenges. FBDD requires a high quality of target protein and good solubility of fragments. Biophysical techniques for fragment screening necessitate expensive detection equipment and the strategies for evolving fragment hits to leads remain to be improved. Regardless, FBDD is necessary for investigating larger chemical space and can be applied to challenging biological targets. In this scenario, cheminformatics and computational chemistry can be used as alternative approaches that can significantly improve the efficiency and success rate of lead discovery and optimization. Cheminformatics and computational tools assist FBDD in a very flexible manner. Computational FBDD can be used independently or in parallel with experimental FBDD for efficiently generating and optimizing leads. Computational FBDD can also be integrated into each step of experimental FBDD and help to play a synergistic role by maximizing its performance. This review will provide critical analysis of the complementarity between computational and experimental FBDD and highlight recent advances in new algorithms and successful examples of their applications. In particular, fragment-based cheminformatics tools, high-throughput fragment docking, and fragment-based de novo drug design will provide the focus of this review. We will also discuss the advantages and limitations of different methods and the trends in new developments that should inspire future research. © 2012 Wiley Periodicals, Inc.

  3. Interactive stereotaxic teleassistance of remote experts during arthroscopic procedures.

    Science.gov (United States)

    Wagner, Arne; Undt, Gerhard; Schicho, Kurt; Wanschitz, Felix; Watzinger, Franz; Murakami, Kenichiro; Czerny, Christian; Ewers, Rolf

    2002-01-01

    This article describes the technical setup for stereotaxic telesurgical assistance for arthroscopic procedures. It also outlines the current state, limitations, and feasibility of this technical development. Teleassistance or teleconsultation implemented in endoscopic or arthroscopic procedures have not yet been reported. In this study, 7 computer-assisted arthroscopies of the temporomandibular joint were supported by extramural experts via interactive stereotaxic teleconsultation from distant locations. The external experts were supplied with close to real-time video, audio, and stereotaxic navigation data directly from the operation site. This setup allows the surgeons and external experts to interactively determine portals, target structures, and instrument positions relative to the patient's anatomy and to discuss any step of the procedures. Optoelectronic tracking interfaced to computer- based navigation technology allowed precise positioning of instruments for single or multiple temporomandibular joint punctures. The average error of digitizing probe measurements was 1.3 mm (range, 0.0 to 2.5 mm) and the average standard deviation was 0.7 mm (range, 0.4 to 0.9 mm). Evaluation of the reliability and accuracy of this technique suggests that it is sufficient for controlled navigation, even inside the small temporomandibular joint, a fact that encourages further applications for arthroscopy in general. The minimum requirement for high-quality video transmission for teleassisted procedures are integrated services digital network (ISDN) connections. Conventional ISDN-based videoconferencing can be combined with computer-aided intraoperative navigation. Transmission control protocol/internet protocol (TCP/IP)-based stereotaxic teleassistance data transmission via ATM or satellite seem to be promising techniques to considerably improve the field of arthroscopy.

  4. Stereotaxic microsurgical procedures of cerebral intracranial tumors guided by image and attended by computer

    International Nuclear Information System (INIS)

    Lopez Flores, Gerardo; Guerra Figueredo, Eritk; Ochoa Zaldivar, Luis

    2000-01-01

    It is reported that spatial guidance during microsurgery is an essential element. This application of stereotaxic surgery is shown at the International Center of Neurological Restoration (Cirene) from May, 1994, to February, 1998, on describing the performance of 65 microsurgical procedures under stereotaxic conditions among 62 patients with cerebral intracranial tumors. The procedure was divided into 3 stages: image acquisition, Cat, surgical planning , with Stasis planning system, and microsurgical procedures that included the Leksell, Micromar and Esteroflex stereotaxic systems. 27 of the total of patients presented glial tumors; 33, non-glial; and only 2 non-neoplastic lesions of diverse localization and size. 30 total resections We're made. Surgical morbidity was minimum and there was no surgical mortality. The main advantages of this method are: exact localization of the craniotomy, easy spatial guidance, and the opportunity to distinguish the limits between the tumor and the sound tissue. The possibility to apply Esteroflex to cerebral microsurgery was demonstrated

  5. The 38th CERN School of Computing visits Greece: Apply now!

    CERN Multimedia

    Alberto Pace, CSC Director

    2015-01-01

    CERN is organising its Summer Computing School (see here) for the 38th time since 1970. CSC2015 will take place from 14 September to 25 September in Kavala, Greece. The CSCs aim at creating a common technical culture in scientific computing among scientists and engineers involved in particle physics or in sister disciplines.   The two-week programme consists of 50 hours of lectures and hands-on exercises. It covers three main themes: data technologies, base technologies and physics computing, and in particular addresses: Many-core performance optimization Concurrent programming Key aspects of multi-threading Writing code for tomorrow’s hardware, today Storage technologies, reliability and performance Cryptography, authentication authorization and accounting Data Replication, caching, monitoring, alarms and quota Writing secure software Observing software with attacker's eyes Software engineering for physics computing Statistical methods and probability conce...

  6. The 37th CERN School of Computing visits Portugal: Apply now!

    CERN Multimedia

    Alberto Pace, CSC Director

    2014-01-01

    CERN is organising its summer School of Computing (see here) for the 37th time since 1970. CSC2014 (see here) will take place from 25 August to 6 September in Braga, Portugal.   The CSCs aim at creating a common technical culture in scientific computing among scientists and engineers involved in particle physics or in sister disciplines. The two-week programme consists of 50 hours of lectures and hands-on exercises. It covers three main themes: data technologies, base technologies and physics computing, and it addresses in particular: Many-core performance optimisation Concurrent programming Key aspects of multi-threading Writing code for tomorrow’s hardware today Storage technologies, reliability and performance Cryptography, authentication authorisation and accounting Data replication, caching, monitoring, alarms and quota Writing secure software Observing software with on attacker's eyes Software engineering for physics computing Statistical methods and proba...

  7. 21 CFR 1.383 - What expedited procedures apply when FDA initiates a seizure action against a detained perishable...

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false What expedited procedures apply when FDA initiates a seizure action against a detained perishable food? 1.383 Section 1.383 Food and Drugs FOOD AND... Administrative Detention of Food for Human or Animal Consumption General Provisions § 1.383 What expedited...

  8. Development and verification of symptom based emergency procedure support system

    International Nuclear Information System (INIS)

    Saijou, Nobuyuki; Sakuma, Akira; Takizawa, Yoji; Tamagawa, Naoko; Kubota, Ryuji; Satou, Hiroyuki; Ikeda, Koji; Taminami, Tatsuya

    1998-01-01

    A Computerized Emergency Procedure Guideline (EPG) Support System has been developed for BWR and evaluated using training simulator. It aims to enhance the effective utilization of EPG. The system identifies suitable symptom-based operating procedures for present plant status automatically. It has two functions : one is plant status identification function, and the other is man-machine interface function. For the realization of the former function, a method which identifies and prioritize suitable symptom-based operational procedures against present plant status has been developed. As man-machine interface, operation flow chart display has been developed. It express the flow of the identified operating procedures graphically. For easy understanding of the display, important information such as plant status change, priority of operating procedures and completion/uncompletion of the operation is displayed on the operation flow display by different colors. As evaluation test, the response of the system to the design based accidents was evaluated by actual plant operators, using training simulator at BWR Training Center. Through the analysis of interviews and questionnaires to operators, it was shown that the system is effective and can be utilized for a real plant. (author)

  9. Short- and long-term effects of clinical audits on compliance with procedures in CT scanning.

    Science.gov (United States)

    Oliveri, Antonio; Howarth, Nigel; Gevenois, Pierre Alain; Tack, Denis

    2016-08-01

    To test the hypothesis that quality clinical audits improve compliance with the procedures in computed tomography (CT) scanning. This retrospective study was conducted in two hospitals, based on 6950 examinations and four procedures, focusing on the acquisition length in lumbar spine CT, the default tube current applied in abdominal un-enhanced CT, the tube potential selection for portal phase abdominal CT and the use of a specific "paediatric brain CT" procedure. The first clinical audit reported compliance with these procedures. After presenting the results to the stakeholders, a second audit was conducted to measure the impact of this information on compliance and was repeated the next year. Comparisons of proportions were performed using the Chi-square Pearson test. Depending on the procedure, the compliance rate ranged from 27 to 88 % during the first audit. After presentation of the audit results to the stakeholders, the compliance rate ranged from 68 to 93 % and was significantly improved for all procedures (P ranging from audit (P ranging from 0.114 to 0.999). Quality improvement through repeated compliance audits with CT procedures durably improves this compliance. • Compliance with CT procedures is operator-dependent and not perfect. • Compliance differs between procedures and hospitals, even within a unified department. • Compliance is improved through audits followed by communication to the stakeholders. • This improvement is sustainable over a one-year period.

  10. Property-Based Anonymous Attestation in Trusted Cloud Computing

    Directory of Open Access Journals (Sweden)

    Zhen-Hu Ning

    2014-01-01

    Full Text Available In the remote attestation on Trusted Computer (TC computing mode TCCP, the trusted computer TC has an excessive burden, and anonymity and platform configuration information security of computing nodes cannot be guaranteed. To overcome these defects, based on the research on and analysis of current schemes, we propose an anonymous proof protocol based on property certificate. The platform configuration information is converted by the matrix algorithm into the property certificate, and the remote attestation is implemented by trusted ring signature scheme based on Strong RSA Assumption. By the trusted ring signature scheme based on property certificate, we achieve the anonymity of computing nodes and prevent the leakage of platform configuration information. By simulation, we obtain the computational efficiency of the scheme. We also expand the protocol and obtain the anonymous attestation based on ECC. By scenario comparison, we obtain the trusted ring signature scheme based on RSA, which has advantages with the growth of the ring numbers.

  11. Computer Based Modelling and Simulation

    Indian Academy of Sciences (India)

    GENERAL I ARTICLE. Computer Based ... universities, and later did system analysis, ... sonal computers (PC) and low cost software packages and tools. They can serve as useful learning experience through student projects. Models are .... Let us consider a numerical example: to calculate the velocity of a trainer aircraft ...

  12. An Overview of Computer-Based Natural Language Processing.

    Science.gov (United States)

    Gevarter, William B.

    Computer-based Natural Language Processing (NLP) is the key to enabling humans and their computer-based creations to interact with machines using natural languages (English, Japanese, German, etc.) rather than formal computer languages. NLP is a major research area in the fields of artificial intelligence and computational linguistics. Commercial…

  13. Applied technology center business plan and market survey

    Science.gov (United States)

    Hodgin, Robert F.; Marchesini, Roberto

    1990-01-01

    Business plan and market survey for the Applied Technology Center (ATC), computer technology transfer and development non-profit corporation, is presented. The mission of the ATC is to stimulate innovation in state-of-the-art and leading edge computer based technology. The ATC encourages the practical utilization of late-breaking computer technologies by firms of all variety.

  14. Analysis of the computational methods on the equipment shock response based on ANSYS environments

    International Nuclear Information System (INIS)

    Wang Yu; Li Zhaojun

    2005-01-01

    With the developments and completions of equipment shock vibration theory, math calculation method simulation technique and other aspects, equipment shock calculation methods are gradually developing form static development to dynamic and from linearity to non-linearity. Now, the equipment shock calculation methods applied worldwide in engineering practices mostly include equivalent static force method, Dynamic Design Analysis Method (abbreviated to DDAM) and real-time simulation method. The DDAM is a method based on the modal analysis theory, which inputs the shock design spectrum as shock load and gets hold of the shock response of the integrated system by applying separate cross-modal integrating method within the frequency domain. The real-time simulation method is to carry through the computational analysis of the equipment shock response within the time domain, use the time-history curves obtained from real-time measurement or spectrum transformation as the equipment shock load and find an iterative solution of a differential equation of the system movement by using the computational procedure within the time domain. Conclusions: Using the separate DDAM and Real-time Simulation Method, this paper carried through the shock analysis of a three-dimensional frame floating raft in ANSYS environments, analyzed the result, and drew the following conclusion: Because DDAM does not calculate damping, non-linear effect and phase difference between mode responses, the result is much bigger than that of real-time simulation method. The coupling response is much complex when the mode result of 3-dimension structure is being calculated, and the coupling response of non-shock direction is also much bigger than that of real-time simulation method when DDAM is applied. Both DDAM and real-time simulation method has its good points and scope of application. The designers should select the design method that is economic and in point according to the features and anti

  15. Applying Machine Learning and High Performance Computing to Water Quality Assessment and Prediction

    Directory of Open Access Journals (Sweden)

    Ruijian Zhang

    2017-12-01

    Full Text Available Water quality assessment and prediction is a more and more important issue. Traditional ways either take lots of time or they can only do assessments. In this research, by applying machine learning algorithm to a long period time of water attributes’ data; we can generate a decision tree so that it can predict the future day’s water quality in an easy and efficient way. The idea is to combine the traditional ways and the computer algorithms together. Using machine learning algorithms, the assessment of water quality will be far more efficient, and by generating the decision tree, the prediction will be quite accurate. The drawback of the machine learning modeling is that the execution takes quite long time, especially when we employ a better accuracy but more time-consuming algorithm in clustering. Therefore, we applied the high performance computing (HPC System to deal with this problem. Up to now, the pilot experiments have achieved very promising preliminary results. The visualized water quality assessment and prediction obtained from this project would be published in an interactive website so that the public and the environmental managers could use the information for their decision making.

  16. Development of a Web-based CANDU Core Management Procedure Automation System

    International Nuclear Information System (INIS)

    Lee, Sanghoon; Kim, Eunggon; Park, Daeyou; Yeom, Choongsub; Suh, Hyungbum; Kim, Sungmin

    2006-01-01

    CANDU reactor core needs efficient core management to increase safety, stability, high performance as well as to decrease operational cost. The most characteristic feature of CANDU is so called 'on-power refueling' i.e., there is no shutdown during refueling in opposition to that of PWR. Although this on-power refueling increases the efficiency of the plant, it requires heavy operational task and difficulties in real time operation such as regulating power distribution, burnup distribution, LZC statistics, the position of control devices and so on. To enhance the CANDU core management, there are several approaches to help operator and reduce difficulties, one of them is the COMOS (CANDU Core On-line Monitoring System). It has developed as an online core surveillance system based on the standard incre instrumentation and the numerical analysis codes such as RFSP (Reactor Fueling Simulation Program). As the procedure is getting more complex and the number of programs is increased, it is required that integrated and cooperative system. So, KHNP and IAE have been developing a new web-based system which can support effective and accurate reactor operational environment called COMPAS that means CANDU cOre Management Procedure Automation System. To ensure development of successful system, several steps of identifying requirements have been performed and Software Requirement Specification (SRS) document was developed. In this paper we emphasis on the how to keep consistency between the requirements and system products by applying requirement traceability methodology

  17. Learning-based computing techniques in geoid modeling for precise height transformation

    Science.gov (United States)

    Erol, B.; Erol, S.

    2013-03-01

    Precise determination of local geoid is of particular importance for establishing height control in geodetic GNSS applications, since the classical leveling technique is too laborious. A geoid model can be accurately obtained employing properly distributed benchmarks having GNSS and leveling observations using an appropriate computing algorithm. Besides the classical multivariable polynomial regression equations (MPRE), this study attempts an evaluation of learning based computing algorithms: artificial neural networks (ANNs), adaptive network-based fuzzy inference system (ANFIS) and especially the wavelet neural networks (WNNs) approach in geoid surface approximation. These algorithms were developed parallel to advances in computer technologies and recently have been used for solving complex nonlinear problems of many applications. However, they are rather new in dealing with precise modeling problem of the Earth gravity field. In the scope of the study, these methods were applied to Istanbul GPS Triangulation Network data. The performances of the methods were assessed considering the validation results of the geoid models at the observation points. In conclusion the ANFIS and WNN revealed higher prediction accuracies compared to ANN and MPRE methods. Beside the prediction capabilities, these methods were also compared and discussed from the practical point of view in conclusions.

  18. An electrochemical procedure coupled with a Schiff base method; application to electroorganic synthesis of new nitrogen-containing heterocycles

    International Nuclear Information System (INIS)

    Dowlati, Bahram; Othman, Mohamed Rozali

    2013-01-01

    The synthesis of Nitrogen-containing heterocycles has been achieved using chemical and electrochemical methods, respectively. The direct chemical synthesis of nucleophiles proceeds through the Schiff base chemical reaction. This procedure offers an alternate reaction between dicarbonyl compounds and diamines leads to the formation of products. The results indicate that the Schiff base chemical method for synthesis of the product has successfully performed in excellent overall yield. In the electrochemical step, a series of Nitrogen-containing compounds were electrosynthesized. Various parameters such as the applied potential, pH of the electrolytic solution, cell configuration and also purification techniques, were carried out to optimize the yields of corresponding products. New Nitrogen-containing heterocycle derivatives were synthesized using an electrochemical procedure coupled with a Schiff base as a facile, efficient and practical method. The products have been characterized after purification by IR, 1 H NMR, 13 C NMR and ESI-MS 2

  19. An automated sensitivity analysis procedure for the performance assessment of nuclear waste isolation systems

    International Nuclear Information System (INIS)

    Pin, F.G.; Worley, B.A.; Oblow, E.M.; Wright, R.Q.; Harper, W.V.

    1986-01-01

    To support an effort in making large-scale sensitivity analyses feasible, cost efficient and quantitatively complete, the authors have developed an automated procedure making use of computer calculus. The procedure, called GRESS (GRadient Enhanced Software System), is embodied in a precompiler that can process Fortran computer codes and add derivative-taking capabilities to the normal calculation scheme. In this paper, the automated GRESS procedure is described and applied to the code UCB-NE-10.2, which simulates the migration through a sorption medium of the radionuclide members of a decay chain. The sensitivity calculations for a sample problem are verified using comparison with analytical and perturbation analysis results. Conclusions are drawn relative to the applicability of GRESS for more general large-scale sensitivity studies, and the role of such techniques in an overall sensitivity and uncertainty analysis program is discussed

  20. Web-based computational chemistry education with CHARMMing I: Lessons and tutorial.

    Science.gov (United States)

    Miller, Benjamin T; Singh, Rishi P; Schalk, Vinushka; Pevzner, Yuri; Sun, Jingjun; Miller, Carrie S; Boresch, Stefan; Ichiye, Toshiko; Brooks, Bernard R; Woodcock, H Lee

    2014-07-01

    This article describes the development, implementation, and use of web-based "lessons" to introduce students and other newcomers to computer simulations of biological macromolecules. These lessons, i.e., interactive step-by-step instructions for performing common molecular simulation tasks, are integrated into the collaboratively developed CHARMM INterface and Graphics (CHARMMing) web user interface (http://www.charmming.org). Several lessons have already been developed with new ones easily added via a provided Python script. In addition to CHARMMing's new lessons functionality, web-based graphical capabilities have been overhauled and are fully compatible with modern mobile web browsers (e.g., phones and tablets), allowing easy integration of these advanced simulation techniques into coursework. Finally, one of the primary objections to web-based systems like CHARMMing has been that "point and click" simulation set-up does little to teach the user about the underlying physics, biology, and computational methods being applied. In response to this criticism, we have developed a freely available tutorial to bridge the gap between graphical simulation setup and the technical knowledge necessary to perform simulations without user interface assistance.

  1. Web-based computational chemistry education with CHARMMing I: Lessons and tutorial.

    Directory of Open Access Journals (Sweden)

    Benjamin T Miller

    2014-07-01

    Full Text Available This article describes the development, implementation, and use of web-based "lessons" to introduce students and other newcomers to computer simulations of biological macromolecules. These lessons, i.e., interactive step-by-step instructions for performing common molecular simulation tasks, are integrated into the collaboratively developed CHARMM INterface and Graphics (CHARMMing web user interface (http://www.charmming.org. Several lessons have already been developed with new ones easily added via a provided Python script. In addition to CHARMMing's new lessons functionality, web-based graphical capabilities have been overhauled and are fully compatible with modern mobile web browsers (e.g., phones and tablets, allowing easy integration of these advanced simulation techniques into coursework. Finally, one of the primary objections to web-based systems like CHARMMing has been that "point and click" simulation set-up does little to teach the user about the underlying physics, biology, and computational methods being applied. In response to this criticism, we have developed a freely available tutorial to bridge the gap between graphical simulation setup and the technical knowledge necessary to perform simulations without user interface assistance.

  2. Drawing-Based Procedural Modeling of Chinese Architectures.

    Science.gov (United States)

    Fei Hou; Yue Qi; Hong Qin

    2012-01-01

    This paper presents a novel modeling framework to build 3D models of Chinese architectures from elevation drawing. Our algorithm integrates the capability of automatic drawing recognition with powerful procedural modeling to extract production rules from elevation drawing. First, different from the previous symbol-based floor plan recognition, based on the novel concept of repetitive pattern trees, small horizontal repetitive regions of the elevation drawing are clustered in a bottom-up manner to form architectural components with maximum repetition, which collectively serve as building blocks for 3D model generation. Second, to discover the global architectural structure and its components' interdependencies, the components are structured into a shape tree in a top-down subdivision manner and recognized hierarchically at each level of the shape tree based on Markov Random Fields (MRFs). Third, shape grammar rules can be derived to construct 3D semantic model and its possible variations with the help of a 3D component repository. The salient contribution lies in the novel integration of procedural modeling with elevation drawing, with a unique application to Chinese architectures.

  3. Computer-based control of nuclear power information systems at international level

    International Nuclear Information System (INIS)

    Boniface, Ekechukwu; Okonkwo, Obi

    2011-01-01

    In most highly industrialized countries of the world information plays major role in anti-nuclear campaign. Information and discussions on nuclear power need critical and objective analysis before the structured information presentation to the public to avoid bias anti-nuclear information on one side and neglect of great risk in nuclear power. This research is developing a computer-based information system for the control of nuclear power at international level. The system is to provide easy and fast information highways for the followings: (1) Low Regulatory dose and activity limit as level of high danger for individuals and public. (2) Provision of relevant technical or scientific education among the information carriers in the nuclear power countries. The research is on fact oriented investigation about radioactivity. It also deals with fact oriented education about nuclear accidents and safety. A standard procedure for dissemination of latest findings using technical and scientific experts in nuclear technology is developed. The information highway clearly analyzes the factual information about radiation risk and nuclear energy. Radiation cannot be removed from our environment. The necessity of radiation utilizations defines nuclear energy as two-edge sword. It is therefore, possible to use computer-based information system in projecting and dissemination of expert knowledge about nuclear technology positively and also to use it in directing the public on the safety and control of the nuclear energy. The computer-based information highway for nuclear energy technology is to assist in scientific research and technological development at international level. (author)

  4. AGOA: A Hydration Procedure and Its Application to the 1-Phenyl-beta-Carboline Molecule

    OpenAIRE

    Hernandes, Marcelo Z.; Silva, João B. P. da; Longo, Ricardo L.

    2002-01-01

    A new procedure, named AGOA, has been developed and implemented in a computer program written in FORTRAN 77 to explore the hydration structures of polar solutes using its molecular electrostatic potential (MEP). This procedure can be generalized to polar solvents other than water. It has been tested for several small molecules, and applied to complex molecules of pharmacological interest, such as the beta-carbolinic systems derived from indole. This is a stringent, but not general, test of th...

  5. Accuracy of the microcanonical Lanczos method to compute real-frequency dynamical spectral functions of quantum models at finite temperatures

    Science.gov (United States)

    Okamoto, Satoshi; Alvarez, Gonzalo; Dagotto, Elbio; Tohyama, Takami

    2018-04-01

    We examine the accuracy of the microcanonical Lanczos method (MCLM) developed by Long et al. [Phys. Rev. B 68, 235106 (2003), 10.1103/PhysRevB.68.235106] to compute dynamical spectral functions of interacting quantum models at finite temperatures. The MCLM is based on the microcanonical ensemble, which becomes exact in the thermodynamic limit. To apply the microcanonical ensemble at a fixed temperature, one has to find energy eigenstates with the energy eigenvalue corresponding to the internal energy in the canonical ensemble. Here, we propose to use thermal pure quantum state methods by Sugiura and Shimizu [Phys. Rev. Lett. 111, 010401 (2013), 10.1103/PhysRevLett.111.010401] to obtain the internal energy. After obtaining the energy eigenstates using the Lanczos diagonalization method, dynamical quantities are computed via a continued fraction expansion, a standard procedure for Lanczos-based numerical methods. Using one-dimensional antiferromagnetic Heisenberg chains with S =1 /2 , we demonstrate that the proposed procedure is reasonably accurate, even for relatively small systems.

  6. A Compute Environment of ABC95 Array Computer Based on Multi-FPGA Chip

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    ABC95 array computer is a multi-function network's computer based on FPGA technology, The multi-function network supports processors conflict-free access data from memory and supports processors access data from processors based on enhanced MESH network.ABC95 instruction's system includes control instructions, scalar instructions, vectors instructions.Mostly net-work instructions are introduced.A programming environment of ABC95 array computer assemble language is designed.A programming environment of ABC95 array computer for VC++ is advanced.It includes load function of ABC95 array computer program and data, store function, run function and so on.Specially, The data type of ABC95 array computer conflict-free access is defined.The results show that these technologies can develop programmer of ABC95 array computer effectively.

  7. The use of Brainsuite iCT for frame-based stereotactic procedures

    DEFF Research Database (Denmark)

    Skjøth-Rasmussen, Jane; Jespersen, Bo; Brennum, Jannick

    2015-01-01

    BACKGROUND: Frame-based stereotactic procedures are the gold standard because of their superior stereotactic accuracy. The procedure used to be in multiple steps and was especially cumbersome and hazardous in intubated patients. A single-step procedure using intraoperative CT was created...

  8. Computational electromagnetics and model-based inversion a modern paradigm for eddy-current nondestructive evaluation

    CERN Document Server

    Sabbagh, Harold A; Sabbagh, Elias H; Aldrin, John C; Knopp, Jeremy S

    2013-01-01

    Computational Electromagnetics and Model-Based Inversion: A Modern Paradigm for Eddy Current Nondestructive Evaluation describes the natural marriage of the computer to eddy-current NDE. Three distinct topics are emphasized in the book: (a) fundamental mathematical principles of volume-integral equations as a subset of computational electromagnetics, (b) mathematical algorithms applied to signal-processing and inverse scattering problems, and (c) applications of these two topics to problems in which real and model data are used. By showing how mathematics and the computer can solve problems more effectively than current analog practices, this book defines the modern technology of eddy-current NDE. This book will be useful to advanced students and practitioners in the fields of computational electromagnetics, electromagnetic inverse-scattering theory, nondestructive evaluation, materials evaluation and biomedical imaging. Users of eddy-current NDE technology in industries as varied as nuclear power, aerospace,...

  9. Multispectral medical image fusion in Contourlet domain for computer based diagnosis of Alzheimer’s disease

    International Nuclear Information System (INIS)

    Bhateja, Vikrant; Moin, Aisha; Srivastava, Anuja; Bao, Le Nguyen; Lay-Ekuakille, Aimé; Le, Dac-Nhuong

    2016-01-01

    Computer based diagnosis of Alzheimer’s disease can be performed by dint of the analysis of the functional and structural changes in the brain. Multispectral image fusion deliberates upon fusion of the complementary information while discarding the surplus information to achieve a solitary image which encloses both spatial and spectral details. This paper presents a Non-Sub-sampled Contourlet Transform (NSCT) based multispectral image fusion model for computer-aided diagnosis of Alzheimer’s disease. The proposed fusion methodology involves color transformation of the input multispectral image. The multispectral image in YIQ color space is decomposed using NSCT followed by dimensionality reduction using modified Principal Component Analysis algorithm on the low frequency coefficients. Further, the high frequency coefficients are enhanced using non-linear enhancement function. Two different fusion rules are then applied to the low-pass and high-pass sub-bands: Phase congruency is applied to low frequency coefficients and a combination of directive contrast and normalized Shannon entropy is applied to high frequency coefficients. The superiority of the fusion response is depicted by the comparisons made with the other state-of-the-art fusion approaches (in terms of various fusion metrics).

  10. Multispectral medical image fusion in Contourlet domain for computer based diagnosis of Alzheimer’s disease

    Energy Technology Data Exchange (ETDEWEB)

    Bhateja, Vikrant, E-mail: bhateja.vikrant@gmail.com, E-mail: nhuongld@hus.edu.vn; Moin, Aisha; Srivastava, Anuja [Shri Ramswaroop Memorial Group of Professional Colleges (SRMGPC), Lucknow, Uttar Pradesh 226028 (India); Bao, Le Nguyen [Duytan University, Danang 550000 (Viet Nam); Lay-Ekuakille, Aimé [Department of Innovation Engineering, University of Salento, Lecce 73100 (Italy); Le, Dac-Nhuong, E-mail: bhateja.vikrant@gmail.com, E-mail: nhuongld@hus.edu.vn [Duytan University, Danang 550000 (Viet Nam); Haiphong University, Haiphong 180000 (Viet Nam)

    2016-07-15

    Computer based diagnosis of Alzheimer’s disease can be performed by dint of the analysis of the functional and structural changes in the brain. Multispectral image fusion deliberates upon fusion of the complementary information while discarding the surplus information to achieve a solitary image which encloses both spatial and spectral details. This paper presents a Non-Sub-sampled Contourlet Transform (NSCT) based multispectral image fusion model for computer-aided diagnosis of Alzheimer’s disease. The proposed fusion methodology involves color transformation of the input multispectral image. The multispectral image in YIQ color space is decomposed using NSCT followed by dimensionality reduction using modified Principal Component Analysis algorithm on the low frequency coefficients. Further, the high frequency coefficients are enhanced using non-linear enhancement function. Two different fusion rules are then applied to the low-pass and high-pass sub-bands: Phase congruency is applied to low frequency coefficients and a combination of directive contrast and normalized Shannon entropy is applied to high frequency coefficients. The superiority of the fusion response is depicted by the comparisons made with the other state-of-the-art fusion approaches (in terms of various fusion metrics).

  11. Transforming bases to bytes: Molecular computing with DNA

    Indian Academy of Sciences (India)

    Despite the popular image of silicon-based computers for computation, an embryonic field of mole- cular computation is emerging, where molecules in solution perform computational ..... [4] Mao C, Sun W, Shen Z and Seeman N C 1999. A nanomechanical device based on the B-Z transition of DNA; Nature 397 144–146.

  12. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    Science.gov (United States)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  13. A Delphi Study on Technology Enhanced Learning (TEL) Applied on Computer Science (CS) Skills

    Science.gov (United States)

    Porta, Marcela; Mas-Machuca, Marta; Martinez-Costa, Carme; Maillet, Katherine

    2012-01-01

    Technology Enhanced Learning (TEL) is a new pedagogical domain aiming to study the usage of information and communication technologies to support teaching and learning. The following study investigated how this domain is used to increase technical skills in Computer Science (CS). A Delphi method was applied, using three-rounds of online survey…

  14. Wheeze sound analysis using computer-based techniques: a systematic review.

    Science.gov (United States)

    Ghulam Nabi, Fizza; Sundaraj, Kenneth; Chee Kiang, Lam; Palaniappan, Rajkumar; Sundaraj, Sebastian

    2017-10-31

    Wheezes are high pitched continuous respiratory acoustic sounds which are produced as a result of airway obstruction. Computer-based analyses of wheeze signals have been extensively used for parametric analysis, spectral analysis, identification of airway obstruction, feature extraction and diseases or pathology classification. While this area is currently an active field of research, the available literature has not yet been reviewed. This systematic review identified articles describing wheeze analyses using computer-based techniques on the SCOPUS, IEEE Xplore, ACM, PubMed and Springer and Elsevier electronic databases. After a set of selection criteria was applied, 41 articles were selected for detailed analysis. The findings reveal that 1) computerized wheeze analysis can be used for the identification of disease severity level or pathology, 2) further research is required to achieve acceptable rates of identification on the degree of airway obstruction with normal breathing, 3) analysis using combinations of features and on subgroups of the respiratory cycle has provided a pathway to classify various diseases or pathology that stem from airway obstruction.

  15. Yahtzee: an anonymized group level matching procedure.

    Directory of Open Access Journals (Sweden)

    Jason J Jones

    Full Text Available Researchers often face the problem of needing to protect the privacy of subjects while also needing to integrate data that contains personal information from diverse data sources. The advent of computational social science and the enormous amount of data about people that is being collected makes protecting the privacy of research subjects ever more important. However, strict privacy procedures can hinder the process of joining diverse sources of data that contain information about specific individual behaviors. In this paper we present a procedure to keep information about specific individuals from being "leaked" or shared in either direction between two sources of data without need of a trusted third party. To achieve this goal, we randomly assign individuals to anonymous groups before combining the anonymized information between the two sources of data. We refer to this method as the Yahtzee procedure, and show that it performs as predicted by theoretical analysis when we apply it to data from Facebook and public voter records.

  16. Yahtzee: an anonymized group level matching procedure.

    Science.gov (United States)

    Jones, Jason J; Bond, Robert M; Fariss, Christopher J; Settle, Jaime E; Kramer, Adam D I; Marlow, Cameron; Fowler, James H

    2013-01-01

    Researchers often face the problem of needing to protect the privacy of subjects while also needing to integrate data that contains personal information from diverse data sources. The advent of computational social science and the enormous amount of data about people that is being collected makes protecting the privacy of research subjects ever more important. However, strict privacy procedures can hinder the process of joining diverse sources of data that contain information about specific individual behaviors. In this paper we present a procedure to keep information about specific individuals from being "leaked" or shared in either direction between two sources of data without need of a trusted third party. To achieve this goal, we randomly assign individuals to anonymous groups before combining the anonymized information between the two sources of data. We refer to this method as the Yahtzee procedure, and show that it performs as predicted by theoretical analysis when we apply it to data from Facebook and public voter records.

  17. Computer-Based Learning in Chemistry Classes

    Science.gov (United States)

    Pietzner, Verena

    2014-01-01

    Currently not many people would doubt that computers play an essential role in both public and private life in many countries. However, somewhat surprisingly, evidence of computer use is difficult to find in German state schools although other countries have managed to implement computer-based teaching and learning in their schools. This paper…

  18. Applying and evaluating computer-animated tutors

    Science.gov (United States)

    Massaro, Dominic W.; Bosseler, Alexis; Stone, Patrick S.; Connors, Pamela

    2002-05-01

    We have developed computer-assisted speech and language tutors for deaf, hard of hearing, and autistic children. Our language-training program utilizes our computer-animated talking head, Baldi, as the conversational agent, who guides students through a variety of exercises designed to teach vocabulary and grammer, to improve speech articulation, and to develop linguistic and phonological awareness. Baldi is an accurate three-dimensional animated talking head appropriately aligned with either synthesized or natural speech. Baldi has a tongue and palate, which can be displayed by making his skin transparent. Two specific language-training programs have been evaluated to determine if they improve word learning and speech articulation. The results indicate that the programs are effective in teaching receptive and productive language. Advantages of utilizing a computer-animated agent as a language tutor are the popularity of computers and embodied conversational agents with autistic kids, the perpetual availability of the program, and individualized instruction. Students enjoy working with Baldi because he offers extreme patience, he doesn't become angry, tired, or bored, and he is in effect a perpetual teaching machine. The results indicate that the psychology and technology of Baldi holds great promise in language learning and speech therapy. [Work supported by NSF Grant Nos. CDA-9726363 and BCS-9905176 and Public Health Service Grant No. PHS R01 DC00236.

  19. Corrosion Surveillance In Pipe By Computed Radiography

    International Nuclear Information System (INIS)

    Nguyen The Man; Dao Duy Dung; Dang Thu Hong; Le Duc Thinh; Ha Hong Thu; Nguyen Trong Nghia

    2014-01-01

    Computed Radiography (CR) is a technique of digital industrial radiology which is developed to replace conventional radiography. With a CR system, the detection of the outer and inner wall surface of the pipe is done usually by edge detection and filter algorithms of the profile line at the position under investigation. Applying in industries, radiographic examination shall be performed in accordance with a written procedure. This paper summarizes collected knowledge and experimental results to establish a procedure for radiography applications in monitoring corrosion in small bore pipes. (author)

  20. Music Learning Based on Computer Software

    Directory of Open Access Journals (Sweden)

    Baihui Yan

    2017-12-01

    Full Text Available In order to better develop and improve students’ music learning, the authors proposed the method of music learning based on computer software. It is still a new field to use computer music software to assist teaching. Hereby, we conducted an in-depth analysis on the computer-enabled music learning and the music learning status in secondary schools, obtaining the specific analytical data. Survey data shows that students have many cognitive problems in the current music classroom, and yet teachers have not found a reasonable countermeasure to them. Against this background, the introduction of computer music software to music learning is a new trial that can not only cultivate the students’ initiatives of music learning, but also enhance their abilities to learn music. Therefore, it is concluded that the computer software based music learning is of great significance to improving the current music learning modes and means.

  1. Three-phase short circuit calculation method based on pre-computed surface for doubly fed induction generator

    Science.gov (United States)

    Ma, J.; Liu, Q.

    2018-02-01

    This paper presents an improved short circuit calculation method, based on pre-computed surface to determine the short circuit current of a distribution system with multiple doubly fed induction generators (DFIGs). The short circuit current, injected into power grid by DFIG, is determined by low voltage ride through (LVRT) control and protection under grid fault. However, the existing methods are difficult to calculate the short circuit current of DFIG in engineering practice due to its complexity. A short circuit calculation method, based on pre-computed surface, was proposed by developing the surface of short circuit current changing with the calculating impedance and the open circuit voltage. And the short circuit currents were derived by taking into account the rotor excitation and crowbar activation time. Finally, the pre-computed surfaces of short circuit current at different time were established, and the procedure of DFIG short circuit calculation considering its LVRT was designed. The correctness of proposed method was verified by simulation.

  2. Novel Schemes for Measurement-Based Quantum Computation

    International Nuclear Information System (INIS)

    Gross, D.; Eisert, J.

    2007-01-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics--based on finitely correlated or projected entangled pair states--to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems

  3. Novel schemes for measurement-based quantum computation.

    Science.gov (United States)

    Gross, D; Eisert, J

    2007-06-01

    We establish a framework which allows one to construct novel schemes for measurement-based quantum computation. The technique develops tools from many-body physics-based on finitely correlated or projected entangled pair states-to go beyond the cluster-state based one-way computer. We identify resource states radically different from the cluster state, in that they exhibit nonvanishing correlations, can be prepared using nonmaximally entangling gates, or have very different local entanglement properties. In the computational models, randomness is compensated in a different manner. It is shown that there exist resource states which are locally arbitrarily close to a pure state. We comment on the possibility of tailoring computational models to specific physical systems.

  4. Novel computer-based endoscopic camera

    Science.gov (United States)

    Rabinovitz, R.; Hai, N.; Abraham, Martin D.; Adler, Doron; Nissani, M.; Fridental, Ron; Vitsnudel, Ilia

    1995-05-01

    We have introduced a computer-based endoscopic camera which includes (a) unique real-time digital image processing to optimize image visualization by reducing over exposed glared areas and brightening dark areas, and by accentuating sharpness and fine structures, and (b) patient data documentation and management. The image processing is based on i Sight's iSP1000TM digital video processor chip and Adaptive SensitivityTM patented scheme for capturing and displaying images with wide dynamic range of light, taking into account local neighborhood image conditions and global image statistics. It provides the medical user with the ability to view images under difficult lighting conditions, without losing details `in the dark' or in completely saturated areas. The patient data documentation and management allows storage of images (approximately 1 MB per image for a full 24 bit color image) to any storage device installed into the camera, or to an external host media via network. The patient data which is included with every image described essential information on the patient and procedure. The operator can assign custom data descriptors, and can search for the stored image/data by typing any image descriptor. The camera optics has extended zoom range of f equals 20 - 45 mm allowing control of the diameter of the field which is displayed on the monitor such that the complete field of view of the endoscope can be displayed on all the area of the screen. All these features provide versatile endoscopic camera with excellent image quality and documentation capabilities.

  5. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  6. Applications of meshless methods for damage computations with finite strains

    International Nuclear Information System (INIS)

    Pan Xiaofei; Yuan Huang

    2009-01-01

    Material defects such as cavities have great effects on the damage process in ductile materials. Computations based on finite element methods (FEMs) often suffer from instability due to material failure as well as large distortions. To improve computational efficiency and robustness the element-free Galerkin (EFG) method is applied in the micro-mechanical constitute damage model proposed by Gurson and modified by Tvergaard and Needleman (the GTN damage model). The EFG algorithm is implemented in the general purpose finite element code ABAQUS via the user interface UEL. With the help of the EFG method, damage processes in uniaxial tension specimens and notched specimens are analyzed and verified with experimental data. Computational results reveal that the damage which takes place in the interior of specimens will extend to the exterior and cause fracture of specimens; the damage is a fast procedure relative to the whole tensing process. The EFG method provides more stable and robust numerical solution in comparing with the FEM analysis

  7. Some computer simulations based on the linear relative risk model

    International Nuclear Information System (INIS)

    Gilbert, E.S.

    1991-10-01

    This report presents the results of computer simulations designed to evaluate and compare the performance of the likelihood ratio statistic and the score statistic for making inferences about the linear relative risk mode. The work was motivated by data on workers exposed to low doses of radiation, and the report includes illustration of several procedures for obtaining confidence limits for the excess relative risk coefficient based on data from three studies of nuclear workers. The computer simulations indicate that with small sample sizes and highly skewed dose distributions, asymptotic approximations to the score statistic or to the likelihood ratio statistic may not be adequate. For testing the null hypothesis that the excess relative risk is equal to zero, the asymptotic approximation to the likelihood ratio statistic was adequate, but use of the asymptotic approximation to the score statistic rejected the null hypothesis too often. Frequently the likelihood was maximized at the lower constraint, and when this occurred, the asymptotic approximations for the likelihood ratio and score statistics did not perform well in obtaining upper confidence limits. The score statistic and likelihood ratio statistics were found to perform comparably in terms of power and width of the confidence limits. It is recommended that with modest sample sizes, confidence limits be obtained using computer simulations based on the score statistic. Although nuclear worker studies are emphasized in this report, its results are relevant for any study investigating linear dose-response functions with highly skewed exposure distributions. 22 refs., 14 tabs

  8. Performance of computer-aided detection applied to full-field digital mammography in detection of breast cancers

    International Nuclear Information System (INIS)

    Sadaf, Arifa; Crystal, Pavel; Scaranelo, Anabel; Helbich, Thomas

    2011-01-01

    Objective: The aim of this retrospective study was to evaluate performance of computer-aided detection (CAD) with full-field digital mammography (FFDM) in detection of breast cancers. Materials and Methods: CAD was retrospectively applied to standard mammographic views of 127 cases with biopsy proven breast cancers detected with FFDM (Senographe 2000, GE Medical Systems). CAD sensitivity was assessed in total group of 127 cases and for subgroups based on breast density, mammographic lesion type, mammographic lesion size, histopathology and mode of presentation. Results: Overall CAD sensitivity was 91% (115 of 127 cases). There were no statistical differences (p > 0.1) in CAD detection of cancers in dense breasts 90% (53/59) versus non-dense breasts 91% (62/68). There was statistical difference (p 20 mm 97% (22/23). Conclusion: CAD applied to FFDM showed 100% sensitivity in identifying cancers manifesting as microcalcifications only and high sensitivity 86% (71/83) for other mammographic appearances of cancer. Sensitivity is influenced by lesion size. CAD in FFDM is an adjunct helping radiologist in early detection of breast cancers.

  9. Computational Intelligence based techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Laghari, J.A.; Mokhlis, H.; Karimi, M.; Bakar, A.H.A.; Mohamad, Hasmaini

    2014-01-01

    Highlights: • Unintentional and intentional islanding, their causes, and solutions are presented. • Remote, passive, active and hybrid islanding detection techniques are discussed. • The limitation of these techniques in accurately detect islanding are discussed. • Computational intelligence techniques ability in detecting islanding is discussed. • Review of ANN, fuzzy logic control, ANFIS, Decision tree techniques is provided. - Abstract: Accurate and fast islanding detection of distributed generation is highly important for its successful operation in distribution networks. Up to now, various islanding detection technique based on communication, passive, active and hybrid methods have been proposed. However, each technique suffers from certain demerits that cause inaccuracies in islanding detection. Computational intelligence based techniques, due to their robustness and flexibility in dealing with complex nonlinear systems, is an option that might solve this problem. This paper aims to provide a comprehensive review of computational intelligence based techniques applied for islanding detection of distributed generation. Moreover, the paper compares the accuracies of computational intelligence based techniques over existing techniques to provide a handful of information for industries and utility researchers to determine the best method for their respective system

  10. Modeling soft factors in computer-based wargames

    Science.gov (United States)

    Alexander, Steven M.; Ross, David O.; Vinarskai, Jonathan S.; Farr, Steven D.

    2002-07-01

    Computer-based wargames have seen much improvement in recent years due to rapid increases in computing power. Because these games have been developed for the entertainment industry, most of these advances have centered on the graphics, sound, and user interfaces integrated into these wargames with less attention paid to the game's fidelity. However, for a wargame to be useful to the military, it must closely approximate as many of the elements of war as possible. Among the elements that are typically not modeled or are poorly modeled in nearly all military computer-based wargames are systematic effects, command and control, intelligence, morale, training, and other human and political factors. These aspects of war, with the possible exception of systematic effects, are individually modeled quite well in many board-based commercial wargames. The work described in this paper focuses on incorporating these elements from the board-based games into a computer-based wargame. This paper will also address the modeling and simulation of the systemic paralysis of an adversary that is implied by the concept of Effects Based Operations (EBO). Combining the fidelity of current commercial board wargames with the speed, ease of use, and advanced visualization of the computer can significantly improve the effectiveness of military decision making and education. Once in place, the process of converting board wargames concepts to computer wargames will allow the infusion of soft factors into military training and planning.

  11. Application of a partitioning procedure based on Rao quadratic entropy index to characterize the temporal evolution of in situ varietal and genetic diversity of bread wheat in France over the period 1981-2006.

    Science.gov (United States)

    Perronne, Rémi; Goldringer, Isabelle

    2018-04-01

    We present and highlight a partitioning procedure based on the Rao quadratic entropy index to assess temporal in situ inter-annual varietal and genetic changes of crop diversity. For decades, Western-European agroecosystems have undergone profound changes, among which a reduction of crop genetic diversity. These changes have been highlighted in numerous studies, but no unified partitioning procedure has been proposed to compute the inter-annual variability in both varietal and genetic diversity. To fill this gap, we tested, adjusted and applied a partitioning procedure based on the Rao quadratic entropy index that made possible to describe the different components of crop diversity as well as to account for the relative acreages of varieties. To emphasize the relevance of this procedure, we relied on a case study focusing on the temporal evolution of bread wheat diversity in France over the period 1981-2006 at both national and district scales. At the national scale, we highlighted a decrease of the weighted genetic replacement indicating that varieties sown in the most recent years were more genetically similar than older ones. At the district scale, we highlighted sudden changes in weighted genetic replacement in some agricultural regions that could be due to fast shifts of successive leading varieties over time. Other regions presented a relatively continuous increase of genetic similarity over time, potentially due to the coexistence of a larger number of co-leading varieties that got closer genetically. Based on the partitioning procedure, we argue that a tendency of in situ genetic homogenization could be compared to some of its potential causes, such as a decrease in the speed of replacement or an increase in between-variety genetic similarity over time.

  12. Computational Inquiry in Introductory Statistics

    Science.gov (United States)

    Toews, Carl

    2017-01-01

    Inquiry-based pedagogies have a strong presence in proof-based undergraduate mathematics courses, but can be difficult to implement in courses that are large, procedural, or highly computational. An introductory course in statistics would thus seem an unlikely candidate for an inquiry-based approach, as these courses typically steer well clear of…

  13. Special data base of Informational - Computational System 'INM RAS - Black Sea' for solving inverse and data assimilation problems

    Science.gov (United States)

    Zakharova, Natalia; Piskovatsky, Nicolay; Gusev, Anatoly

    2014-05-01

    Development of Informational-Computational Systems (ICS) for data assimilation procedures is one of multidisciplinary problems. To study and solve these problems one needs to apply modern results from different disciplines and recent developments in: mathematical modeling; theory of adjoint equations and optimal control; inverse problems; numerical methods theory; numerical algebra and scientific computing. The above problems are studied in the Institute of Numerical Mathematics of the Russian Academy of Science (INM RAS) in ICS for personal computers. In this work the results on the Special data base development for ICS "INM RAS - Black Sea" are presented. In the presentation the input information for ICS is discussed, some special data processing procedures are described. In this work the results of forecast using ICS "INM RAS - Black Sea" with operational observation data assimilation are presented. This study was supported by the Russian Foundation for Basic Research (project No 13-01-00753) and by Presidium Program of Russian Academy of Sciences (project P-23 "Black sea as an imitational ocean model"). References 1. V.I. Agoshkov, M.V. Assovskii, S.A. Lebedev, Numerical simulation of Black Sea hydrothermodynamics taking into account tide-forming forces. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 5-31. 2. E.I. Parmuzin, V.I. Agoshkov, Numerical solution of the variational assimilation problem for sea surface temperature in the model of the Black Sea dynamics. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 69-94. 3. V.B. Zalesny, N.A. Diansky, V.V. Fomin, S.N. Moshonkin, S.G. Demyshev, Numerical model of the circulation of Black Sea and Sea of Azov. Russ. J. Numer. Anal. Math. Modelling (2012) 27, No.1, pp. 95-111. 4. Agoshkov V.I.,Assovsky M.B., Giniatulin S. V., Zakharova N.B., Kuimov G.V., Parmuzin E.I., Fomin V.V. Informational Computational system of variational assimilation of observation data "INM RAS - Black sea"// Ecological

  14. Computer Based Porosity Design by Multi Phase Topology Optimization

    Science.gov (United States)

    Burblies, Andreas; Busse, Matthias

    2008-02-01

    A numerical simulation technique called Multi Phase Topology Optimization (MPTO) based on finite element method has been developed and refined by Fraunhofer IFAM during the last five years. MPTO is able to determine the optimum distribution of two or more different materials in components under thermal and mechanical loads. The objective of optimization is to minimize the component's elastic energy. Conventional topology optimization methods which simulate adaptive bone mineralization have got the disadvantage that there is a continuous change of mass by growth processes. MPTO keeps all initial material concentrations and uses methods adapted from molecular dynamics to find energy minimum. Applying MPTO to mechanically loaded components with a high number of different material densities, the optimization results show graded and sometimes anisotropic porosity distributions which are very similar to natural bone structures. Now it is possible to design the macro- and microstructure of a mechanical component in one step. Computer based porosity design structures can be manufactured by new Rapid Prototyping technologies. Fraunhofer IFAM has applied successfully 3D-Printing and Selective Laser Sintering methods in order to produce very stiff light weight components with graded porosities calculated by MPTO.

  15. Solution Procedure for Transport Modeling in Effluent Recharge Based on Operator-Splitting Techniques

    Directory of Open Access Journals (Sweden)

    Shutang Zhu

    2008-01-01

    Full Text Available The coupling of groundwater movement and reactive transport during groundwater recharge with wastewater leads to a complicated mathematical model, involving terms to describe convection-dispersion, adsorption/desorption and/or biodegradation, and so forth. It has been found very difficult to solve such a coupled model either analytically or numerically. The present study adopts operator-splitting techniques to decompose the coupled model into two submodels with different intrinsic characteristics. By applying an upwind finite difference scheme to the finite volume integral of the convection flux term, an implicit solution procedure is derived to solve the convection-dominant equation. The dispersion term is discretized in a standard central-difference scheme while the dispersion-dominant equation is solved using either the preconditioned Jacobi conjugate gradient (PJCG method or Thomas method based on local-one-dimensional scheme. The solution method proposed in this study is applied to the demonstration project of groundwater recharge with secondary effluent at Gaobeidian sewage treatment plant (STP successfully.

  16. Semantic computing and language knowledge bases

    Science.gov (United States)

    Wang, Lei; Wang, Houfeng; Yu, Shiwen

    2017-09-01

    As the proposition of the next-generation Web - semantic Web, semantic computing has been drawing more and more attention within the circle and the industries. A lot of research has been conducted on the theory and methodology of the subject, and potential applications have also been investigated and proposed in many fields. The progress of semantic computing made so far cannot be detached from its supporting pivot - language resources, for instance, language knowledge bases. This paper proposes three perspectives of semantic computing from a macro view and describes the current status of affairs about the construction of language knowledge bases and the related research and applications that have been carried out on the basis of these resources via a case study in the Institute of Computational Linguistics at Peking University.

  17. Comparison of four software packages applied to a scattering problem

    DEFF Research Database (Denmark)

    Albertsen, Niels Christian; Chesneaux, Jean-Marie; Christiansen, Søren

    1999-01-01

    We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation. This le......We investigate characteristic features of four different software packages by applying them to the numerical solution of a non-trivial physical problem in computer simulation, viz., scattering of waves from a sinusoidal boundary. The numerical method used is based on boundary collocation...

  18. Using a micro computer based test bank

    International Nuclear Information System (INIS)

    Hamel, R.T.

    1987-01-01

    Utilizing a micro computer based test bank offers a training department many advantages and can have a positive impact upon training procedures and examination standards. Prior to data entry, Training Department management must pre-review the examination questions and answers to ensure compliance with examination standards and to verify the validity of all questions. Management must adhere to the TSD format since all questions require an enabling objective numbering scheme. Each question is entered under the enabling objective upon which it is based. Then the question is selected via the enabling objective. This eliminates any instructor bias because a random number generator chooses the test question. However, the instructor may load specific questions to create an emphasis theme for any test. The examination, answer and cover sheets are produced and printed within minutes. The test bank eliminates the large amount of time that is normally required for an instructor to formulate an examination. The need for clerical support is reduced by the elimination of typing examinations and also by the software's ability to maintain and generate student/course lists, attendance sheets, and grades. Software security measures limit access to the test bank, and the impromptu method used to generate and print an examination enhance its security

  19. Above the cloud computing: applying cloud computing principles to create an orbital services model

    Science.gov (United States)

    Straub, Jeremy; Mohammad, Atif; Berk, Josh; Nervold, Anders K.

    2013-05-01

    Large satellites and exquisite planetary missions are generally self-contained. They have, onboard, all of the computational, communications and other capabilities required to perform their designated functions. Because of this, the satellite or spacecraft carries hardware that may be utilized only a fraction of the time; however, the full cost of development and launch are still bone by the program. Small satellites do not have this luxury. Due to mass and volume constraints, they cannot afford to carry numerous pieces of barely utilized equipment or large antennas. This paper proposes a cloud-computing model for exposing satellite services in an orbital environment. Under this approach, each satellite with available capabilities broadcasts a service description for each service that it can provide (e.g., general computing capacity, DSP capabilities, specialized sensing capabilities, transmission capabilities, etc.) and its orbital elements. Consumer spacecraft retain a cache of service providers and select one utilizing decision making heuristics (e.g., suitability of performance, opportunity to transmit instructions and receive results - based on the orbits of the two craft). The two craft negotiate service provisioning (e.g., when the service can be available and for how long) based on the operating rules prioritizing use of (and allowing access to) the service on the service provider craft, based on the credentials of the consumer. Service description, negotiation and sample service performance protocols are presented. The required components of each consumer or provider spacecraft are reviewed. These include fully autonomous control capabilities (for provider craft), a lightweight orbit determination routine (to determine when consumer and provider craft can see each other and, possibly, pointing requirements for craft with directional antennas) and an authentication and resource utilization priority-based access decision making subsystem (for provider craft

  20. Current research activities: Applied and numerical mathematics, fluid mechanics, experiments in transition and turbulence and aerodynamics, and computer science

    Science.gov (United States)

    1992-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, fluid mechanics including fluid dynamics, acoustics, and combustion, aerodynamics, and computer science during the period 1 Apr. 1992 - 30 Sep. 1992 is summarized.