WorldWideScience

Sample records for logic evaluation software

  1. Automating Software Development Process using Fuzzy Logic

    NARCIS (Netherlands)

    Marcelloni, Francesco; Aksit, Mehmet; Damiani, Ernesto; Jain, Lakhmi C.; Madravio, Mauro

    2004-01-01

    In this chapter, we aim to highlight how fuzzy logic can be a valid expressive tool to manage the software development process. We characterize a software development method in terms of two major components: artifact types and methodological rules. Classes, attributes, operations, and inheritance

  2. Logical inference and evaluation

    International Nuclear Information System (INIS)

    Perey, F.G.

    1981-01-01

    Most methodologies of evaluation currently used are based upon the theory of statistical inference. It is generally perceived that this theory is not capable of dealing satisfactorily with what are called systematic errors. Theories of logical inference should be capable of treating all of the information available, including that not involving frequency data. A theory of logical inference is presented as an extension of deductive logic via the concept of plausibility and the application of group theory. Some conclusions, based upon the application of this theory to evaluation of data, are also given

  3. Universal Programmable Logic Controller Software

    International Nuclear Information System (INIS)

    Mohd Arif Hamzah; Azhar Shamsudin; Fadil Ismail; Muhammad Nor Atan; Anwar Abdul Rahman

    2013-01-01

    Programmable Logic Controller (PLC) is an electronic hardware which is widely used in manufacturing or processing industries. It is also serve as the main control system hardware to run the production and manufacturing process. There are more than ten (10) well known company producing PLC hardware, with their own specialties, including the method of programming and language used. Malaysia Nuclear Agency have various plant and equipment, runs and control by PLC, such as Mintex Sinagama Plant, Alurtron Plant, and few laboratory equipment. Since all the equipment and plant are equipped with various brand or different manufacture of PLC, it creates difficulties to the supporting staff to master the control program. The same problems occur for new application of this hardware, since there no policies to purchase only one specific brand of PLC. (author)

  4. Applications of Logic Coverage Criteria and Logic Mutation to Software Testing

    Science.gov (United States)

    Kaminski, Garrett K.

    2011-01-01

    Logic is an important component of software. Thus, software logic testing has enjoyed significant research over a period of decades, with renewed interest in the last several years. One approach to detecting logic faults is to create and execute tests that satisfy logic coverage criteria. Another approach to detecting faults is to perform mutation…

  5. Reliability evaluation programmable logic devices

    International Nuclear Information System (INIS)

    Srivani, L.; Murali, N.; Thirugnana Murthy, D.; Satya Murty, S.A.V.

    2014-01-01

    Programmable Logic Devices (PLD) are widely used as basic building modules in high integrity systems, considering their robust features such as gate density, performance, speed etc. PLDs are used to implement digital design such as bus interface logic, control logic, sequencing logic, glue logic etc. Due to semiconductor evolution, new PLDs with state-of-the-art features are arriving to the market. Since these devices are reliable as per the manufacturer's specification, they were used in the design of safety systems. But due to their reduced market life, the availability of performance data is limited. So evaluating the PLD before deploying in a safety system is very important. This paper presents a survey on the use of PLDs in the nuclear domain and the steps involved in the evaluation of PLD using Quantitative Accelerated Life Testing. (author)

  6. Modeling of Some Chaotic Systems with AnyLogic Software

    Directory of Open Access Journals (Sweden)

    Biljana Zlatanovska

    2018-05-01

    Full Text Available The chaotic systems are already known in the theory of chaos. In our paper will be analyzed the following chaotic systems: Rossler, Chua and Chen systems. All of them are systems of ordinary differential equations. By mathematical software Mathematica and MatLab, their graphical representation as continuous dynamical systems is already known. By computer simulations, via examples, the systems will be analyzed using AnyLogic software. We would like to present the way how ordinary differential equations are modeling with AnyLogic software, as one of the simplest software for use.

  7. The TSO Logic and G2 Software Product

    Science.gov (United States)

    Davis, Derrick D.

    2014-01-01

    This internship assignment for spring 2014 was at John F. Kennedy Space Center (KSC), in NASAs Engineering and Technology (NE) group in support of the Control and Data Systems Division (NE-C) within the Systems Hardware Engineering Branch. (NEC-4) The primary focus was in system integration and benchmarking utilizing two separate computer software products. The first half of this 2014 internship is spent in assisting NE-C4s Electronics and Embedded Systems Engineer, Kelvin Ruiz and fellow intern Scott Ditto with the evaluation of a newly piece of software, called G2. Its developed by the Gensym Corporation and introduced to the group as a tool used in monitoring launch environments. All fellow interns and employees of the G2 group have been working together in order to better understand the significance of the G2 application and how KSC can benefit from its capabilities. The second stage of this Spring project is to assist with an ongoing integration of a benchmarking tool, developed by a group of engineers from a Canadian based organization known as TSO Logic. Guided by NE-C4s Computer Engineer, Allen Villorin, NASA 2014 interns put forth great effort in helping to integrate TSOs software into the Spaceport Processing Systems Development Laboratory (SPSDL) for further testing and evaluating. The TSO Logic group claims that their software is designed for, monitoring and reducing energy consumption at in-house server farms and large data centers, allows data centers to control the power state of servers, without impacting availability or performance and without changes to infrastructure and the focus of the assignment is to test this theory. TSOs Aaron Rallo Founder and CEO, and Chris Tivel CTO, both came to KSC to assist with the installation of their software in the SPSDL laboratory. TSOs software is installed onto 24 individual workstations running three different operating systems. The workstations were divided into three groups of 8 with each group having its

  8. Constrained mathematics evaluation in probabilistic logic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Arlin Cooper, J

    1998-06-01

    A challenging problem in mathematically processing uncertain operands is that constraints inherent in the problem definition can require computations that are difficult to implement. Examples of possible constraints are that the sum of the probabilities of partitioned possible outcomes must be one, and repeated appearances of the same variable must all have the identical value. The latter, called the 'repeated variable problem', will be addressed in this paper in order to show how interval-based probabilistic evaluation of Boolean logic expressions, such as those describing the outcomes of fault trees and event trees, can be facilitated in a way that can be readily implemented in software. We will illustrate techniques that can be used to transform complex constrained problems into trivial problems in most tree logic expressions, and into tractable problems in most other cases.

  9. Educational Software for First Order Logic Semantics in Introductory Logic Courses

    Science.gov (United States)

    Mauco, María Virginia; Ferrante, Enzo; Felice, Laura

    2014-01-01

    Basic courses on logic are common in most computer science curricula. Students often have difficulties in handling formalisms and getting familiar with them. Educational software helps to motivate and improve the teaching-learning processes. Therefore, incorporating these kinds of tools becomes important, because they contribute to gaining…

  10. Software component quality evaluation

    Science.gov (United States)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  11. Formal Verification of Digital Protection Logic and Automatic Testing Software

    Energy Technology Data Exchange (ETDEWEB)

    Cha, S. D.; Ha, J. S.; Seo, J. S. [KAIST, Daejeon (Korea, Republic of)

    2008-06-15

    - Technical aspect {center_dot} It is intended that digital I and C software have safety and reliability. Project results help the software to acquire license. Software verification technique, which results in this project, can be to use for digital NPP(Nuclear power plant) in the future. {center_dot} This research introduces many meaningful results of verification on digital protection logic and suggests I and C software testing strategy. These results apply to verify nuclear fusion device, accelerator, nuclear waste management and nuclear medical device that require dependable software and high-reliable controller. Moreover, These can be used for military, medical or aerospace-related software. - Economical and industrial aspect {center_dot} Since safety of digital I and C software is highly import, It is essential for the software to be verified. But verification and licence acquisition related to digital I and C software face high cost. This project gives economic profit to domestic economy by using introduced verification and testing technique instead of foreign technique. {center_dot} The operation rate of NPP will rise, when NPP safety critical software is verified with intellectual V and V tool. It is expected that these software substitute safety-critical software that wholly depend on foreign. Consequently, the result of this project has high commercial value and the recognition of the software development works will be able to be spread to the industrial circles. - Social and cultural aspect People expect that nuclear power generation contributes to relieving environmental problems because that does not emit more harmful air pollution source than other power generations. To give more trust and expectation about nuclear power generation to our society, we should make people to believe that NPP is highly safe system. In that point of view, we can present high-reliable I and C proofed by intellectual V and V technique as evidence

  12. 242-A Control System device logic software documentation. Revision 2

    International Nuclear Information System (INIS)

    Berger, J.F.

    1995-01-01

    A Distributive Process Control system was purchased by Project B-534. This computer-based control system, called the Monitor and Control System (MCS), was installed in the 242-A Evaporator located in the 200 East Area. The purpose of the MCS is to monitor and control the Evaporator and Monitor a number of alarms and other signals from various Tank Farm facilities. Applications software for the MCS was developed by the Waste Treatment System Engineering Group of Westinghouse. This document describes the Device Logic for this system

  13. Software product family evaluation

    NARCIS (Netherlands)

    van der Linden, F; Bosch, J; Kamsties, E; Kansala, K; Krzanik, L; Obbink, H; VanDerLinden, F

    2004-01-01

    This paper proposes a 4-dimensional software product family engineering evaluation model. The 4 dimensions relate to the software engineering concerns of business, architecture, organisation and process. The evaluation model is meant to be used within organisations to determine the status of their

  14. Formalization of software requirements for information systems using fuzzy logic

    Science.gov (United States)

    Yegorov, Y. S.; Milov, V. R.; Kvasov, A. S.; Sorokoumova, S. N.; Suvorova, O. V.

    2018-05-01

    The paper considers an approach to the design of information systems based on flexible software development methodologies. The possibility of improving the management of the life cycle of information systems by assessing the functional relationship between requirements and business objectives is described. An approach is proposed to establish the relationship between the degree of achievement of business objectives and the fulfillment of requirements for the projected information system. It describes solutions that allow one to formalize the process of formation of functional and non-functional requirements with the help of fuzzy logic apparatus. The form of the objective function is formed on the basis of expert knowledge and is specified via learning from very small data set.

  15. Methods of software V and V for a programmable logic controller in NPPs

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Lee, Young Jun; Cha, Kyung Ho; Cheon, Se Woo; Son, Han Seong; Lee, Jang Soo; Kwon, Kee Choon

    2004-01-01

    This paper addresses the Verification and Validation (V and V) process and methodology for embedded real time software of a safety-grade Programmable Logic Controller(PLC). This safety-grade PLC is being developed in the Korea Nuclear Instrumentation and Control System (KNICS) projects. KNICS projects are developing a Reactor Protection System(RPS) and an Engineered Safety Feature-Component Control System (ESF-CCS) as well as safety-grade PLC. Safety-grade PLC will be a major component that composes the RPS systems and ESF-CCS systems as nuclear instruments and control equipments. This paper describes the V and V guidelines and procedure, V and V environment, V and V process and methodology, and the V and V tools by the KNICS projects. Specially, it describes the real-time operating system V and V experience which corresponds to the requirement analysis phase of the software development life cycle. Main activities of the real-time operating system Software Requirement Specification(SRS) V and V of the PLC are the technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, software safety analysis, and software configuration management. The proposed V and V methodology satisfies the Standard Review Plan(SRP)/Branch Technical Position(BTP)-14(MOST-KSRG 7/Appendix 15 in Korea will be issued soon) criteria for the safety software in nuclear power plants. The proposed V and V methodology is going to verify the upcoming software life cycle in the KNICS projects. (author)

  16. V and V methods of a safety-critical software for a programmable logic controller

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jang Yeol; Lee, Young Jun; Cha, Kyung Ho; Cheon, Se Woo; Lee, Jang Soo; Kwon, Kee Choon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kong, Seung Ju [Korea Hydro and Nuclear Power Co., Ltd, Daejeon (Korea, Republic of)

    2005-11-15

    This paper addresses the Verification an Validation(V and V) process and the methodology for an embedded real time software of a safety-grade Programmable Logic Controller(PLC). This safety-grade PLC is being developed as one of the Korean Nuclear Instrumentation and Control System(KNICS) project KNICS projects are developing a Reactor Protection System(RPS) and an Engineered Safety Feature-Component Control System(ESF-CCS) as well as a safety-grade PLC. The safety-grade PLC will be a major component that encomposes the RPS systems and the ESF-CCS systems as nuclear instruments and control equipment. This paper describes the V and V guidelines an procedures, V and V environment, V and V process and methodology, and the V and V tools in the KNICS projects. Specifically, it describes the real-time operating system V and V experience which corresponds to the requirement analysis phase, design phase and the implementation and testing phase of the software development life cycle. Main activities of the V and V for the PLC system software are a technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, software safety analysis, and a software configuration management. The proposed V and V methodology satisfies the Standard Review Plan(SRP)/Branch Technical Position(BTP)-14 criteria for the safety software in nuclear power plants. The proposed V and V methodology is going to be used to verify the upcoming software life cycle in the KNICS projects.

  17. Software V and V methods for a safety - grade programmable logic controller

    International Nuclear Information System (INIS)

    Jang Yeol Kim; Young Jun Lee; Kyung Ho Cha; Se Woo Cheon; Jang Soo Lee; Kee Choon Kwon

    2006-01-01

    This paper addresses the Verification and Validation(V and V) process and the methodology for an embedded real time software of a safety-grade Programmable Logic Controller(PLC). This safety- grade PLC is being developed as one of the Korean Nuclear Instrumentation and Control System (KNICS) projects. KNICS projects are developing a Reactor Protection System(RPS) and an Engineered Safety Feature-Component Control System(ESF-CCS) as well as a safety-grade PLC. The safety-grade PLC will be a major component that encomposes the RPS systems and the ESF-CCS systems as nuclear instruments and control equipment. This paper describes the V and V guidelines and procedures, V and V environment, V and V process and methodology, and the V and V tools in the KNICS projects. Specifically, it describes the real-time operating system V and V experience which corresponds to the requirement analysis phase, design phase and the implementation and testing phase of the software development life cycle. Main activities of the V and V for the PLC system software are a technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, software safety analysis, and a software configuration management. The proposed V and V methodology satisfies the Standard Review Plan(SRP)/Branch Technical Position(BTP)-14 criteria for the safety software in nuclear power plants. The proposed V and V methodology is going to be used to verify the upcoming software life cycle in the KNICS projects. (author)

  18. V and V methods of a safety-critical software for a programmable logic controller

    International Nuclear Information System (INIS)

    Kim, Jang Yeol; Lee, Young Jun; Cha, Kyung Ho; Cheon, Se Woo; Lee, Jang Soo; Kwon, Kee Choon; Kong, Seung Ju

    2005-01-01

    This paper addresses the Verification an Validation(V and V) process and the methodology for an embedded real time software of a safety-grade Programmable Logic Controller(PLC). This safety-grade PLC is being developed as one of the Korean Nuclear Instrumentation and Control System(KNICS) project KNICS projects are developing a Reactor Protection System(RPS) and an Engineered Safety Feature-Component Control System(ESF-CCS) as well as a safety-grade PLC. The safety-grade PLC will be a major component that encomposes the RPS systems and the ESF-CCS systems as nuclear instruments and control equipment. This paper describes the V and V guidelines an procedures, V and V environment, V and V process and methodology, and the V and V tools in the KNICS projects. Specifically, it describes the real-time operating system V and V experience which corresponds to the requirement analysis phase, design phase and the implementation and testing phase of the software development life cycle. Main activities of the V and V for the PLC system software are a technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, software safety analysis, and a software configuration management. The proposed V and V methodology satisfies the Standard Review Plan(SRP)/Branch Technical Position(BTP)-14 criteria for the safety software in nuclear power plants. The proposed V and V methodology is going to be used to verify the upcoming software life cycle in the KNICS projects

  19. RCM and component evaluation logic

    International Nuclear Information System (INIS)

    Davis, B.; Anderson, J.

    1991-01-01

    This paper reports on the prevention maintenance program at Palo Verde Nuclear Generating Station (PVNGS), initially developed based on recommendations found in vendor technical manuals and accepted industry standards. The method was used until there was sufficient operating history to justify changes. The Reliability Centered Maintenance (RCM) project was started to provide another approach to the implementation of preventive maintenance. RCM evaluations were performed on nine systems. RCM was then suspended since it was found during implementation that additional documentation was necessary. RCM was selected as a preventive maintenance development process because it provides a documented analytical approach to establishing a preventive maintenance program. It is also being developed throughout the industry as a standard approach to preventive maintenance. PVNGS became interested in performing RCM analyses primarily to ensure that system and component reliability is maintained at the highest level possible and that hidden or rare failures are addressed by appropriate and effective maintenance. A secondary reason was to save maintenance expenditures

  20. Using Abductive Research Logic: "The Logic of Discovery", to Construct a Rigorous Explanation of Amorphous Evaluation Findings

    Science.gov (United States)

    Levin-Rozalis, Miri

    2010-01-01

    Background: Two kinds of research logic prevail in scientific research: deductive research logic and inductive research logic. However, both fail in the field of evaluation, especially evaluation conducted in unfamiliar environments. Purpose: In this article I wish to suggest the application of a research logic--"abduction"--"the logic of…

  1. Data Logic

    DEFF Research Database (Denmark)

    Nilsson, Jørgen Fischer

    A Gentle introduction to logical languages, logical modeling, formal reasoning and computational logic for computer science and software engineering students......A Gentle introduction to logical languages, logical modeling, formal reasoning and computational logic for computer science and software engineering students...

  2. SEffEst: Effort estimation in software projects using fuzzy logic and neural networks

    Directory of Open Access Journals (Sweden)

    Israel

    2012-08-01

    Full Text Available Academia and practitioners confirm that software project effort prediction is crucial for an accurate software project management. However, software development effort estimation is uncertain by nature. Literature has developed methods to improve estimation correctness, using artificial intelligence techniques in many cases. Following this path, this paper presents SEffEst, a framework based on fuzzy logic and neural networks designed to increase effort estimation accuracy on software development projects. Trained using ISBSG data, SEffEst presents remarkable results in terms of prediction accuracy.

  3. Saltwell PIC Skid Programmable Logic Controller (PLC) Software Configuration Management Plan

    International Nuclear Information System (INIS)

    KOCH, M.R.

    1999-01-01

    This document provides the procedures and guidelines necessary for computer software configuration management activities during the operation and maintenance phases of the Saltwell PIC Skids as required by LMH-PRO-309/Rev. 0, Computer Software Quality Assurance, Section 2.6, Software Configuration Management. The software configuration management plan (SCMP) integrates technical and administrative controls to establish and maintain technical consistency among requirements, physical configuration, and documentation for the Saltwell PIC Skid Programmable Logic Controller (PLC) software during the Hanford application, operations and maintenance. This SCMP establishes the Saltwell PIC Skid PLC Software Baseline, status changes to that baseline, and ensures that software meets design and operational requirements and is tested in accordance with their design basis

  4. Saltwell Leak Detector Station Programmable Logic Controller (PLC) Software Configuration Management Plan (SCMP)

    International Nuclear Information System (INIS)

    WHITE, K.A.

    2000-01-01

    This document provides the procedures and guidelines necessary for computer software configuration management activities during the operation and maintenance phases of the Saltwell Leak Detector Stations as required by HNF-PRO-309/Rev.1, Computer Software Quality Assurance, Section 2.4, Software Configuration Management. The software configuration management plan (SCMP) integrates technical and administrative controls to establish and maintain technical consistency among requirements, physical configuration, and documentation for the Saltwell Leak Detector Station Programmable Logic Controller (PLC) software during the Hanford application, operations and maintenance. This SCMP establishes the Saltwell Leak Detector Station PLC Software Baseline, status changes to that baseline, and ensures that software meets design and operational requirements and is tested in accordance with their design basis

  5. FUZZY LOGIC BASED SOFTWARE PROCESS IMPROVIZATION FRAMEWORK FOR INDIAN SMALL SCALE SOFTWARE ORGANIZATIONS

    OpenAIRE

    A.M.Kalpana; Dr.A.Ebenezer Jeyakumar

    2010-01-01

    In this paper, the authors elaborate the results obtained after analyzing and assessing the software process activities in five small to medium sized Indian software companies. This work demonstrates a cost effective framework for software process appraisal, specificallytargeted at Indian software Small-to-Medium-sized Enterprises (SMEs). Improvisation deals with the unforeseen. It involves continual experimentation with new possibilities to create innovative and improved solutions outside cu...

  6. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    Science.gov (United States)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  7. Experiment to evaluate software safety

    International Nuclear Information System (INIS)

    Soubies, B.; Henry, J.Y.

    1994-01-01

    The process of licensing nuclear power plants for operation consists of mandatory steps featuring detailed examination of the instrumentation and control system by the safety authorities, including softwares. The criticality of these softwares obliges the manufacturer to develop in accordance with the IEC 880 standard 'Computer software in nuclear power plant safety systems' issued by the International Electronic Commission. The evaluation approach, a two-stage assessment is described in detail. In this context, the IPSN (Institute of Protection and Nuclear Safety), the technical support body of the safety authority uses the MALPAS tool to analyse the quality of the programs. (R.P.). 4 refs

  8. Logic Models for Program Design, Implementation, and Evaluation: Workshop Toolkit. REL 2015-057

    Science.gov (United States)

    Shakman, Karen; Rodriguez, Sheila M.

    2015-01-01

    The Logic Model Workshop Toolkit is designed to help practitioners learn the purpose of logic models, the different elements of a logic model, and the appropriate steps for developing and using a logic model for program evaluation. Topics covered in the sessions include an overview of logic models, the elements of a logic model, an introduction to…

  9. Saphire models and software for ASP evaluations

    International Nuclear Information System (INIS)

    Sattison, M.B.

    1997-01-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission's (NRC's) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented

  10. [The development and evaluation of software to verify diagnostic accuracy].

    Science.gov (United States)

    Jensen, Rodrigo; de Moraes Lopes, Maria Helena Baena; Silveira, Paulo Sérgio Panse; Ortega, Neli Regina Siqueira

    2012-02-01

    This article describes the development and evaluation of software that verifies the accuracy of diagnoses made by nursing students. The software was based on a model that uses fuzzy logic concepts, including PERL, the MySQL database for Internet accessibility, and the NANDA-I 2007-2008 classification system. The software was evaluated in terms of its technical quality and usability through specific instruments. The activity proposed in the software involves four stages in which students establish the relationship values between nursing diagnoses, defining characteristics/risk factors and clinical cases. The relationship values determined by students are compared to those of specialists, generating performance scores for the students. In the evaluation, the software demonstrated satisfactory outcomes regarding the technical quality and, according to the students, helped in their learning and may become an educational tool to teach the process of nursing diagnosis.

  11. R-1 (C-620-A) and R-2 (C-620-B) air compressor control logic, computer software description. Revision 1

    International Nuclear Information System (INIS)

    Walter, K.E.

    1995-01-01

    This document provides an updated computer software description for the software used on the FFTF R-1 (C-620-A) and R-2 (C-620-B) air compressor programmable controllers. Logic software design changes were required to allow automatic starting of a compressor that had not been previously started

  12. INDONESIA PUBLIC BANKS PERFORMANCE EVALUATION USING FUZZY LOGIC

    Directory of Open Access Journals (Sweden)

    Sugiarto Sugiarto

    2016-10-01

    Full Text Available Return on Asset (ROA is a variable that has the greatest ability in predicting public banks stock prices in Indonesia. The coefficient of determination of ROA on public banks stock prices in Indonesia reached 54.8%. ROA has a significant positive influence on public bank stock prices in Indonesia. Fuzzy logic process on the performance of the 15 public banks in Indonesia have been carried out using the data of ROA for the period 2010 up to 2013. Bank reference performance according to ROA is based on Bank Indonesia Letter No. 6 / 23DPNP / 2011. The performance of each bank was analyzed by conventional methods and as a comparison used fuzzy logic. The evaluation with fuzzy logic method able to provide added value to the currently enforced performance evaluation method. There is significant difference in conclusion between the determination of fuzzy logic models and conventional method

  13. Evaluating pharmacy leader development through the seven action logics.

    Science.gov (United States)

    Philip, Achsah; Desai, Avani; Nguyen, Phouc Anne; Birney, Patrick; Colavecchia, Anthony; Karralli, Rusol; Smith, Lindsey; Lorimer, Dirk; Burgess, Gwen; Munch, Kyle; Daniel, Nelvin; Lionetti, Jason; Garey, Kevin W

    2016-01-15

    Pharmacy leader development over time was analyzed using the seven action logics. As part of an ongoing leadership seminar series, students were required to select a visionary pharmacy leader and conduct a structured interview to evaluate pharmacy leaders' action logics. A standardized questionnaire comprising 13 questions was created by the class. Questions addressed leadership qualities during the leaders' early years, education years, and work years. Transcripts were then coded by two separate trained investigators based on the leader's stage of life to provide a score for each action logic individually over time. Kappa coefficient was used to evaluate interrater agreement. A total of 14 leaders were interviewed. All leaders were currently employed and had won national awards for their contributions to pharmacy practice. Overall, there was 82% agreement between the two evaluators' scores for the various characteristics. Action logics changed based on the leaders' life stage. Using aggregate data from all leader interviews, a progression from lower-order action logics (opportunist, diplomat, expert) to higher-order action logics (strategist, alchemist) was found. Ten leaders (71%) were diplomats during their early years. Six leaders (43%) were experts during their education years, and 4 (29%) were strategists or alchemists. During the third life stage analyzed (the work years), 6 leaders (43%) were strategists, and 2 were alchemists. During their work years, all leaders had a percentage of their answers coded as alchemist (range, 5-22%). Throughout their professional careers, pharmacy leaders continually develop skills through formal education and mentorship that follow action logics. Copyright © 2016 by the American Society of Health-System Pharmacists, Inc. All rights reserved.

  14. A Logic Model for Evaluating the Academic Health Department.

    Science.gov (United States)

    Erwin, Paul Campbell; McNeely, Clea S; Grubaugh, Julie H; Valentine, Jennifer; Miller, Mark D; Buchanan, Martha

    2016-01-01

    Academic Health Departments (AHDs) are collaborative partnerships between academic programs and practice settings. While case studies have informed our understanding of the development and activities of AHDs, there has been no formal published evaluation of AHDs, either singularly or collectively. Developing a framework for evaluating AHDs has potential to further aid our understanding of how these relationships may matter. In this article, we present a general theory of change, in the form of a logic model, for how AHDs impact public health at the community level. We then present a specific example of how the logic model has been customized for a specific AHD. Finally, we end with potential research questions on the AHD based on these concepts. We conclude that logic models are valuable tools, which can be used to assess the value and ultimate impact of the AHD.

  15. An Exploratory Study to Assess Analytical and Logical Thinking Skills of the Software Practitioners using a Gamification Perspective

    Directory of Open Access Journals (Sweden)

    Şahin KAYALI

    2016-12-01

    Full Text Available The link between analytical and logical thinking skills and success of software practitioners attracted an increasing attention in the last decade. Several studies report that the ability to think logically is a requirement for improving software development skills, which exhibits a strong reasoning. Additionally, analytical thinking is a vital part of software development for example while dividing a task into elemental parts with respect to basic rules and principles.  Using the basic essence of gamification, this study proposes a mobile testing platform for assessing analytical and logical thinking skills of software practitioners as well as computer engineering students. The assessment questions were taken from the literature and transformed into a gamified tool based on the software requirements. A focus group study was conducted to capture the requirements. Using the Delphi method, these requirements were discussed by a group of experts to reach a multidisciplinary understanding where a level of moderate agreement has been achieved. In light of these, an assessment tool was developed, which was tested on both software practitioners from the industry and senior computer engineering students. Our results suggest that individuals who exhibit skills in analytical and logical thinking are also more inclined to be successful in software development.

  16. Automated Software Acceleration in Programmable Logic for an Efficient NFFT Algorithm Implementation: A Case Study.

    Science.gov (United States)

    Rodríguez, Manuel; Magdaleno, Eduardo; Pérez, Fernando; García, Cristhian

    2017-03-28

    Non-equispaced Fast Fourier transform (NFFT) is a very important algorithm in several technological and scientific areas such as synthetic aperture radar, computational photography, medical imaging, telecommunications, seismic analysis and so on. However, its computation complexity is high. In this paper, we describe an efficient NFFT implementation with a hardware coprocessor using an All-Programmable System-on-Chip (APSoC). This is a hybrid device that employs an Advanced RISC Machine (ARM) as Processing System with Programmable Logic for high-performance digital signal processing through parallelism and pipeline techniques. The algorithm has been coded in C language with pragma directives to optimize the architecture of the system. We have used the very novel Software Develop System-on-Chip (SDSoC) evelopment tool that simplifies the interface and partitioning between hardware and software. This provides shorter development cycles and iterative improvements by exploring several architectures of the global system. The computational results shows that hardware acceleration significantly outperformed the software based implementation.

  17. Cloud-Centric and Logically Isolated Virtual Network Environment Based on Software-Defined Wide Area Network

    Directory of Open Access Journals (Sweden)

    Dongkyun Kim

    2017-12-01

    Full Text Available Recent development of distributed cloud environments requires advanced network infrastructure in order to facilitate network automation, virtualization, high performance data transfer, and secured access of end-to-end resources across regional boundaries. In order to meet these innovative cloud networking requirements, software-defined wide area network (SD-WAN is primarily demanded to converge distributed cloud resources (e.g., virtual machines (VMs in a programmable and intelligent manner over distant networks. Therefore, this paper proposes a logically isolated networking scheme designed to integrate distributed cloud resources to dynamic and on-demand virtual networking over SD-WAN. The performance evaluation and experimental results of the proposed scheme indicate that virtual network convergence time is minimized in two different network models such as: (1 an operating OpenFlow-oriented SD-WAN infrastructure (KREONET-S which is deployed on the advanced national research network in Korea, and (2 Mininet-based experimental and emulated networks.

  18. A psychometric evaluation of the digital logic concept inventory

    Science.gov (United States)

    Herman, Geoffrey L.; Zilles, Craig; Loui, Michael C.

    2014-10-01

    Concept inventories hold tremendous promise for promoting the rigorous evaluation of teaching methods that might remedy common student misconceptions and promote deep learning. The measurements from concept inventories can be trusted only if the concept inventories are evaluated both by expert feedback and statistical scrutiny (psychometric evaluation). Classical Test Theory and Item Response Theory provide two psychometric frameworks for evaluating the quality of assessment tools. We discuss how these theories can be applied to assessment tools generally and then apply them to the Digital Logic Concept Inventory (DLCI). We demonstrate that the DLCI is sufficiently reliable for research purposes when used in its entirety and as a post-course assessment of students' conceptual understanding of digital logic. The DLCI can also discriminate between students across a wide range of ability levels, providing the most information about weaker students' ability levels.

  19. Evaluation of high-performance computing software

    Energy Technology Data Exchange (ETDEWEB)

    Browne, S.; Dongarra, J. [Univ. of Tennessee, Knoxville, TN (United States); Rowan, T. [Oak Ridge National Lab., TN (United States)

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  20. Management information systems software evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Al-Tunisi, N.; Ghazzawi, A.; Gruyaert, F.; Clarke, D. [Saudi Aramco, Dhahran (Saudi Arabia). Process and Control Systems Dept.

    1995-11-01

    In November 1993, Saudi Aramco management endorsed a proposal to coordinate the development of the Management Information Systems (MISs) of four concurrent projects for its facilities Controls Modernization Program. The affected projects were the Ras Tanura Refinery Upgrade Project, the Abqaiq Plant Controls Modernization and the Shedgum and Uthmaniyah Gas plants Control Upgrade Projects. All of these projects had a significant requirement of MISs in their scope. Under the leadership of the Process and Control Systems Department, and MIS Coordination Team was formed with representatives of several departments. An MIS Applications Evaluation procedure was developed based on the Kepner Tregoe Decisions Analysis Process and general questionnaires were sent to over a hundred potential Vendors. The applications were divided into several categories, such as: Data Capture and Historization, Human User Interface, Trending, Reporting, Graphic Displays, Data Reconciliation, Statistical Analysis, Expert Systems, Maintenance Applications, Document Management and Operations Planning and Scheduling. For each of the MIS Application areas, detailed follow-up questionnaires were used to short list the candidate products. In May and June 1994, selected Vendors were invited to Saudi Arabia for an Exhibition which was open to all Saudi Aramco employees. In conjunction with this, the Vendors were subjected to a rigorous product testing exercise by independent teams of testers. The paper will describe the methods used and the lessons learned in this extensive software evaluation phase, which was a first for Saudi Aramco.

  1. Management information systems software evaluation

    International Nuclear Information System (INIS)

    Al-Tunisi, N.; Ghazzawi, A.; Gruyaert, F.; Clarke, D.

    1995-01-01

    In November 1993, Saudi Aramco management endorsed a proposal to coordinate the development of the Management Information Systems (MISs) of four concurrent projects for its facilities Controls Modernization Program. The affected projects were the Ras Tanura Refinery Upgrade Project, the Abqaiq Plant Controls Modernization and the Shedgum and Uthmaniyah Gas plants Control Upgrade Projects. All of these projects had a significant requirement of MISs in their scope. Under the leadership of the Process and Control Systems Department, and MIS Coordination Team was formed with representatives of several departments. An MIS Applications Evaluation procedure was developed based on the Kepner Tregoe Decisions Analysis Process and general questionnaires were sent to over a hundred potential Vendors. The applications were divided into several categories, such as: Data Capture and Historization, Human User Interface, Trending, Reporting, Graphic Displays, Data Reconciliation, Statistical Analysis, Expert Systems, Maintenance Applications, Document Management and Operations Planning and Scheduling. For each of the MIS Application areas, detailed follow-up questionnaires were used to short list the candidate products. In May and June 1994, selected Vendors were invited to Saudi Arabia for an Exhibition which was open to all Saudi Aramco employees. In conjunction with this, the Vendors were subjected to a rigorous product testing exercise by independent teams of testers. The paper will describe the methods used and the lessons learned in this extensive software evaluation phase, which was a first for Saudi Aramco

  2. Evaluation of properties over phylogenetic trees using stochastic logics.

    Science.gov (United States)

    Requeno, José Ignacio; Colom, José Manuel

    2016-06-14

    Model checking has been recently introduced as an integrated framework for extracting information of the phylogenetic trees using temporal logics as a querying language, an extension of modal logics that imposes restrictions of a boolean formula along a path of events. The phylogenetic tree is considered a transition system modeling the evolution as a sequence of genomic mutations (we understand mutation as different ways that DNA can be changed), while this kind of logics are suitable for traversing it in a strict and exhaustive way. Given a biological property that we desire to inspect over the phylogeny, the verifier returns true if the specification is satisfied or a counterexample that falsifies it. However, this approach has been only considered over qualitative aspects of the phylogeny. In this paper, we repair the limitations of the previous framework for including and handling quantitative information such as explicit time or probability. To this end, we apply current probabilistic continuous-time extensions of model checking to phylogenetics. We reinterpret a catalog of qualitative properties in a numerical way, and we also present new properties that couldn't be analyzed before. For instance, we obtain the likelihood of a tree topology according to a mutation model. As case of study, we analyze several phylogenies in order to obtain the maximum likelihood with the model checking tool PRISM. In addition, we have adapted the software for optimizing the computation of maximum likelihoods. We have shown that probabilistic model checking is a competitive framework for describing and analyzing quantitative properties over phylogenetic trees. This formalism adds soundness and readability to the definition of models and specifications. Besides, the existence of model checking tools hides the underlying technology, omitting the extension, upgrade, debugging and maintenance of a software tool to the biologists. A set of benchmarks justify the feasibility of our

  3. Contribution to the aid to computer-aided design. Simulation of digital and logical sets. The CHAMBOR software

    International Nuclear Information System (INIS)

    Mansuy, Guy

    1973-01-01

    This report presents a simulation software which belongs to a set of software aimed at the design, analysis, test and tracing of electronic and logical assemblies. This software simulates the operation in time, and considers the propagation of signals through the network elements, with taking the delay created by each of them into account. The author presents some generalities (modules, description, library, simulation of a network in function of time), proposes a general and then a detailed description of the software: data interpretation, processing of dynamic data and network simulation, display of results on a graphical workstation

  4. Repository Evaluation of Software Reuse

    OpenAIRE

    Banker, Rajiv D.; Kauffman, Robert J.; Zweig, Dani

    1993-01-01

    The article of record as published may be found at: 10.1109/32.223805 Center for Digital Economy Research Stem School of Business Working Paper IS-93-28, Replaces: Working Paper IS-93-1 Working Paper Series STERN IS-93-28 Working Paper series: STERN IS-93-28 The traditional unit of analysis and control for software managers is the software project, and subsequently the resulting application system. Today, with the emerging ca- pabilities of computer-aided software engineering ...

  5. Software design specification and analysis(NuFDS) approach for the safety critical software based on porgrammable logic controller(PLC)

    International Nuclear Information System (INIS)

    Koo, Seo Ryong; Seong, Poong Hyun; Jung, Jin Yong; Choi, Seong Soo

    2004-01-01

    This paper introduces the software design specification and analysis technique for the safety-critical system based on Programmable Logic Controller (PLC). During software development phases, the design phase should perform an important role to connect between requirements phase and implementation phase as a process of translating problem requirements into software structures. In this work, the Nuclear FBD-style Design Specification and analysis (NuFDS) approach was proposed. The NuFDS approach for nuclear Instrumentation and Control (I and C) software are suggested in a straight forward manner. It consists of four major specifications as follows; Database, Software Architecture, System Behavior, and PLC Hardware Configuration. Additionally, correctness, completeness, consistency, and traceability check techniques are also suggested for the formal design analysis in NuFDS approach. In addition, for the tool supporting, we are developing NuSDS tool based on the NuFDS approach which is a tool, especially for the software design specification in nuclear fields

  6. Experiment on safety software evaluation

    International Nuclear Information System (INIS)

    Soubies, B.; Henry, J.Y.

    1994-06-01

    The licensing procedures process of nuclear plants includes compulsory steps which bring about a thorough exam of the commands control system. In this context the IPSN uses a tool called MALPAS to carry out an analysis of the quality of the software involved in safety control. The IPSN also try to obtain the automation of the generation of test games necessary for dynamical analysis. The MALPAS tool puts forward the particularities of programing which can influence the testability and the upholding of the studied software. (TEC). 4 refs

  7. HALOE test and evaluation software

    Science.gov (United States)

    Edmonds, W.; Natarajan, S.

    1987-01-01

    Computer programming, system development and analysis efforts during this contract were carried out in support of the Halogen Occultation Experiment (HALOE) at NASA/Langley. Support in the major areas of data acquisition and monitoring, data reduction and system development are described along with a brief explanation of the HALOE project. Documented listings of major software are located in the appendix.

  8. Methodology of formal software evaluation

    International Nuclear Information System (INIS)

    Tuszynski, J.

    1998-01-01

    Sydkraft AB, the major Swedish utility, owner of ca 6000 MW el installed in nuclear (NPP Barsebaeck and NPP Oskarshamn), fossil fuel and hydro Power Plants is facing modernization of the control systems of the plants. Standards applicable require structured, formal methods for implementation of the control functions in the modem, real time software systems. This presentation introduces implementation methodology as discussed presently at the Sydkraft organisation. The approach suggested is based upon the process of co-operation of three parties taking part in the implementation; owner of the plant, vendor and Quality Assurance (QA) organisation. QA will be based on tools for formal software validation and on systematic gathering by the owner of validated and proved-by-operation control modules for the concern-wide utilisation. (author)

  9. A quantitative calculation for software reliability evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Young-Jun; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    To meet these regulatory requirements, the software used in the nuclear safety field has been ensured through the development, validation, safety analysis, and quality assurance activities throughout the entire process life cycle from the planning phase to the installation phase. A variety of activities, such as the quality assurance activities are also required to improve the quality of a software. However, there are limitations to ensure that the quality is improved enough. Therefore, the effort to calculate the reliability of the software continues for a quantitative evaluation instead of a qualitative evaluation. In this paper, we propose a quantitative calculation method for the software to be used for a specific operation of the digital controller in an NPP. After injecting random faults in the internal space of a developed controller and calculating the ability to detect the injected faults using diagnostic software, we can evaluate the software reliability of a digital controller in an NPP. We tried to calculate the software reliability of the controller in an NPP using a new method that differs from a traditional method. It calculates the fault detection coverage after injecting the faults into the software memory space rather than the activity through the life cycle process. We attempt differentiation by creating a new definition of the fault, imitating the software fault using the hardware, and giving a consideration and weights for injection faults.

  10. Evaluation of a Postdischarge Call System Using the Logic Model.

    Science.gov (United States)

    Frye, Timothy C; Poe, Terri L; Wilson, Marisa L; Milligan, Gary

    2018-02-01

    This mixed-method study was conducted to evaluate a postdischarge call program for congestive heart failure patients at a major teaching hospital in the southeastern United States. The program was implemented based on the premise that it would improve patient outcomes and overall quality of life, but it had never been evaluated for effectiveness. The Logic Model was used to evaluate the input of key staff members to determine whether the outputs and results of the program matched the expectations of the organization. Interviews, online surveys, reviews of existing patient outcome data, and reviews of publicly available program marketing materials were used to ascertain current program output. After analyzing both qualitative and quantitative data from the evaluation, recommendations were made to the organization to improve the effectiveness of the program.

  11. Software for Evaluation of Conceptual Design

    DEFF Research Database (Denmark)

    Hartvig, Susanne C

    1998-01-01

    by the prototype, it addresses the requirements that the methods imply, and it explains the actual implementation of the prototype. Finally it discusses what have been learned from developing and testing the prototype. In this paper it is suggested, that a software tool which supports evaluation of design can...... be developed with a limited effort, and that such tools could support a structured evaluation process as opposed to no evaluation. Compared to manual evaluation, the introduced software based evaluation tool offers automation of tasks, such as performing assessments, when they are based on prior evaluations...

  12. DEVELOPING EVALUATION INSTRUMENT FOR MATHEMATICS EDUCATIONAL SOFTWARE

    Directory of Open Access Journals (Sweden)

    Wahyu Setyaningrum

    2012-02-01

    Full Text Available The rapid increase and availability of mathematics software, either for classroom or individual learning activities, presents a challenge for teachers. It has been argued that many products are limited in quality. Some of the more commonly used software products have been criticized for poor content, activities which fail to address some learning issues, poor graphics presentation, inadequate documentation, and other technical problems. The challenge for schools is to ensure that the educational software used in classrooms is appropriate and effective in supporting intended outcomes and goals. This paper aimed to develop instrument for evaluating mathematics educational software in order to help teachers in selecting the appropriate software. The instrument considers the notion of educational including content, teaching and learning skill, interaction, and feedback and error correction; and technical aspects of educational software including design, clarity, assessment and documentation, cost and hardware and software interdependence. The instrument use a checklist approach, the easier and effective methods in assessing the quality of educational software, thus the user needs to put tick in each criteria. The criteria in this instrument are adapted and extended from standard evaluation instrument in several references.   Keywords: mathematics educational software, educational aspect, technical aspect.

  13. Speech to Text Software Evaluation Report

    CERN Document Server

    Martins Santo, Ana Luisa

    2017-01-01

    This document compares out-of-box performance of three commercially available speech recognition software: Vocapia VoxSigma TM , Google Cloud Speech, and Lime- craft Transcriber. It is defined a set of evaluation criteria and test methods for speech recognition softwares. The evaluation of these softwares in noisy environments are also included for the testing purposes. Recognition accuracy was compared using noisy environments and languages. Testing in ”ideal” non-noisy environment of a quiet room has been also performed for comparison.

  14. A FUZZY LOGIC APPROACH TO MEASURE THE PRECISE TESTABILITY INDEX OF SOFTWARE

    OpenAIRE

    NAVDEEP KAUR,; MANINDERPAL SINGH

    2011-01-01

    Many of the software fails as a result of poor quality. For large software projects testing has a deep influence on the overall acceptability and quality of the final software. Testability of the software can be effectively measured form the testability effort and the time required to test the software. In today’s software development environment, object oriented design and development become important. There is strong relationship between the object oriented metrics and the testability effor...

  15. Risk evaluation in Columbian electricity market using fuzzy logic

    International Nuclear Information System (INIS)

    Medina, S.; Moreno, J.

    2007-01-01

    This article proposes a model based on Fuzzy Logic to evaluate the market risk that a trading agent faces in the electric power negotiation in Colombia, as part of a general model of negotiation. The proposed model considers single external factors as regulatory changes, social and political issues, and the condition of the national transmission net. Variables of the market associated to these risk factors were selected and some graphic and statistical analyses were made in order to check their relationship with the electricity prices and to determine why the experts consider these factors in their analyses. According to the obtained results a Mamdani Fuzzy Inference System which contains the expert knowledge was developed and it is presented in a fuzzy cognitive map. (author)

  16. Ensuring system security through formal software evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Howell, J A; Fuyat, C [Los Alamos National Lab., NM (United States); Elvy, M [Marble Associates, Boston, MA (United States)

    1992-01-01

    With the increasing use of computer systems and networks to process safeguards information in nuclear facilities, the issue of system and data integrity is receiving worldwide attention. Among the many considerations are validation that the software performs as intended and that the information is adequately protected. Such validations are often requested of the Safeguards Systems Group of the Los Alamos National Laboratory. This paper describes our methodology for performing these software evaluations.

  17. Usability Evaluation Method for Agile Software Development

    Directory of Open Access Journals (Sweden)

    Saad Masood Butt

    2015-02-01

    Full Text Available Agile methods are the best fit for tremendously growing software industry due to its flexible and dynamic nature. But the software developed using agile methods do meet the usability standards? To answer this question we can see that majority of agile software development projects currently involve interactive user interface designs, which can only be possible by following User Centered Design (UCD in agile methods. The question here is, how to integrate UCD with agile models. Both Agile models and UCD are iterative in nature but agile models focus on coding and development of software; whereas, UCD focuses on user interface of the software. Similarly, both of them have testing features where the agile model involves automated tested code while UCD involves an expert or a user to test the user interface. In this paper, a new agile usability model is proposed and the evaluation is of the proposed model is presented by practically implementing it in three real life projects. . Key results from these projects clearly show: the proposed agile model incorporates usability evaluation methods, improves the relationship between usability experts to work with agile software experts; in addition, allows agile developers to incorporate the result from UCD into subsequent interactions.

  18. Virtual optical network provisioning with unified service logic processing model for software-defined multidomain optical networks

    Science.gov (United States)

    Zhao, Yongli; Li, Shikun; Song, Yinan; Sun, Ji; Zhang, Jie

    2015-12-01

    Hierarchical control architecture is designed for software-defined multidomain optical networks (SD-MDONs), and a unified service logic processing model (USLPM) is first proposed for various applications. USLPM-based virtual optical network (VON) provisioning process is designed, and two VON mapping algorithms are proposed: random node selection and per controller computation (RNS&PCC) and balanced node selection and hierarchical controller computation (BNS&HCC). Then an SD-MDON testbed is built with OpenFlow extension in order to support optical transport equipment. Finally, VON provisioning service is experimentally demonstrated on the testbed along with performance verification.

  19. Software Dependability and Safety Evaluations ESA's Initiative

    Science.gov (United States)

    Hernek, M.

    ESA has allocated funds for an initiative to evaluate Dependability and Safety methods of Software. The objectives of this initiative are; · More extensive validation of Safety and Dependability techniques for Software · Provide valuable results to improve the quality of the Software thus promoting the application of Dependability and Safety methods and techniques. ESA space systems are being developed according to defined PA requirement specifications. These requirements may be implemented through various design concepts, e.g. redundancy, diversity etc. varying from project to project. Analysis methods (FMECA. FTA, HA, etc) are frequently used during requirements analysis and design activities to assure the correct implementation of system PA requirements. The criticality level of failures, functions and systems is determined and by doing that the critical sub-systems are identified, on which dependability and safety techniques are to be applied during development. Proper performance of the software development requires the development of a technical specification for the products at the beginning of the life cycle. Such technical specification comprises both functional and non-functional requirements. These non-functional requirements address characteristics of the product such as quality, dependability, safety and maintainability. Software in space systems is more and more used in critical functions. Also the trend towards more frequent use of COTS and reusable components pose new difficulties in terms of assuring reliable and safe systems. Because of this, its dependability and safety must be carefully analysed. ESA identified and documented techniques, methods and procedures to ensure that software dependability and safety requirements are specified and taken into account during the design and development of a software system and to verify/validate that the implemented software systems comply with these requirements [R1].

  20. Evaluation Logic of Main Control Board Fire Risk

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Kim, Kilyoo; Lim, Ho Gon [KAERI, Daejeon (Korea, Republic of)

    2015-05-15

    The main control board (MCB) is defined as the collection of control panels inside the main control room (MCR) of a nuclear power plant (NPP). As the MCB has the control and instrumentation circuits of redundant trains for almost all plant systems, small fires within the control panels may be detrimental to the safe shutdown capability. A big fire affecting many panels in the MCB can cause a forced MCR abandonment of the operators as well as function failures or spurious operations of the control and instrumentation-related components. If the MCR cannot be habitable, a safe shutdown from outside the MCR can be achieved and maintained at an alternate shutdown panel electrically and physically independent from the MCR. Because the MCB consist of many electrical panels, it may have internal barriers between them to prevent a fire from spreading from its origin to neighboring locations. However, most MCBs of domestic NPPs do not have internal barriers within them. If the MCB cabinets are not separated by a double wall with an air gap, the fire propagation of an MCB panel fire cannot be ruled out. Recently, Joglar et al. proposed a new evaluation logic for the MCB panel fires and mentioned that an MCB fire can be divided into propagation and non-propagating fires for abandonment and non-abandonment fire scenarios. However, they did not present the details on the fire modeling approaches and probability formulas for the fire scenarios. In this paper, a decision tree for evaluating the risk of an MCB fire is proposed to systematically determine the fire scenarios in terms of the fire modeling approaches. This paper proposed a decision tree for evaluating the risk of an MCB fire to systematically determine the fire scenarios in terms of fire modeling approaches.

  1. Development of an object-oriented software based on fuzzy-logic for controlling temperatures in PAC experiments

    International Nuclear Information System (INIS)

    Lapolli, Andre L.; Yamagishi, Sueli; Domienikan, Claudio; Schoueri, Roberto M.; Carbonari, Artur W.; Saxena, Rajendra N.

    2009-01-01

    The Hyperfine Interaction Laboratory at Instituto de Pesquisas Energeticas e Nucleares (IPEN) has been using Perturbed Angular Correlation (PAC) technique for studying material science for more than 20 years. One of the important aspects of the research involves the study of the behavior of measured properties of samples as a function of temperature. For temperatures higher than room temperature a small resistance furnace is used to heat the sample. The need to carry out the PAC measurement at predefined temperatures steps in a programmed manner is obvious. The present work describes a procedure for the furnace temperature control and automatic data acquisition at different temperatures based on fuzzy logic. The procedure consists in determining the linguistic input (temp, Δtemp) and output (pow) variables and their pertinence functions. After defining the variables, an object.oriented program is written in Java language which is an interface between principal data acquisition program and electronic temperature controller of the mini furnace. In addition to the implementation of the class that involves the fuzzy logic and classes with strategic algorithms defined for each temperature range there are classes of communication between systems based on modbus protocol RTU (Remote Terminal Unit) connected to serial interface RS-488. In this manner the applied technology for the development of software permits higher software life requiring only small alterations or implementation of classes in the use with new equipment. (author)

  2. Development of an object-oriented software based on fuzzy-logic for controlling temperatures in PAC experiments

    Energy Technology Data Exchange (ETDEWEB)

    Lapolli, Andre L.; Yamagishi, Sueli; Domienikan, Claudio; Schoueri, Roberto M.; Carbonari, Artur W.; Saxena, Rajendra N., E-mail: alapolli@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2009-07-01

    The Hyperfine Interaction Laboratory at Instituto de Pesquisas Energeticas e Nucleares (IPEN) has been using Perturbed Angular Correlation (PAC) technique for studying material science for more than 20 years. One of the important aspects of the research involves the study of the behavior of measured properties of samples as a function of temperature. For temperatures higher than room temperature a small resistance furnace is used to heat the sample. The need to carry out the PAC measurement at predefined temperatures steps in a programmed manner is obvious. The present work describes a procedure for the furnace temperature control and automatic data acquisition at different temperatures based on fuzzy logic. The procedure consists in determining the linguistic input (temp, DELTAtemp) and output (pow) variables and their pertinence functions. After defining the variables, an object.oriented program is written in Java language which is an interface between principal data acquisition program and electronic temperature controller of the mini furnace. In addition to the implementation of the class that involves the fuzzy logic and classes with strategic algorithms defined for each temperature range there are classes of communication between systems based on modbus protocol RTU (Remote Terminal Unit) connected to serial interface RS-488. In this manner the applied technology for the development of software permits higher software life requiring only small alterations or implementation of classes in the use with new equipment. (author)

  3. Evaluating Predictive Models of Software Quality

    Science.gov (United States)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  4. Evaluating predictive models of software quality

    International Nuclear Information System (INIS)

    Ciaschini, V; Canaparo, M; Ronchieri, E; Salomoni, D

    2014-01-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  5. A Logic Programming Based Approach to Applying Abstract Interpretation to Embedded Software

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen

    anvendt til at analysere programmer udviklet til indlejrede systemer. Et givet indlejret system modelleres som en emulator skrevet i en variant af logikprogrammering kaldet constraint logic programming (CLP). Emulatoren specialiseres med hensyn til et givet program, hvilket resulterer i et nyt program i...... programmeringsparadigme der har et solidt matematisk fundament. Et af dets karakteristika er adskillelsen af logik (betydningen af et program) og kontrol (hvordan programmet udføres), hvilket gør logikprogrammering til et meget anvendeligt sprog hvad angår programanalyse. I denne afhandling bliver logikprogrammering...... skrevet i sproget CLP der samtidig er isomorft med programmet skrevet til det indlejrede system. Anvendes abstract interpretation baserede analysatorer på det specialiserede program, kan resultater fra denne analyse direkte overføres til den indlejrede program, da dette program og den specialiserede...

  6. D-VASim: A Software Tool to Simulate and Analyze Genetic Logic Circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2016-01-01

    -stage researchers with limited experience in the field of biology. The Solution: Using LabVIEW to develop a user-friendly simulation tool named Dynamic Virtual Analyzer and Simulator (D-VASim), which is the first software tool in the domain of synthetic biology that provides a virtual laboratory environment...

  7. Federal Highway Administration research and technology evaluation final report : Eco-Logical

    Science.gov (United States)

    2018-03-01

    This report documents an evaluation of Federal Highway Administrations (FHWA) Research and Technology Programs activities on the implementation of the Eco-Logical approach by State transportation departments and metropolitan planning organizati...

  8. Evaluating system behavior through Dynamic Master Logic Diagram (DMLD) modeling

    International Nuclear Information System (INIS)

    Hu, Y.-S.; Modarres, Mohammad

    1999-01-01

    In this paper, the Dynamic Master Logic Diagram (DMLD) is introduced for representing full-scale time-dependent behavior and uncertain behavior of complex physical systems. Conceptually, the DMLD allows one to decompose a complex system hierarchically to model and to represent: (1) partial success/failure of the system, (2) full-scale logical, physical and fuzzy connectivity relations, (3) probabilistic, resolutional or linguistic uncertainty, (4) multiple-state system dynamics, and (5) floating threshold and transition effects. To demonstrate the technique, examples of using DMLD to model, to diagnose and to control dynamic behavior of a system are presented. A DMLD-based expert system building tool, called Dynamic Reliability Expert System (DREXs), is introduced to automate the DMLD modeling process

  9. The Logical Basis of Evaluation Order and Pattern-Matching

    Science.gov (United States)

    2009-04-17

    various people in the 1930s, particularly Wittgenstein (“It is what is regarded as the justification of an assertion that constitutes the sense of the...assertion” [ Wittgenstein , 1974, I,§40]), and Brouwer-Heyting-Kolmogorov in their interpretations of intuitionistic logic [Heyting, 1974, Kolmogorov, 1932...Technical Report CMU-CS-02-101, Department of Computer Science, Carnegie Mellon University, 2002. Revised May 2003. Ludwig Wittgenstein

  10. Software for Library Management: Selection and Evaluation.

    Science.gov (United States)

    Notowitz, Carol I.

    1987-01-01

    This discussion of library software packages includes guidelines for library automation with microcomputers; criteria to aid in software selection; comparison of some features of available acquisitions, circulation and overdues software; references for software reviews; additional information on microsoftware; and a directory of producers and…

  11. Software configuration plan for the 1,000 CFM portable exhauster's small logic control system

    International Nuclear Information System (INIS)

    Kaiser, T.D.

    1998-01-01

    This document describes the formal documentation for maintaining the control system associated with the 1,000 CFM portable exhauster's. The objective of the software configuration control plan is to provide assurances that the portable exhauster's control system will be operable for the duration of 241-C-106 and 241-AY-102 operations (project 320). The design was based upon the criteria documented in the portable exhauster functional design criteria (HNF-SD-WM-DB-035) and procurement specification (HNF-S-0490) for the exhauster interlock systems

  12. Software Development and Feedback from Usability Evaluations

    DEFF Research Database (Denmark)

    Høegh, Rune Thaarup

    2008-01-01

    This paper presents a study of the strengths and weaknesses of written, multimedia and oral feedback from usability evaluations to developers. The strengths and weaknesses are related to how well the feedback supports the developers in addressing usability problems in a software system. The study...... concludes that using the traditional written usability report, as the only form of feedback from usability evaluations is associated with problems related to the report not supporting the process of addressing the usability problems. The report is criticized for representing an overwhelming amount...... of information, while still not offering the required information to address usability problems. Other forms of feedback, such as oral or multimedia feedback helps the developer in understanding the usability problems better, but are on the other hand less cost-effective than a written description....

  13. An evaluation and acceptance of COTS software for FPGA-based controllers in NPPS

    International Nuclear Information System (INIS)

    Jung, Sejin; Kim, Eui-Sub; Yoo, Junbeom; Kim, Jang-Yeol; Choi, Jong Gyun

    2016-01-01

    Highlights: • All direct/indirect COTS SW should be dedicated. • FPGA synthesis tools are important for the safety of new digital I&Cs. • No standards/reports are yet available to deal with the indirect SW – FPGA synthesis tools. • This paper proposes a new evaluation/acceptance process and criteria for indirect SW. - Abstract: FPGA (Field-Programmable Gate Array) has received much attention from nuclear industry as an alternative platform of PLC (Programmable Logic Controller)-based digital I&C (Instrumentation & Control). Software aspect of FPGA development encompasses several commercial tools such as logic synthesis and P&R (Place & Route), which should be first dedicated in accordance with domestic standards based on EPRI NP-5652. Even if a state-of-the-art supplementary EPRI TR-1025243 makes an effort, the dedication of indirect COTS (Commercial Off-The-Shelf) SW such as FPGA logic synthesis tools has still caused a dispute. This paper proposes an acceptance process and evaluation criteria, specific to COTS SW, not commercial-grade direct items. It specifically incorporates indirect COTS SW and also provides categorized evaluation criteria for acceptance. It provides an explicit linkage between acceptance methods (Verification and Validation techniques) and evaluation criteria, too. We tried to perform the evaluation and acceptance process upon a commercial FPGA logic synthesis tool being used to develop a new FPGA-based digital I&C in Korea, and could confirm its applicability.

  14. Evaluation of Agricultural Accounting Software. Improved Decision Making. Third Edition.

    Science.gov (United States)

    Lovell, Ashley C., Comp.

    Following a discussion of the evaluation criteria for choosing accounting software, this guide contains reviews of 27 accounting software programs that could be used by farm or ranch business managers. The information in the reviews was provided by the software vendors and covers the following points for each software package: general features,…

  15. Optimization and evaluation of probabilistic-logic sequence models

    DEFF Research Database (Denmark)

    Christiansen, Henning; Lassen, Ole Torp

    to, in principle, Turing complete languages. In general, such models are computationally far to complex for direct use, so optimization by pruning and approximation are needed. % The first steps are made towards a methodology for optimizing such models by approximations using auxiliary models......Analysis of biological sequence data demands more and more sophisticated and fine-grained models, but these in turn introduce hard computational problems. A class of probabilistic-logic models is considered, which increases the expressibility from HMM's and SCFG's regular and context-free languages...

  16. Evaluating Accounting Software in Secondary Schools.

    Science.gov (United States)

    Chalupa, Marilyn

    1988-01-01

    The secondary accounting curriculum must be modified to include computers and software. Educators must be aware of the computer skills needed on the job and of the accounting software that is available. Software selection must be tailored to fit the curriculum and the time available. (JOW)

  17. STEM - software test and evaluation methods. A study of failure dependency in diverse software

    International Nuclear Information System (INIS)

    Bishop, P.G.; Pullen, F.D.

    1989-02-01

    STEM is a collaborative software reliability project undertaken in partnership with Halden Reactor Project, UKAEA, and the Finnish Technical Research Centre. The objective of STEM is to evaluate a number of fault detection and fault estimation methods which can be applied to high integrity software. This Report presents a study of the observed failure dependencies between faults in diversely produced software. (author)

  18. Development of a Program Logic Model and Evaluation Plan for a Participatory Ergonomics Intervention in Construction

    Science.gov (United States)

    Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley

    2013-01-01

    Background Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. Methods In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. Results The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Conclusions Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. PMID:24006097

  19. Development of a program logic model and evaluation plan for a participatory ergonomics intervention in construction.

    Science.gov (United States)

    Jaegers, Lisa; Dale, Ann Marie; Weaver, Nancy; Buchholz, Bryan; Welch, Laura; Evanoff, Bradley

    2014-03-01

    Intervention studies in participatory ergonomics (PE) are often difficult to interpret due to limited descriptions of program planning and evaluation. In an ongoing PE program with floor layers, we developed a logic model to describe our program plan, and process and summative evaluations designed to describe the efficacy of the program. The logic model was a useful tool for describing the program elements and subsequent modifications. The process evaluation measured how well the program was delivered as intended, and revealed the need for program modifications. The summative evaluation provided early measures of the efficacy of the program as delivered. Inadequate information on program delivery may lead to erroneous conclusions about intervention efficacy due to Type III error. A logic model guided the delivery and evaluation of our intervention and provides useful information to aid interpretation of results. © 2013 Wiley Periodicals, Inc.

  20. Research on software behavior trust based on hierarchy evaluation

    Science.gov (United States)

    Long, Ke; Xu, Haishui

    2017-08-01

    In view of the correlation software behavior, we evaluate software behavior credibility from two levels of control flow and data flow. In control flow level, method of the software behavior of trace based on support vector machine (SVM) is proposed. In data flow level, behavioral evidence evaluation based on fuzzy decision analysis method is put forward.

  1. Study of evaluation techniques of software configuration management and reliability

    Energy Technology Data Exchange (ETDEWEB)

    Youn, Cheong; Baek, Y. W.; Kim, H. C.; Han, H. C.; Choi, C. R. [Chungnam National Univ., Taejon (Korea, Republic of)

    2001-03-15

    The Study of activities to solve software safety and quality must be executed in base of establishing software development process for digitalized nuclear plant. Especially study of software testing and Verification and Validation must executed. For this purpose methodologies and tools which can improve software qualities are evaluated and software Testing, V and V and Configuration Management which can be applied to software life cycle are investigated. This study establish a guideline that can be used to assure software safety and reliability requirements in digitalized nuclear plant systems.

  2. Software vulnerability: Definition, modelling, and practical evaluation for E-mail: transfer software

    International Nuclear Information System (INIS)

    Kimura, Mitsuhiro

    2006-01-01

    This paper proposes a method of assessing software vulnerability quantitatively. By expanding the concept of the IPO (input-program-output) model, we first define the software vulnerability and construct a stochastic model. Then we evaluate the software vulnerability of the sendmail system by analyzing the actual security-hole data, which were collected from its release note. Also we show the relationship between the estimated software reliability and vulnerability of the analyzed system

  3. Development of a software tool using deterministic logic for the optimization of cochlear implant processor programming.

    Science.gov (United States)

    Govaerts, Paul J; Vaerenberg, Bart; De Ceulaer, Geert; Daemers, Kristin; De Beukelaer, Carina; Schauwers, Karen

    2010-08-01

    An intelligent agent, Fitting to Outcomes eXpert, was developed to optimize and automate Cochlear implant (CI) programming. The current article describes the rationale, development, and features of this tool. Cochlear implant fitting is a time-consuming procedure to define the value of a subset of the available electric parameters based primarily on behavioral responses. It is comfort-driven with high intraindividual and interindividual variability both with respect to the patient and to the clinician. Its validity in terms of process control can be questioned. Good clinical practice would require an outcome-driven approach. An intelligent agent may help solve the complexity of addressing more electric parameters based on a range of outcome measures. A software application was developed that consists of deterministic rules that analyze the map settings in the processor together with psychoacoustic test results (audiogram, A(section sign)E phoneme discrimination, A(section sign)E loudness scaling, speech audiogram) obtained with that map. The rules were based on the daily clinical practice and the expertise of the CI programmers. The data transfer to and from this agent is either manual or through seamless digital communication with the CI fitting database and the psychoacoustic test suite. It recommends and executes modifications to the map settings to improve the outcome. Fitting to Outcomes eXpert is an operational intelligent agent, the principles of which are described. Its development and modes of operation are outlined, and a case example is given. Fitting to Outcomes eXpert is in use for more than a year now and seems to be capable to improve the measured outcome. It is argued that this novel tool allows a systematic approach focusing on outcome, reducing the fitting time, and improving the quality of fitting. It introduces principles of artificial intelligence in the process of CI fitting.

  4. POLE.VAULT: A Semantic Framework for Health Policy Evaluation and Logical Testing.

    Science.gov (United States)

    Shaban-Nejad, Arash; Okhmatovskaia, Anya; Shin, Eun Kyong; Davis, Robert L; Buckeridge, David L

    2017-01-01

    The major goal of our study is to provide an automatic evaluation framework that aligns the results generated through semantic reasoning with the best available evidence regarding effective interventions to support the logical evaluation of public health policies. To this end, we have designed the POLicy EVAlUation & Logical Testing (POLE.VAULT) Framework to assist different stakeholders and decision-makers in making informed decisions about different health-related interventions, programs and ultimately policies, based on the contextual knowledge and the best available evidence at both individual and aggregate levels.

  5. Software Maintenance Management Evaluation and Continuous Improvement

    CERN Document Server

    April, Alain

    2008-01-01

    This book explores the domain of software maintenance management and provides road maps for improving software maintenance organizations. It describes full maintenance maturity models organized by levels 1, 2, and 3, which allow for benchmarking and continuous improvement paths. Goals for each key practice area are also provided, and the model presented is fully aligned with the architecture and framework of software development maturity models of CMMI and ISO 15504. It is complete with case studies, figures, tables, and graphs.

  6. Evaluating Graphene as a Channel Material in Spintronic Logic Devices

    Science.gov (United States)

    Anugrah, Yoska

    Spintronics, a class of devices that exploit the spin properties of electrons in addition to the charge properties, promises the possibility for nonvolatile logic and memory devices that operate at low power. Graphene is a material in which the spin orientation of electrons can be conserved over a long distance, which makes it an attractive channel material in spintronics devices. In this dissertation, the properties of graphene that are interesting for spintronics applications are explored. A robust fabrication process is described for graphene spin valves using Al2O3 tunnel tunnel barriers and Co ferromagnetic contacts. Spin transport was characterized in both few-layer exfoliated and single-layer graphene, and spin diffusion lengths and spin relaxation times were extracted using the nonlocal spin valve geometry and Hanle measurements. The effect of input-output asymmetry on the spin transport was investigated. The effect of an applied drift electric field on spin transport was investigated and the spin diffusion length was found to be tunable by a factor of 8X (suppressed to 1.6 microm and enhanced to 13 microm from the intrinsic length of 4.6 microm using electric field of +/-1800 V/cm). A mechanism to induce asymmetry without excess power dissipation is also described which utilizes a double buried-gate structure to tune the Fermi levels on the input and output sides of a graphene spin logic device independently. It was found that different spin scattering mechanisms were at play in the two halves of a small graphene strip. This suggests that the spin properties of graphene are strongly affected by its local environment, e.g. impurities, surface topography, defects. Finally, two-dimensional materials beyond graphene have been explored as spin channels. One such material is phosphorene, which has low spin-orbit coupling and high mobility, and the interface properties of ferromagnets (cobalt and permalloy) with this material were explored. This work could

  7. Integrating Usability Evaluations into the Software Development Process

    DEFF Research Database (Denmark)

    Lizano, Fulvio

    as relevant and strategic human–computer interaction (HCI) activities in the software development process, there are obstacles that limit the complete, effective and efficient integration of this kind of testing into the software development process. Two main obstacles are the cost of usability evaluations...... and the software developers' resistance to accepting users’ opinions regarding the lack of usability in their software systems. The ‘cost obstacle’ refers to the constraint of conducting usability evaluations in the software process due to the significant amount of resources required by this type of testing. Some......This thesis addresses the integration of usability evaluations into the software development process. The integration here is contextualized in terms of how to include usability evaluation as an activity in the software development lifecycle. Even though usability evaluations are considered...

  8. A reliability evaluation method for NPP safety DCS application software

    International Nuclear Information System (INIS)

    Li Yunjian; Zhang Lei; Liu Yuan

    2014-01-01

    In the field of nuclear power plant (NPP) digital i and c application, reliability evaluation for safety DCS application software is a key obstacle to be removed. In order to quantitatively evaluate reliability of NPP safety DCS application software, this paper propose a reliability evaluating method based on software development life cycle every stage's v and v defects density characteristics, by which the operating reliability level of the software can be predicted before its delivery, and helps to improve the reliability of NPP safety important software. (authors)

  9. Microcomputer Software Engineering, Documentation and Evaluation

    Science.gov (United States)

    1981-03-31

    local dealer or call for complete specificalons. eAUTOMATIC INC To proceed step by step, we need toUe G T A TOMA IC NC. know where we are going and a...MICROPROCESSOR normal sequence that should be DIRECT MEMORY ACCESS preserved in the documentation. For INTRODUCTION 2.2 DRIVE CONTROLS example, you...with linear, sequential logic (like a computer). It is also the verbal side and controls language. The right side specializes in images, music, pictures

  10. Disability Policy Evaluation: Combining Logic Models and Systems Thinking

    Science.gov (United States)

    Claes, Claudia; Ferket, Neelke; Vandevelde, Stijn; Verlet, Dries; De Maeyer, Jessica

    2017-01-01

    Policy evaluation focuses on the assessment of policy-related personal, family, and societal changes or benefits that follow as a result of the interventions, services, and supports provided to those persons to whom the policy is directed. This article describes a systematic approach to policy evaluation based on an evaluation framework and an…

  11. Presenting an Evaluation Model for the Cancer Registry Software.

    Science.gov (United States)

    Moghaddasi, Hamid; Asadi, Farkhondeh; Rabiei, Reza; Rahimi, Farough; Shahbodaghi, Reihaneh

    2017-12-01

    As cancer is increasingly growing, cancer registry is of great importance as the main core of cancer control programs, and many different software has been designed for this purpose. Therefore, establishing a comprehensive evaluation model is essential to evaluate and compare a wide range of such software. In this study, the criteria of the cancer registry software have been determined by studying the documents and two functional software of this field. The evaluation tool was a checklist and in order to validate the model, this checklist was presented to experts in the form of a questionnaire. To analyze the results of validation, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved, the final version of the evaluation model for the cancer registry software was presented. The evaluation model of this study contains tool and method of evaluation. The evaluation tool is a checklist including the general and specific criteria of the cancer registry software along with their sub-criteria. The evaluation method of this study was chosen as a criteria-based evaluation method based on the findings. The model of this study encompasses various dimensions of cancer registry software and a proper method for evaluating it. The strong point of this evaluation model is the separation between general criteria and the specific ones, while trying to fulfill the comprehensiveness of the criteria. Since this model has been validated, it can be used as a standard to evaluate the cancer registry software.

  12. Training Software Developers and Designers to Conduct Usability Evaluations

    Science.gov (United States)

    Skov, Mikael Brasholt; Stage, Jan

    2012-01-01

    Many efforts to improve the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both of these approaches depend on a complete division of work between…

  13. Technical Evaluation Report 37: Assistive Software for Disabled Learners

    Directory of Open Access Journals (Sweden)

    Jon Baggaley

    2004-11-01

    Full Text Available Previous reports in this series (#32 and 36 have discussed online software features of value to disabled learners in distance education. The current report evaluates four specific assistive software products with useful features for visually and hearing impaired learners: ATutor, ACollab, Natural Voice, and Just Vanilla. The evaluative criteria discussed include the purpose, uses, costs, and features of each software product, all considered primarily from the accessibility perspective.

  14. Using program logic model analysis to evaluate and better deliver what works

    International Nuclear Information System (INIS)

    Megdal, Lori; Engle, Victoria; Pakenas, Larry; Albert, Scott; Peters, Jane; Jordan, Gretchen

    2005-01-01

    There is a rich history in using program theories and logic models (PT/LM) for evaluation, monitoring, and program refinement in a variety of fields, such as health care, social and education programs. The use of these tools to evaluate and improve energy efficiency programs has been growing over the last 5-7 years. This paper provides an overview of the state-of-the-art methods of logic model development, with analysis that significantly contributed to: Assessing the logic behind how the program expects to be able to meets its ultimate goals, including the 'who', the 'how', and through what mechanism. In doing so, gaps and questions that still need to be addressed can be identified. Identifying and prioritize the indicators that should be measured to evaluate the program and program theory. Determining key researchable questions that need to be answered by evaluation/research, to assess whether the mechanism assumed to cause the changes in actions, attitudes, behaviours, and business practices is workable and efficient. Also will assess the validity in the program logic and the likelihood that the program can accomplish its ultimate goals. Incorporating analysis of prior like programs and social science theories in a framework to identify opportunities for potential program refinements. The paper provides an overview of the tools, techniques and references, and uses as example the energy efficiency program analysis conducted for the New York State Energy Research and Development Authority's (NYSERDA) New York ENERGY $MART SM programs

  15. Symbolic Evaluation Graphs and Term Rewriting — A General Methodology for Analyzing Logic Programs

    DEFF Research Database (Denmark)

    Giesl, J.; Ströder, T.; Schneider-Kamp, P.

    2013-01-01

    There exist many powerful techniques to analyze termination and complexity of term rewrite systems (TRSs). Our goal is to use these techniques for the analysis of other programming languages as well. For instance, approaches to prove termination of definite logic programs by a transformation...... to TRSs have been studied for decades. However, a challenge is to handle languages with more complex evaluation strategies (such as Prolog, where predicates like the cut influence the control flow). We present a general methodology for the analysis of such programs. Here, the logic program is first...... information on the termination or complexity of the original logic program. More information can be found in the full paper [1]. © 2013 Springer-Verlag....

  16. A Characteristics Approach to the Evaluation of Economics Software Packages.

    Science.gov (United States)

    Lumsden, Keith; Scott, Alex

    1988-01-01

    Utilizes Bloom's Taxonomy to identify elements of teacher and student interest. Depicts the way in which these interests are developed into characteristics for use in analytically evaluating software. Illustrates the use of this evaluating technique by appraising the much used software package "Running the British Economy." (KO)

  17. Direct Integration: Training Software Developers to Conduct Usability Evaluations

    DEFF Research Database (Denmark)

    Skov, Mikael B.; Stage, Jan

    2008-01-01

    is based on an empirical study where 36 teams with a total of 234 first-year university students on software development and design educations were trained in a simple approach for user-based website usability testing that was taught in a 40 hour course. This approach supported them in planning, conducting......Many improvements of the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both approaches involve...... a complete division of work between developers and evaluators, which is an undesirable complexity for many software development projects. This paper takes a different approach by exploring to what extent software developers and designers can be trained to carry out their own usability evaluations. The paper...

  18. Evaluation of selected environmental decision support software

    International Nuclear Information System (INIS)

    Sullivan, T.M.; Moskowitz, P.D.; Gitten, M.

    1997-06-01

    Decision Support Software (DSS) continues to be developed to support analysis of decisions pertaining to environmental management. Decision support systems are computer-based systems that facilitate the use of data, models, and structured decision processes in decision making. The optimal DSS should attempt to integrate, analyze, and present environmental information to remediation project managers in order to select cost-effective cleanup strategies. The optimal system should have a balance between the sophistication needed to address the wide range of complicated sites and site conditions present at DOE facilities, and ease of use (e.g., the system should not require data that is typically unknown and should have robust error checking of problem definition through input, etc.). In the first phase of this study, an extensive review of the literature, the Internet, and discussions with sponsors and developers of DSS led to identification of approximately fifty software packages that met the preceding definition

  19. Evaluation for nuclear safety-critical software reliability of DCS

    International Nuclear Information System (INIS)

    Liu Ying

    2015-01-01

    With the development of control and information technology at NPPs, software reliability is important because software failure is usually considered as one form of common cause failures in Digital I and C Systems (DCS). The reliability analysis of DCS, particularly qualitative and quantitative evaluation on the nuclear safety-critical software reliability belongs to a great challenge. To solve this problem, not only comprehensive evaluation model and stage evaluation models are built in this paper, but also prediction and sensibility analysis are given to the models. It can make besement for evaluating the reliability and safety of DCS. (author)

  20. Integrated evaluation framework. Based on the logical framework approach for project cycle management

    International Nuclear Information System (INIS)

    1996-11-01

    This Integrated Evaluation Framework (IEF) was developed by TC Evaluation with the aim of presenting in a comprehensive manner the logic of thinking used when evaluating projects and programmes. Thus, in the first place, the intended audience for this report are evaluation officers, so that when applying the evaluation procedures and check lists, data can be organized following a systematic and logical scheme and conclusions can be derived ''objectively''. The value of such a framework for reporting on performance and in providing a quality reference for disbursements represents one of its major advantages. However, when developing and applying the IEF, it was realized that a Logical Framework Approach (LFA), like the one upon which the IEF is based, needs to be followed throughout the project life cycle, from the Country Programme Framework planning stage, through project design and implementation. Then, the helpful consequences flow into project design quality and smooth implementation. It is only in such an environment that meaningful and consistent evaluation can take place. Therefore the main audience for this report are Agency staff involved in planning, designing and implementing TC projects as well as their counterparts in Member States. In this understanding, the IEF was subjected to review by a consultants meeting, which included both external consultants and Agency staff. This Consultants Review Meeting encouraged the Secretariat to further adopt the LFA into the TC management process

  1. MATHEMATICAL MODEL FOR SOFTWARE USABILITY AUTOMATED EVALUATION AND ASSURANCE

    Directory of Open Access Journals (Sweden)

    І. Гученко

    2011-04-01

    Full Text Available The subject of the research is software usability and the aim is construction of mathematicalmodel of estimation and providing of the set level of usability. Methodology of structural analysis,methods of multicriterion optimization and theory of making decision, method of convolution,scientific methods of analysis and analogies is used in the research. The result of executed work isthe model for software usability automated evaluation and assurance that allows not only toestimate the current level of usability during every iteration of agile development but also tomanage the usability of created software products. Results can be used for the construction ofautomated support systems of management the software usability.

  2. SITE PROGRAM DEMONSTRATION ECO LOGIC INTERNATIONAL GAS-PHASE CHEMICAL REDUCTION PROCESS, BAY CITY, MICHIGAN TECHNOLOGY EVALUATION REPORT

    Science.gov (United States)

    The SITE Program funded a field demonstration to evaluate the Eco Logic Gas-Phase Chemical Reduction Process developed by ELI Eco Logic International Inc. (ELI), Ontario, Canada. The Demonstration took place at the Middleground Landfill in Bay City, Michigan using landfill wa...

  3. Development of a Logic Model to Guide Evaluations of the ASCA National Model for School Counseling Programs

    Science.gov (United States)

    Martin, Ian; Carey, John

    2014-01-01

    A logic model was developed based on an analysis of the 2012 American School Counselor Association (ASCA) National Model in order to provide direction for program evaluation initiatives. The logic model identified three outcomes (increased student achievement/gap reduction, increased school counseling program resources, and systemic change and…

  4. Evaluating bacterial gene-finding HMM structures as probabilistic logic programs.

    Science.gov (United States)

    Mørk, Søren; Holmes, Ian

    2012-03-01

    Probabilistic logic programming offers a powerful way to describe and evaluate structured statistical models. To investigate the practicality of probabilistic logic programming for structure learning in bioinformatics, we undertook a simplified bacterial gene-finding benchmark in PRISM, a probabilistic dialect of Prolog. We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length modeling and three-state versions of the five model structures. The models are all represented as probabilistic logic programs and evaluated using the PRISM machine learning system in terms of statistical information criteria and gene-finding prediction accuracy, in two bacterial genomes. Neither of our implementations of the two currently most used model structures are best performing in terms of statistical information criteria or prediction performances, suggesting that better-fitting models might be achievable. The source code of all PRISM models, data and additional scripts are freely available for download at: http://github.com/somork/codonhmm. Supplementary data are available at Bioinformatics online.

  5. Presenting an evaluation model of the trauma registry software.

    Science.gov (United States)

    Asadi, Farkhondeh; Paydar, Somayeh

    2018-04-01

    Trauma is a major cause of 10% death in the worldwide and is considered as a global concern. This problem has made healthcare policy makers and managers to adopt a basic strategy in this context. Trauma registry has an important and basic role in decreasing the mortality and the disabilities due to injuries resulted from trauma. Today, different software are designed for trauma registry. Evaluation of this software improves management, increases efficiency and effectiveness of these systems. Therefore, the aim of this study is to present an evaluation model for trauma registry software. The present study is an applied research. In this study, general and specific criteria of trauma registry software were identified by reviewing literature including books, articles, scientific documents, valid websites and related software in this domain. According to general and specific criteria and related software, a model for evaluating trauma registry software was proposed. Based on the proposed model, a checklist designed and its validity and reliability evaluated. Mentioned model by using of the Delphi technique presented to 12 experts and specialists. To analyze the results, an agreed coefficient of %75 was determined in order to apply changes. Finally, when the model was approved by the experts and professionals, the final version of the evaluation model for the trauma registry software was presented. For evaluating of criteria of trauma registry software, two groups were presented: 1- General criteria, 2- Specific criteria. General criteria of trauma registry software were classified into four main categories including: 1- usability, 2- security, 3- maintainability, and 4-interoperability. Specific criteria were divided into four main categories including: 1- data submission and entry, 2- reporting, 3- quality control, 4- decision and research support. The presented model in this research has introduced important general and specific criteria of trauma registry software

  6. Review of the Educational Software Evaluation Forms and Scales

    Directory of Open Access Journals (Sweden)

    Ahmet ARSLAN

    2016-12-01

    Full Text Available The main purpose of this study is to review existing evaluation forms and scales that have been prepared for educational software evaluation. In addition to this purpose, the study aims to provide insight and guidance for future studies in this context. In total, forty-two studies that including evaluation forms and scales have been taken into consideration. “Educational software evaluation”, “Software evaluation”, “Educational software evaluation forms/scales” were searched as keywords in the: “Education Resources Information Centre (ERIC”, “Marmara University e-Library”, “National Thesis Center” and “Science Direct” databases. Twenty-nine of them have met the review selection criteria and been evaluated. There is an increase in the number of evaluation tools between 2006 – 2010. However, it was noticed that there is no sufficient number of evaluation tools targeting “educational games”. It was concluded that reliability and validity studies are very important part of developing educational software evaluation tools and this is a matter that should be considered in future studies.

  7. Development of evaluation method for software hazard identification techniques

    International Nuclear Information System (INIS)

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-01-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  8. Computer Assisted Language Learning (CALL) Software: Evaluation ...

    African Journals Online (AJOL)

    Evaluating the nature and extent of the influence of Computer Assisted Language Learning (CALL) on the quality of language learning is highly problematic. This is owing to the number and complexity of interacting variables involved in setting the items for teaching and learning languages. This paper identified and ...

  9. Expert evaluation of innovation projects of mining enterprises on the basis of methods of system analysis and fuzzy logics

    Directory of Open Access Journals (Sweden)

    Pimonov Alexander

    2017-01-01

    Full Text Available This paper presents the multipurpose approach to evaluation of research and innovation projects based on the method of analysis of hierarchies and fuzzy logics for the mining industry. The approach, implemented as part of a decision support system, can reduce the degree of subjectivity during examinations by taking into account both quantitative and qualitative characteristics of the compared innovative alternatives; it does not depend on specific conditions of examination and allows engagement of experts of various fields of knowledge. The system includes the mechanism of coordination of several experts’ views. Using of fuzzy logics allows evaluating the qualitative characteristics of innovations in the form of formalized logical conclusions.

  10. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    International Nuclear Information System (INIS)

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-01-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a ''living document'' that will be modified over the course of the execution of this work

  11. Encapsulating Software Platform Logic by Aspect-Oriented Programming : A Case Study in Using Aspects for Language Portability

    NARCIS (Netherlands)

    Kats, L.C.; Visser, E.

    2010-01-01

    Software platforms such as the Java Virtual Machine or the CLR .NET virtual machine have their own ecosystem of a core programming language or instruction set, libraries, and developer community. Programming languages can target multiple software platforms to increase interoperability or to boost

  12. Fuzzy Logic-based expert system for evaluating cake quality of freeze-dried formulations.

    Science.gov (United States)

    Trnka, Hjalte; Wu, Jian X; Van De Weert, Marco; Grohganz, Holger; Rantanen, Jukka

    2013-12-01

    Freeze-drying of peptide and protein-based pharmaceuticals is an increasingly important field of research. The diverse nature of these compounds, limited understanding of excipient functionality, and difficult-to-analyze quality attributes together with the increasing importance of the biosimilarity concept complicate the development phase of safe and cost-effective drug products. To streamline the development phase and to make high-throughput formulation screening possible, efficient solutions for analyzing critical quality attributes such as cake quality with minimal material consumption are needed. The aim of this study was to develop a fuzzy logic system based on image analysis (IA) for analyzing cake quality. Freeze-dried samples with different visual quality attributes were prepared in well plates. Imaging solutions together with image analytical routines were developed for extracting critical visual features such as the degree of cake collapse, glassiness, and color uniformity. On the basis of the IA outputs, a fuzzy logic system for analysis of these freeze-dried cakes was constructed. After this development phase, the system was tested with a new screening well plate. The developed fuzzy logic-based system was found to give comparable quality scores with visual evaluation, making high-throughput classification of cake quality possible. © 2013 Wiley Periodicals, Inc. and the American Pharmacists Association.

  13. Development of evaluation method for software safety analysis techniques

    International Nuclear Information System (INIS)

    Huang, H.; Tu, W.; Shih, C.; Chen, C.; Yang, W.; Yih, S.; Kuo, C.; Chen, M.

    2006-01-01

    Full text: Full text: Following the massive adoption of digital Instrumentation and Control (I and C) system for nuclear power plant (NPP), various Software Safety Analysis (SSA) techniques are used to evaluate the NPP safety for adopting appropriate digital I and C system, and then to reduce risk to acceptable level. However, each technique has its specific advantage and disadvantage. If the two or more techniques can be complementarily incorporated, the SSA combination would be more acceptable. As a result, if proper evaluation criteria are available, the analyst can then choose appropriate technique combination to perform analysis on the basis of resources. This research evaluated the applicable software safety analysis techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flowgraph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/ noise ratio, complexity, and implementation cost. These indexes may help the decision makers and the software safety analysts to choose the best SSA combination arrange their own software safety plan. By this proposed method, the analysts can evaluate various SSA combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (without transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and Simulation-based model analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantage are the completeness complexity

  14. Beyond Open Source: Evaluating the Community Availability of Software

    Directory of Open Access Journals (Sweden)

    Bret Davidson

    2016-01-01

    Full Text Available The Code4Lib community has produced an increasingly impressive collection of open source software over the last decade, but much of this creative work remains out of reach for large portions of the library community. Do the relatively privileged institutions represented by a majority of Code4Lib participants have a professional responsibility to support the adoption of their innovations? Drawing from old and new software packaging and distribution approaches (from freeware to Docker, we propose extending the open source software values of collaboration and transparency to include the wide and affordable distribution of software. We believe this will not only simplify the process of sharing our applications within the library community, but also make it possible for less well-resourced institutions to actually use our software. We identify areas of need, present our experiences with the users of our own open source projects, discuss our attempts to go beyond open source, propose a preliminary set of technology availability performance indicators for evaluating software availability, and make an argument for the internal value of supporting and encouraging a vibrant library software ecosystem.

  15. Human action quality evaluation based on fuzzy logic with application in underground coal mining.

    Science.gov (United States)

    Ionica, Andreea; Leba, Monica

    2015-01-01

    The work system is defined by its components, their roles and the relationships between them. Any work system gravitates around the human resource and the interdependencies between human factor and the other components of it. Researches in this field agreed that the human factor and its actions are difficult to quantify and predict. The objective of this paper is to apply a method of human actions evaluation in order to estimate possible risks and prevent possible system faults, both at human factor level and at equipment level. In order to point out the importance of the human factor influence on all the elements of the working systems we propose a fuzzy logic based methodology for quality evaluation of human actions. This methodology has a multidisciplinary character, as it gathers ideas and methods from: quality management, ergonomics, work safety and artificial intelligence. The results presented refer to a work system with a high degree of specificity, namely, underground coal mining and are valuable for human resources risk evaluation pattern. The fuzzy logic evaluation of the human actions leads to early detection of possible dangerous evolutions of the work system and alarm the persons in charge.

  16. Expert System for Competences Evaluation 360° Feedback Using Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Alberto Alfonso Aguilar Lasserre

    2014-01-01

    Full Text Available Performance evaluation (PE is a process that estimates the employee overall performance during a given period, and it is a common function carried out inside modern companies. PE is important because it is an instrument that encourages employees, organizational areas, and the whole company to have an appropriate behavior and continuous improvement. In addition, PE is useful in decision making about personnel allocation, productivity bonuses, incentives, promotions, disciplinary measures, and dismissals. There are many performance evaluation methods; however, none is universal and common to all companies. This paper proposes an expert performance evaluation system based on a fuzzy logic model, with competences 360° feedback oriented to human behavior. This model uses linguistic labels and adjustable numerical values to represent ambiguous concepts, such as imprecision and subjectivity. The model was validated in the administrative department of a real Mexican manufacturing company, where final results and conclusions show the fuzzy logic method advantages in comparison with traditional 360° performance evaluation methodologies.

  17. Oak Ridge National Laboratory Technology Logic Diagram. Volume 1, Technology Evaluation: Part B, Remedial Action

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The Oak Ridge National Laboratory Technology Logic Diagram (TLD) was developed to provide a decision-support tool that relates environmental restoration (ER) and waste management (WM) problems at Oak Ridge National Laboratory (ORNL) to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed to develop these technologies to a state that allows technology transfer and application to decontamination and decommissioning (D&D), remedial action (RA), and WM activities. The TLD consists of three fundamentally separate volumes: Vol. 1 (Technology Evaluation), Vol. 2 (Technology Logic Diagram), and Vol. 3 (Technology Evaluation Data Sheets). Part A of Vols. 1 and 2 focuses on D&D. Part B of Vols. 1 and 2 focuses on RA of contaminated facilities. Part C of Vols. 1 and 2 focuses on WM. Each part of Vol. 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the ranking os remedial technologies. Volume 2 (Pts. A, B, and C) contains the logic linkages among EM goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 (Pts. A, B, and C) contains the TLD data sheets. The focus of Vol. 1, Pt. B, is RA, and it has been divided into six chapters. The first chapter is an introduction, which defines problems specific to the ER Program for ORNL. Chapter 2 provides a general overview of the TLD. Chapters 3 through 5 are organized into necessary subelement categories: RA, characterization, and robotics and automation. The final chapter contains regulatory compliance information concerning RA.

  18. Performance evaluation software moving object detection and tracking in videos

    CERN Document Server

    Karasulu, Bahadir

    2013-01-01

    Performance Evaluation Software: Moving Object Detection and Tracking in Videos introduces a software approach for the real-time evaluation and performance comparison of the methods specializing in moving object detection and/or tracking (D&T) in video processing. Digital video content analysis is an important item for multimedia content-based indexing (MCBI), content-based video retrieval (CBVR) and visual surveillance systems. There are some frequently-used generic algorithms for video object D&T in the literature, such as Background Subtraction (BS), Continuously Adaptive Mean-shift (CMS),

  19. Evaluating the Potential Business Benefits of Ecodesign Implementation: A Logic Model Approach

    Directory of Open Access Journals (Sweden)

    Vinícius P. Rodrigues

    2018-06-01

    Full Text Available The business benefits attained from ecodesign programs in manufacturing companies have been regularly documented by several studies from both the academic and corporate spheres. However, there are still significant challenges for adopting ecodesign, especially regarding the evaluation of these potential business benefits prior to the actual ecodesign implementation. To address such gap, this study proposes an exploratory and theory-driven framework based on logic models to support the development of business cases for ecodesign implementation. The objective is to offer an outlook into how ecodesign implementation can potentially affect key corporate performance outcomes. This paper is based on a three-stage research methodology with six steps. Two full systematic literature reviews were performed, along with two thematic analyses and a grounded theory approach with the aim of developing the business case framework, which was then evaluated by seven industry experts. This research contributes to the literature of ecodesign especially by laying out an ecodesign-instantiated logic model, which is readily available to be adapted and customized for further test and use in practice. Discussions on the usefulness and applicability of the framework and directions for future research are presented.

  20. Software

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, R.; Budd, G.; Ross, E.; Wells, P.

    2010-07-15

    The software section of this journal presented new software programs that have been developed to help in the exploration and development of hydrocarbon resources. Software provider IHS Inc. has made additions to its geological and engineering analysis software tool, IHS PETRA, a product used by geoscientists and engineers to visualize, analyze and manage well production, well log, drilling, reservoir, seismic and other related information. IHS PETRA also includes a directional well module and a decline curve analysis module to improve analysis capabilities in unconventional reservoirs. Petris Technology Inc. has developed a software to help manage the large volumes of data. PetrisWinds Enterprise (PWE) helps users find and manage wellbore data, including conventional wireline and MWD core data; analysis core photos and images; waveforms and NMR; and external files documentation. Ottawa-based Ambercore Software Inc. has been collaborating with Nexen on the Petroleum iQ software for steam assisted gravity drainage (SAGD) producers. Petroleum iQ integrates geology and geophysics data with engineering data in 3D and 4D. Calgary-based Envirosoft Corporation has developed a software that reduces the costly and time-consuming effort required to comply with Directive 39 of the Alberta Energy Resources Conservation Board. The product includes an emissions modelling software. Houston-based Seismic Micro-Technology (SMT) has developed the Kingdom software that features the latest in seismic interpretation. Holland-based Joa Oil and Gas and Calgary-based Computer Modelling Group have both supplied the petroleum industry with advanced reservoir simulation software that enables reservoir interpretation. The 2010 software survey included a guide to new software applications designed to facilitate petroleum exploration, drilling and production activities. Oil and gas producers can use the products for a range of functions, including reservoir characterization and accounting. In

  1. Software for MUF evaluating in item nuclear material accounting

    International Nuclear Information System (INIS)

    Wang Dong; Zhang Quanhu; He Bin; Wang Hua; Yang Daojun

    2009-01-01

    Nuclear material accounting is a key measure for nuclear safeguard. Software for MUF evaluation in item nuclear material accounting was worked out in this paper. It is composed of several models, including input model, data processing model, data inquiring model, data print model, system setting model etc. It could be used to check the variance of the measurement and estimate the confidence interval according to the MUF value. To insure security of the data multi-user management function was applied in the software. (authors)

  2. Fuzzy Logic vs. Neutrosophic Logic: Operations Logic

    Directory of Open Access Journals (Sweden)

    Salah Bouzina

    2016-12-01

    Full Text Available The goal of this research is first to show how different, thorough, widespread and effective are the operations logic of the neutrosophic logic compared to the fuzzy logic’s operations logical. The second aim is to observe how a fully new logic, the neutrosophic logic, is established starting by changing the previous logical perspective fuzzy logic, and by changing that, we mean changing changing the truth values from the truth and falsity degrees membership in fuzzy logic, to the truth, falsity and indeterminacy degrees membership in neutrosophic logic; and thirdly, to observe that there is no limit to the logical discoveries - we only change the principle, then the system changes completely.

  3. Evaluation of flux-based logic schemes for high-Tc applications

    International Nuclear Information System (INIS)

    Fleishman, J.; Feld, D.; Xiao, P.; Van Dazer, T.

    1991-01-01

    This paper presents analyses of three digital logic families that can be made using nonhysteretic Josephson junctions, potentially the only kind of Josephson device realizable with superconductors having high transition temperatures. These logic families utilize magnetic flux-transfer and are characterized by very low power dissipation. Rapid Single Flux Quantum (RSFQ) and Phase Mode logic are both based on pulse propagation. The Quantum Flux Parametron (QFP) logic family is based on current latching. Simulations of RSFQ, Phase-Mode, and QFP logic families using high-T c junction parameters are presented to demonstrate the compatibility of these logic families with the new perovskite superconductors. The operation of these logic families is analyzed and the advantages and disadvantages of each are discussed

  4. Software for evaluation of EPR-dosimetry performance

    International Nuclear Information System (INIS)

    Shishkina, E.A.; Timofeev, Yu.S.; Ivanov, D.V.

    2014-01-01

    Electron paramagnetic resonance (EPR) with tooth enamel is a method extensively used for retrospective external dosimetry. Different research groups apply different equipment, sample preparation procedures and spectrum processing algorithms for EPR dosimetry. A uniform algorithm for description and comparison of performances was designed and implemented in a new computer code. The aim of the paper is to introduce the new software 'EPR-dosimetry performance'. The computer code is a user-friendly tool for providing a full description of method-specific capabilities of EPR tooth dosimetry, from metrological characteristics to practical limitations in applications. The software designed for scientists and engineers has several applications, including support of method calibration by evaluation of calibration parameters, evaluation of critical value and detection limit for registration of radiation-induced signal amplitude, estimation of critical value and detection limit for dose evaluation, estimation of minimal detectable value for anthropogenic dose assessment and description of method uncertainty. (authors)

  5. "SABER": A new software tool for radiotherapy treatment plan evaluation.

    Science.gov (United States)

    Zhao, Bo; Joiner, Michael C; Orton, Colin G; Burmeister, Jay

    2010-11-01

    Both spatial and biological information are necessary in order to perform true optimization of a treatment plan and for predicting clinical outcome. The goal of this work is to develop an enhanced treatment plan evaluation tool which incorporates biological parameters and retains spatial dose information. A software system is developed which provides biological plan evaluation with a novel combination of features. It incorporates hyper-radiosensitivity using the induced-repair model and applies the new concept of dose convolution filter (DCF) to simulate dose wash-out effects due to cell migration, bystander effect, and/or tissue motion during treatment. Further, the concept of spatial DVH (sDVH) is introduced to evaluate and potentially optimize the spatial dose distribution in the target volume. Finally, generalized equivalent uniform dose is derived from both the physical dose distribution (gEUD) and the distribution of equivalent dose in 2 Gy fractions (gEUD2) and the software provides three separate models for calculation of tumor control probability (TCP), normal tissue complication probability (NTCP), and probability of uncomplicated tumor control (P+). TCP, NTCP, and P+ are provided as a function of prescribed dose and multivariable TCP, NTCP, and P+ plots are provided to illustrate the dependence on individual parameters used to calculate these quantities. Ten plans from two clinical treatment sites are selected to test the three calculation models provided by this software. By retaining both spatial and biological information about the dose distribution, the software is able to distinguish features of radiotherapy treatment plans not discernible using commercial systems. Plans that have similar DVHs may have different spatial and biological characteristics and the application of novel tools such as sDVH and DCF within the software may substantially change the apparent plan quality or predicted plan metrics such as TCP and NTCP. For the cases examined

  6. A fuzzy logic expert system for evaluating policy progress towards sustainability goals.

    Science.gov (United States)

    Cisneros-Montemayor, Andrés M; Singh, Gerald G; Cheung, William W L

    2017-12-16

    Evaluating progress towards environmental sustainability goals can be difficult due to a lack of measurable benchmarks and insufficient or uncertain data. Marine settings are particularly challenging, as stakeholders and objectives tend to be less well defined and ecosystem components have high natural variability and are difficult to observe directly. Fuzzy logic expert systems are useful analytical frameworks to evaluate such systems, and we develop such a model here to formally evaluate progress towards sustainability targets based on diverse sets of indicators. Evaluation criteria include recent (since policy enactment) and historical (from earliest known state) change, type of indicators (state, benefit, pressure, response), time span and spatial scope, and the suitability of an indicator in reflecting progress toward a specific objective. A key aspect of the framework is that all assumptions are transparent and modifiable to fit different social and ecological contexts. We test the method by evaluating progress towards four Aichi Biodiversity Targets in Canadian oceans, including quantitative progress scores, information gaps, and the sensitivity of results to model and data assumptions. For Canadian marine systems, national protection plans and biodiversity awareness show good progress, but species and ecosystem states overall do not show strong improvement. Well-defined goals are vital for successful policy implementation, as ambiguity allows for conflicting potential indicators, which in natural systems increases uncertainty in progress evaluations. Importantly, our framework can be easily adapted to assess progress towards policy goals with different themes, globally or in specific regions.

  7. Developing and Using a Logic Model for Evaluation and Assessment of University Student Affairs Programming: A Case Study

    Science.gov (United States)

    Cooper, Jeff

    2009-01-01

    This dissertation addresses theory and practice of evaluation and assessment in university student affairs, by applying logic modeling/program theory to a case study. I intend to add knowledge to ongoing dialogue among evaluation scholars and practitioners on student affairs program planning and improvement as integral considerations that serve…

  8. Evaluation of the procedures in medical applications of X-rays using fuzzy logic

    International Nuclear Information System (INIS)

    Silva, Luiz A.C.; Teixeira, Marcello G.; Ferreira, Nadya M.P.D.

    2005-01-01

    A project is being developed in a large hospital III, located in the city of Rio de Janeiro, with the objective of implementing coordinated actions and procedures, in order to optimize the images obtained in conventional radiology equipment,taking into account the lower risk to the patient and images with information for a safe diagnosis. In this paper Fuzzy Logic was used for modeling the problem of image evaluation of chest radiographs. The evaluation system was modeled as a diffuse network of three layers. The first, formed by the input variables of the system, that were considered relevant to the decision-making processes of the radiographic image quality, and are related to questions observed by radiologists during the report of examination of the chest. The second formed by the outputs of two inferences that evaluate the sharpness and visibility, and a third, consisting of an final inference that groups the two inferences of second layer, providing the final evaluation of radiography. The comparison of the results obtained with the evaluation of chest radiographs by medical experts shows that are consistent using this modeling

  9. Study of evaluation techniques of software testing and V and V in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Youn, Cheong; Baek, Y. W.; Kim, H. C.; Shin, C. Y.; Park, N. J. [Chungnam Nationl Univ., Taejon (Korea, Republic of)

    2000-03-15

    The study of activities to solve software safety and quality must be executed in base of establishing software development process for digitalized nuclear plant. Especially study of software testing and verification and validation must executed. For this purpose methodologies and tools which can improve software qualities are evaluated and software testing and V and V which can be applied to software life cycle are investigated. This study establish a guideline that can assure software safety and reliability requirements in digitalized nuclear plant systems and can be used as a guidebook of software development process to assure software quality many software development organization.

  10. Study on scenario evaluation methodology for decommissioning nuclear facilities using fuzzy logic

    International Nuclear Information System (INIS)

    Matsuhashi, Kazuya; Yanagihara, Satoshi

    2015-01-01

    Since there are many scenarios of the process from start to completion of a decommissioning project, it is important to study scenarios of decommissioning by evaluating such properties as safety, cost, and technology. An optimum scenario with the highest feasibility in accordance with the facility and environmental conditions should be selected on the basis of the results of the study. For analyzing a scenario of decommissioning, we prepared structured work packages by using the work breakdown structures (WBS) method together with qualitative evaluation of the technologies being applied to work packages located at the bottom (the third level) of the WBS. A calculation model was constructed to evaluate the feasibility of a scenario where fuzzy logic is applied to derive a score of technology performance and TOPSIS is applied for getting a feasibility grade of the scenario from technical performance scoring. As a case study, the model was applied to the debris removal scenario of Fukushima Daiichi Nuclear Power Plant to confirm its applicability. Two scenarios, underwater and in-air debris removal cases, were characterized by extracting the work packages with the lowest feasibility and by obtaining total average scores of the scenarios. It is confirmed that the methodology developed is useful for the scenario evaluation of decommissioning nuclear facilities. (author)

  11. Towards an Evaluation Framework for Software Process Improvement

    OpenAIRE

    Cheng, Chow Kian; Permadi, Rahadian Bayu

    2009-01-01

    Software has gained an essential role in our daily life in the last decades. This condition demands high quality software. To produce high quality software many practitioners and researchers put more attention on the software development process. Large investments are poured to improve the software development process. Software Process Improvement (SPI) is a research area which is aimed to address the assessment and improvement issues in the software development process. One of the most impor...

  12. Action Type Deontic Logic

    DEFF Research Database (Denmark)

    Bentzen, Martin Mose

    2014-01-01

    A new deontic logic, Action Type Deontic Logic, is presented. To motivate this logic, a number of benchmark cases are shown, representing inferences a deontic logic should validate. Some of the benchmark cases are singled out for further comments and some formal approaches to deontic reasoning...... are evaluated with respect to the benchmark cases. After that follows an informal introduction to the ideas behind the formal semantics, focussing on the distinction between action types and action tokens. Then the syntax and semantics of Action Type Deontic Logic is presented and it is shown to meet...

  13. Materials accounting software for evaluation of inventory differences

    International Nuclear Information System (INIS)

    Picard, R.R.; Hafer, J.F.

    1991-01-01

    As a consequence of facility efforts to better understand inventory differences (IDs) and the desire to comply with related regulatory requirements, propagation of uncertainties has received much attention in recent years. This paper reviews several issues regarding software for ID evaluation. Some of these issues are generic (e.g., the importance of identifying individual measured values and individual special nuclear material items by name and the generality needed to handle a wide variety of accounting problems) and others are facility-specific (e.g., interfacing the facility's database to a variance propagation engine and subsequent uses of that engine). In this paper the history of a Los Alamos engine, MAWST, is briefly reviewed and some of the lessons learned during its development are described. Major hurdles to implementation do not involve shortcomings in software or in statistical theory

  14. Combining geographic information system, multicriteria evaluation techniques and fuzzy logic in siting MSW landfills

    Science.gov (United States)

    Gemitzi, Alexandra; Tsihrintzis, Vassilios A.; Voudrias, Evangelos; Petalas, Christos; Stravodimos, George

    2007-01-01

    This study presents a methodology for siting municipal solid waste landfills, coupling geographic information systems (GIS), fuzzy logic, and multicriteria evaluation techniques. Both exclusionary and non-exclusionary criteria are used. Factors, i.e., non-exclusionary criteria, are divided in two distinct groups which do not have the same level of trade off. The first group comprises factors related to the physical environment, which cannot be expressed in terms of monetary cost and, therefore, they do not easily trade off. The second group includes those factors related to human activities, i.e., socioeconomic factors, which can be expressed as financial cost, thus showing a high level of trade off. GIS are used for geographic data acquisition and processing. The analytical hierarchy process (AHP) is the multicriteria evaluation technique used, enhanced with fuzzy factor standardization. Besides assigning weights to factors through the AHP, control over the level of risk and trade off in the siting process is achieved through a second set of weights, i.e., order weights, applied to factors in each factor group, on a pixel-by-pixel basis, thus taking into account the local site characteristics. The method has been applied to Evros prefecture (NE Greece), an area of approximately 4,000 km2. The siting methodology results in two intermediate suitability maps, one related to environmental and the other to socioeconomic criteria. Combination of the two intermediate maps results in the final composite suitability map for landfill siting.

  15. A logic model framework for evaluation and planning in a primary care practice-based research network (PBRN)

    Science.gov (United States)

    Hayes, Holly; Parchman, Michael L.; Howard, Ray

    2012-01-01

    Evaluating effective growth and development of a Practice-Based Research Network (PBRN) can be challenging. The purpose of this article is to describe the development of a logic model and how the framework has been used for planning and evaluation in a primary care PBRN. An evaluation team was formed consisting of the PBRN directors, staff and its board members. After the mission and the target audience were determined, facilitated meetings and discussions were held with stakeholders to identify the assumptions, inputs, activities, outputs, outcomes and outcome indicators. The long-term outcomes outlined in the final logic model are two-fold: 1.) Improved health outcomes of patients served by PBRN community clinicians; and 2.) Community clinicians are recognized leaders of quality research projects. The Logic Model proved useful in identifying stakeholder interests and dissemination activities as an area that required more attention in the PBRN. The logic model approach is a useful planning tool and project management resource that increases the probability that the PBRN mission will be successfully implemented. PMID:21900441

  16. QUALITY SERVICES EVALUATION MODEL BASED ON DEDICATED SOFTWARE TOOL

    Directory of Open Access Journals (Sweden)

    ANDREEA CRISTINA IONICĂ

    2012-10-01

    Full Text Available In this paper we introduced a new model, called Service Quality (SQ, which combines QFD and SERVQUAL methods. This model takes from the SERVQUAL method the five dimensions of requirements and three of characteristics and from the QFD method the application methodology. The originality of the SQ model consists in computing a global index that reflects the customers’ requirements accomplishment level by the quality characteristics. In order to prove the viability of the SQ model, there was developed a software tool that was applied for the evaluation of a health care services provider.

  17. Embedding Logics into Product Logic

    Czech Academy of Sciences Publication Activity Database

    Baaz, M.; Hájek, Petr; Krajíček, Jan; Švejda, David

    1998-01-01

    Roč. 61, č. 1 (1998), s. 35-47 ISSN 0039-3215 R&D Projects: GA AV ČR IAA1030601 Grant - others:COST(XE) Action 15 Keywords : fuzzy logic * Lukasiewicz logic * Gödel logic * product logic * computational complexity * arithmetical hierarchy Subject RIV: BA - General Mathematics

  18. Evaluation of The Virtual Cells Software: a Teaching Tool

    Directory of Open Access Journals (Sweden)

    C.C.P. da Silva

    2005-07-01

    Full Text Available Studies show that the use of games and interactive materials  at schools is a good educational strategy, motivating students to create mental  outlines and developing the reasoning and facilitating  the learn- ing. In this context, the Scientific Dissemination Coordination of the Center  for Structural Molecular Biotechnology  (CBME,  developed  a series of educational materials  destined  to the  elementary and high  schools,  universities  and  general  public.   Among  these,  we highlighted  the  Virtual  Cells soft- ware that was developed  with  the  aim of helping  in the  understanding of the  basic concepts  of cell types,  their  structures, organelles  and  specific functions.   Characterized by its  interactive  interface, this  software shows eukaryotes  and prokaryotes cells images, where organelles are shown as dynamic structures. In addition, it presents exercises in another  step that reinforce the comprehension  of Cy- tology.  A speaker  narrates the  resources  offered by the  program  and  the  necessary  steps  for its use. During  the  stage  of development of the  software,  students and  teachers of public and  private  high schools from Sao Carlos  city, Sao Paulo  State,  were invited  to register their  opinions  regarding  the language and content of the software in order to help us in the improvement of it.  After this stage, the Scientific Dissemination Coordination of CBME organized a series of workshops, where 120 individuals evaluated the software (students and teachers  of high school and others undergraduate students. For this evaluation, a questionnaire was elaborated based on the international current literature in the area of sciences teaching  and it was applied  after the interactive section with the software.  The analysis of the results demonstrated that most of the individuals  considered the software of easy

  19. evaluation of a multi-variable self-learning fuzzy logic controller

    African Journals Online (AJOL)

    Dr Obe

    2003-03-01

    Mar 1, 2003 ... The most challenging aspect of the design of a fuzzy logic controller is ... inaccuracy (or structured uncertainty) and unmodelled ... mathematical analysis on paper is impossible ... output (SISO) system that can self-construct ...

  20. Safety assurance logic techniques for evaluation of accident prevention and mitigation

    International Nuclear Information System (INIS)

    McWethy, L.M.; Hagan, J.W.

    1976-01-01

    Safety assurance methods have been developed and applied in reactor safety assessments of FFTF. These methods promote visibility of the total safety provided by the plant, both in prevention of off-normal or accident conditions as well as provision of various features which terminate conditions within acceptable bounds if such conditions should occur. One of the primary techniques applied in safety assurance is the development of safety assurance diagrams. These diagrams explicitly identify the multiple lines of defense which prevent accident progression. The diagrams graphically demonstrate the defense-in-depth provided by the plant for each postulated occurrence. Lines of defense are shown against ever having an occurrence in the first place; thus giving appropriate emphasis on accident prevention, and visibility to the designer's role in promoting this level of safety. These diagrams, or accident process trees, also show graphically the various paths of postulated accident progression to their logical termination. Evaluation of the importance and strength of each line-of-defense assures fulfillment of the safety objectives of the overall plant system

  1. Evaluation of software architecture using fuzzy colored Petri nets

    Directory of Open Access Journals (Sweden)

    Vahid Abroshan

    2013-02-01

    Full Text Available Software Architecture (SA is one of the most important artifacts for life cycle of a software system because it incorporates some important decisions and principles for the system development. On the other hand, developing the systems based on uncertain and ambiguous requirement has been increased, significantly. Therefore, there have been significant attentions on SA requirements. In this paper, we present a new method for evaluation of performance characteristics based on a use case, response time, and queue length of SA. Since there are some ambiguities associated with considered systems, we use the idea of Fuzzy UML (F-UML diagrams. In addition, these diagrams have been enriched with performance annotations using proposed Fuzzy-SPT sub profile, the extended version of SPT profile proposed by OMG. Then, these diagrams are mapped into an executable model based on Fuzzy Colored Petri Nets (FCPN and finally the performance metrics are calculated using the proposed algorithms. We have implemented CPN-Tools for creating and evaluating the FCPN model.

  2. Fuzzy logic approach for energetic and economic evaluation of hydroelectric projects

    International Nuclear Information System (INIS)

    Iliev, Atanas M.

    2003-01-01

    A mathematical model for energetic and economic evaluation of hydroelectric projects is developed. The main advantage of the proposed methodology is that the model considers uncertainty and vagueness which appears during the decision making process. Due to modeling of variables that are non statistical in their character, fuzzy logic approach is fully incorporated in the model. The first step in energetic evaluation of the hydro power projects is determination of the characteristic of the efficiency of the units to be installed in hydro power plants. For this purpose the model which uses the best characteristics of Artificial Network Fuzzy Inference System (ANFIS) is applied. The method is tested on real systems: HPP Tikves- the power plant in operation and HPP Kozjak - the power plant in construction. The results obtained from practical implementation show that the proposed approach gives superior results than classical polynomial approximation. The model for determining the consumption characteristic of hydro power plant is developed by Sugeno Fuzzy Logic System with polynomials in the consequent part of the rules. Model takes into account the variable gross head of HPP, as well as, the number of units which will be in operation for given output. Modeling of the gross head and power output are performed by expert's design membership functions. This model is practically applied on HPP Tikves for determination of the consumption characteristic for several gross head. The plausible yearly production of electricity from hydro power project, which is important for estimation of the benefit from the project, is calculated by mixed fuzzy-statistical model. hi this approach fuzzy set of the inflow is constructed according to the statistical parameters. The calculation of the production of electricity is realized for a several hydrological conditions which are described by linguistic variables. Finally, Mamdani Fuzzy Inference System with fuzzy number in consequent part

  3. Logical labyrinths

    CERN Document Server

    Smullyan, Raymond

    2008-01-01

    This book features a unique approach to the teaching of mathematical logic by putting it in the context of the puzzles and paradoxes of common language and rational thought. It serves as a bridge from the author's puzzle books to his technical writing in the fascinating field of mathematical logic. Using the logic of lying and truth-telling, the author introduces the readers to informal reasoning preparing them for the formal study of symbolic logic, from propositional logic to first-order logic, a subject that has many important applications to philosophy, mathematics, and computer science. T

  4. Evaluation of breast parenchymal density with QUANTRA software

    International Nuclear Information System (INIS)

    Pahwa, Shivani; Hari, Smriti; Thulkar, Sanjay; Angraal, Suveen

    2015-01-01

    To evaluate breast parenchymal density using QUANTRA software and to correlate numerical breast density values obtained from QUANTRA with ACR BI-RADS breast density categories. Two-view digital mammograms of 545 consecutive women (mean age - 47.7 years) were categorized visually by three independent radiologists into one of the four ACR BI-RADS categories (D1-D4). Numerical breast density values as obtained by QUANTRA software were then used to establish the cutoff values for each category using receiver operator characteristic (ROC) analysis. Numerical breast density values obtained by QUANTRA (range - 7-42%) were systematically lower than visual estimates. QUANTRA breast density value of less than 14.5% could accurately differentiate category D1 from the categories D2, D3, and D4 [area under curve (AUC) on ROC analysis - 94.09%, sensitivity - 85.71%, specificity - 84.21%]. QUANTRA density values of <19.5% accurately differentiated categories D1 and D2 from D3 and D4 (AUC - 94.4%, sensitivity - 87.50%, specificity - 84.60%); QUANTRA density values of <26.5% accurately differentiated categories D1, D2, and D3 from category D4 (AUC - 90.75%, sensitivity - 88.89%, specificity - 88.621%). Breast density values obtained by QUANTRA software can be used to obtain objective cutoff values for each ACR BI-RADS breast density category. Although the numerical density values obtained by QUANTRA are lower than visual estimates, they correlate well with the BI-RADS breast density categories assigned visually to the mammograms

  5. Sensory evaluation of aromatic foods packed in developed starch based films using fuzzy logic

    Directory of Open Access Journals (Sweden)

    Tanima Chowdhury

    2015-04-01

    Full Text Available The last two decades have seen attempts to replace non biodegradable, synthetic food packaging films with alternatives made from biopolymers. The objective of the present work was to evaluate sensory quality of tea leaf and culinary tastemaker powder when sealed in pouches based on starch films.Films were developed from corn starch and a functional polysaccharide (FP from amylose (AM, methylcellulose (MC, and hydroxypropylmethylcellulose (HPMC, using a casting technique. Pouches were stored inside a secondary package (plastic jar under ambient condition for 90 days. Sensory attributes of the stored food samples were evaluated (tea in liquor form and the scores analysed by fuzzy logic. Results were compared with similarly stored foods but using market available poly-pouches as packaging material.For tea and tastemaker in general, the relative importance of the sensory attributes under consideration was assessed as:  aroma (Highly important >taste (Highly important>colour (Highly important > strength (Important for tea, and taste (Highly important>aroma (Highly important>colour (Important>appearance (Important for tastemaker. Among the three films that were developed, the highly important sensory attributes of aroma and taste were maintained as ‘Very good’ when the foods were packed in starch–HPMC/AM film. When the products were packed in market-available poly-pouches they exhibited similar attributes. With the exception of ‘Very good’ maintenance of the colour of tastemaker by the commercial pouch, irrespective of film and food, the colour and strength/appearance were retained in the ‘Good’-‘Satisfactory’ range. The overall sensory score of tea was also maintained as ‘Very good’ in starch-HPMC film.

  6. Logic functions and equations examples and exercises

    CERN Document Server

    Steinbach, Bernd

    2009-01-01

    With a free, downloadable software package available to help solve the exercises, this book focuses on practical and relevant problems that arise in the field of binary logics, with its two main applications - digital circuit design, and propositional logics.

  7. Use Case Evaluation (UCE): A Method for Early Usability Evaluation in Software Development

    DEFF Research Database (Denmark)

    Stage, Jan; Høegh, Rune Thaarup; Hornbæk, K.

    2007-01-01

    t is often argued that usability problems should be identified as early as possible during software development, but many usability evaluation methods do not fit well in early development activities. We propose a method for usability evaluation of use cases, a widely used representation of design...... ideas produced early in software development processes. The method proceeds by systematic inspection of use cases with reference to a set of guidelines for usable design. To validate the method, four evaluators inspected a set of use cases for a health care application....

  8. Ambiguity, logic, simplicity, and dynamics: Wittgensteinian evaluative criteria in peer review of quantitative research on categorization.

    Science.gov (United States)

    Shimp, Charles P

    2004-06-30

    Research on categorization has changed over time, and some of these changes resemble how Wittgenstein's views changed from his Tractatus Logico-Philosophicus to his Philosophical Investigations. Wittgenstein initially focused on unambiguous, abstract, parsimonious, logical propositions and rules, and on independent, static, "atomic facts." This approach subsequently influenced the development of logical positivism and thereby may have indirectly influenced method and theory in research on categorization: much animal research on categorization has focused on learning simple, static, logical rules unambiguously interrelating small numbers of independent features. He later rejected logical simplicity and rigor and focused instead on Gestalt ideas about figure-ground reversals and context, the ambiguity of family resemblance, and the function of details of everyday language. Contemporary contextualism has been influenced by this latter position, some features of which appear in contemporary empirical research on categorization. These developmental changes are illustrated by research on avian local and global levels of visual perceptual analysis, categorization of rectangles and moving objects, and artificial grammar learning. Implications are described for peer review of quantitative theory in which ambiguity, logical rigor, simplicity, or dynamics are designed to play important roles.

  9. Some relationships between logic programming and multiple-valued logic

    International Nuclear Information System (INIS)

    Rine, D.C.

    1986-01-01

    There have been suggestions in the artificial intelligence literature that investigations into relationships between logic programming and multiple-valued logic may be helpful. This paper presents some of these relationships through equivalent algebraic evaluations

  10. Mathematical logic

    CERN Document Server

    Kleene, Stephen Cole

    1967-01-01

    Undergraduate students with no prior instruction in mathematical logic will benefit from this multi-part text. Part I offers an elementary but thorough overview of mathematical logic of 1st order. Part II introduces some of the newer ideas and the more profound results of logical research in the 20th century. 1967 edition.

  11. BDI Logics

    NARCIS (Netherlands)

    Meyer, J.J.Ch.; Broersen, J.M.; Herzig, A.

    2015-01-01

    This paper presents an overview of so-called BDI logics, logics where the notion of Beliefs, Desires and Intentions play a central role. Starting out from the basic ideas about BDI by Bratman, we consider various formalizations in logic, such as the approach of Cohen and Levesque, slightly

  12. Metric-based Evaluation of Implemented Software Architectures

    NARCIS (Netherlands)

    Bouwers, E.M.

    2013-01-01

    Software systems make up an important part of our daily lives. Just like all man- made objects, the possibilities of a software system are constrained by the choices made during its creation. The complete set of these choices can be referred to as the software architecture of a system. Since the

  13. Continuously variable rating: a new, simple and logical procedure to evaluate original scientific publications

    Directory of Open Access Journals (Sweden)

    Mauricio Rocha e Silva

    2011-01-01

    Full Text Available OBJECTIVE: Impact Factors (IF are widely used surrogates to evaluate single articles, in spite of known shortcomings imposed by cite distribution skewness. We quantify this asymmetry and propose a simple computer-based procedure for evaluating individual articles. METHOD: (a Analysis of symmetry. Journals clustered around nine Impact Factor points were selected from the medical ‘‘Subject Categories’’ in Journal Citation Reports 2010. Citable items published in 2008 were retrieved and ranked by granted citations over the Jan/2008 - Jun/2011 period. Frequency distribution of cites, normalized cumulative cites and absolute cites/decile were determined for each journal cluster. (b Positive Predictive Value. Three arbitrarily established evaluation classes were generated: LOW (1.33.9. Positive Predictive Value for journal clusters within each class range was estimated. (c Continuously Variable Rating. An alternative evaluation procedure is proposed to allow the rating of individually published articles in comparison to all articles published in the same journal within the same year of publication. The general guiding lines for the construction of a totally dedicated software program are delineated. RESULTS AND CONCLUSIONS: Skewness followed the Pareto Distribution for (1

  14. Research and Evaluations of the Health Aspects of Disasters, Part VI: Interventional Research and the Disaster Logic Model.

    Science.gov (United States)

    Birnbaum, Marvin L; Daily, Elaine K; O'Rourke, Ann P; Kushner, Jennifer

    2016-04-01

    Disaster-related interventions are actions or responses undertaken during any phase of a disaster to change the current status of an affected community or a Societal System. Interventional disaster research aims to evaluate the results of such interventions in order to develop standards and best practices in Disaster Health that can be applied to disaster risk reduction. Considering interventions as production functions (transformation processes) structures the analyses and cataloguing of interventions/responses that are implemented prior to, during, or following a disaster or other emergency. Since currently it is not possible to do randomized, controlled studies of disasters, in order to validate the derived standards and best practices, the results of the studies must be compared and synthesized with results from other studies (ie, systematic reviews). Such reviews will be facilitated by the selected studies being structured using accepted frameworks. A logic model is a graphic representation of the transformation processes of a program [project] that shows the intended relationships between investments and results. Logic models are used to describe a program and its theory of change, and they provide a method for the analyzing and evaluating interventions. The Disaster Logic Model (DLM) is an adaptation of a logic model used for the evaluation of educational programs and provides the structure required for the analysis of disaster-related interventions. It incorporates a(n): definition of the current functional status of a community or Societal System, identification of needs, definition of goals, selection of objectives, implementation of the intervention(s), and evaluation of the effects, outcomes, costs, and impacts of the interventions. It is useful for determining the value of an intervention and it also provides the structure for analyzing the processes used in providing the intervention according to the Relief/Recovery and Risk-Reduction Frameworks.

  15. Oak Ridge National Laboratory Technology Logic Diagram. Volume 3, Technology evaluation data sheets: Part B, Dismantlement, Remedial action

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The Oak Ridge National Laboratory Technology Logic Diagram (TLD) was developed to provide a decision support tool that relates environmental restoration (ER) and waste management (WM) problems at Oak Ridge National Laboratory (ORNL) to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration testing, and evaluation needed to develop these technologies to a state that allows technology transfer and application to decontamination and decommissioning (D&D), remedial action (RA), and WM activities. The TLD consists of three fundamentally separate volumes: Vol. 1, Technology Evaluation; Vol. 2, Technology Logic Diagram and Vol. 3, Technology EvaLuation Data Sheets. Part A of Vols. 1 and 2 focuses on RA. Part B of Vols. 1 and 2 focuses on the D&D of contaminated facilities. Part C of Vols. 1 and 2 focuses on WM. Each part of Vol. 1 contains an overview of the TM, an explanation of the problems facing the volume-specific program, a review of identified technologies, and rankings of technologies applicable to the site. Volume 2 (Pts. A. B. and C) contains the logic linkages among EM goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 (Pts. A. B, and C) contains the TLD data sheets. This volume provides the technology evaluation data sheets (TEDS) for ER/WM activities (D&D, RA and WM) that are referenced by a TEDS code number in Vol. 2 of the TLD. Each of these sheets represents a single logic trace across the TLD. These sheets contain more detail than is given for the technologies in Vol. 2.

  16. Oak Ridge National Laboratory Technology Logic Diagram. Volume 3, Technology evaluation data sheets: Part C, Robotics/automation, Waste management

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The Oak Ridge National Laboratory Technology Logic Diagram (TLD) was developed to provide a decision support tool that relates environmental restoration (ER) and waste management (WM) problems at Oak Ridge National Laboratory (ORNL) to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration testing, and evaluation needed to develop these technologies to a state that allows technology transfer and application to decontamination and decommissioning (D&D), remedial action (RA), and WM activities. The TLD consists of three fundamentally separate volumes: Vol. 1, Technology Evaluation; Vol. 2, Technology Logic Diagram and Vol. 3, Technology EvaLuation Data Sheets. Part A of Vols. 1 and 2 focuses on RA. Part B of Vols. 1 and 2 focuses on the D&D of contaminated facilities. Part C of Vols. 1 and 2 focuses on WM. Each part of Vol. 1 contains an overview of the TM, an explanation of the problems facing the volume-specific program, a review of identified technologies, and rankings of technologies applicable to the site. Volume 2 (Pts. A. B. and C) contains the logic linkages among EM goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 (Pts. A. B, and C) contains the TLD data sheets. This volume provides the technology evaluation data sheets (TEDS) for ER/WM activities (D&D, RA and WM) that are referenced by a TEDS code number in Vol. 2 of the TLD. Each of these sheets represents a single logic trace across the TLD. These sheets contain more detail than is given for the technologies in Vol. 2.

  17. Innovative teaching tools of automatic control and evaluation of trainees’s mathematical knowledge using fuzzy logic

    Directory of Open Access Journals (Sweden)

    Светлана Николаевна Дворяткина

    2014-12-01

    Full Text Available This article focuses on the actual problem of designing information systems of automated control of mathematical knowledge of students using fuzzy logic, which take into account the shortcomings of modern systems of evaluation and control. These include a limited number of forms of response and two-point scoring system, inflexible procedures calculating the final assessment, the lack of consideration of estimating the depth and breadth of knowledge, adaptation of the estimation procedure to the individual characteristics of the students.

  18. The MEDA Project: Developing Evaluation Competence in the Training Software Domain.

    Science.gov (United States)

    Machell, Joan; Saunders, Murray

    1992-01-01

    The MEDA (Methodologie d'Evaluation des Didacticiels pour les Adultes) tool is a generic instrument to evaluate training courseware. It was developed for software designers to improve products, for instructors to select appropriate courseware, and for distributors and consultants to match software to client needs. Describes software evaluation…

  19. PM 3655 PHILIPS Logic analyzer

    CERN Multimedia

    A logic analyzer is an electronic instrument that captures and displays multiple signals from a digital system or digital circuit. A logic analyzer may convert the captured data into timing diagrams, protocol decodes, state machine traces, assembly language, or may correlate assembly with source-level software. Logic Analyzers have advanced triggering capabilities, and are useful when a user needs to see the timing relationships between many signals in a digital system.

  20. Evaluation of a Multi-Variable Self-Learning Fuzzy Logic Controller ...

    African Journals Online (AJOL)

    In spite of the usefulness of fuzzy control, its main drawback comes from lack of a systematic control design methodology. The most challenging aspect of the design of a fuzzy logic controller is the elicitation of the control rules for its rule base. In this paper, a scheme capable of elicitation of acceptable rules for multivariable ...

  1. Supervision of tunnelling constructions and software used for their evaluation

    Science.gov (United States)

    Caravanas, Aristotelis; Hilar, Matous

    2017-09-01

    Supervision is a common instrument for controlling constructions of tunnels. In order to suit relevant project’s purposes a supervision procedure is modified by local conditions, habits, codes and ways of allocating of a particular tunnelling project. The duties of tunnel supervision are specified in an agreement with the client and they can include a wide range of activities. On large scale tunnelling projects the supervision tasks are performed by a high number of people of different professions. Teamwork, smooth communication and coordination are required in order to successfully fulfil supervision tasks. The efficiency and quality of tunnel supervision work are enhanced when specialized software applications are used. Such applications should allow on-line data management and the prompt evaluation, reporting and sharing of relevant construction information and other aspects. The client is provided with an as-built database that contains all the relevant information related to a construction process, which is a valuable tool for the claim management as well as for the evaluation of structure defects that can occur in the future. As a result, the level of risks related to tunnel constructions is decreased.

  2. Study of evaluation techniques of software safety and reliability in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Youn, Cheong; Baek, Y. W.; Kim, H. C.; Park, N. J.; Shin, C. Y. [Chungnam National Univ., Taejon (Korea, Republic of)

    1999-04-15

    Software system development process and software quality assurance activities are examined in this study. Especially software safety and reliability requirements in nuclear power plant are investigated. For this purpose methodologies and tools which can be applied to software analysis, design, implementation, testing, maintenance step are evaluated. Necessary tasks for each step are investigated. Duty, input, and detailed activity for each task are defined to establish development process of high quality software system. This means applying basic concepts of software engineering and principles of system development. This study establish a guideline that can assure software safety and reliability requirements in digitalized nuclear plant systems and can be used as a guidebook of software development process to assure software quality many software development organization.

  3. Y-12 Plant remedial action Technology Logic Diagram: Volume 3, Technology evaluation data sheets: Part A, Remedial action

    International Nuclear Information System (INIS)

    1994-09-01

    The Y-12 Plant Remedial Action Technology Logic Diagram (TLD) was developed to provide a decision-support tool that relates environmental restoration (ER) problems at the Y-12 Plant to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to remedial action (RA) activities. The TLD consists of three volumes. Volume 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the rankings of remedial technologies. Volume 2 contains the logic linkages among environmental management goals, environmental problems and the various technologies that have the potential to solve these problems. Volume 3 contains the TLD data sheets. This report is Part A of Volume 3 and contains the Remedial Action section

  4. Y-12 Plant remedial action Technology Logic Diagram: Volume 3, Technology evaluation data sheets: Part B, Characterization; robotics/automation

    International Nuclear Information System (INIS)

    1994-09-01

    The Y-12 Plant Remedial Action Technology Logic Diagram (TLD) was developed to provide a decision-support tool that relates environmental restoration (ER) problems at the Y-12 Plant to potential technologies that can remediate theses problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to remedial action (RA) activities. The TLD consists of three volumes. Volume 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the rankings of remedial technologies. Volume 2 contains the logic linkages among environmental management goals, environmental problems, and the various technologies that have the potential to solve these problems. Volume 3 contains the TLD data sheets. This report is Part B of Volume 3 and contains the Characterization and Robotics/Automation sections

  5. Y-12 Plant remedial action Technology Logic Diagram: Volume 3, Technology evaluation data sheets: Part A, Remedial action

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-09-01

    The Y-12 Plant Remedial Action Technology Logic Diagram (TLD) was developed to provide a decision-support tool that relates environmental restoration (ER) problems at the Y-12 Plant to potential technologies that can remediate these problems. The TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to remedial action (RA) activities. The TLD consists of three volumes. Volume 1 contains an overview of the TLD, an explanation of the program-specific responsibilities, a review of identified technologies, and the rankings of remedial technologies. Volume 2 contains the logic linkages among environmental management goals, environmental problems and the various technologies that have the potential to solve these problems. Volume 3 contains the TLD data sheets. This report is Part A of Volume 3 and contains the Remedial Action section.

  6. Performance Evaluation of Software Routers with VPN Features

    Directory of Open Access Journals (Sweden)

    H. Redžović

    2017-11-01

    Full Text Available This paper presents implementation and analysis of the VPN software router which is based on Quagga and strongSwan open-source software tools. We validated the functionalities of strongSwan and Quagga in realistic environment which include scenarios with link failures. Also, we measured and analyzed the performance of encryption and hash algorithms supported by strongSwan software, in order to advise an optimal VPN configuration that provides the best performance.

  7. Methodology for economic evaluation of software development projects

    International Nuclear Information System (INIS)

    Witte, D.M.

    1990-01-01

    Many oil and gas exploration and production companies develop computer software in-house or with contract programmers to support their exploration activities. Software development projects compete for funding with exploration and development projects, though most companies lack valid comparison measures for the two types of projects. This paper presents a methodology of pro form a cash flow analysis for software development proposals intended for internal use. This methodology, based on estimates of development and support costs, exploration benefits, and probability of successful development and implementation, can be used to compare proposed software development projects directly with competing exploration proposals

  8. A system for automatic evaluation of simulation software

    Science.gov (United States)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  9. Dispositional logic

    Science.gov (United States)

    Le Balleur, J. C.

    1988-01-01

    The applicability of conventional mathematical analysis (based on the combination of two-valued logic and probability theory) to problems in which human judgment, perception, or emotions play significant roles is considered theoretically. It is shown that dispositional logic, a branch of fuzzy logic, has particular relevance to the common-sense reasoning typical of human decision-making. The concepts of dispositionality and usuality are defined analytically, and a dispositional conjunctive rule and dispositional modus ponens are derived.

  10. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    Science.gov (United States)

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  11. Evaluating Business Intelligence/Business Analytics Software for Use in the Information Systems Curriculum

    Science.gov (United States)

    Davis, Gary Alan; Woratschek, Charles R.

    2015-01-01

    Business Intelligence (BI) and Business Analytics (BA) Software has been included in many Information Systems (IS) curricula. This study surveyed current and past undergraduate and graduate students to evaluate various BI/BA tools. Specifically, this study compared several software tools from two of the major software providers in the BI/BA field.…

  12. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    Science.gov (United States)

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Multi-channel software defined radio experimental evaluation and analysis

    CSIR Research Space (South Africa)

    Van der Merwe, JR

    2014-09-01

    Full Text Available Multi-channel software-defined radios (SDRs) can be utilised as inexpensive prototyping platforms for transceiver arrays. The application for multi-channel prototyping is discussed and measured results of coherent channels for both receiver...

  14. [Logic model of the Franche-Comté Regional Health Project: advantages and limitations for the evaluation process].

    Science.gov (United States)

    Michaud, Claude; Sannino, Nadine; Duboudin, Cédric; Baudier, François; Guillin, Caroline; Billondeau, Christine; Mansion, Sylvie

    2014-01-01

    The French "Hospitals, patients, health and territories" law of July 2009 created the Regional Health Project (PRS) to support regional health policy, and requires evaluation of these projects. The construction of these projects, which includes prevention planning, care planning, and medical and social welfare planning, presents an unprecedented complexity in France, where evaluation programmes are still in their infancy. To support future evaluations, the Franche-Comté Regional Health Agency (ARS FC), assisted by the expertise of EFECT Consultants, decided to reconstruct the PRS logic model. This article analyzes the advantages and limitations of this approach. The resulting logic model allows visualization of the strategy adopted to achieve the Franche-Comté PRS ambitions and expected results. The model highlights four main aspects of structural change to the health system, often poorly visible in PRS presentation documents. This model also establishes links with the usual public policy evaluation issues and facilitates their prioritization. This approach also provides a better understanding of the importance of analysis of the programme construction in order to be effective rather than direct analysis of the effects, which constitutes the natural tendency of current practice. The main controversial limit concerns the retrospective design of the PRS framework, both in terms of the reliability of interpretation and adoption by actors not directly involved in this initiative.

  15. Logic Meeting

    CERN Document Server

    Tugué, Tosiyuki; Slaman, Theodore

    1989-01-01

    These proceedings include the papers presented at the logic meeting held at the Research Institute for Mathematical Sciences, Kyoto University, in the summer of 1987. The meeting mainly covered the current research in various areas of mathematical logic and its applications in Japan. Several lectures were also presented by logicians from other countries, who visited Japan in the summer of 1987.

  16. Veterinary software application for comparison of thermograms for pathology evaluation

    Science.gov (United States)

    Pant, Gita; Umbaugh, Scott E.; Dahal, Rohini; Lama, Norsang; Marino, Dominic J.; Sackman, Joseph

    2017-09-01

    The bilateral symmetry property in mammals allows for the detection of pathology by comparison of opposing sides. For any pathological disorder, thermal patterns differ compared to the normal body part. A software application for veterinary clinics has been under development to input two thermograms of body parts on both sides, one normal and the other unknown, and the application compares them based on extracted features and appropriate similarity and difference measures and outputs the likelihood of pathology. Here thermographic image data from 19° C to 40° C was linearly remapped to create images with 256 gray level values. Features were extracted from these images, including histogram, texture and spectral features. The comparison metrics used are the vector inner product, Tanimoto, Euclidean, city block, Minkowski and maximum value metric. Previous research with the anterior cruciate ligament (ACL) pathology in dogs suggested any thermogram variation below a threshold of 40% of Euclidean distance is normal and above 40% is abnormal. Here the 40% threshold was applied to a new ACL image set and achieved a sensitivity of 75%, an improvement from the 55% sensitivity of the previous work. With the new data set it was determined that using a threshold of 20% provided a much improved 92% sensitivity metric. However, this will require further research to determine the corresponding specificity success rate. Additionally, it was found that the anterior view provided better results than the lateral view. It was also determined that better results were obtained with all three feature sets than with just the histogram and texture sets. Further experiments are ongoing with larger image datasets, and pathologies, new features and comparison metric evaluation for determination of more accurate threshold values to separate normal and abnormal images.

  17. Improved Fuzzy Logic System to Evaluate Milk Electrical Conductivity Signals from On-Line Sensors to Monitor Dairy Goat Mastitis

    Directory of Open Access Journals (Sweden)

    Mauro Zaninelli

    2016-07-01

    Full Text Available The aim of this study was to develop and test a new fuzzy logic model for monitoring the udder health status (HS of goats. The model evaluated, as input variables, the milk electrical conductivity (EC signal, acquired on-line for each gland by a dedicated sensor, the bandwidth length and the frequency and amplitude of the first main peak of the Fourier frequency spectrum of the recorded milk EC signal. Two foremilk gland samples were collected from eight Saanen goats for six months at morning milking (lactation stages (LS: 0–60 Days In Milking (DIM; 61–120 DIM; 121–180 DIM, for a total of 5592 samples. Bacteriological analyses and somatic cell counts (SCC were used to define the HS of the glands. With negative bacteriological analyses and SCC < 1,000,000 cells/mL, glands were classified as healthy. When bacteriological analyses were positive or showed a SCC > 1,000,000 cells/mL, glands were classified as not healthy (NH. For each EC signal, an estimated EC value was calculated and a relative deviation was obtained. Furthermore, the Fourier frequency spectrum was evaluated and bandwidth length, frequency and amplitude of the first main peak were identified. Before using these indexes as input variables of the fuzzy logic model a linear mixed-effects model was developed to evaluate the acquired data considering the HS, LS and LS × HS as explanatory variables. Results showed that performance of a fuzzy logic model, in the monitoring of mammary gland HS, could be improved by the use of EC indexes derived from the Fourier frequency spectra of gland milk EC signals recorded by on-line EC sensors.

  18. MANUAL LOGIC CONTROLLER (MLC)

    OpenAIRE

    Claude Ziad Bayeh

    2015-01-01

    The “Manual Logic Controller” also called MLC, is an electronic circuit invented and designed by the author in 2008, in order to replace the well known PLC (Programmable Logic Controller) in many applications for its advantages and its low cost of fabrication. The function of the MLC is somewhat similar to the well known PLC, but instead of doing it by inserting a written program into the PLC using a computer or specific software inside the PLC, it will be manually programmed in a manner to h...

  19. A technical survey on issues of the quantitative evaluation of software reliability

    International Nuclear Information System (INIS)

    Park, J. K; Sung, T. Y.; Eom, H. S.; Jeong, H. S.; Park, J. H.; Kang, H. G.; Lee, K. Y.; Park, J. K.

    2000-04-01

    To develop the methodology for evaluating the software reliability included in digital instrumentation and control system (I and C), many kinds of methodologies/techniques that have been proposed from the software reliability engineering fuel are analyzed to identify the strong and week points of them. According to analysis results, methodologies/techniques that can be directly applied for the evaluation of the software reliability are not exist. Thus additional researches to combine the most appropriate methodologies/techniques from existing ones would be needed to evaluate the software reliability. (author)

  20. Evaluation of peak-fitting software for gamma spectrum analysis

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Moralles, Mauricio

    2009-01-01

    In all applications of gamma-ray spectroscopy, one of the most important and delicate parts of the data analysis is the fitting of the gamma-ray spectra, where information as the number of counts, the position of the centroid and the width, for instance, are associated with each peak of each spectrum. There's a huge choice of computer programs that perform this type of analysis, and the most commonly used in routine work are the ones that automatically locate and fit the peaks; this fit can be made in several different ways - the most common ways are to fit a Gaussian function to each peak or simply to integrate the area under the peak, but some software go far beyond and include several small corrections to the simple Gaussian peak function, in order to compensate for secondary effects. In this work several gamma-ray spectroscopy software are compared in the task of finding and fitting the gamma-ray peaks in spectra taken with standard sources of 137 Cs, 60 Co, 133 Ba and 152 Eu. The results show that all of the automatic software can be properly used in the task of finding and fitting peaks, with the exception of GammaVision; also, it was possible to verify that the automatic peak-fitting software did perform as well as - and sometimes even better than - a manual peak-fitting software. (author)

  1. Using a logic model to evaluate the Kids Together early education inclusion program for children with disabilities and additional needs.

    Science.gov (United States)

    Clapham, Kathleen; Manning, Claire; Williams, Kathryn; O'Brien, Ginger; Sutherland, Margaret

    2017-04-01

    Despite clear evidence that learning and social opportunities for children with disabilities and special needs are more effective in inclusive not segregated settings, there are few known effective inclusion programs available to children with disabilities, their families or teachers in the early years within Australia. The Kids Together program was developed to support children with disabilities/additional needs aged 0-8 years attending mainstream early learning environments. Using a key worker transdisciplinary team model, the program aligns with the individualised package approach of the National Disability Insurance Scheme (NDIS). This paper reports on the use of a logic model to underpin the process, outcomes and impact evaluation of the Kids Together program. The research team worked across 15 Early Childhood Education and Care (ECEC) centres and in home and community settings. A realist evaluation using mixed methods was undertaken to understand what works, for whom and in what contexts. The development of a logic model provided a structured way to explore how the program was implemented and achieved short, medium and long term outcomes within a complex community setting. Kids Together was shown to be a highly effective and innovative model for supporting the inclusion of children with disabilities/additional needs in a range of environments central for early childhood learning and development. The use of a logic model provided a visual representation of the Kids Together model and its component parts and enabled a theory of change to be inferred, showing how a coordinated and collaborative approached can work across multiple environments. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Oak Ridge National Laboratory Technology Logic Diagram. Volume 1, Technology Evaluation: Part A, Decontamination and Decommissioning

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    The Strategic Roadmap for the Oak Ridge Reservation is a generalized planning document that identifies broad categories of issues that keep ORNL outside full compliance with the law and other legally binding agreements. Possible generic paths to compliance, issues, and the schedule for resolution of the issues one identified. The role of the Oak Ridge National Laboratory Technology Logic Diagram (TLD) is then to identify specific site issues (problems), identify specific technologies that can be brought to bear on the issues, and assess the current status and readiness of these remediation technologies within the constraints of the schedule commitment. Regulatory requirements and commitments contained in the Strategic Roadmap for the Oak Ridge Reservation are also included in the TLD as constraints to the application of immature technological solutions. Some otherwise attractive technological solutions may not be employed because they may not be deployable on the schedule enumerated in the regulatory agreements. The roadmap for ORNL includes a list of 46 comprehensive logic diagrams for WM of low-level, radioactive-mixed, hazardous, sanitary and industrial. and TRU waste. The roadmapping process gives comparisons of the installation as it exists to the way the installation should exist under full compliance. The identification of the issues is the goal of roadmapping. This allows accurate and timely formulation of activities.

  3. Y-12 Plant Remedial Action technology logic diagram. Volume I: Technology evaluation

    International Nuclear Information System (INIS)

    1994-09-01

    The Y-12 Plant Remedial Action Program addresses remediation of the contaminated groundwater, surface water and soil in the following areas located on the Oak Ridge Reservation: Chestnut Ridge, Bear Creek Valley, the Upper and Lower East Fork Popular Creek Watersheds, CAPCA 1, which includes several areas in which remediation has been completed, and CAPCA 2, which includes dense nonaqueous phase liquid wells and a storage facility. There are many facilities within these areas that are contaminated by uranium, mercury, organics, and other materials. This Technology Logic Diagram identifies possible remediation technologies that can be applied to the soil, water, and contaminants for characterization, treatment, and waste management technology options are supplemented by identification of possible robotics or automation technologies. These would facilitate the cleanup effort by improving safety, of remediation, improving the final remediation product, or decreasing the remediation cost. The Technology Logic Diagram was prepared by a diverse group of more than 35 scientists and engineers from across the Oak Ridge Reservation. Most are specialists in the areas of their contributions. 22 refs., 25 tabs

  4. Oak Ridge K-25 Site Technology Logic Diagram. Volume 1, Technology evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Fellows, R.L. [ed.

    1993-02-26

    The Oak Ridge K-25 Technology Logic Diagram (TLD), a decision support tool for the K-25 Site, was developed to provide a planning document that relates environmental restoration and waste management problems at the Oak Ridge K-25 Site to potential technologies that can remediate these problems. The TLD technique identifies the research necessary to develop these technologies to a state that allows for technology transfer and application to waste management, remedial action, and decontamination and decommissioning activities. The TLD consists of four separate volumes-Vol. 1, Vol. 2, Vol. 3A, and Vol. 3B. This Volume, Volume 1 provides introductory and overview information about the TLD. Volume 2 contains logic diagrams. Volume 3 has been divided into two separate volumes to facilitate handling and use. This volume is divided into ten chapters. The first chapter is a brief introduction, and the second chapter details the technical approach of the TLD. These categories are the work activities necessary for successful decontamination and decommissioning, waste management, and remedial action of the K-25 Site. The categories are characterization, decontamination, dismantlement, robotics and automation, remedial action, and waste management. Materials disposition is addressed in Chap. 9. The final chapter contains regulatory compliance information concerning waste management, remedial action, and decontamination and decommissioning.

  5. Evaluating software for safety systems in nuclear power plants

    International Nuclear Information System (INIS)

    Lawrence, J.D.; Persons, W.L.; Preckshot, G.G.; Gallagher, J.

    1994-01-01

    In 1991, LLNL was asked by the NRC to provide technical assistance in various aspects of computer technology that apply to computer-based reactor protection systems. This has involved the review of safety aspects of new reactor designs and the provision of technical advice on the use of computer technology in systems important to reactor safety. The latter includes determining and documenting state-of-the-art subjects that require regulatory involvement by the NRC because of their importance in the development and implementation of digital computer safety systems. These subjects include data communications, formal methods, testing, software hazards analysis, verification and validation, computer security, performance, software complexity and others. One topic software reliability and safety is the subject of this paper

  6. Staying in the Light: Evaluating Sustainability Models for Brokering Software

    Science.gov (United States)

    Powers, L. A.; Benedict, K. K.; Best, M.; Fyfe, S.; Jacobs, C. A.; Michener, W. K.; Pearlman, J.; Turner, A.; Nativi, S.

    2015-12-01

    The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and obstacles, and policy and legal considerations. The issue of sustainability is not unique to brokering software and these models may be relevant to many applications. Results of this comprehensive analysis highlight advantages and disadvantages of the various models in respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis while recognizing that all software is part of an evolutionary process and has a lifespan.

  7. Applying a Framework to Evaluate Assignment Marking Software: A Case Study on Lightwork

    Science.gov (United States)

    Heinrich, Eva; Milne, John

    2012-01-01

    This article presents the findings of a qualitative evaluation on the effect of a specialised software tool on the efficiency and quality of assignment marking. The software, Lightwork, combines with the Moodle learning management system and provides support through marking rubrics and marker allocations. To enable the evaluation a framework has…

  8. Extensive Evaluation of Using a Game Project in a Software Architecture Course

    Science.gov (United States)

    Wang, Alf Inge

    2011-01-01

    This article describes an extensive evaluation of introducing a game project to a software architecture course. In this project, university students have to construct and design a type of software architecture, evaluate the architecture, implement an application based on the architecture, and test this implementation. In previous years, the domain…

  9. Real Time Robot Soccer Game Event Detection Using Finite State Machines with Multiple Fuzzy Logic Probability Evaluators

    Directory of Open Access Journals (Sweden)

    Elmer P. Dadios

    2009-01-01

    Full Text Available This paper presents a new algorithm for real time event detection using Finite State Machines with multiple Fuzzy Logic Probability Evaluators (FLPEs. A machine referee for a robot soccer game is developed and is used as the platform to test the proposed algorithm. A novel technique to detect collisions and other events in microrobot soccer game under inaccurate and insufficient information is presented. The robots' collision is used to determine goalkeeper charging and goal score events which are crucial for the machine referee's decisions. The Main State Machine (MSM handles the schedule of event activation. The FLPE calculates the probabilities of the true occurrence of the events. Final decisions about the occurrences of events are evaluated and compared through threshold crisp probability values. The outputs of FLPEs can be combined to calculate the probability of an event composed of subevents. Using multiple fuzzy logic system, the FLPE utilizes minimal number of rules and can be tuned individually. Experimental results show the accuracy and robustness of the proposed algorithm.

  10. Evaluation of Farm Accounting Software. Improved Decision Making.

    Science.gov (United States)

    Lovell, Ashley C., Comp.

    This guide contains information on 36 computer programs used for farm and ranch accounting. This information and assessment of software features were provided by the manufacturers and vendors. Information is provided on the following items, among others: program name, vendor's name and address, computer and operating system, type of accounting and…

  11. Evaluation of open source data mining software packages

    Science.gov (United States)

    Bonnie Ruefenacht; Greg Liknes; Andrew J. Lister; Haans Fisk; Dan Wendt

    2009-01-01

    Since 2001, the USDA Forest Service (USFS) has used classification and regression-tree technology to map USFS Forest Inventory and Analysis (FIA) biomass, forest type, forest type groups, and National Forest vegetation. This prior work used Cubist/See5 software for the analyses. The objective of this project, sponsored by the Remote Sensing Steering Committee (RSSC),...

  12. An Evaluation of ADLs on Modeling Patterns for Software Architecture

    NARCIS (Netherlands)

    Waqas Kamal, Ahmad; Avgeriou, Paris

    2007-01-01

    Architecture patterns provide solutions to recurring design problems at the architecture level. In order to model patterns during software architecture design, one may use a number of existing Architecture Description Languages (ADLs), including the UML, a generic language but also a de facto

  13. Fuzzy Logic-based expert system for evaluating cake quality of freeze-dried formulations

    DEFF Research Database (Denmark)

    Trnka, Hjalte; Wu, Jian-Xiong; van de Weert, Marco

    2013-01-01

    Freeze-drying of peptide and protein-based pharmaceuticals is an increasingly important field of research. The diverse nature of these compounds, limited understanding of excipient functionality, and difficult-to-analyze quality attributes together with the increasing importance of the biosimilar......Freeze-drying of peptide and protein-based pharmaceuticals is an increasingly important field of research. The diverse nature of these compounds, limited understanding of excipient functionality, and difficult-to-analyze quality attributes together with the increasing importance...... critical visual features such as the degree of cake collapse, glassiness, and color uniformity. On the basis of the IA outputs, a fuzzy logic system for analysis of these freeze-dried cakes was constructed. After this development phase, the system was tested with a new screening well plate. The developed...

  14. Comparative evaluation of fuzzy logic and genetic algorithms models for portfolio optimization

    Directory of Open Access Journals (Sweden)

    Heidar Masoumi Soureh

    2017-03-01

    Full Text Available Selection of optimum methods which have appropriate speed and precision for planning and de-cision-making has always been a challenge for investors and managers. One the most important concerns for them is investment planning and optimization for acquisition of desirable wealth under controlled risk with the best return. This paper proposes a model based on Markowitz the-orem by considering the aforementioned limitations in order to help effective decisions-making for portfolio selection. Then, the model is investigated by fuzzy logic and genetic algorithms, for the optimization of the portfolio in selected active companies listed in Tehran Stock Exchange over the period 2012-2016 and the results of the above models are discussed. The results show that the two studied models had functional differences in portfolio optimization, its tools and the possibility of supplementing each other and their selection.

  15. Integrando mediante patrones de software una estrategia Fuzzy-Logic, en un servicio de balanceo dinámico de carga bajo CORBA

    OpenAIRE

    Gricelda Medina Veloz; Francisco Javier Luna Rosas; Jaime Muñoz Arteaga; Julio César Martínez Romo

    2008-01-01

    El enfoque del uso de patrones en el desarrollo de software se ha popularizado a nivel mundial y dentro de la ingeniería de software se ha extendido cada vez más su uso, a tal grado que se han creado y se siguen creando catálogos de patrones para solucionar problemas en el desarrollo de software a distintos niveles de abstracción. Este trabajo se centra en destacar el diseño mediante patrones de software de un servicio de balanceo de carga para aplicaciones distribuidas desarrolladas bajo el ...

  16. Evaluation of software and electronics technologies for the control of the E-ELT instruments: a case study

    International Nuclear Information System (INIS)

    Di Marcantonio, P.; Cirami, R.; Coretti, I.; Chiozzi, G.; Kiekebusch, M.

    2012-01-01

    In the scope of the evaluation of architecture and technologies for the control system of the E-ELT (European-Extremely Large Telescope) instruments, a collaboration has been set up between the Instrumentation and Control Group of the INAF-OATs and the ESO Directorate of Engineering. The first result of this collaboration is the design and implementation of a prototype of a small but representative control system for a kind of multi-object (optical) spectrograph. The electronics has been based on PLCs (Programmable Logical Controller) and Ethernet based field-buses from different vendors but using international standards like the IEC 61131-3 and PLCopen Motion Control. The baseline design for the control software follows the architecture of the VLT (Very Large Telescope) Instrumentation application framework but it has been implemented using the ACS (ALMA Common Software), an open source software framework developed for the ALMA project and based on CORBA middle-ware. The communication among the software components is based on two models: CORBA calls for command/reply using the client/server paradigm and CORBA notification channel for distributing the devices status using the publisher/subscriber paradigm. The communication with the PLCs is based on OPC UA, an international standard for the communication with industrial controllers. The results of this work will contribute to the definition of the architecture of the control system that will be provided to all consortia responsible for the actual implementation of the E-ELT instruments. This paper presents the prototype motivation, its architecture, design and implementation. (authors)

  17. Evaluation procedure of software requirements specification for digital I and C of KNGR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Park, Jong Kyun; Lee, Ki Young; Kim, Jang Yeol; Cheon, Se Woo

    2001-06-01

    The accuracy of the specification of requirements of a digital system is of prime importance to the acceptance and success of the system. The development, use, and regulation of computer systems in nuclear reactor Instrumentation and Control (I and C) systems to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Korean Next Generation Reactor (KNGR) Software Safety Verification and Validation (SSVV) Task, Korea Atomic Energy Research Institute, which investigates different aspects of computer software in reactor I and C systems, and describes the engineering procedures for developing such a software. The purpose of this guideline is to give the software safety evaluator the trail map between the code and standards layer and the design methodology and documents layer for the software important to safety in nuclear power plants. Recently, the requirements specification of safety-critical software systems and safety analysis of them are being recognized as one of the important issues in the software life cycle, and being developed new regulatory positions and standards by the regulatory and the standardization organizations such as IAEA, IEC, and IEEE. We presented the procedure for evaluating the software requirements specifications of the KNGR protection systems. We believe it can be useful for both licenser and licensee to conduct an evaluation of the safety in the requirements phase of developing the software. The guideline consists of the requirements engineering for software of KNGR protection systems in chapter 1, the evaluation checklist of software requirements specification in chapter2.3, and the safety evaluation procedure of KNGR software requirements specification in chapter 2.4

  18. Survivability as a Tool for Evaluating Open Source Software

    Science.gov (United States)

    2015-06-01

    tremendously successful in certain applications such as the Mozilla Firefox web browser and the Apache web server [10]. Open source software is often...source versions (such as Internet Explorer compared to Mozilla Firefox ), which typically conclude that vulnerabilities are, in fact, much more...for radios M. Smith ACS ACS ROS autonomous functionality (none) ACS PX4 Firmware PX4 FMU driver BSD 3-clause ACS PX4 Nuttx real time OS BSD ACS

  19. Short-circuit logic

    NARCIS (Netherlands)

    Bergstra, J.A.; Ponse, A.

    2010-01-01

    Short-circuit evaluation denotes the semantics of propositional connectives in which the second argument is only evaluated if the first argument does not suffice to determine the value of the expression. In programming, short-circuit evaluation is widely used. A short-circuit logic is a variant of

  20. Automation of Military Civil Engineering and Site Design Functions: Software Evaluation

    Science.gov (United States)

    1989-09-01

    AutoCad , in-house application programs written in AutoCad command language, AutoLISP , and BASIC Would like to obtain: Surveying, earthwork, utilities...Experiment Station (WES) Corps library programs, no one software is being used more than another. For drafting, AutoCAD has been the most commonly...ware packages evaluated. D.C.A. Engineering Software D.C.A. software is used to enhance the AutoCAD drafting package and operates solely within the

  1. Evaluation procedure of software safety plan for digital I and C of KNGR

    International Nuclear Information System (INIS)

    Lee, Jang Soo; Park, Jong Kyun; Lee, Ki Young; Kwon, Ki Choon; Kim, Jang Yeol; Cheon, Se Woo

    2000-05-01

    The development, use, and regulation of computer systems in nuclear reactor instrumentation and control (I and C) systems to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Korean next generation reactor (KNGR) software safety verification and validation (SSVV) task, Korea Atomic Energy Research Institute, which investigates different aspects of computer software in reactor I and C systems, and describes the engineering procedures for developing such a software. The purpose of this guideline is to give the software safety evaluator the trail map between the code and standards layer and the design methodology and documents layer for the software important to safety in nuclear power plants. Recently, the safety planning for safety-critical software systems is being recognized as the most important phase in the software life cycle, and being developed new regulatory positions and standards by the regulatory and the standardization organizations. The requirements for software important to safety of nuclear reactor are described in such positions and standards, for example, the new standard review plan (SRP), IEC 880 supplements, IEEE standard 1228-1994, IEEE standard 7-4.3.2-1993, and IAEA safety series No. 50-SG-D3 and D8. We presented the guidance for evaluating the safety plan of the software in the KNGR protection systems. The guideline consists of the regulatory requirements for software safety in chapter 2, the evaluation checklist of software safety plan in chapter3, and the evaluation results of KNGR software safety plan in chapter 4

  2. Evaluation procedure of software safety plan for digital I and C of KNGR

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jang Soo; Park, Jong Kyun; Lee, Ki Young; Kwon, Ki Choon; Kim, Jang Yeol; Cheon, Se Woo

    2000-05-01

    The development, use, and regulation of computer systems in nuclear reactor instrumentation and control (I and C) systems to enhance reliability and safety is a complex issue. This report is one of a series of reports from the Korean next generation reactor (KNGR) software safety verification and validation (SSVV) task, Korea Atomic Energy Research Institute, which investigates different aspects of computer software in reactor I and C systems, and describes the engineering procedures for developing such a software. The purpose of this guideline is to give the software safety evaluator the trail map between the code and standards layer and the design methodology and documents layer for the software important to safety in nuclear power plants. Recently, the safety planning for safety-critical software systems is being recognized as the most important phase in the software life cycle, and being developed new regulatory positions and standards by the regulatory and the standardization organizations. The requirements for software important to safety of nuclear reactor are described in such positions and standards, for example, the new standard review plan (SRP), IEC 880 supplements, IEEE standard 1228-1994, IEEE standard 7-4.3.2-1993, and IAEA safety series No. 50-SG-D3 and D8. We presented the guidance for evaluating the safety plan of the software in the KNGR protection systems. The guideline consists of the regulatory requirements for software safety in chapter 2, the evaluation checklist of software safety plan in chapter3, and the evaluation results of KNGR software safety plan in chapter 4.

  3. Propositional Logics of Dependence

    NARCIS (Netherlands)

    Yang, F.; Väänänen, J.

    2016-01-01

    In this paper, we study logics of dependence on the propositional level. We prove that several interesting propositional logics of dependence, including propositional dependence logic, propositional intuitionistic dependence logic as well as propositional inquisitive logic, are expressively complete

  4. Intuitionistic hybrid logic

    DEFF Research Database (Denmark)

    Braüner, Torben

    2011-01-01

    Intuitionistic hybrid logic is hybrid modal logic over an intuitionistic logic basis instead of a classical logical basis. In this short paper we introduce intuitionistic hybrid logic and we give a survey of work in the area.......Intuitionistic hybrid logic is hybrid modal logic over an intuitionistic logic basis instead of a classical logical basis. In this short paper we introduce intuitionistic hybrid logic and we give a survey of work in the area....

  5. Microhard MHX 2420 Orbital Performance Evaluation Using RT Logic T400CS

    Science.gov (United States)

    Kearney, Stuart; Lombardi, Mark; Attai, Watson; Oyadomari, Ken; Al Rumhi, Ahmed Saleh Nasser; Rakotonarivo, Sebastien; Chardon, Loic; Gazulla, Oriol Tintore; Wolfe, Jasper; Salas, AlbertoGuillen; hide

    2012-01-01

    A major upfront cost of building low cost Nanosatellites is the communications sub-system. Most radios built for space missions cost over $4,000 per unit. This exceeds many budgets. One possible cost effective solution is the Microhard MHX2420, a commercial off-the-shelf transceiver with a unit cost under $1000. This paper aims to support the Nanosatellite community seeking an inexpensive radio by characterizing Microhard's performance envelope. Though not intended for space operations, the ability to test edge cases and increase average data transfer speeds through optimization positions this radio as a solution for Nanosatellite communications by expanding usage to include more missions. The second objective of this paper is to test and verify the optimal radio settings for the most common cases to improve downlinking. All tests were conducted with the aid of the RT Logic T400CS, a hardware-in-the-loop channel simulator designed to emulate real-world radio frequency (RF) link effects. This study provides recommended settings to optimize the downlink speed as well as the environmental parameters that cause the link to fail.

  6. Evaluation of Magnetospheric Internal Magnetic Field models and Existing Software

    Science.gov (United States)

    1990-01-31

    than an order of magnitude. The polar maxima are still visible, but they are not as distinct. The SAA is still apparent, but apin it is not as...completely overlap each other either. These plots show polar maxima (the right two panels) and the SAA minimum (the lower right panel). Note, that the...coGRF85 83M Contours Year - 1990.0 Atitude 1.5000 to 40 1221020.0 -225 4. DISCUSSION OF SOFWARE 4.1 INTRODUCION iliree basic software packages were used

  7. Object-oriented software for evaluating measurement uncertainty

    Science.gov (United States)

    Hall, B. D.

    2013-05-01

    An earlier publication (Hall 2006 Metrologia 43 L56-61) introduced the notion of an uncertain number that can be used in data processing to represent quantity estimates with associated uncertainty. The approach can be automated, allowing data processing algorithms to be decomposed into convenient steps, so that complicated measurement procedures can be handled. This paper illustrates the uncertain-number approach using several simple measurement scenarios and two different software tools. One is an extension library for Microsoft Excel®. The other is a special-purpose calculator using the Python programming language.

  8. Object-oriented software for evaluating measurement uncertainty

    International Nuclear Information System (INIS)

    Hall, B D

    2013-01-01

    An earlier publication (Hall 2006 Metrologia 43 L56–61) introduced the notion of an uncertain number that can be used in data processing to represent quantity estimates with associated uncertainty. The approach can be automated, allowing data processing algorithms to be decomposed into convenient steps, so that complicated measurement procedures can be handled. This paper illustrates the uncertain-number approach using several simple measurement scenarios and two different software tools. One is an extension library for Microsoft Excel®. The other is a special-purpose calculator using the Python programming language. (paper)

  9. Software Organization in Student Data Banks for Research and Evaluation: Four Institutional Models.

    Science.gov (United States)

    Friedman, Charles P.

    Student data banks for ongoing research and evaluation have been implemented by a number of professional schools. Institutions selecting software designs for the establishment of such systems are often faced with making their choice before all the possible uses of the system are determined. Making software design decisions involves "rational"…

  10. Free and open-source software application for the evaluation of coronary computed tomography angiography images.

    Science.gov (United States)

    Hadlich, Marcelo Souza; Oliveira, Gláucia Maria Moraes; Feijóo, Raúl A; Azevedo, Clerio F; Tura, Bernardo Rangel; Ziemer, Paulo Gustavo Portela; Blanco, Pablo Javier; Pina, Gustavo; Meira, Márcio; Souza e Silva, Nelson Albuquerque de

    2012-10-01

    The standardization of images used in Medicine in 1993 was performed using the DICOM (Digital Imaging and Communications in Medicine) standard. Several tests use this standard and it is increasingly necessary to design software applications capable of handling this type of image; however, these software applications are not usually free and open-source, and this fact hinders their adjustment to most diverse interests. To develop and validate a free and open-source software application capable of handling DICOM coronary computed tomography angiography images. We developed and tested the ImageLab software in the evaluation of 100 tests randomly selected from a database. We carried out 600 tests divided between two observers using ImageLab and another software sold with Philips Brilliance computed tomography appliances in the evaluation of coronary lesions and plaques around the left main coronary artery (LMCA) and the anterior descending artery (ADA). To evaluate intraobserver, interobserver and intersoftware agreements, we used simple and kappa statistics agreements. The agreements observed between software applications were generally classified as substantial or almost perfect in most comparisons. The ImageLab software agreed with the Philips software in the evaluation of coronary computed tomography angiography tests, especially in patients without lesions, with lesions 70% in the ADA was lower, but this is also observed when the anatomical reference standard is used.

  11. Organ doses in interventional radiology procedures: Evaluation of software

    International Nuclear Information System (INIS)

    Tort, I.; Ruiz-Cruces, R.; Perez-Martinez, M.; Carrera, F.; Ojeda, C.; Diez de los Rios, A.

    2001-01-01

    Interventional Radiology (IR) procedures require large fluoroscopy times and important number of radiological images, so the levels of radiation to patient are high, which leads us to calculate the organ doses. The objective of this work is to estimate and make a comparison of the results given by the different software that we have to do the calculation of organ doses in complex procedures of IR. To do this, 28 patients have been selected, distributed in the 3 procedures with highest doses. The determination of organ doses and effective doses has been made using the projections utilized and different software based on Monte Carlo Methods: Eff-dose, PCXMC and Diasoft. We have obtained very high dispersion in the average organ dose between the 3 programs. In many cases, it is higher than 25% and in some particular cases, it is greater than 100%. Dispersion obtained in effective doses is not so high, being under 20% in all cases. This shows that a better solution is needed to solve the problem of the organ doses calculation; a more accurate method is necessary that brings us to a trustworthy approach to reality, and, at the moment, that we do not dispose of it. (author)

  12. Evaluating Sustainability Models for Interoperability through Brokering Software

    Science.gov (United States)

    Pearlman, Jay; Benedict, Karl; Best, Mairi; Fyfe, Sue; Jacobs, Cliff; Michener, William; Nativi, Stefano; Powers, Lindsay; Turner, Andrew

    2016-04-01

    Sustainability of software and research support systems is an element of innovation that is not often discussed. Yet, sustainment is essential if we expect research communities to make the time investment to learn and adopt new technologies. As the Research Data Alliance (RDA) is developing new approaches to interoperability, the question of uptake and sustainability is important. Brokering software sustainability is one of the areas that is being addressed in RDA. The Business Models Team of the Research Data Alliance Brokering Governance Working Group examined several support models proposed to promote the long-term sustainability of brokering middleware. The business model analysis includes examination of funding source, implementation frameworks and challenges, and policy and legal considerations. Results of this comprehensive analysis highlight advantages and disadvantages of the various models with respect to the specific requirements for brokering services. We offer recommendations based on the outcomes of this analysis that suggest that hybrid funding models present the most likely avenue to long term sustainability.

  13. State-of-the-Art Resources (SOAR) for Software Vulnerability Detection, Test, and Evaluation

    Science.gov (United States)

    2014-07-01

    preclude in-depth analysis, and widespread use of a Software -as-a- Service ( SaaS ) model that limits data availability and application to DoD systems...provide mobile application analysis using a Software - as-a- Service ( SaaS ) model. In this case, any software to be analyzed must be sent to the...tools are only available through a SaaS model. The widespread use of a Software -as-a- Service ( SaaS ) model as a sole evaluation model limits data

  14. Evaluation Framework for Telemedicine Using the Logical Framework Approach and a Fishbone Diagram.

    Science.gov (United States)

    Chang, Hyejung

    2015-10-01

    Technological advances using telemedicine and telehealth are growing in healthcare fields, but the evaluation framework for them is inconsistent and limited. This paper suggests a comprehensive evaluation framework for telemedicine system implementation and will support related stakeholders' decision-making by promoting general understanding, and resolving arguments and controversies. This study focused on developing a comprehensive evaluation framework by summarizing themes across the range of evaluation techniques and organized foundational evaluation frameworks generally applicable through studies and cases of diverse telemedicine. Evaluation factors related to aspects of information technology; the evaluation of satisfaction of service providers and consumers, cost, quality, and information security are organized using the fishbone diagram. It was not easy to develop a monitoring and evaluation framework for telemedicine since evaluation frameworks for telemedicine are very complex with many potential inputs, activities, outputs, outcomes, and stakeholders. A conceptual framework was developed that incorporates the key dimensions that need to be considered in the evaluation of telehealth implementation for a formal structured approach to the evaluation of a service. The suggested framework consists of six major dimensions and the subsequent branches for each dimension. To implement telemedicine and telehealth services, stakeholders should make decisions based on sufficient evidence in quality and safety measured by the comprehensive evaluation framework. Further work would be valuable in applying more comprehensive evaluations to verify and improve the comprehensive framework across a variety of contexts with more factors and participant group dimensions.

  15. Fuzzy logic

    CERN Document Server

    Smets, P

    1995-01-01

    We start by describing the nature of imperfect data, and giving an overview of the various models that have been proposed. Fuzzy sets theory is shown to be an extension of classical set theory, and as such has a proeminent role or modelling imperfect data. The mathematic of fuzzy sets theory is detailled, in particular the role of the triangular norms. The use of fuzzy sets theory in fuzzy logic and possibility theory,the nature of the generalized modus ponens and of the implication operator for approximate reasoning are analysed. The use of fuzzy logic is detailled for application oriented towards process control and database problems.

  16. Separation Logic

    DEFF Research Database (Denmark)

    Reynolds, John C.

    2002-01-01

    In joint work with Peter O'Hearn and others, based on early ideas of Burstall, we have developed an extension of Hoare logic that permits reasoning about low-level imperative programs that use shared mutable data structure. The simple imperative programming language is extended with commands (not...... with the inductive definition of predicates on abstract data structures, this extension permits the concise and flexible description of structures with controlled sharing. In this paper, we will survey the current development of this program logic, including extensions that permit unrestricted address arithmetic...

  17. MIAWARE Software

    DEFF Research Database (Denmark)

    Wilkowski, Bartlomiej; Pereira, Oscar N. M.; Dias, Paulo

    2008-01-01

    is automatically generated. Furthermore, MIAWARE software is accompanied with an intelligent search engine for medical reports, based on the relations between parts of the lungs. A logical structure of the lungs is introduced to the search algorithm through the specially developed ontology. As a result...

  18. Guidelines for evaluating software configuration management plans for digital instrumentation and control systems

    International Nuclear Information System (INIS)

    Cheon, Se Woo; Park, Jong Kyun; Lee, Ki Young; Lee, Jang Soo; Kim, Jang Yeon

    2001-08-01

    Software configuration management (SCM) is the process for identifying software configuration items (CIs), controlling the implementation and changes to software, recording and reporting the status of changes, and verifying the completeness and correctness of the released software. SCM consists of two major aspects: planning and implementation. Effective SCM involves planning for how activities are to be performed, and performing these activities in accordance with the Plan. This report first reviews the background of SCM that include key standards, SCM disciplines, SCM basic functions, baselines, software entity, SCM process, the implementation of SCM, and the tools of SCM. In turn, the report provides the guidelines for evaluating the SCM Plan for digital I and C systems of nuclear power plants. Most of the guidelines in the report are based on IEEE Std 828 and ANSI/IEEE Std 1042. According to BTP-14, NUREG-0800, the evaluation topics on the SCM Plan is classified into three categories: management, implementation, and resource characteristics

  19. Evaluation of the Usability of Different Virtual Lab Software Used in Physics Courses

    Directory of Open Access Journals (Sweden)

    O. Karagoz

    2010-11-01

    Full Text Available In recent years the use of virtual lab software has become ubiquitous in education settings. The purpose of the current study is to de¬velop an Evaluation Scale to allow the easy and fast assessment of virtual lab software. During the development of the Evaluation Scale, theoretical and experimental studies investigating the effects of different software on learn¬ing, particularly in terms of usability, were utilized. The Evaluation Scale was created by adding new attributes to pre-existing scales, and comprises three sections – attributes related to the interface of the software; attributes related to its use as a material in education; and attributes related to product and ser¬vice support – and 79 items. Testing of the Evaluation Scale was carried out using two different virtual lab programmes, with the help of a checklist. The evaluation was carried out by physics teachers and academics that had previ¬ously used similar software, and consistency between the results was consid¬ered to represent inter-rater reliability. At the end of the study, the usability of the Evaluation Scale was tested, and the instructor evaluations regarding the usability characteristics considered sufficient and that require improvement in virtual lab software were also investigated.

  20. Evaluation of Available Software for Reconstruction of a Structure from its Imagery

    Science.gov (United States)

    2017-04-01

    scene and signature generation for ladar and imaging sensors, in Proc. SPIE 9071 Infrared Imaging System : Design , Analysis , Modeling, and Testing XXV...UNCLASSIFIED Evaluation of Available Software for Reconstruction of a Structure from its Imagery Leonid K Antanovskii Weapons and Combat Systems ...project. The Computer Vision System toolbox of MATLAB R© and the Visual Structure from Motion (VisualSFM) software are evaluated on three datasets of

  1. Develop Quality Characteristics Based Quality Evaluation Process for Ready to Use Software Products

    OpenAIRE

    Daiju Kato; Hiroshi Ishikawa

    2016-01-01

    The users who use ready to use software product had better get the products’ quality information with classified by some kind of global standard metrics or technique for their evaluation. But many of those software products’ co mpanies don’t provide the quality information because of the products are developed b y their own development and evaluation process. But those users want to get quality i...

  2. Using CONFIG for Simulation of Operation of Water Recovery Subsystems for Advanced Control Software Evaluation

    Science.gov (United States)

    Malin, Jane T.; Flores, Luis; Fleming, Land; Throop, Daiv

    2002-01-01

    A hybrid discrete/continuous simulation tool, CONFIG, has been developed to support evaluation of the operability life support systems. CON FIG simulates operations scenarios in which flows and pressures change continuously while system reconfigurations occur as discrete events. In simulations, intelligent control software can interact dynamically with hardware system models. CONFIG simulations have been used to evaluate control software and intelligent agents for automating life support systems operations. A CON FIG model of an advanced biological water recovery system has been developed to interact with intelligent control software that is being used in a water system test at NASA Johnson Space Center

  3. Evaluating Open Source Software for Use in Library Initiatives: A Case Study Involving Electronic Publishing

    Science.gov (United States)

    Samuels, Ruth Gallegos; Griffy, Henry

    2012-01-01

    This article discusses best practices for evaluating open source software for use in library projects, based on the authors' experience evaluating electronic publishing solutions. First, it presents a brief review of the literature, emphasizing the need to evaluate open source solutions carefully in order to minimize Total Cost of Ownership. Next,…

  4. Evaluation of Distribution Analysis Software for DER Applications

    Energy Technology Data Exchange (ETDEWEB)

    Staunton, RH

    2003-01-23

    unstoppable. In response, energy providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of

  5. Logic Assumptions and Risks Framework Applied to Defence Campaign Planning and Evaluation

    Science.gov (United States)

    2013-05-01

    based on prescriptive targets of reduction in particular crime statistics in a certain timeframe. Similarly, if overall desired effects are not well...the Evaluation Journal of Australasia, Australasian Evaluation Society.  UNCLASSIFIED 21 UNCLASSIFIED DSTO-TR-2840 These six campaign functions...Callahan’s article in “Anecdotally” Newsletter January 2013, Anecdote Pty Ltd., a commercial consultancy specialising in narrative technique for business

  6. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    Science.gov (United States)

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  7. Evaluation of drug interaction microcomputer software: Dambro's Drug Interactions.

    Science.gov (United States)

    Poirier, T I; Giudici, R A

    1990-01-01

    Dambro's Drug Interactions was evaluated using general and specific criteria. The installation process, ease of learning and use were rated excellent. The user documentation and quality of the technical support were good. The scope of coverage, clinical documentation, frequency of updates, and overall clinical performance were fair. The primary advantages of the program are the quick searching and detection of drug interactions, and the attempt to provide useful interaction data, i.e., significance and reference. The disadvantages are the lack of current drug interaction information, outdated references, lack of evaluative drug interaction information, and the inability to save or print patient profiles. The program is not a good value for the pharmacist but has limited use as a quick screening tool.

  8. Using Computer-Aided Software Engineering (CASE)--tools to document the current logical model of a system for DoD requirements specifications.

    OpenAIRE

    Ganzer, Donna A.

    1987-01-01

    Approved for public release; distribution is unlimited The Naval Postgraduate School's final exam scheduling system serves as a test case with which to compare two commercially available Computer-Aided Software Engineering (CASE) tools. The tools, Nastec Corporation's DesignAid (Release 3.55) and Index Technology's Excelerator (Release 1.7) are used to create Section 4.1 of two Abbreviated Systems Decision Papers to determine if their output can satisfy and should replace some of the Life...

  9. Evaluating bacterial gene-finding HMM structures as probabilistic logic programs

    DEFF Research Database (Denmark)

    Mørk, Søren; Holmes, Ian

    2012-01-01

    , a probabilistic dialect of Prolog. Results: We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length...

  10. Preference Evaluation System for Construction Products Using QFD-TOPSIS Logic by Considering Trade-Off Technical Characteristics

    Directory of Open Access Journals (Sweden)

    Jaeho Cho

    2017-01-01

    Full Text Available This paper investigates the feasibility of quality function deployment, technique for the order of preference by similarity to ideal solution (QFD-TOPSIS in presenting user preferences for multiple alternatives, such as construction technologies, products, systems, and design solutions, with trade-off technical characteristics (TC. The original QFD as house of quality (HOQ defines the requirements and features as subjective matrix relations, which cause interpretations to vary across users and limit its industrial applications. QFD-TOPSIS is a new model that combines the benefits of QFD with those of TOPSIS, maintains the subjectivity and objectivity evaluation of the technical characteristics (TC, and rates the preferences by considering users’ individual propensity for requirements. In addition, QFD-TOPSIS rates the preferences through the reciprocal compensation effects of trade-off TC and filters unsuitable alternatives with predefined restrictive conditions. Trade-off refers to conflicts and/or contradictions between attributes, often arising in multicriteria decision-making. Users or project stakeholder groups define the priorities of trade-off TC that directly influence product preferences and decision-making. In the present study, we have developed a Web system based on the QFD-TOPSIS logic and tested its operation to verify its industrial applicability and viability for automatic quality evaluation.

  11. The outcome competency framework for practitioners in infection prevention and control: use of the outcome logic model for evaluation.

    Science.gov (United States)

    Burnett, E; Curran, E; Loveday, H P; Kiernan, M A; Tannahill, M

    2014-01-01

    Healthcare is delivered in a dynamic environment with frequent changes in populations, methods, equipment and settings. Infection prevention and control practitioners (IPCPs) must ensure that they are competent in addressing the challenges they face and are equipped to develop infection prevention and control (IPC) services in line with a changing world of healthcare provision. A multifaceted Framework was developed to assist IPCPs to enhance competence at an individual, team and organisational level to enable quality performance and improved quality of care. However, if these aspirations are to be met, it is vital that competency frameworks are fit for purpose or they risk being ignored. The aim of this unique study was to evaluate short and medium term outcomes as set out in the Outcome Logic Model to assist with the evaluation of the impact and success of the Framework. This study found that while the Framework is being used effectively in some areas, it is not being used as much or in the ways that were anticipated. The findings will enable future work on revision, communication and dissemination, and will provide intelligence to those initiating education and training in the utilisation of the competences.

  12. The Computer-based Health Evaluation Software (CHES: a software for electronic patient-reported outcome monitoring

    Directory of Open Access Journals (Sweden)

    Holzner Bernhard

    2012-11-01

    Full Text Available Abstract Background Patient-reported Outcomes (PROs capturing e.g., quality of life, fatigue, depression, medication side-effects or disease symptoms, have become important outcome parameters in medical research and daily clinical practice. Electronic PRO data capture (ePRO with software packages to administer questionnaires, storing data, and presenting results has facilitated PRO assessment in hospital settings. Compared to conventional paper-pencil versions of PRO instruments, ePRO is more economical with regard to staff resources and time, and allows immediate presentation of results to the medical staff. The objective of our project was to develop software (CHES – Computer-based Health Evaluation System for ePRO in hospital settings and at home with a special focus on the presentation of individual patient’s results. Methods Following the Extreme Programming development approach architecture was not fixed up-front, but was done in close, continuous collaboration with software end users (medical staff, researchers and patients to meet their specific demands. Developed features include sophisticated, longitudinal charts linking patients’ PRO data to clinical characteristics and to PRO scores from reference populations, a web-interface for questionnaire administration, and a tool for convenient creating and editing of questionnaires. Results By 2012 CHES has been implemented at various institutions in Austria, Germany, Switzerland, and the UK and about 5000 patients participated in ePRO (with around 15000 assessments in total. Data entry is done by the patients themselves via tablet PCs with a study nurse or an intern approaching patients and supervising questionnaire completion. Discussion During the last decade several software packages for ePRO have emerged for different purposes. Whereas commercial products are available primarily for ePRO in clinical trials, academic projects have focused on data collection and presentation in daily

  13. Learning from examples - Generation and evaluation of decision trees for software resource analysis

    Science.gov (United States)

    Selby, Richard W.; Porter, Adam A.

    1988-01-01

    A general solution method for the automatic generation of decision (or classification) trees is investigated. The approach is to provide insights through in-depth empirical characterization and evaluation of decision trees for software resource data analysis. The trees identify classes of objects (software modules) that had high development effort. Sixteen software systems ranging from 3,000 to 112,000 source lines were selected for analysis from a NASA production environment. The collection and analysis of 74 attributes (or metrics), for over 4,700 objects, captured information about the development effort, faults, changes, design style, and implementation style. A total of 9,600 decision trees were automatically generated and evaluated. The trees correctly identified 79.3 percent of the software modules that had high development effort or faults, and the trees generated from the best parameter combinations correctly identified 88.4 percent of the modules on the average.

  14. Interplay between usability evaluation and software development (I-USED 2009)

    DEFF Research Database (Denmark)

    Abrahão, S.; Hornbæk, Kasper Anders Søren; Law, E.

    2009-01-01

    This workshop is aimed at bringing together researchers and practitioners from the Human-Computer Interaction (HCI) and Software Engineering (SE) fields to determine the state-of-the-art in the interplay between usability evaluation and software development and to generate ideas for new...... and improved relations between these activities. The aim is to base the determination of the current state on empirical studies. Presentations of new ideas on how to improve the interplay between HCI & SE to the design of usable software systems should also be based on empirical studies....

  15. Development and evaluation of new semi-automatic TLD reader software

    International Nuclear Information System (INIS)

    Pathan, M.S.; Pradhan, S.M.; Palani Selvam, T.; Datta, D.

    2018-01-01

    Nowadays, all technology advancement is primarily focused on creating the user-friendly environment while operating any machine, also minimizing the human errors by automation of procedures. In the present study development and evaluation of new software for semi-automatic TLD badge reader (TLDBR-7B) is presented. The software provides an interactive interface and is compatible with latest windows OS as well as USB mode of data communication. Important new features of the software are automatic glow curve analysis for identifying any abnormality, event log register, user defined limits on TL count and time of temperature stabilization for readout interruption and auto reading resumption options

  16. The logic of XACML

    DEFF Research Database (Denmark)

    Ramli, Carroline Dewi Puspa Kencana; Nielson, Hanne Riis; Nielson, Flemming

    2014-01-01

    We study the international standard XACML 3.0 for describing security access control policies in a compositional way. Our main contributions are (i) to derive a logic that precisely captures the intentions of the standard, (ii) to formally define a semantics for the XACML 3.0 component evaluation...

  17. Expert System for Competences Evaluation 360° Feedback Using Fuzzy Logic

    OpenAIRE

    Alberto Alfonso Aguilar Lasserre; Marina Violeta Lafarja Solabac; Roberto Hernandez-Torres; Rubén Posada-Gomez; Ulises Juárez-Martínez; Gregorio Fernández Lambert

    2014-01-01

    Performance evaluation (PE) is a process that estimates the employee overall performance during a given period, and it is a common function carried out inside modern companies. PE is important because it is an instrument that encourages employees, organizational areas, and the whole company to have an appropriate behavior and continuous improvement. In addition, PE is useful in decision making about personnel allocation, productivity bonuses, incentives, promotions, disciplinary measures, and...

  18. Continuous Markovian Logics

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian; Cardelli, Luca; Larsen, Kim Guldstrand

    2012-01-01

    Continuous Markovian Logic (CML) is a multimodal logic that expresses quantitative and qualitative properties of continuous-time labelled Markov processes with arbitrary (analytic) state-spaces, henceforth called continuous Markov processes (CMPs). The modalities of CML evaluate the rates...... of the exponentially distributed random variables that characterize the duration of the labeled transitions of a CMP. In this paper we present weak and strong complete axiomatizations for CML and prove a series of metaproperties, including the finite model property and the construction of canonical models. CML...... characterizes stochastic bisimilarity and it supports the definition of a quantified extension of the satisfiability relation that measures the "compatibility" between a model and a property. In this context, the metaproperties allows us to prove two robustness theorems for the logic stating that one can...

  19. STEM - software test and evaluation methods: fault detection using static analysis techniques

    International Nuclear Information System (INIS)

    Bishop, P.G.; Esp, D.G.

    1988-08-01

    STEM is a software reliability project with the objective of evaluating a number of fault detection and fault estimation methods which can be applied to high integrity software. This Report gives some interim results of applying both manual and computer-based static analysis techniques, in particular SPADE, to an early CERL version of the PODS software containing known faults. The main results of this study are that: The scope for thorough verification is determined by the quality of the design documentation; documentation defects become especially apparent when verification is attempted. For well-defined software, the thoroughness of SPADE-assisted verification for detecting a large class of faults was successfully demonstrated. For imprecisely-defined software (not recommended for high-integrity systems) the use of tools such as SPADE is difficult and inappropriate. Analysis and verification tools are helpful, through their reliability and thoroughness. However, they are designed to assist, not replace, a human in validating software. Manual inspection can still reveal errors (such as errors in specification and errors of transcription of systems constants) which current tools cannot detect. There is a need for tools to automatically detect typographical errors in system constants, for example by reporting outliers to patterns. To obtain the maximum benefit from advanced tools, they should be applied during software development (when verification problems can be detected and corrected) rather than retrospectively. (author)

  20. Choreographies, Logically

    DEFF Research Database (Denmark)

    Carbone, Marco; Montesi, Fabrizio; Schürmann, Carsten

    2014-01-01

    In Choreographic Programming, a distributed system is programmed by giving a choreography, a global description of its interactions, instead of separately specifying the behaviour of each of its processes. Process implementations in terms of a distributed language can then be automatically...... projected from a choreography. We present Linear Compositional Choreographies (LCC), a proof theory for reasoning about programs that modularly combine choreographies with processes. Using LCC, we logically reconstruct a semantics and a projection procedure for programs. For the first time, we also obtain...... a procedure for extracting choreographies from process terms....

  1. Combinations of Methods for Collaborative Evaluation of the Usability of Interactive Software Systems

    Directory of Open Access Journals (Sweden)

    Andrés Solano

    2016-01-01

    Full Text Available Usability is a fundamental quality characteristic for the success of an interactive system. It is a concept that includes a set of metrics and methods in order to obtain easy-to-learn and easy-to-use systems. Usability Evaluation Methods, UEM, are quite diverse; their application depends on variables such as costs, time availability, and human resources. A large number of UEM can be employed to assess interactive software systems, but questions arise when deciding which method and/or combination of methods gives more (relevant information. We propose Collaborative Usability Evaluation Methods, CUEM, following the principles defined by the Collaboration Engineering. This paper analyzes a set of CUEM conducted on different interactive software systems. It proposes combinations of CUEM that provide more complete and comprehensive information about the usability of interactive software systems than those evaluation methods conducted independently.

  2. Evaluation of the litcit software for thermal simulation of superficial lasers such as hair removal lasers

    Directory of Open Access Journals (Sweden)

    Shirkavand A

    2007-01-01

    Full Text Available Background and Objectives : In this study, we evaluate LITCIT software for its application as a thermal simulation software for superficial hair removal laser systems. Materials and Methods: Two articles were used as our references. Complete information regarding the tissues, such as optical/thermal properties and geometrical modeling and also the laser systems such as wavelength, spot size, pulse duration and fluence were extracted from these texts. Then, this information regarding the tissues and systems was entered into the LITCIT simulation software. Further, we ran the program and saved the results. Finally, we compared our results with the results in references and evaluated the. Results : Output results of the LITCIT show that they are consistent with the results of references that were calculated with a different thermal modeling. Such a small average error shows the accuracy of the software for simulation and calculating the temperature. Conclusions : This simulating software has a good ability to be used as a treatment planning software for superficial lasers. Thus, it can be used for the optimization of treatment parameters and protocols.

  3. Evaluation of high-fidelity simulation training in radiation oncology using an outcomes logic model

    International Nuclear Information System (INIS)

    Giuliani, Meredith; Gillan, Caitlin; Wong, Olive; Harnett, Nicole; Milne, Emily; Moseley, Doug; Thompson, Robert; Catton, Pamela; Bissonnette, Jean-Pierre

    2014-01-01

    To evaluate the feasibility and educational value of high-fidelity, interprofessional team-based simulation in radiation oncology. The simulation event was conducted in a radiation oncology department during a non-clinical day. It involved 5 simulation scenarios that were run over three 105 minute timeslots in a single day. High-acuity, low-frequency clinical situations were selected and included HDR brachytherapy emergency, 4D CT artifact management, pediatric emergency clinical mark-up, electron scalp trial set-up and a cone beam CT misregistration incident. A purposive sample of a minimum of 20 trainees was required to assess recruitment feasibility. A faculty radiation oncologist (RO), medical physicist (MP) or radiation therapist (RTT), facilitated each case. Participants completed a pre event survey of demographic data and motivation for participation. A post event survey collected perceptions of familiarity with the clinical content, comfort with interprofessional practice, and event satisfaction, scored on a 1–10 scale in terms of clinical knowledge, clinical decision making, clinical skills, exposure to other trainees and interprofessional communication. Means and standard deviations were calculated. Twenty-one trainees participated including 6 ROs (29%), 6 MPs (29%), and 9 RTTs (43%). All 12 cases (100%) were completed within the allocated 105 minutes. Nine faculty facilitators, (3MP, 2 RO, 4 RTTs) were required for 405 minutes each. Additional costs associated with this event were 154 hours to build the high fidelity scenarios, 2 standardized patients (SPs) for a total of 15.5 hours, and consumables.The mean (±SD) educational value score reported by participants with respect to clinical knowledge was 8.9 (1.1), clinical decision making 8.9 (1.3), clinical skills 8.9 (1.1), exposure to other trainees 9.1 (2.3) and interprofessional communication 9.1 (1.0). Fifteen (71%) participants reported the cases were of an appropriate complexity. The importance

  4. Evaluating the Effect of Software Quality Characteristics on Health Care Quality Indicators

    Directory of Open Access Journals (Sweden)

    Sakineh Aghazadeh

    2015-07-01

    Full Text Available Introduction: Various types of software are used in health care organizations to manage information and care processes. The quality of software has been an important concern for both health authorities and designers of Health Information Technology. Thus, assessing the effect of software quality on the performance quality of healthcare institutions is essential. Method: The most important health care quality indicators in relation to software quality characteristics are provided via an already performed literature review. ISO 9126 standard model is used for definition and integration of various characteristics of software quality. The effects of software quality characteristics and sub-characteristics on the healthcare indicators are evaluated through expert opinion analyses. A questionnaire comprising of 126 questions of 10-point Likert scale was used to gather opinions of experts in the field of Medical/Health Informatics. The data was analyzed using Structural Equation Modeling. Results: Our findings showed that software Maintainability was rated as the most effective factor on user satisfaction (R2 =0.89 and Functionality as the most important and independent variable affecting patient care quality (R2 =0.98. Efficiency was considered as the most effective factor on workflow (R2 =0.97, and Maintainability as the most important factor that affects healthcare communication (R2 =0.95. Usability and Efficiency were rated as the most effectual factor affecting patient satisfaction (R2 =0.80, 0.81. Reliability, Maintainability, and Efficiency were considered as the main factors affecting care costs (R2 =0.87, 0.74, 0.87. Conclusion: We presented a new model based on ISO standards. The model demonstrates and weighs the relations between software quality characteristics and healthcare quality indicators. The clear relationships between variables and the type of the metrics and measurement methods used in the model make it a reliable method to assess

  5. Quantum logic

    International Nuclear Information System (INIS)

    Mittelstaedt, P.

    1979-01-01

    The subspaces of Hilbert space constitute an orthocomplemented quasimodular lattice Lsub(q) for which neither a two-valued function nor generalized truth function exist. A generalisation of the dialogic method can be used as an interpretation of a lattice Lsub(qi), which may be considered as the intuitionistic part of Lsub(q). Some obvious modifications of the dialogic method are introduced which come from the possible incommensurability of propositions about quantum mechanical systems. With the aid of this generalized dialogic method a propositional calculus Qsub(eff) is derived which is similar to the calculus of effective (intuitionistic) logic, but contains a few restrictions which are based on the incommensurability of quantum mechanical propositions. It can be shown within the framework of the calculus Qsub(eff) that the value-definiteness of the elementary propositions which are proved by quantum mechanical propositions is inherited by all finite compund propositions. In this way one arrives at the calculus Q of full quantum logic which incorporates the principle of excluded middle for all propositions and which is a model for the lattice Lsub(q). (Auth.)

  6. Y-12 Plant decontamination and decommissioning technology logic diagram for Building 9201-4. Volume 3: Technology evaluation data sheets; Part B: Decontamination, robotics/automation, waste management

    International Nuclear Information System (INIS)

    1994-09-01

    The Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4 (TLD) was developed to provide a decision-support tool that relates decontamination and decommissioning (D and D) problems at Bldg. 9201-4 to potential technologies that can remediate these problems. The TLD uses information from the Strategic Roadmap for the Oak Ridge Reservation, the Oak Ridge K-25 Site Technology Logic Diagram, the Oak Ridge National Laboratory Technology Logic Diagram, and a previous Hanford logic diagram. This TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to D and D and waste management (WM) activities. It is essential that follow-on engineering studies be conducted to build on the output of this project. These studies will begin by selecting the most promising technologies identified in the TLD and by finding an optimum mix of technologies that will provide a socially acceptable balance between cost and risk. This report consists of the decontamination, robotics/automation, and WM data sheets

  7. Y-12 Plant decontamination and decommissioning technology logic diagram for Building 9201-4. Volume 3: Technology evaluation data sheets; Part A: Characterization, dismantlement

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-09-01

    The Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4 (TLD) was developed to provide a decision-support tool that relates decontamination and decommissioning (D and D) problems at Bldg. 9201-4 to potential technologies that can remediate these problems. The TLD uses information from the Strategic Roadmap for the Oak Ridge Reservation, the Oak Ridge K-25 Site Technology Logic Diagram, the Oak Ridge National Laboratory Technology Logic Diagram, and a previous Hanford logic diagram. This TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to D and D and waste management (WM) activities. It is essential that follow-on engineering studies be conducted to build on the output of this project. These studies will begin by selecting the most promising technologies identified in the TLD and by finding an optimum mix of technologies that will provide a socially acceptable balance between cost and risk. This report consists of the characterization and dismantlement data sheets.

  8. Y-12 Plant decontamination and decommissioning technology logic diagram for Building 9201-4. Volume 3: Technology evaluation data sheets; Part A: Characterization, dismantlement

    International Nuclear Information System (INIS)

    1994-09-01

    The Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4 (TLD) was developed to provide a decision-support tool that relates decontamination and decommissioning (D and D) problems at Bldg. 9201-4 to potential technologies that can remediate these problems. The TLD uses information from the Strategic Roadmap for the Oak Ridge Reservation, the Oak Ridge K-25 Site Technology Logic Diagram, the Oak Ridge National Laboratory Technology Logic Diagram, and a previous Hanford logic diagram. This TLD identifies the research, development, demonstration, testing, and evaluation needed for sufficient development of these technologies to allow for technology transfer and application to D and D and waste management (WM) activities. It is essential that follow-on engineering studies be conducted to build on the output of this project. These studies will begin by selecting the most promising technologies identified in the TLD and by finding an optimum mix of technologies that will provide a socially acceptable balance between cost and risk. This report consists of the characterization and dismantlement data sheets

  9. Evaluation and Satisfaction Survey on the Interface Usability of Online Publishing Software

    Directory of Open Access Journals (Sweden)

    Ying-Jye Lee

    2014-01-01

    Full Text Available Digital publishing is one of the national key programs. Different from traditional digital publishing models, consumers could create personal digital publications with the editing program provided by businesses and combine it with web-to-print to output solid publications. Nevertheless, the usability of online publishing software is related to consumers’ acceptance or intention of product purchase. In this case, Focus Group is utilized to screen representative online publishing software (including TinTint, Photobook, and Hypo for evaluating interface usability, investigating users’ Subjective Satisfaction, and further proposing suggestions for interface modification. Learnability and the number of user errors are set as the evaluation indicators of usability. Within the evaluation indicators in Learnability, the results show that nine typical tasks, except for Storing, show significant difference between various online publishing software. Typical tasks of basic information of works, appending pictures, adjusting pictures, changing type version, and changing pages in the number of user errors reveal significant difference on distinct online publishing software. Regarding the evaluation of overall Subjective Satisfaction with interface, TinTint and Hypo outperform Photobook, and no significant difference appears between TinTint and Hypo. It is expected that the research model could be the application reference of interface development and evaluation in digital content industries.

  10. Application of fuzzy logic control in industry

    International Nuclear Information System (INIS)

    Van der Wal, A.J.

    1994-01-01

    An overview is given of the various ways fuzzy logic can be used to improve industrial control. The application of fuzzy logic in control is illustrated by two case studies. The first example shows how fuzzy logic, incorporated in the hardware of an industrial controller, helps to finetune a PID controller, without the operator having any a priori knowledge of the system to be controlled. The second example is from process industry. Here, fuzzy logic supervisory control is implemented in software and enhances the operation of a sintering oven through a subtle combination of priority management and deviation-controlled timing

  11. Software package evaluation for the TJ-II Data Acquisition System

    International Nuclear Information System (INIS)

    Cremy, C.; Sanchez, E.; Portas, A.; Vega, J.

    1996-01-01

    The TJ-II Data Acquisition System (DAS) has to provide a user interface which will allow setup for sampling channels, discharge signal visualization and reduce data processing, all in run time. On the other hand, the DAS will provide a high level software capability for signal analysis, processing and data visualization either in run time or off line. A set of software packages including Builder Xcessory, X-designer, llog Builder, Toolmaster, AVS 5, AVS/Express, PV-WAVE and Iris Explorer, have been evaluated by the Data Acquisition Group of the Fusion Division. the software evaluation, resumed in this paper, has resulted in a global solution being found which meets all of the DAS requirements. (Author)

  12. The design of an instrument to evaluate software for EFL/ESL pronunciation teaching

    Directory of Open Access Journals (Sweden)

    Cristiana Gomes de Freitas Menezes Martins

    2016-01-01

    Full Text Available http://dx.doi.org/10.5007/2175-8026.2016v69n1p141 The purpose of this study was to develop and test the reliability and validity of an instrument to evaluate the extent to which software programs teach English as a Foreign Language and/or Second Language (EFL/ESL pronunciation following the principles of the Communicative Approach (Celce-Murcia et al, 2010, thus having the potential to develop English pronunciation. After the development of the instrument, 46 EFL/ESL teachers used it to analyze an online version of the software program Pronunciation Power 2. The responses of the participants were submitted to statistical analysis and the validity and reliability of the instrument were tested. The good reliability indexes obtained in this study suggest the instrument has some degree of validity for evaluating how well an ESL/EFL pronunciation teaching software program potentially develops English pronunciation.

  13. AgesGalore-A software program for evaluating spatially resolved luminescence data

    International Nuclear Information System (INIS)

    Greilich, S.; Harney, H.-L.; Woda, C.; Wagner, G.A.

    2006-01-01

    Low-light luminescence is usually recorded by photomultiplier tubes (PMTs) yielding integrated photon-number data. Highly sensitive CCD (charged coupled device) detectors allow for the spatially resolved recording of luminescence. The resulting two-dimensional images require suitable software for data processing. We present a recently developed software program specially designed for equivalent-dose evaluation in the framework of optically stimulated luminescence (OSL) dating. The software is capable of appropriate CCD data handling, parameter estimation using a Bayesian approach, and the pixel-wise fitting of functions for time and dose dependencies to the luminescence signal. The results of the fitting procedure and the equivalent-dose evaluation can be presented and analyzed both as spatial and as frequency distributions

  14. Quality assurance (QA) procedures for software: Evaluation of an ADC quality system

    International Nuclear Information System (INIS)

    Efstathopoulos, E. P.; Benekos, O.; Molfetas, M.; Charou, E.; Kottou, S.; Argentos, S.; Kelekis, N. L.

    2005-01-01

    Image viewing and processing software in computed radiography manipulates image contrast in such a way that all relevant image features are rendered to an appropriate degree of visibility, and improves image quality using enhancement algorithms. The purpose of this study was to investigate procedures for the quality assessment of image processing software for computed radiography with the use of existing test objects and to assess the influence that processing introduces on physical image quality characteristics. Measurements of high-contrast resolution, low-contrast resolution, spatial resolution, grey scale (characteristic curve) and geometric distortion were performed 'subjectively' by three independent observers and 'objectively' by the use of criteria based on pixel intensity values. Results show quality assessment is possible without the need for human evaluators, using digital images. It was discovered that the processing software evaluated in this study was able to improve some aspects of image quality, without introducing geometric distortion. (authors)

  15. An Embedded Reconfigurable Logic Module

    Science.gov (United States)

    Tucker, Jerry H.; Klenke, Robert H.; Shams, Qamar A. (Technical Monitor)

    2002-01-01

    A Miniature Embedded Reconfigurable Computer and Logic (MERCAL) module has been developed and verified. MERCAL was designed to be a general-purpose, universal module that that can provide significant hardware and software resources to meet the requirements of many of today's complex embedded applications. This is accomplished in the MERCAL module by combining a sub credit card size PC in a DIMM form factor with a XILINX Spartan I1 FPGA. The PC has the ability to download program files to the FPGA to configure it for different hardware functions and to transfer data to and from the FPGA via the PC's ISA bus during run time. The MERCAL module combines, in a compact package, the computational power of a 133 MHz PC with up to 150,000 gate equivalents of digital logic that can be reconfigured by software. The general architecture and functionality of the MERCAL hardware and system software are described.

  16. A Framework for Evaluating the Software Product Quality of Pregnancy Monitoring Mobile Personal Health Records.

    Science.gov (United States)

    Idri, Ali; Bachiri, Mariam; Fernández-Alemán, José Luis

    2016-03-01

    Stakeholders' needs and expectations are identified by means of software quality requirements, which have an impact on software product quality. In this paper, we present a set of requirements for mobile personal health records (mPHRs) for pregnancy monitoring, which have been extracted from literature and existing mobile apps on the market. We also use the ISO/IEC 25030 standard to suggest the requirements that should be considered during the quality evaluation of these mPHRs. We then go on to design a checklist in which we contrast the mPHRs for pregnancy monitoring requirements with software product quality characteristics and sub-characteristics in order to calculate the impact of these requirements on software product quality, using the ISO/IEC 25010 software product quality standard. The results obtained show that the requirements related to the user's actions and the app's features have the most impact on the external sub-characteristics of the software product quality model. The only sub-characteristic affected by all the requirements is Appropriateness of Functional suitability. The characteristic Operability is affected by 95% of the requirements while the lowest degrees of impact were identified for Compatibility (15%) and Transferability (6%). Lastly, the degrees of the impact of the mPHRs for pregnancy monitoring requirements are discussed in order to provide appropriate recommendations for the developers and stakeholders of mPHRs for pregnancy monitoring.

  17. Assessment of three different software systems in the evaluation of dynamic MRI of the breast

    International Nuclear Information System (INIS)

    Kurz, K.D.; Steinhaus, D.; Klar, V.; Cohnen, M.; Wittsack, H.J.; Saleh, A.; Moedder, U.; Blondin, D.

    2009-01-01

    Objective: The aim was to compare the diagnostic performance and handling of dynamic contrast-enhanced MRI of the breast with two commercial software solutions ('CADstream' and '3TP') and one self-developed software system ('Mammatool'). Materials and methods: Identical data sets of dynamic breast MRI from 21 patients were evaluated retrospectively with all three software systems. The exams were classified according to the BI-RADS classification. The number of lesions in the parametric mapping was compared to histology or follow-up of more than 2 years. In addition, 25 quality criteria were judged by 3 independent investigators with a score from 0 to 5. Statistical analysis was performed to document the quality ranking of the different software systems. Results: There were 9 invasive carcinomas, one pure DCIS, one papilloma, one radial scar, three histologically proven changes due to mastopathy, one adenosis and two fibroadenomas. Additionally two patients with enhancing parenchyma followed with MRI for more than 3 years and one scar after breast conserving therapy were included. All malignant lesions were classified as BI-RADS 4 or 5 using all software systems and showed significant enhancement in the parametric mapping. 'CADstream' showed the best score on subjective quality criteria. '3TP' showed the lowest number of false-positive results. 'Mammatool' produced the lowest number of benign tissues indicated with parametric overlay. Conclusion: All three software programs tested were adequate for sensitive and efficient assessment of dynamic MRI of the breast. Improvements in specificity may be achievable

  18. Assessment of three different software systems in the evaluation of dynamic MRI of the breast.

    Science.gov (United States)

    Kurz, K D; Steinhaus, D; Klar, V; Cohnen, M; Wittsack, H J; Saleh, A; Mödder, U; Blondin, D

    2009-02-01

    The aim was to compare the diagnostic performance and handling of dynamic contrast-enhanced MRI of the breast with two commercial software solutions ("CADstream" and "3TP") and one self-developed software system ("Mammatool"). Identical data sets of dynamic breast MRI from 21 patients were evaluated retrospectively with all three software systems. The exams were classified according to the BI-RADS classification. The number of lesions in the parametric mapping was compared to histology or follow-up of more than 2 years. In addition, 25 quality criteria were judged by 3 independent investigators with a score from 0 to 5. Statistical analysis was performed to document the quality ranking of the different software systems. There were 9 invasive carcinomas, one pure DCIS, one papilloma, one radial scar, three histologically proven changes due to mastopathy, one adenosis and two fibroadenomas. Additionally two patients with enhancing parenchyma followed with MRI for more than 3 years and one scar after breast conserving therapy were included. All malignant lesions were classified as BI-RADS 4 or 5 using all software systems and showed significant enhancement in the parametric mapping. "CADstream" showed the best score on subjective quality criteria. "3TP" showed the lowest number of false-positive results. "Mammatool" produced the lowest number of benign tissues indicated with parametric overlay. All three software programs tested were adequate for sensitive and efficient assessment of dynamic MRI of the breast. Improvements in specificity may be achievable.

  19. A Comparison and Evaluation of Real-Time Software Systems Modeling Languages

    Science.gov (United States)

    Evensen, Kenneth D.; Weiss, Kathryn Anne

    2010-01-01

    A model-driven approach to real-time software systems development enables the conceptualization of software, fostering a more thorough understanding of its often complex architecture and behavior while promoting the documentation and analysis of concerns common to real-time embedded systems such as scheduling, resource allocation, and performance. Several modeling languages have been developed to assist in the model-driven software engineering effort for real-time systems, and these languages are beginning to gain traction with practitioners throughout the aerospace industry. This paper presents a survey of several real-time software system modeling languages, namely the Architectural Analysis and Design Language (AADL), the Unified Modeling Language (UML), Systems Modeling Language (SysML), the Modeling and Analysis of Real-Time Embedded Systems (MARTE) UML profile, and the AADL for UML profile. Each language has its advantages and disadvantages, and in order to adequately describe a real-time software system's architecture, a complementary use of multiple languages is almost certainly necessary. This paper aims to explore these languages in the context of understanding the value each brings to the model-driven software engineering effort and to determine if it is feasible and practical to combine aspects of the various modeling languages to achieve more complete coverage in architectural descriptions. To this end, each language is evaluated with respect to a set of criteria such as scope, formalisms, and architectural coverage. An example is used to help illustrate the capabilities of the various languages.

  20. Software engineering

    CERN Document Server

    Thorin, Marc

    1985-01-01

    Software Engineering describes the conceptual bases as well as the main methods and rules on computer programming. This book presents software engineering as a coherent and logically built synthesis and makes it possible to properly carry out an application of small or medium difficulty that can later be developed and adapted to more complex cases. This text is comprised of six chapters and begins by introducing the reader to the fundamental notions of entities, actions, and programming. The next two chapters elaborate on the concepts of information and consistency domains and show that a proc

  1. Nanomagnetic Logic

    Science.gov (United States)

    Carlton, David Bryan

    The exponential improvements in speed, energy efficiency, and cost that the computer industry has relied on for growth during the last 50 years are in danger of ending within the decade. These improvements all have relied on scaling the size of the silicon-based transistor that is at the heart of every modern CPU down to smaller and smaller length scales. However, as the size of the transistor reaches scales that are measured in the number of atoms that make it up, it is clear that this scaling cannot continue forever. As a result of this, there has been a great deal of research effort directed at the search for the next device that will continue to power the growth of the computer industry. However, due to the billions of dollars of investment that conventional silicon transistors have received over the years, it is unlikely that a technology will emerge that will be able to beat it outright in every performance category. More likely, different devices will possess advantages over conventional transistors for certain applications and uses. One of these emerging computing platforms is nanomagnetic logic (NML). NML-based circuits process information by manipulating the magnetization states of single-domain nanomagnets coupled to their nearest neighbors through magnetic dipole interactions. The state variable is magnetization direction and computations can take place without passing an electric current. This makes them extremely attractive as a replacement for conventional transistor-based computing architectures for certain ultra-low power applications. In most work to date, nanomagnetic logic circuits have used an external magnetic clocking field to reset the system between computations. The clocking field is then subsequently removed very slowly relative to the magnetization dynamics, guiding the nanomagnetic logic circuit adiabatically into its magnetic ground state. In this dissertation, I will discuss the dynamics behind this process and show that it is greatly

  2. Advances in Modal Logic

    DEFF Research Database (Denmark)

    Modal logic is a subject with ancient roots in the western logical tradition. Up until the last few generations, it was pursued mainly as a branch of philosophy. But in recent years, the subject has taken new directions with connections to topics in computer science and mathematics. This volume...... is the proceedings of the conference of record in its fi eld, Advances in Modal Logic. Its contributions are state-of-the-art papers. The topics include decidability and complexity results for specifi c modal logics, proof theory of modal logic, logics for reasoning about time and space, provability logic, dynamic...... epistemic logic, and the logic of evidence....

  3. The threshold contrast thickness evaluated with different CDMAM phantoms and software

    Directory of Open Access Journals (Sweden)

    Fabiszewska Ewa

    2016-03-01

    Full Text Available The image quality in digital mammography is described by specifying the thickness and diameter of disks with threshold visibility. The European Commission recommends the CDMAM phantom as a tool to evaluate threshold contrast visibility in digital mammography [1, 2]. Inaccuracy of the manufacturing process of CDMAM 3.4 phantoms (Artinis Medical System BV, as well as differences between software used to analyze the images, may lead to discrepancies in the evaluation of threshold contrast visibility. The authors of this work used three CDMAM 3.4 phantoms with serial numbers 1669, 1840, and 1841 and two mammography systems of the same manufacturer with an identical types of detectors. The images were analyzed with EUREF software (version 1.5.5 with CDCOM 1.6. exe file and Artinis software (version 1.2 with CDCOM 1.6. exe file. The differences between the observed thicknesses of the threshold contrast structures, which were caused by differences between the CDMAM 3.4 phantoms, were not reproduced in the same way on two mammography units of the same type. The thickness reported by the Artinis software (version 1.2 with CDCOM 1.6. exe file was generally greater than the one determined by the EUREF software (version 1.5.5 with CDCOM 1.6. exe file, but the ratio of the results depended on the phantom and diameter of the structure. It was not possible to establish correction factors, which would allow correction of the differences between the results obtained for different CDMAM 3.4 phantoms, or to correct the differences between software. Great care must be taken when results of the tests performed with different CDMAM 3.4 phantoms and with different software application are interpreted.

  4. A comprehensive evaluation of popular proteomics software workflows for label-free proteome quantification and imputation.

    Science.gov (United States)

    Välikangas, Tommi; Suomi, Tomi; Elo, Laura L

    2017-05-31

    Label-free mass spectrometry (MS) has developed into an important tool applied in various fields of biological and life sciences. Several software exist to process the raw MS data into quantified protein abundances, including open source and commercial solutions. Each software includes a set of unique algorithms for different tasks of the MS data processing workflow. While many of these algorithms have been compared separately, a thorough and systematic evaluation of their overall performance is missing. Moreover, systematic information is lacking about the amount of missing values produced by the different proteomics software and the capabilities of different data imputation methods to account for them.In this study, we evaluated the performance of five popular quantitative label-free proteomics software workflows using four different spike-in data sets. Our extensive testing included the number of proteins quantified and the number of missing values produced by each workflow, the accuracy of detecting differential expression and logarithmic fold change and the effect of different imputation and filtering methods on the differential expression results. We found that the Progenesis software performed consistently well in the differential expression analysis and produced few missing values. The missing values produced by the other software decreased their performance, but this difference could be mitigated using proper data filtering or imputation methods. Among the imputation methods, we found that the local least squares (lls) regression imputation consistently increased the performance of the software in the differential expression analysis, and a combination of both data filtering and local least squares imputation increased performance the most in the tested data sets. © The Author 2017. Published by Oxford University Press.

  5. Implementation of fuzzy logic control algorithm in embedded ...

    African Journals Online (AJOL)

    Fuzzy logic control algorithm solves problems that are difficult to address with traditional control techniques. This paper describes an implementation of fuzzy logic control algorithm using inexpensive hardware as well as how to use fuzzy logic to tackle a specific control problem without any special software tools. As a case ...

  6. The Design and Evaluation of a Cryptography Teaching Strategy for Software Engineering Students

    Science.gov (United States)

    Dowling, T.

    2006-01-01

    The present paper describes the design, implementation and evaluation of a cryptography module for final-year software engineering students. The emphasis is on implementation architectures and practical cryptanalysis rather than a standard mathematical approach. The competitive continuous assessment process reflects this approach and rewards…

  7. Architecture and evaluation of software-defined optical switching matrix for hybrid data centers

    DEFF Research Database (Denmark)

    Mehmeri, Victor; Vegas Olmos, Juan José; Tafur Monroy, Idelfonso

    2016-01-01

    A software architecture is proposed for hybrid packet/optical data centers employing programmable NETCONF-enabled optical switching matrix, and a performance evaluation is presented comparing hybrid and electrical-only architectures for elephant flows under different traffic patterns. Network...

  8. A Java-platform software for the evaluation of mass attenuation and ...

    African Journals Online (AJOL)

    A computer software was written for the evaluation of mass attenuation coefficient (μ/ρ) and mass energy-absorption coefficient (μ /ρ) for body tissues and substitutes of arbitrary elemental composition and en percentage-by-weight of elemental constituents using the Java development platform which could run on any ...

  9. Contribution at the evaluation of safety softwares in nuclear power plants control systems

    International Nuclear Information System (INIS)

    Soubies, B.; Le Meur, M.; Henry, J.Y.; Boulc'h, J.

    1993-06-01

    The introduction of programmable systems such the SPIN (Numerical Integrated Protection System) has conducted at particular dispositions for the conception and the use of such systems. The utilization of such systems until 1983 has conducted at modifications in the maintenance procedures. The new methods used for the N4 project in the evaluation of safety softwares are given in this report

  10. Dynamic CT myocardial perfusion imaging: performance of 3D semi-automated evaluation software

    Energy Technology Data Exchange (ETDEWEB)

    Ebersberger, Ullrich [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Heart Center Munich-Bogenhausen, Department of Cardiology and Intensive Care Medicine, Munich (Germany); Marcus, Roy P.; Nikolaou, Konstantin; Bamberg, Fabian [University of Munich, Institute of Clinical Radiology, Munich (Germany); Schoepf, U.J.; Gray, J.C.; McQuiston, Andrew D. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Lo, Gladys G. [Hong Kong Sanatorium and Hospital, Department of Diagnostic and Interventional Radiology, Hong Kong (China); Wang, Yining [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Department of Radiology, Beijing (China); Blanke, Philipp [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); University Hospital Freiburg, Department of Diagnostic Radiology, Freiburg (Germany); Geyer, Lucas L. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); University of Munich, Institute of Clinical Radiology, Munich (Germany); Cho, Young Jun [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Konyang University College of Medicine, Department of Radiology, Daejeon (Korea, Republic of); Scheuering, Michael; Canstein, Christian [Siemens Healthcare, CT Division, Forchheim (Germany); Hoffmann, Ellen [Heart Center Munich-Bogenhausen, Department of Cardiology and Intensive Care Medicine, Munich (Germany)

    2014-01-15

    To evaluate the performance of three-dimensional semi-automated evaluation software for the assessment of myocardial blood flow (MBF) and blood volume (MBV) at dynamic myocardial perfusion computed tomography (CT). Volume-based software relying on marginal space learning and probabilistic boosting tree-based contour fitting was applied to CT myocardial perfusion imaging data of 37 subjects. In addition, all image data were analysed manually and both approaches were compared with SPECT findings. Study endpoints included time of analysis and conventional measures of diagnostic accuracy. Of 592 analysable segments, 42 showed perfusion defects on SPECT. Average analysis times for the manual and software-based approaches were 49.1 ± 11.2 and 16.5 ± 3.7 min respectively (P < 0.01). There was strong agreement between the two measures of interest (MBF, ICC = 0.91, and MBV, ICC = 0.88, both P < 0.01) and no significant difference in MBF/MBV with respect to diagnostic accuracy between the two approaches for both MBF and MBV for manual versus software-based approach; respectively; all comparisons P > 0.05. Three-dimensional semi-automated evaluation of dynamic myocardial perfusion CT data provides similar measures and diagnostic accuracy to manual evaluation, albeit with substantially reduced analysis times. This capability may aid the integration of this test into clinical workflows. (orig.)

  11. Moralized Rationality: Relying on Logic and Evidence in the Formation and Evaluation of Belief Can Be Seen as a Moral Issue

    OpenAIRE

    St?hl, Tomas; Zaal, Maarten P.; Skitka, Linda J.

    2016-01-01

    In the present article we demonstrate stable individual differences in the extent to which a reliance on logic and evidence in the formation and evaluation of beliefs is perceived as a moral virtue, and a reliance on less rational processes is perceived as a vice. We refer to this individual difference variable as moralized rationality. Eight studies are reported in which an instrument to measure individual differences in moralized rationality is validated. Results show that the Moralized Rat...

  12. Historical and logical approach of the strategic planning impact evaluation for the Higher Politehcnical Institute of Guayaquil

    Directory of Open Access Journals (Sweden)

    Beatriz Rodríguez-Herkt

    2016-10-01

    Full Text Available In the article the logical historical behavior of the impact assessment of institutional development plan in the Bolivarian Polytechnic Institute in Guayaquil, Ecuador, preceded by the assessment of the most relevant models that have marked the course of history of this process is addressed, on the basis of the key indicators fundamental stages that this process has gone characterized. The historical and logical analysis of the impact assessment of strategic planning requires a raid needed in the different models associated with this process, so as to allow the orientation contextualize the options made in this investigation.

  13. Coupling photon Monte Carlo simulation and CAD software. Application to X-ray nondestructive evaluation

    International Nuclear Information System (INIS)

    Tabary, J.; Gliere, A.

    2001-01-01

    A Monte Carlo radiation transport simulation program, EGS Nova, and a computer aided design software, BRL-CAD, have been coupled within the framework of Sindbad, a nondestructive evaluation (NDE) simulation system. In its current status, the program is very valuable in a NDE laboratory context, as it helps simulate the images due to the uncollided and scattered photon fluxes in a single NDE software environment, without having to switch to a Monte Carlo code parameters set. Numerical validations show a good agreement with EGS4 computed and published data. As the program's major drawback is the execution time, computational efficiency improvements are foreseen. (orig.)

  14. Paraconsistent Computational Logic

    DEFF Research Database (Denmark)

    Jensen, Andreas Schmidt; Villadsen, Jørgen

    2012-01-01

    In classical logic everything follows from inconsistency and this makes classical logic problematic in areas of computer science where contradictions seem unavoidable. We describe a many-valued paraconsistent logic, discuss the truth tables and include a small case study....

  15. Microelectromechanical reprogrammable logic device

    KAUST Repository

    Hafiz, Md Abdullah Al; Kosuru, Lakshmoji; Younis, Mohammad I.

    2016-01-01

    on the electrothermal frequency modulation scheme of a single microelectromechanical resonator, capable of performing all the fundamental 2-bit logic functions as well as n-bit logic operations. Logic functions are performed by actively tuning the linear resonance

  16. Stochastic coalgebraic logic

    CERN Document Server

    Doberkat, Ernst-Erich

    2009-01-01

    Combining coalgebraic reasoning, stochastic systems and logic, this volume presents the principles of coalgebraic logic from a categorical perspective. Modal logics are also discussed, including probabilistic interpretations and an analysis of Kripke models.

  17. [Development and practice evaluation of blood acid-base imbalance analysis software].

    Science.gov (United States)

    Chen, Bo; Huang, Haiying; Zhou, Qiang; Peng, Shan; Jia, Hongyu; Ji, Tianxing

    2014-11-01

    To develop a blood gas, acid-base imbalance analysis computer software to diagnose systematically, rapidly, accurately and automatically determine acid-base imbalance type, and evaluate the clinical application. Using VBA programming language, a computer aided diagnostic software for the judgment of acid-base balance was developed. The clinical data of 220 patients admitted to the Second Affiliated Hospital of Guangzhou Medical University were retrospectively analyzed. The arterial blood gas [pH value, HCO(3)(-), arterial partial pressure of carbon dioxide (PaCO₂)] and electrolytes included data (Na⁺ and Cl⁻) were collected. Data were entered into the software for acid-base imbalances judgment. At the same time the data generation was calculated manually by H-H compensation formula for determining the type of acid-base imbalance. The consistency of judgment results from software and manual calculation was evaluated, and the judgment time of two methods was compared. The clinical diagnosis of the types of acid-base imbalance for the 220 patients: 65 cases were normal, 90 cases with simple type, mixed type in 41 cases, and triplex type in 24 cases. The accuracy of the judgment results of the normal and triplex types from computer software compared with which were calculated manually was 100%, the accuracy of the simple type judgment was 98.9% and 78.0% for the mixed type, and the total accuracy was 95.5%. The Kappa value of judgment result from software and manual judgment was 0.935, P=0.000. It was demonstrated that the consistency was very good. The time for software to determine acid-base imbalances was significantly shorter than the manual judgment (seconds:18.14 ± 3.80 vs. 43.79 ± 23.86, t=7.466, P=0.000), so the method of software was much faster than the manual method. Software judgment can replace manual judgment with the characteristics of rapid, accurate and convenient, can improve work efficiency and quality of clinical doctors and has great

  18. AN INITIAL EVALUATION OF THE BTRACKS BALANCE PLATE AND SPORTS BALANCE SOFTWARE FOR CONCUSSION DIAGNOSIS.

    Science.gov (United States)

    Goble, Daniel J; Manyak, Kristin A; Abdenour, Thomas E; Rauh, Mitchell J; Baweja, Harsimran S

    2016-04-01

    As recently dictated by the American Medical Society, balance testing is an important component in the clinical evaluation of concussion. Despite this, previous research on the efficacy of balance testing for concussion diagnosis suggests low sensitivity (∼30%), based primarily on the popular Balance Error Scoring System (BESS). The Balance Tracking System (BTrackS, Balance Tracking Systems Inc., San Diego, CA, USA) consists of a force plate (BTrackS Balance Plate) and software (BTrackS Sport Balance) which can quickly (balance testing with gold standard accuracy. The present study aimed to determine the sensitivity of the BTrackS Balance Plate and Sports Balance Software for concussion diagnosis. Cross-Sectional Study. Preseason baseline balance testing of 519 healthy Division I college athletes playing sports with a relatively high risk for concussions was performed with the BTrackS Balance Test. Testing was administered by certified athletic training staff using the BTrackS Balance Plate and Sport Balance software. Of the baselined athletes, 25 later experienced a concussion during the ensuing sport season. Post-injury balance testing was performed on these concussed athletes within 48 of injury and the sensitivity of the BTrackS Balance Plate and Sport Balance software was estimated based on the number of athletes showing a balance decline according to the criteria specified in the Sport Balance software. This criteria is based on the minimal detectable change statistic with a 90% confidence level (i.e. 90% specificity). Of 25 athletes who experienced concussions, 16 had balance declines relative to baseline testing results according to the BTrackS Sport Balance software criteria. This corresponds to an estimated concussion sensitivity of 64%, which is twice as great as that reported previously for the BESS. The BTrackS Balance Plate and Sport Balance software has the greatest concussion sensitivity of any balance testing instrument reported to date. Level 2

  19. A economic evaluation system software on in-situ leaching mining sandstone uranium deposits

    International Nuclear Information System (INIS)

    Yao Yixuan; Su Xuebin; Xie Weixing; Que Weimin

    2001-01-01

    The author presents the study results of applying computer technology to evaluate quantitatively the technical-economic feasibility of in-situ leaching mining sandstone uranium deposits. A computer system software have been developed. Under specifying deposit conditions and given production size per year, the application of the software will generate total capital and mine life operating costs as well as solve for the movable and static financial assessment targets through discounted cash flow analysis. According to the characters of two kinds of sandstone uranium deposits, a data bases of economic and technique parameters of in-situ leaching have been designed. Also the system software can be used to study the economic value of deposits and to optimize the key project parameters. Its features, data input method and demand, main functions, structure and operating environments are described

  20. Classical logic and logicism in human thought

    OpenAIRE

    Elqayam, Shira

    2012-01-01

    This chapter explores the role of classical logic as a theory of human reasoning. I distinguish between classical logic as a normative, computational and algorithmic system, and review its role is theories of human reasoning since the 1960s. The thesis I defend is that psychological theories have been moving further and further away from classical logic on all three levels. I examine some prominent example of logicist theories, which incorporate logic in their psychological account, includin...

  1. Logic programming extensions of Horn clause logic

    Directory of Open Access Journals (Sweden)

    Ron Sigal

    1988-11-01

    Full Text Available Logic programming is now firmly established as an alternative programming paradigm, distinct and arguably superior to the still dominant imperative style of, for instance, the Algol family of languages. The concept of a logic programming language is not precisely defined, but it is generally understood to be characterized buy: a declarative nature; foundation in some well understood logical system, e.g., first order logic.

  2. Availability increase evaluation of a CANDU-600 reactor based on the 2 out of 4 shutdown logic

    International Nuclear Information System (INIS)

    Stefanescu, P.; Stancu-Ciolac, O.

    1995-01-01

    Quality, reliability and maintenance are the three directly decisive factors for the availability of nuclear process and safety systems of a Nuclear Power Plant. Since the reliability of nuclear equipment and components based on the efforts performed for perfecting them can rapidly reach a 'saturation' point, the only way to improve a system availability is to find those possibilities to optimize its structure so that to strongly minimize its unavailability to work. Reliability analysis prove that very good results have been obtained by replacing the simple reduplicate schemes (1 out of 2; 2 out of 3) with more sophisticate once (2 out of 4; 2 x 2 out of 3). The paper reveals the advantages gained by a CANDU-600 reactor if the Shut Down System Number 1 is based on the 2 out of 4 logic instead of 2 out of 3. The investigation's framework is a new 2 out of 4 shutdown scheme, entailing only relay changes and aiming for identical design requirements and purposes as the initial one. The calculation use the classical logical block diagrams, reliability factors and equations and demonstrate the advantage of the proposed logic by computing and comparing the availability factors for 2 out of 4 and 2 out of 3 logic. The efficiency of the method is established by estimating in comparison with the initial 2 out of 3 logic the implicit investment owed to the additional 4 release and 1 measurement channel. The determined increase of the availability factor (5.86·10 -6 year/years) and the subsequent rise of investment (8.6 millions lei) sustain the proposed method. (Author) 2 Tabs

  3. Pulse coded safety logic for PFBR

    International Nuclear Information System (INIS)

    Anwer, Md. Najam; Satheesh, N.; Nagaraj, C.P.; Krishnakumar, B.

    2002-01-01

    Full text: Reactor safety logic is designed to initiate safety action against design basis events. The reactor is shutdown by de-energizing electromagnets and dropping the absorber rods under gravity. In prototype fast breeder reactor (PFBR), shutdown is affected by two independent shutdown systems, viz., control and safety rod drive mechanism (CSRDM) and diverse safety rod drive mechanism (DSRDM). Two separate safety logics are proposed for CSRDM and DSRDM, i.e. solid state logic with on-line fine impulse test (FIT) for CSRDM and pulse coded safety logic (PCSL) for DSRDM. The PCSL primarily utilizes the fact that the vast majority of faults in the logic circuitry result in static conditions at the output. It is arranged such that the presence of pulses are required to hold the shutdown actuators and any DC logic state, either logic 0 or logic 1 releases them. It is a dynamic, self-testing logic and used in a number of reactors. This paper describes the principle of operation of PCSL, its advantages, the concept of guard line logic (GLL), detection of stuck at 0 and stuck at 1 faults, fail safe and diversity features. The implementation of PCSL using Altera Max+Plus II software for PFBR trip signals and the results of simulation are discussed. This paper also describes a test jig using 80186 based system for testing PCSL for various input parameter's combinations and monitoring the outputs

  4. Automated quantification of epicardial adipose tissue using CT angiography: evaluation of a prototype software

    International Nuclear Information System (INIS)

    Spearman, James V.; Silverman, Justin R.; Krazinski, Aleksander W.; Costello, Philip; Meinel, Felix G.; Geyer, Lucas L.; Schoepf, U.J.; Apfaltrer, Paul; Canstein, Christian; De Cecco, Carlo Nicola

    2014-01-01

    This study evaluated the performance of a novel automated software tool for epicardial fat volume (EFV) quantification compared to a standard manual technique at coronary CT angiography (cCTA). cCTA data sets of 70 patients (58.6 ± 12.9 years, 33 men) were retrospectively analysed using two different post-processing software applications. Observer 1 performed a manual single-plane pericardial border definition and EFV M segmentation (manual approach). Two observers used a software program with fully automated 3D pericardial border definition and EFV A calculation (automated approach). EFV and time required for measuring EFV (including software processing time and manual optimization time) for each method were recorded. Intraobserver and interobserver reliability was assessed on the prototype software measurements. T test, Spearman's rho, and Bland-Altman plots were used for statistical analysis. The final EFV A (with manual border optimization) was strongly correlated with the manual axial segmentation measurement (60.9 ± 33.2 mL vs. 65.8 ± 37.0 mL, rho = 0.970, P 0.9). Automated EFV A quantification is an accurate and time-saving method for quantification of EFV compared to established manual axial segmentation methods. (orig.)

  5. Miniaturization of Josephson logic circuits

    International Nuclear Information System (INIS)

    Ko, H.; Van Duzer, T.

    1985-01-01

    The performances of Current Injection Logic (CIL) and Resistor Coupled Josephson Logic (RCJL) have been evaluated for minimum features sizes ranging from 5 μm to 0.2 μm. The logic delay is limited to about 10 ps for both the CIL AND gate and the RCJL OR gate biased at 70% of maximum bias current. The maximum circuit count on an 6.35 x 6.35 chip is 13,000 for CIL gates and 20,000 for RCJL gates. Some suggestions are given for further improvements

  6. Vrednovanje lokacija za uspostavljanje mosnog mesta prelaska preko vodenih prepreka primenom fuzzy logike / Evaluating locations for river crossing using fuzzy logic

    Directory of Open Access Journals (Sweden)

    Darko I. Božanić

    2010-01-01

    pontoon bridge location for the purpose of overcoming water obstacles. The decision making process includes a higher or lower level of indefiniteness of criteria needed for making a relevant decision. Since the fuzzy logic is very suitable for expressing indefiniteness and uncertainty, the decision making process using a fuzzy logic approach is shown in the paper. Characteristics of multi-criteria methods and selection of methods for evaluation With the development of the evaluation theory, evaluation models were being developed as well. Different objectives of evaluation and other differences in the whole procedure had an impact on the development of the majority of evaluation models adapted to different requests. The main objective of multi-criteria methods is to define the priority between particular variants or criteria in the situation with a large number of decision makers and a large number of decision making criteria in repeated periods of time. Main notions of fuzzy logic and fuzzy sets In a larger sense, the fuzzy logic is a synonym for the fuzzy sets theory which refers to the class of objects with unclear borders the membership of which is measured by certain value. It is important to realize that the essence of the fuzzy logic is different from the essence of the traditional logic system. This logic, based on clear and precisely defined rules, has its foundation in the set theory. An element can or cannot be a part of a set, which means that sets have clearly determined borders. Contrary to the conventional logic, the fuzzy logic does not define precisely the membership of an element to a set. The membership value is expressed in percentage, for example. The fuzzy logic is very close to human perception. Fuzzy system modeling for evaluation of selected locations The fuzzy logic is usually used for complex system modeling, when it is difficult to define interdependences between certain variables by other methods. The criteria for the selection of locations for

  7. Three-valued logics in modal logic

    NARCIS (Netherlands)

    Kooi, Barteld; Tamminga, Allard

    2013-01-01

    Every truth-functional three-valued propositional logic can be conservatively translated into the modal logic S5. We prove this claim constructively in two steps. First, we define a Translation Manual that converts any propositional formula of any three-valued logic into a modal formula. Second, we

  8. Ease of adoption of clinical natural language processing software: An evaluation of five systems.

    Science.gov (United States)

    Zheng, Kai; Vydiswaran, V G Vinod; Liu, Yang; Wang, Yue; Stubbs, Amber; Uzuner, Özlem; Gururaj, Anupama E; Bayer, Samuel; Aberdeen, John; Rumshisky, Anna; Pakhomov, Serguei; Liu, Hongfang; Xu, Hua

    2015-12-01

    In recognition of potential barriers that may inhibit the widespread adoption of biomedical software, the 2014 i2b2 Challenge introduced a special track, Track 3 - Software Usability Assessment, in order to develop a better understanding of the adoption issues that might be associated with the state-of-the-art clinical NLP systems. This paper reports the ease of adoption assessment methods we developed for this track, and the results of evaluating five clinical NLP system submissions. A team of human evaluators performed a series of scripted adoptability test tasks with each of the participating systems. The evaluation team consisted of four "expert evaluators" with training in computer science, and eight "end user evaluators" with mixed backgrounds in medicine, nursing, pharmacy, and health informatics. We assessed how easy it is to adopt the submitted systems along the following three dimensions: communication effectiveness (i.e., how effective a system is in communicating its designed objectives to intended audience), effort required to install, and effort required to use. We used a formal software usability testing tool, TURF, to record the evaluators' interactions with the systems and 'think-aloud' data revealing their thought processes when installing and using the systems and when resolving unexpected issues. Overall, the ease of adoption ratings that the five systems received are unsatisfactory. Installation of some of the systems proved to be rather difficult, and some systems failed to adequately communicate their designed objectives to intended adopters. Further, the average ratings provided by the end user evaluators on ease of use and ease of interpreting output are -0.35 and -0.53, respectively, indicating that this group of users generally deemed the systems extremely difficult to work with. While the ratings provided by the expert evaluators are higher, 0.6 and 0.45, respectively, these ratings are still low indicating that they also experienced

  9. Comparative exploration of multidimensional flow cytometry software: a model approach evaluating T cell polyfunctional behavior.

    Science.gov (United States)

    Spear, Timothy T; Nishimura, Michael I; Simms, Patricia E

    2017-08-01

    Advancement in flow cytometry reagents and instrumentation has allowed for simultaneous analysis of large numbers of lineage/functional immune cell markers. Highly complex datasets generated by polychromatic flow cytometry require proper analytical software to answer investigators' questions. A problem among many investigators and flow cytometry Shared Resource Laboratories (SRLs), including our own, is a lack of access to a flow cytometry-knowledgeable bioinformatics team, making it difficult to learn and choose appropriate analysis tool(s). Here, we comparatively assess various multidimensional flow cytometry software packages for their ability to answer a specific biologic question and provide graphical representation output suitable for publication, as well as their ease of use and cost. We assessed polyfunctional potential of TCR-transduced T cells, serving as a model evaluation, using multidimensional flow cytometry to analyze 6 intracellular cytokines and degranulation on a per-cell basis. Analysis of 7 parameters resulted in 128 possible combinations of positivity/negativity, far too complex for basic flow cytometry software to analyze fully. Various software packages were used, analysis methods used in each described, and representative output displayed. Of the tools investigated, automated classification of cellular expression by nonlinear stochastic embedding (ACCENSE) and coupled analysis in Pestle/simplified presentation of incredibly complex evaluations (SPICE) provided the most user-friendly manipulations and readable output, evaluating effects of altered antigen-specific stimulation on T cell polyfunctionality. This detailed approach may serve as a model for other investigators/SRLs in selecting the most appropriate software to analyze complex flow cytometry datasets. Further development and awareness of available tools will help guide proper data analysis to answer difficult biologic questions arising from incredibly complex datasets. © Society

  10. Novel software for quantitative evaluation and graphical representation of masticatory efficiency.

    Science.gov (United States)

    Halazonetis, D J; Schimmel, M; Antonarakis, G S; Christou, P

    2013-05-01

    Blending of chewing gums of different colours is used in the clinical setting, as a simple and reliable means for the assessment of chewing efficiency. However, the available software is difficult to use in an everyday clinical setting, and there is no possibility of automated classification of the patient's chewing ability in a graph, to facilitate visualisation of the results and to evaluate potential chewing difficulties. The aims of this study were to test the validity of ViewGum - a novel image analysis software for the evaluation of boli derived from a two-colour mixing ability test - and to establish a baseline graph for the representation of the masticatory efficiency in a healthy population. Image analysis demonstrated significant hue variation decrease as the number of chewing cycles increased, indicating a higher degree of colour mixture. Standard deviation of hue (SDHue) was significantly different between all chewing cycles. Regression of the log-transformed values of the medians of SDHue on the number of chewing cycles showed a high statistically significant correlation (r² = 0.94, P test methods by the simplicity of its application. The newly developed ViewGum software provides speed, ease of use and immediate extraction of clinically useful conclusions to the already established method of chewing efficiency evaluation and is a valid adjunct for the evaluation of masticatory efficiency with two-colour chewing gum. © 2013 Blackwell Publishing Ltd.

  11. Development and evaluation of a web-based software for crash data collection, processing and analysis.

    Science.gov (United States)

    Montella, Alfonso; Chiaradonna, Salvatore; Criscuolo, Giorgio; De Martino, Salvatore

    2017-02-05

    First step of the development of an effective safety management system is to create reliable crash databases since the quality of decision making in road safety depends on the quality of the data on which decisions are based. Improving crash data is a worldwide priority, as highlighted in the Global Plan for the Decade of Action for Road Safety adopted by the United Nations, which recognizes that the overall goal of the plan will be attained improving the quality of data collection at the national, regional and global levels. Crash databases provide the basic information for effective highway safety efforts at any level of government, but lack of uniformity among countries and among the different jurisdictions in the same country is observed. Several existing databases show significant drawbacks which hinder their effective use for safety analysis and improvement. Furthermore, modern technologies offer great potential for significant improvements of existing methods and procedures for crash data collection, processing and analysis. To address these issues, in this paper we present the development and evaluation of a web-based platform-independent software for crash data collection, processing and analysis. The software is designed for mobile and desktop electronic devices and enables a guided and automated drafting of the crash report, assisting police officers both on-site and in the office. The software development was based both on the detailed critical review of existing Australasian, EU, and U.S. crash databases and software as well as on the continuous consultation with the stakeholders. The evaluation was carried out comparing the completeness, timeliness, and accuracy of crash data before and after the use of the software in the city of Vico Equense, in south of Italy showing significant advantages. The amount of collected information increased from 82 variables to 268 variables, i.e., a 227% increase. The time saving was more than one hour per crash, i

  12. Evaluating the effects of a new qualitative simulation software (DynaLearn) on learning behavior, factual and causal understanding

    NARCIS (Netherlands)

    Zitek, A.; Poppe, M.; Stelzhammer, M.; Muhar, S.; Bredeweg, B.; Biswas, G.; Bull, S.; Kay, J.; Mitrovic, A.

    2011-01-01

    The DynaLearn software, a new intelligent learning environment aimed at supporting a better conceptual and causal understanding of environmental sciences was evaluated. The main goals of these pilot evaluations were to provide information on (1) usability of the software and problems learners

  13. Automated quantification of epicardial adipose tissue using CT angiography: evaluation of a prototype software

    Energy Technology Data Exchange (ETDEWEB)

    Spearman, James V.; Silverman, Justin R.; Krazinski, Aleksander W.; Costello, Philip [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Meinel, Felix G.; Geyer, Lucas L. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Ludwig-Maximilians-University Hospital, Institute for Clinical Radiology, Munich (Germany); Schoepf, U.J. [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); Medical University of South Carolina, Division of Cardiology, Department of Medicine, Charleston, SC (United States); Apfaltrer, Paul [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University Medical Center Mannheim, Medical Faculty Mannheim - Heidelberg University, Institute of Clinical Radiology and Nuclear Medicine, Mannheim (Germany); Canstein, Christian [Siemens Medical Solutions USA, Inc., Malvern, PA (United States); De Cecco, Carlo Nicola [Medical University of South Carolina, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Rome ' ' Sapienza' ' - Polo Pontino, Department of Radiological Sciences, Oncology and Pathology, Latina (Italy)

    2014-02-15

    This study evaluated the performance of a novel automated software tool for epicardial fat volume (EFV) quantification compared to a standard manual technique at coronary CT angiography (cCTA). cCTA data sets of 70 patients (58.6 ± 12.9 years, 33 men) were retrospectively analysed using two different post-processing software applications. Observer 1 performed a manual single-plane pericardial border definition and EFV{sub M} segmentation (manual approach). Two observers used a software program with fully automated 3D pericardial border definition and EFV{sub A} calculation (automated approach). EFV and time required for measuring EFV (including software processing time and manual optimization time) for each method were recorded. Intraobserver and interobserver reliability was assessed on the prototype software measurements. T test, Spearman's rho, and Bland-Altman plots were used for statistical analysis. The final EFV{sub A} (with manual border optimization) was strongly correlated with the manual axial segmentation measurement (60.9 ± 33.2 mL vs. 65.8 ± 37.0 mL, rho = 0.970, P < 0.001). A mean of 3.9 ± 1.9 manual border edits were performed to optimize the automated process. The software prototype required significantly less time to perform the measurements (135.6 ± 24.6 s vs. 314.3 ± 76.3 s, P < 0.001) and showed high reliability (ICC > 0.9). Automated EFV{sub A} quantification is an accurate and time-saving method for quantification of EFV compared to established manual axial segmentation methods. (orig.)

  14. Assessment of three different software systems in the evaluation of dynamic MRI of the breast

    Energy Technology Data Exchange (ETDEWEB)

    Kurz, K.D. [Department of Radiology, Stavanger University Hospital, Postbox 8100, Stavanger (Norway)], E-mail: kurk@sus.no; Steinhaus, D. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: Daniele.Steinhaus@med.uni-duesseldorf.de; Klar, V. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: verena.klar@uni-duesseldorf.de; Cohnen, M. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: cohnen@med.uni-duesseldorf.de; Wittsack, H.J. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: wittsack@uni-duesseldorf.de; Saleh, A. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: saleh@uni-duesseldorf.de; Moedder, U. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: moedder@med.uni-duesseldorf.de; Blondin, D. [Institute of Daignostic Radiology, Duesseldorf University Hospital, Moorenstr. 5, 40225 Duesseldorf (Germany)], E-mail: blondin@med.uni-duesseldorf.de

    2009-02-15

    Objective: The aim was to compare the diagnostic performance and handling of dynamic contrast-enhanced MRI of the breast with two commercial software solutions ('CADstream' and '3TP') and one self-developed software system ('Mammatool'). Materials and methods: Identical data sets of dynamic breast MRI from 21 patients were evaluated retrospectively with all three software systems. The exams were classified according to the BI-RADS classification. The number of lesions in the parametric mapping was compared to histology or follow-up of more than 2 years. In addition, 25 quality criteria were judged by 3 independent investigators with a score from 0 to 5. Statistical analysis was performed to document the quality ranking of the different software systems. Results: There were 9 invasive carcinomas, one pure DCIS, one papilloma, one radial scar, three histologically proven changes due to mastopathy, one adenosis and two fibroadenomas. Additionally two patients with enhancing parenchyma followed with MRI for more than 3 years and one scar after breast conserving therapy were included. All malignant lesions were classified as BI-RADS 4 or 5 using all software systems and showed significant enhancement in the parametric mapping. 'CADstream' showed the best score on subjective quality criteria. '3TP' showed the lowest number of false-positive results. 'Mammatool' produced the lowest number of benign tissues indicated with parametric overlay. Conclusion: All three software programs tested were adequate for sensitive and efficient assessment of dynamic MRI of the breast. Improvements in specificity may be achievable.

  15. Evaluation of features to support safety and quality in general practice clinical software

    Science.gov (United States)

    2011-01-01

    Background Electronic prescribing is now the norm in many countries. We wished to find out if clinical software systems used by general practitioners in Australia include features (functional capabilities and other characteristics) that facilitate improved patient safety and care, with a focus on quality use of medicines. Methods Seven clinical software systems used in general practice were evaluated. Fifty software features that were previously rated as likely to have a high impact on safety and/or quality of care in general practice were tested and are reported here. Results The range of results for the implementation of 50 features across the 7 clinical software systems was as follows: 17-31 features (34-62%) were fully implemented, 9-13 (18-26%) partially implemented, and 9-20 (18-40%) not implemented. Key findings included: Access to evidence based drug and therapeutic information was limited. Decision support for prescribing was available but varied markedly between systems. During prescribing there was potential for medicine mis-selection in some systems, and linking a medicine with its indication was optional. The definition of 'current medicines' versus 'past medicines' was not always clear. There were limited resources for patients, and some medicines lists for patients were suboptimal. Results were provided to the software vendors, who were keen to improve their systems. Conclusions The clinical systems tested lack some of the features expected to support patient safety and quality of care. Standards and certification for clinical software would ensure that safety features are present and that there is a minimum level of clinical functionality that clinicians could expect to find in any system.

  16. Software-based evaluation of toric IOL orientation in a multicenter clinical study.

    Science.gov (United States)

    Kasthurirangan, Sanjeev; Feuchter, Lucas; Smith, Pamela; Nixon, Donald

    2014-12-01

    To evaluate the rotational stability of a new one-piece hydrophobic acrylic toric intraocular lens (IOL) using a custom-developed software for analysis of slit-lamp photographs. In a prospective, multicenter study, 174 eyes were implanted with the TECNIS Toric IOL (Abbott Medical Optics, Inc., Santa Ana, CA). A custom-developed software was used to analyze high-resolution slit-lamp photographs of 156 eyes taken at day 1 (baseline) and 1, 3, and 6 months postoperatively. The software uses iris and sclera landmarks to align the baseline image and later images for comparison. Validation of software was performed through repeated analyses of protractor images rotated from 0.1° to 10.0° and randomly selected photographs of 20 eyes. Software validation showed precision (repeatability plus reproducibility variation) of 0.02° using protractor images and 2.22° using slit-lamp photographs. Good quality slit-lamp images and clear landmarks were necessary for precise measurements. At 6 months, 94.2% of eyes had 5° or less change in IOL orientation versus baseline; only 2 eyes (1.4%) had axis shift greater than 30°. Most eyes were within 5° or less of rotation between 1 and 3 months (92.9%) and 3 and 6 months (94.1%). Mean absolute axis change (± standard deviation) from 1 day to 6 months was 2.70° ± 5.51°. The new custom software was precise and quick in analyzing slit-lamp photographs to determine postoperative toric IOL rotation. Copyright 2014, SLACK Incorporated.

  17. Oak Ridge K-25 Site Technology Logic Diagram. Volume 3, Technology evaluation data sheets; Part A, Characterization, decontamination, dismantlement

    Energy Technology Data Exchange (ETDEWEB)

    Fellows, R.L. [ed.

    1993-02-26

    The Oak Ridge K-25 Technology Logic Diagram (TLD), a decision support tool for the K-25 Site, was developed to provide a planning document that relates environmental restoration and waste management problems at the Oak Ridge K-25 Site to potential technologies that can remediate these problems. The TLD technique identifies the research necessary to develop these technologies to a state that allows for technology transfer and application to waste management, remedial action, and decontamination and decommissioning activities. The TLD consists of four separate volumes-Vol. 1, Vol. 2, Vol. 3A, and Vol. 3B. Volume 1 provides introductory and overview information about the TLD. Volume 2 contains logic diagrams. Volume 3 has been divided into two separate volumes to facilitate handling and use. This report is part A of Volume 3 concerning characterization, decontamination, and dismantlement.

  18. Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4. Volume 1: Technology evaluation

    International Nuclear Information System (INIS)

    1994-09-01

    During World War 11, the Oak Ridge Y-12 Plant was built as part of the Manhattan Project to supply enriched uranium for weapons production. In 1945, Building 9201-4 (Alpha-4) was originally used to house a uranium isotope separation process based on electromagnetic separation technology. With the startup of the Oak Ridge K-25 Site gaseous diffusion plant In 1947, Alpha-4 was placed on standby. In 1953, the uranium enrichment process was removed, and installation of equipment for the Colex process began. The Colex process--which uses a mercury solvent and lithium hydroxide as the lithium feed material-was shut down in 1962 and drained of process materials. Residual Quantities of mercury and lithium hydroxide have remained in the process equipment. Alpha-4 contains more than one-half million ft 2 of floor area; 15,000 tons of process and electrical equipment; and 23,000 tons of insulation, mortar, brick, flooring, handrails, ducts, utilities, burnables, and sludge. Because much of this equipment and construction material is contaminated with elemental mercury, cleanup is necessary. The goal of the Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4 is to provide a planning document that relates decontamination and decommissioning and waste management problems at the Alpha-4 building to the technologies that can be used to remediate these problems. The Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4 builds on the methodology transferred by the U.S. Air Force to the Environmental Management organization with DOE and draws from previous technology logic diagram-efforts: logic diagrams for Hanford, the K-25 Site, and ORNL

  19. Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4. Volume 1: Technology evaluation

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-09-01

    During World War 11, the Oak Ridge Y-12 Plant was built as part of the Manhattan Project to supply enriched uranium for weapons production. In 1945, Building 9201-4 (Alpha-4) was originally used to house a uranium isotope separation process based on electromagnetic separation technology. With the startup of the Oak Ridge K-25 Site gaseous diffusion plant In 1947, Alpha-4 was placed on standby. In 1953, the uranium enrichment process was removed, and installation of equipment for the Colex process began. The Colex process--which uses a mercury solvent and lithium hydroxide as the lithium feed material-was shut down in 1962 and drained of process materials. Residual Quantities of mercury and lithium hydroxide have remained in the process equipment. Alpha-4 contains more than one-half million ft{sup 2} of floor area; 15,000 tons of process and electrical equipment; and 23,000 tons of insulation, mortar, brick, flooring, handrails, ducts, utilities, burnables, and sludge. Because much of this equipment and construction material is contaminated with elemental mercury, cleanup is necessary. The goal of the Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4 is to provide a planning document that relates decontamination and decommissioning and waste management problems at the Alpha-4 building to the technologies that can be used to remediate these problems. The Y-12 Plant Decontamination and Decommissioning Technology Logic Diagram for Building 9201-4 builds on the methodology transferred by the U.S. Air Force to the Environmental Management organization with DOE and draws from previous technology logic diagram-efforts: logic diagrams for Hanford, the K-25 Site, and ORNL.

  20. Comparison of SOAP and REST Based Web Services Using Software Evaluation Metrics

    Directory of Open Access Journals (Sweden)

    Tihomirovs Juris

    2016-12-01

    Full Text Available The usage of Web services has recently increased. Therefore, it is important to select right type of Web services at the project design stage. The most common implementations are based on SOAP (Simple Object Access Protocol and REST (Representational State Transfer Protocol styles. Maintainability of REST and SOAP Web services has become an important issue as popularity of Web services is increasing. Choice of the right approach is not an easy decision since it is influenced by development requirements and maintenance considerations. In the present research, we present the comparison of SOAP and REST based Web services using software evaluation metrics. To achieve this aim, a systematic literature review will be made to compare REST and SOAP Web services in terms of the software evaluation metrics.

  1. Oak Ridge K-25 Site Technology Logic Diagram. Volume 3, Technology evaluation data sheets; Part B, Remedial action, robotics/automation, waste management

    Energy Technology Data Exchange (ETDEWEB)

    Fellows, R.L. [ed.

    1993-02-26

    The Oak Ridge K-25 Technology Logic Diagram (TLD), a decision support tool for the K-25 Site, was developed to provide a planning document that relates environmental restoration (ER) and waste management (WN) problems at the Oak Ridge K-25 Site. The TLD technique identifies the research necessary to develop these technologies to a state that allows for technology transfer and application to waste management, remediation, decontamination, and decommissioning activities. The TLD consists of four separate volumes-Vol. 1, Vol. 2, Vol. 3A, and Vol. 3B. Volume 1 provides introductory and overview information about the TLD. Volume 2 contains logic diagrams. Volume 3 has been divided into two separate volumes to facilitate handling and use. This volume 3 B provides the Technology Evaluation Data Sheets (TEDS) for ER/WM activities (Remedial Action Robotics and Automation, Waste Management) that are referenced by a TEDS code number in Vol. 2 of the TLD. Each of these sheets represents a single logic trace across the TLD. These sheets contain more detail than each technology in Vol. 2. The TEDS are arranged alphanumerically by the TEDS code number in the upper right corner of each data sheet. Volume 3 can be used in two ways: (1) technologies that are identified from Vol. 2 can be referenced directly in Vol. 3 by using the TEDS codes, and (2) technologies and general technology areas (alternatives) can be located in the index in the front of this volume.

  2. SU-F-J-72: A Clinical Usable Integrated Contouring Quality Evaluation Software for Radiotherapy

    International Nuclear Information System (INIS)

    Jiang, S; Dolly, S; Cai, B; Mutic, S; Li, H

    2016-01-01

    Purpose: To introduce the Auto Contour Evaluation (ACE) software, which is the clinical usable, user friendly, efficient and all-in-one toolbox for automatically identify common contouring errors in radiotherapy treatment planning using supervised machine learning techniques. Methods: ACE is developed with C# using Microsoft .Net framework and Windows Presentation Foundation (WPF) for elegant GUI design and smooth GUI transition animations through the integration of graphics engines and high dots per inch (DPI) settings on modern high resolution monitors. The industrial standard software design pattern, Model-View-ViewModel (MVVM) pattern, is chosen to be the major architecture of ACE for neat coding structure, deep modularization, easy maintainability and seamless communication with other clinical software. ACE consists of 1) a patient data importing module integrated with clinical patient database server, 2) a 2D DICOM image and RT structure simultaneously displaying module, 3) a 3D RT structure visualization module using Visualization Toolkit or VTK library and 4) a contour evaluation module using supervised pattern recognition algorithms to detect contouring errors and display detection results. ACE relies on supervised learning algorithms to handle all image processing and data processing jobs. Implementations of related algorithms are powered by Accord.Net scientific computing library for better efficiency and effectiveness. Results: ACE can take patient’s CT images and RT structures from commercial treatment planning software via direct user input or from patients’ database. All functionalities including 2D and 3D image visualization and RT contours error detection have been demonstrated with real clinical patient cases. Conclusion: ACE implements supervised learning algorithms and combines image processing and graphical visualization modules for RT contours verification. ACE has great potential for automated radiotherapy contouring quality verification

  3. SU-F-J-72: A Clinical Usable Integrated Contouring Quality Evaluation Software for Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, S; Dolly, S; Cai, B; Mutic, S; Li, H [Washington University School of Medicine, Saint Louis, MO (United States)

    2016-06-15

    Purpose: To introduce the Auto Contour Evaluation (ACE) software, which is the clinical usable, user friendly, efficient and all-in-one toolbox for automatically identify common contouring errors in radiotherapy treatment planning using supervised machine learning techniques. Methods: ACE is developed with C# using Microsoft .Net framework and Windows Presentation Foundation (WPF) for elegant GUI design and smooth GUI transition animations through the integration of graphics engines and high dots per inch (DPI) settings on modern high resolution monitors. The industrial standard software design pattern, Model-View-ViewModel (MVVM) pattern, is chosen to be the major architecture of ACE for neat coding structure, deep modularization, easy maintainability and seamless communication with other clinical software. ACE consists of 1) a patient data importing module integrated with clinical patient database server, 2) a 2D DICOM image and RT structure simultaneously displaying module, 3) a 3D RT structure visualization module using Visualization Toolkit or VTK library and 4) a contour evaluation module using supervised pattern recognition algorithms to detect contouring errors and display detection results. ACE relies on supervised learning algorithms to handle all image processing and data processing jobs. Implementations of related algorithms are powered by Accord.Net scientific computing library for better efficiency and effectiveness. Results: ACE can take patient’s CT images and RT structures from commercial treatment planning software via direct user input or from patients’ database. All functionalities including 2D and 3D image visualization and RT contours error detection have been demonstrated with real clinical patient cases. Conclusion: ACE implements supervised learning algorithms and combines image processing and graphical visualization modules for RT contours verification. ACE has great potential for automated radiotherapy contouring quality verification

  4. Carotid artery stenosis: Performance of advanced vessel analysis software in evaluating CTA

    International Nuclear Information System (INIS)

    Tsiflikas, Ilias; Biermann, Christina; Thomas, Christoph; Ketelsen, Dominik; Claussen, Claus D.; Heuschmid, Martin

    2012-01-01

    Objectives: The aim of this study was to evaluate time efficiency and diagnostic reproducibility of an advanced vessel analysis software for diagnosis of carotid artery stenosis. Material and methods: 40 patients with suspected carotid artery stenosis received head and neck DE-CTA as part of their pre-interventional workup. Acquired data were evaluated by 2 independent radiologists. Stenosis grading was performed by MPR eyeballing with freely adjustable MPRs and with a preliminary prototype of the meanwhile available client-server and advanced visualization software syngo.via CT Vascular (Siemens Healthcare, Erlangen, Germany). Stenoses were graded according to the following 5 categories: I: 0%, II: 1–50%, III: 51–69%, IV: 70–99% and V: total occlusion. Furthermore, time to diagnosis for each carotid artery was recorded. Results: Both readers achieved very good specificity values and good respectively very good sensitivity values without significant differences between both reading methods. Furthermore, there was a very good correlation between both readers for both reading methods without significant differences (kappa value: standard image interpretation k = 0.809; advanced vessel analysis software k = 0.863). Using advanced vessel analysis software resulted in a significant time saving (p < 0.0001) for both readers. Time to diagnosis could be decreased by approximately 55%. Conclusions: Advanced vessel analysis application CT Vascular of the new imaging software syngo.via (Siemens Healthcare, Forchheim, Germany) provides a high rate of reproducibility in assessment of carotid artery stenosis. Furthermore a significant time saving in comparison to standard image interpretation is achievable

  5. Carotid artery stenosis: Performance of advanced vessel analysis software in evaluating CTA

    Energy Technology Data Exchange (ETDEWEB)

    Tsiflikas, Ilias, E-mail: ilias.tsiflikas@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Biermann, Christina, E-mail: christina.biermann@siemens.com [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Siemens AG, Siemens Healthcare Consulting, Allee am Röthelheimpark 3A, 91052 Erlangen (Germany); Thomas, Christoph, E-mail: christoph.thomas@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Ketelsen, Dominik, E-mail: dominik.ketelsen@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Claussen, Claus D., E-mail: claus.claussen@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Heuschmid, Martin, E-mail: martin.heuschmid@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany)

    2012-09-15

    Objectives: The aim of this study was to evaluate time efficiency and diagnostic reproducibility of an advanced vessel analysis software for diagnosis of carotid artery stenosis. Material and methods: 40 patients with suspected carotid artery stenosis received head and neck DE-CTA as part of their pre-interventional workup. Acquired data were evaluated by 2 independent radiologists. Stenosis grading was performed by MPR eyeballing with freely adjustable MPRs and with a preliminary prototype of the meanwhile available client-server and advanced visualization software syngo.via CT Vascular (Siemens Healthcare, Erlangen, Germany). Stenoses were graded according to the following 5 categories: I: 0%, II: 1–50%, III: 51–69%, IV: 70–99% and V: total occlusion. Furthermore, time to diagnosis for each carotid artery was recorded. Results: Both readers achieved very good specificity values and good respectively very good sensitivity values without significant differences between both reading methods. Furthermore, there was a very good correlation between both readers for both reading methods without significant differences (kappa value: standard image interpretation k = 0.809; advanced vessel analysis software k = 0.863). Using advanced vessel analysis software resulted in a significant time saving (p < 0.0001) for both readers. Time to diagnosis could be decreased by approximately 55%. Conclusions: Advanced vessel analysis application CT Vascular of the new imaging software syngo.via (Siemens Healthcare, Forchheim, Germany) provides a high rate of reproducibility in assessment of carotid artery stenosis. Furthermore a significant time saving in comparison to standard image interpretation is achievable.

  6. Evaluation of the BreastSimulator software platform for breast tomography

    Science.gov (United States)

    Mettivier, G.; Bliznakova, K.; Sechopoulos, I.; Boone, J. M.; Di Lillo, F.; Sarno, A.; Castriconi, R.; Russo, P.

    2017-08-01

    The aim of this work was the evaluation of the software BreastSimulator, a breast x-ray imaging simulation software, as a tool for the creation of 3D uncompressed breast digital models and for the simulation and the optimization of computed tomography (CT) scanners dedicated to the breast. Eight 3D digital breast phantoms were created with glandular fractions in the range 10%-35%. The models are characterised by different sizes and modelled realistic anatomical features. X-ray CT projections were simulated for a dedicated cone-beam CT scanner and reconstructed with the FDK algorithm. X-ray projection images were simulated for 5 mono-energetic (27, 32, 35, 43 and 51 keV) and 3 poly-energetic x-ray spectra typically employed in current CT scanners dedicated to the breast (49, 60, or 80 kVp). Clinical CT images acquired from two different clinical breast CT scanners were used for comparison purposes. The quantitative evaluation included calculation of the power-law exponent, β, from simulated and real breast tomograms, based on the power spectrum fitted with a function of the spatial frequency, f, of the form S(f)  =  α/f   β . The breast models were validated by comparison against clinical breast CT and published data. We found that the calculated β coefficients were close to that of clinical CT data from a dedicated breast CT scanner and reported data in the literature. In evaluating the software package BreastSimulator to generate breast models suitable for use with breast CT imaging, we found that the breast phantoms produced with the software tool can reproduce the anatomical structure of real breasts, as evaluated by calculating the β exponent from the power spectral analysis of simulated images. As such, this research tool might contribute considerably to the further development, testing and optimisation of breast CT imaging techniques.

  7. Software V ampersand V methods for digital plant protection system

    International Nuclear Information System (INIS)

    Kim, Hung-Jun; Han, Jai-Bok; Chun, Chong-Son; Kim, Sung; Kim, Kern-Joong.

    1997-01-01

    Careful thought must be given to software design in the development of digital based systems that play a critical role in the successful operation of nuclear power plants. To evaluate the software verification and validation methods as well as to verify its system performance capabilities for the upgrade instrumentation and control system in the Korean future nuclear power plants, the prototype Digital Plant, Protection System (DPPS) based on the Programmable Logic Controller (PLC) has been constructed. The system design description and features are briefly presented, and the software design and software verification and validation methods are focused. 6 refs., 2 figs

  8. Quantifiers for quantum logic

    OpenAIRE

    Heunen, Chris

    2008-01-01

    We consider categorical logic on the category of Hilbert spaces. More generally, in fact, any pre-Hilbert category suffices. We characterise closed subobjects, and prove that they form orthomodular lattices. This shows that quantum logic is just an incarnation of categorical logic, enabling us to establish an existential quantifier for quantum logic, and conclude that there cannot be a universal quantifier.

  9. Evaluation of Solid Rocket Motor Component Data Using a Commercially Available Statistical Software Package

    Science.gov (United States)

    Stefanski, Philip L.

    2015-01-01

    Commercially available software packages today allow users to quickly perform the routine evaluations of (1) descriptive statistics to numerically and graphically summarize both sample and population data, (2) inferential statistics that draws conclusions about a given population from samples taken of it, (3) probability determinations that can be used to generate estimates of reliability allowables, and finally (4) the setup of designed experiments and analysis of their data to identify significant material and process characteristics for application in both product manufacturing and performance enhancement. This paper presents examples of analysis and experimental design work that has been conducted using Statgraphics®(Registered Trademark) statistical software to obtain useful information with regard to solid rocket motor propellants and internal insulation material. Data were obtained from a number of programs (Shuttle, Constellation, and Space Launch System) and sources that include solid propellant burn rate strands, tensile specimens, sub-scale test motors, full-scale operational motors, rubber insulation specimens, and sub-scale rubber insulation analog samples. Besides facilitating the experimental design process to yield meaningful results, statistical software has demonstrated its ability to quickly perform complex data analyses and yield significant findings that might otherwise have gone unnoticed. One caveat to these successes is that useful results not only derive from the inherent power of the software package, but also from the skill and understanding of the data analyst.

  10. Evaluation of PMBOK and scrum practices for software development in the vision of specialists

    Directory of Open Access Journals (Sweden)

    Elton Fernandes Gonçalves

    2017-07-01

    Full Text Available In project management, the challenge for software development is to achieve success for the proposed projects, using methods such as PMBOK and Scrum. Knowledge of the advantages of these methods are critical success factors for product development. Therefore, the proposal of this study was to verify the perception of specialists of the area of software development on practices of project management. The used methods in this study were the bibliographic, exploratory and qualitative research, with the construction of a questionnaire with 14 items on the advantages of project management practices of various natures, size and complexity, which were applied in 90 specialists. The results of the research demonstrated that all the experts agreed with the advantages of the project management practices, identified based on the literature review, for software development, thus validating the proposed items of the questionnaire. It is recommended for future researches the accomplishment of case studies that explore practical models of evaluation of the use of the practices studied in the scope of software development. It is important in these future studies that metrics and indicators are drawn for each of the advantages cited in the present study.

  11. The contribution of instrumentation and control software to system reliability

    International Nuclear Information System (INIS)

    Fryer, M.O.

    1984-01-01

    Advanced instrumentation and control systems are usually implemented using computers that monitor the instrumentation and issue commands to control elements. The control commands are based on instrument readings and software control logic. The reliability of the total system will be affected by the software design. When comparing software designs, an evaluation of how each design can contribute to the reliability of the system is desirable. Unfortunately, the science of reliability assessment of combined hardware and software systems is in its infancy. Reliability assessment of combined hardware/software systems is often based on over-simplified assumptions about software behavior. A new method of reliability assessment of combined software/hardware systems is presented. The method is based on a procedure called fault tree analysis which determines how component failures can contribute to system failure. Fault tree analysis is a well developed method for reliability assessment of hardware systems and produces quantitative estimates of failure probability based on component failure rates. It is shown how software control logic can be mapped into a fault tree that depicts both software and hardware contributions to system failure. The new method is important because it provides a way for quantitatively evaluating the reliability contribution of software designs. In many applications, this can help guide designers in producing safer and more reliable systems. An application to the nuclear power research industry is discussed

  12. Verification and validation process for the safety software in KNICS

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Lee, Jang-Soo; Kim, Jang-Yeol

    2004-01-01

    This paper describes the Verification and Validation (V and V ) process for safety software of Programmable Logic Controller (PLC), Digital Reactor Protection System (DRPS), and Engineered Safety Feature-Component Control System (ESF-CCS) that are being developed in Korea Nuclear Instrumentation and Control System (KNICS) projects. Specifically, it presents DRPS V and V experience according to the software development life cycle. The main activities of DRPS V and V process are preparation of software planning documentation, verification of Software Requirement Specification (SRS), Software Design Specification (SDS) and codes, and testing of the integrated software and the integrated system. In addition, they include software safety analysis and software configuration management. SRS V and V of DRPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated system test plan, software safety analysis, and software configuration management. Also, SDS V and V of RPS are technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, preparing integrated software test plan, software safety analysis, and software configuration management. The code V and V of DRPS are traceability analysis, source code inspection, test case and test procedure generation, software safety analysis, and software configuration management. Testing is the major V and V activity of software integration and system integration phase. Software safety analysis at SRS phase uses Hazard Operability (HAZOP) method, at SDS phase it uses HAZOP and Fault Tree Analysis (FTA), and at implementation phase it uses FTA. Finally, software configuration management is performed using Nu-SCM (Nuclear Software Configuration Management) tool developed by KNICS project. Through these activities, we believe we can achieve the functionality, performance, reliability and safety that are V

  13. Application software profiles 2010

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    2010-04-15

    This article presented information on new software applications designed to facilitate petroleum exploration, drilling and production activities. Computer modelling and analysis enables oil and gas producers to characterize reservoirs, estimate reserves forecast production, plan operations and manage assets. Seven Calgary-based organizations were highlighted along with their sophisticated software tools, the applications and the new features available in each product. The geoSCOUT version 7.7 by GeoLOGIC Systems Ltd. integrates public and proprietary data on wells, well logs, reserves, pipelines, production, ownership and seismic location data. The Value Navigator and AFE Navigator by Energy Navigator provides control over reserves, production and cash flow forecasting. FAST Harmony, FAST Evolution, FAST CBM, FAST FieldNotes, Fast Piper, FAST RTA, FAST VirtuWell and FAST WellTest by Fekete Associates Inc. provide reserve evaluations for reservoir engineering projects and production data analysis. The esi.manage software program by 3esi improves business results for upstream oil and gas companies through enhanced decision making and workforce effectiveness. WELLFLO, PIPEFLO, FORGAS, OLGA, Drillbench, and MEPO wellbore solutions by Neotec provide unique platforms for flow simulation to optimize oil and gas production systems. Petrel, ECLIPSE, Avocet, PipeSim and Merak software tools by Schlumberger Information Solutions are petroleum systems modelling tools for geologic mapping, visualization modelling and reservoir engineering. StudioSL by Streamsim Technologies Inc. is a modelling tool for optimizing flood management. figs.

  14. [Evaluation of pressure ulcers area using the softwares Motic and AutoCAD®].

    Science.gov (United States)

    Reis, Camila Letícia Dias dos; Cavalcante, Janaína Mortosa; Rocha Júnior, Edvar Ferreira da; Neves, Rinaldo Souza; Santana, Levy Aniceto; Guadagnin, Renato da Veiga; Brasil, Lourdes Mattos

    2012-01-01

    Pressure ulcer is a lesion that affects skin layers in some regions of the body and its healing can be followed up using image processing. The analysis of pressure ulcer area is relevant to evaluate its evolution and response to therapeutic procedures. Such areas can be evaluated through contour marking with the softwares Motic and AutoCAD®. In this study 35 volunteers computed areas from two grade III pressure ulcers using these instruments. It was possible to conclude that results are clinically equivalent and so can be considered to follow up healing evolution from pressure ulcers.

  15. Use of VAP3D software in the construction of pathological anthropomorphic phantoms for dosimetric evaluations

    International Nuclear Information System (INIS)

    Lima, Lindeval Fernandes de; Lima, Fernando R.A.

    2011-01-01

    This paper performs a new type of dosimetric evaluation, where it was used a phantom of pathological voxels (representative phantom of sick person). The software VAP3D (Visualization and Analysis of Phantoms 3D) were used for, from a healthy phantom (phantom representative of healthy person), to introduce three dimensional regions to simulate tumors. It was used the Monte Carlo ESGnrc code to simulate the X ray photon transport, his interaction with matter and evaluation of absorbed dose in organs and tissues from thorax region of the healthy phantom and his pathological version. This is a computer model of typical exposure for programming the treatments in radiodiagnostic

  16. Metamathematics of fuzzy logic

    CERN Document Server

    Hájek, Petr

    1998-01-01

    This book presents a systematic treatment of deductive aspects and structures of fuzzy logic understood as many valued logic sui generis. Some important systems of real-valued propositional and predicate calculus are defined and investigated. The aim is to show that fuzzy logic as a logic of imprecise (vague) propositions does have well-developed formal foundations and that most things usually named `fuzzy inference' can be naturally understood as logical deduction.

  17. What are Institutional Logics

    OpenAIRE

    Berg Johansen, Christina; Bock Waldorff, Susanne

    2015-01-01

    This study presents new insights into the explanatory power of the institutional logics perspective. With outset in a discussion of seminal theory texts, we identify two fundamental topics that frame institutional logics: overarching institutional orders guided by institutional logics, as well as change and agency generated by friction between logics. We use these topics as basis for an analysis of selected empirical papers, with the aim of understanding how institutional logics contribute to...

  18. Integrated software system for seismic evaluation of nuclear power plant structures

    International Nuclear Information System (INIS)

    Xu, J.; Graves, H.L.

    1993-01-01

    The computer software CARES (Computer Analysis for Rapid Evaluation of Structures) was developed by the Brookhaven National Laboratory for the U.S. Nuclear Regulatory Commission. It represents an effort to utilize established numerical methodologies commonly employed by industry for structural safety evaluations of nuclear power plant facilities and incorporates them into an integrated computer software package operated on personal computers. CARES was developed with the objective of including all aspects of seismic performance evaluation of nuclear power structures. It can be used to evaluate the validity and accuracy of analysis methodologies used for structural safety evaluations of nuclear power plants by various utilities. CARES has a modular format, each module performing a specific type of analysis. The seismic module integrates all the steps of a complete seismic analysis into a single package with many user-friendly features such as interactiveness and quick turnaround. Linear structural theory and pseudo-linear convolution theory are utilized as the bases for the development with a special emphasis on the nuclear regulatory requirements for structural safety of nuclear plants. The organization of the seismic module is arranged in eight options, each performing a specific step of the analysis with most of input/output interfacing processed by the general manager. Finally, CARES provides comprehensive post-processing capability for displaying results graphically or in tabular form so that direct comparisons can be easily made. (author)

  19. Evaluation of the finite element software ABAQUS for biomechanical modelling of biphasic tissues.

    Science.gov (United States)

    Wu, J Z; Herzog, W; Epstein, M

    1998-02-01

    The biphasic cartilage model proposed by Mow et al. (1980) has proven successful to capture the essential mechanical features of articular cartilage. In order to analyse the joint contact mechanics in real, anatomical joints, the cartilage model needs to be implemented into a suitable finite element code to approximate the irregular surface geometries of such joints. However, systematic and extensive evaluation of the capacity of commercial software for modelling the contact mechanics with biphasic cartilage layers has not been made. This research was aimed at evaluating the commercial finite element software ABAQUS for analysing biphasic soft tissues. The solutions obtained using ABAQUS were compared with those obtained using other finite element models and analytical solutions for three numerical tests: an unconfined indentation test, a test with the contact of a spherical cartilage surface with a rigid plate, and an axi-symmetric joint contact test. It was concluded that the biphasic cartilage model can be implemented into the commercial finite element software ABAQUS to analyse practical joint contact problems with biphasic articular cartilage layers.

  20. Classical Logic and Quantum Logic with Multiple and Common Lattice Models

    Directory of Open Access Journals (Sweden)

    Mladen Pavičić

    2016-01-01

    Full Text Available We consider a proper propositional quantum logic and show that it has multiple disjoint lattice models, only one of which is an orthomodular lattice (algebra underlying Hilbert (quantum space. We give an equivalent proof for the classical logic which turns out to have disjoint distributive and nondistributive ortholattices. In particular, we prove that both classical logic and quantum logic are sound and complete with respect to each of these lattices. We also show that there is one common nonorthomodular lattice that is a model of both quantum and classical logic. In technical terms, that enables us to run the same classical logic on both a digital (standard, two-subset, 0-1-bit computer and a nondigital (say, a six-subset computer (with appropriate chips and circuits. With quantum logic, the same six-element common lattice can serve us as a benchmark for an efficient evaluation of equations of bigger lattice models or theorems of the logic.

  1. Digital System Reliability Test for the Evaluation of safety Critical Software of Digital Reactor Protection System

    Directory of Open Access Journals (Sweden)

    Hyun-Kook Shin

    2006-08-01

    Full Text Available A new Digital Reactor Protection System (DRPS based on VME bus Single Board Computer has been developed by KOPEC to prevent software Common Mode Failure(CMF inside digital system. The new DRPS has been proved to be an effective digital safety system to prevent CMF by Defense-in-Depth and Diversity (DID&D analysis. However, for practical use in Nuclear Power Plants, the performance test and the reliability test are essential for the digital system qualification. In this study, a single channel of DRPS prototype has been manufactured for the evaluation of DRPS capabilities. The integrated functional tests are performed and the system reliability is analyzed and tested. The results of reliability test show that the application software of DRPS has a very high reliability compared with the analog reactor protection systems.

  2. [Evaluation of Web-based software applications for administrating and organising an ophthalmological clinical trial site].

    Science.gov (United States)

    Kortüm, K; Reznicek, L; Leicht, S; Ulbig, M; Wolf, A

    2013-07-01

    The importance and complexity of clinical trials is continuously increasing, especially in innovative specialties like ophthalmology. Therefore an efficient clinical trial site organisational structure is essential. In modern internet times, this can be accomplished by web-based applications. In total, 3 software applications (Vibe on Prem, Sharepoint and open source software) were evaluated in a clinical trial site in ophthalmology. Assessment criteria were set; they were: reliability, easiness of administration, usability, scheduling, task list, knowledge management, operating costs and worldwide availability. Vibe on Prem customised by the local university met the assessment criteria best. Other applications were not as strong. By introducing a web-based application for administrating and organising an ophthalmological trial site, studies can be conducted in a more efficient and reliable manner. Georg Thieme Verlag KG Stuttgart · New York.

  3. Towards a Framework for the Evaluation Design of Enterprise Social Software

    DEFF Research Database (Denmark)

    Herzog, Christian; Richter, Alexander; Steinhüser, Melanie

    2015-01-01

    a design theory that highlights the various design options and ensures completeness and consistency. Based on a comprehensive literature analysis, as well as an interview study with 31 ESS experts from 29 companies, we suggest a conceptual framework intended as decision support for the ESS evaluation...... design for different stakeholders. Beyond providing an orientation the framework also reveals six evaluation classes that represent typical application instantiations and can be understood as principles of implementation. A first validation in five organizations confirms that the framework can lead......While the use of Enterprise Social Software (ESS) increases, reports from science and practice show that evaluating its impact remains a major challenge. Various interests and points of view make each ESS evaluation an individual matter and lead to diverse requirements. In this paper, we propose...

  4. Logic Model Checking of Unintended Acceleration Claims in Toyota Vehicles

    Science.gov (United States)

    Gamble, Ed

    2012-01-01

    Part of the US Department of Transportation investigation of Toyota sudden unintended acceleration (SUA) involved analysis of the throttle control software, JPL Laboratory for Reliable Software applied several techniques including static analysis and logic model checking, to the software; A handful of logic models were build, Some weaknesses were identified; however, no cause for SUA was found; The full NASA report includes numerous other analyses

  5. Connections among quantum logics

    International Nuclear Information System (INIS)

    Lock, P.F.; Hardegree, G.M.

    1985-01-01

    In this paper, a theory of quantum logics is proposed which is general enough to enable us to reexamine a previous work on quantum logics in the context of this theory. It is then easy to assess the differences between the different systems studied. The quantum logical systems which are incorporated are divided into two groups which we call ''quantum propositional logics'' and ''quantum event logics''. The work of Kochen and Specker (partial Boolean algebras) is included and so is that of Greechie and Gudder (orthomodular partially ordered sets), Domotar (quantum mechanical systems), and Foulis and Randall (operational logics) in quantum propositional logics; and Abbott (semi-Boolean algebras) and Foulis and Randall (manuals) in quantum event logics, In this part of the paper, an axiom system for quantum propositional logics is developed and the above structures in the context of this system examined. (author)

  6. Designing an educational software for teaching and evaluation of radiology course in dentistry

    Directory of Open Access Journals (Sweden)

    Fatemeh Ezoddini Ardakani

    2009-08-01

    Full Text Available Background and purpose: Radiology course has different parts in dentistry curriculum in Iranian medical universities. The third part of this course involves the diagnosis of facial and jaw’s bone lesions. In this study an educational software was designed, by using Access Database Software, to teach this course to dentistry students and after one semester the efficacy of it was tested by examining the students’ opinion about it.Methods: In this study the radiology course part 3 was thought to the 32 students in 2 parts. In the first part (before mid-semester the traditional method of teaching and in the second part (after the mid-semester exam the designed software was used for teaching and evaluating the students. After each semester the opinion of the students about the course was examined by using a standard questionnaire. The students in the first and second part of the semester were considered as the control and case group respectively.Results: Most of the students (90.6% believe that the software is useful in education and helps them to learn the subject. In addition, 84.4% of students believe that the soft ware can evaluate the clinical skills of students in detecting the radiological lesions and that the program can save their studying time. Overall, the students’ marks in case group were significantly higher than these in control group. The overall satisfaction of 78.1% of students about this program was good while 9.3% did not have a good feeling about it and 12.5% did not express any opinion.Conclusion: This study shows the importance of using computer for educational purposes especially in the courses with a huge amount of materials to be memorized by students. Computer science can help students in memorizing different signs and symptoms of many disorders. In addition it can help the teachers to evaluate the students at the end of the course.Key words: Computer, Software, Education, Radiology, Dentistry.

  7. Application of configurable logic in nuclear fuel handling

    International Nuclear Information System (INIS)

    Ernst, W.H.; Rayment, D.J.

    1996-01-01

    Control and protective systems operating in the older nuclear power stations are nearing the end of their reliable operating life. These systems are still subject to frequent logic changes. Testing the software logic changes is becoming a significant task with ever greater expense. The software based systems can be replaced with systems using configurable logic. These systems provide new, more reliable technology, offer the capability for change, and provide capability for complete logic simulation and test before installation. There is a base of operating experience with these devices and many potential applications where they can be used to advantage. (author). 5 refs

  8. Application of configurable logic in nuclear fuel handling

    Energy Technology Data Exchange (ETDEWEB)

    Ernst, W H; Rayment, D J [Canadian General Electric Co. Ltd., Peterborough, ON (Canada)

    1997-12-31

    Control and protective systems operating in the older nuclear power stations are nearing the end of their reliable operating life. These systems are still subject to frequent logic changes. Testing the software logic changes is becoming a significant task with ever greater expense. The software based systems can be replaced with systems using configurable logic. These systems provide new, more reliable technology, offer the capability for change, and provide capability for complete logic simulation and test before installation. There is a base of operating experience with these devices and many potential applications where they can be used to advantage. (author). 5 refs.

  9. Structural Logical Relations

    DEFF Research Database (Denmark)

    Schürmann, Carsten; Sarnat, Jeffrey

    2008-01-01

    Tait's method (a.k.a. proof by logical relations) is a powerful proof technique frequently used for showing foundational properties of languages based on typed lambda-calculi. Historically, these proofs have been extremely difficult to formalize in proof assistants with weak meta-logics......, such as Twelf, and yet they are often straightforward in proof assistants with stronger meta-logics. In this paper, we propose structural logical relations as a technique for conducting these proofs in systems with limited meta-logical strength by explicitly representing and reasoning about an auxiliary logic...

  10. What are Institutional Logics

    DEFF Research Database (Denmark)

    Berg Johansen, Christina; Waldorff, Susanne Boch

    This study presents new insights into the explanatory power of the institutional logics perspective. With outset in a discussion of seminal theory texts, we identify two fundamental topics that frame institutional logics: overarching institutional orders guides by institutional logics, as well...... as change and agency generated by friction between logics. We use these topics as basis for an analysis of selected empirical papers, with the aim of understanding how institutional logics contribute to institutional theory at large, and which social matters institutional logics can and cannot explore...

  11. Indeterministic Temporal Logic

    Directory of Open Access Journals (Sweden)

    Trzęsicki Kazimierz

    2015-09-01

    Full Text Available The questions od determinism, causality, and freedom have been the main philosophical problems debated since the beginning of temporal logic. The issue of the logical value of sentences about the future was stated by Aristotle in the famous tomorrow sea-battle passage. The question has inspired Łukasiewicz’s idea of many-valued logics and was a motive of A. N. Prior’s considerations about the logic of tenses. In the scheme of temporal logic there are different solutions to the problem. In the paper we consider indeterministic temporal logic based on the idea of temporal worlds and the relation of accessibility between them.

  12. Evaluation of components and systems of RSG-GAS and developing of maintenance evaluation software

    International Nuclear Information System (INIS)

    Pane, Jupiter-Sitorus; Kadarusmanto

    2003-01-01

    In order to enhance the evaluation of components and systems, inventarization of failure and repair data has been performed. The evaluation was done by calculating distribution function of failure, failure rate, MTTF, MTTR, and unavailability. The evaluation was comprehend in a code program of SYSCOM-RSG-GAS, that are able to performed statistical analysis, reliability parameters, and Fault Tree Analysis. This paper will describe some results of the program

  13. EVALUATION OF PATCHY ATROPHY SECONDARY TO HIGH MYOPIA BY SEMIAUTOMATED SOFTWARE FOR FUNDUS AUTOFLUORESCENCE ANALYSIS.

    Science.gov (United States)

    Miere, Alexandra; Capuano, Vittorio; Serra, Rita; Jung, Camille; Souied, Eric; Querques, Giuseppe

    2017-05-31

    To evaluate the progression of patchy atrophy in high myopia using semiautomated software for fundus autofluorescence (FAF) analysis. The medical records and multimodal imaging of 21 consecutive highly myopic patients with macular chorioretinal patchy atrophy (PA) were retrospectively analyzed. All patients underwent repeated fundus autofluorescence and spectral domain optical coherence tomography over at least 12 months. Color fundus photography was also performed in a subset of patients. Total atrophy area was measured on FAF images using Region Finder semiautomated software embedded in Spectralis (Heidelberg Engineering, Heidelberg, Germany) at baseline and during follow-up visits. Region Finder was compared with manually measured PA on FAF images. Twenty-two eyes of 21 patients (14 women, 7 men; mean age 62.8 + 13.0 years, range 32-84 years) were included. Mean PA area using Region Finder was 2.77 ± 2.91 SD mm at baseline, 3.12 ± 2.68 mm at Month 6, 3.43 ± 2.68 mm at Month 12, and 3.73 ± 2.74 mm at Month 18 (overall P autofluorescence analysis by Region Finder semiautomated software provides accurate measurements of lesion area and allows us to quantify the progression of PA in high myopia. In our series, PA enlarged significantly over at least 12 months, and its progression seemed to be related to the lesion size at baseline.

  14. The Performance Evaluation of Multi-Image 3d Reconstruction Software with Different Sensors

    Science.gov (United States)

    Mousavi, V.; Khosravi, M.; Ahmadi, M.; Noori, N.; Naveh, A. Hosseini; Varshosaz, M.

    2015-12-01

    Today, multi-image 3D reconstruction is an active research field and generating three dimensional model of the objects is one the most discussed issues in Photogrammetry and Computer Vision that can be accomplished using range-based or image-based methods. Very accurate and dense point clouds generated by range-based methods such as structured light systems and laser scanners has introduced them as reliable tools in the industry. Image-based 3D digitization methodologies offer the option of reconstructing an object by a set of unordered images that depict it from different viewpoints. As their hardware requirements are narrowed down to a digital camera and a computer system, they compose an attractive 3D digitization approach, consequently, although range-based methods are generally very accurate, image-based methods are low-cost and can be easily used by non-professional users. One of the factors affecting the accuracy of the obtained model in image-based methods is the software and algorithm used to generate three dimensional model. These algorithms are provided in the form of commercial software, open source and web-based services. Another important factor in the accuracy of the obtained model is the type of sensor used. Due to availability of mobile sensors to the public, popularity of professional sensors and the advent of stereo sensors, a comparison of these three sensors plays an effective role in evaluating and finding the optimized method to generate three-dimensional models. Lots of research has been accomplished to identify a suitable software and algorithm to achieve an accurate and complete model, however little attention is paid to the type of sensors used and its effects on the quality of the final model. The purpose of this paper is deliberation and the introduction of an appropriate combination of a sensor and software to provide a complete model with the highest accuracy. To do this, different software, used in previous studies, were compared and

  15. Enhancing programming logic thinking using analogy mapping

    Science.gov (United States)

    Sukamto, R. A.; Megasari, R.

    2018-05-01

    Programming logic thinking is the most important competence for computer science students. However, programming is one of the difficult subject in computer science program. This paper reports our work about enhancing students' programming logic thinking using Analogy Mapping for basic programming subject. Analogy Mapping is a computer application which converts source code into analogies images. This research used time series evaluation and the result showed that Analogy Mapping can enhance students' programming logic thinking.

  16. Practical Findings from Applying the PSD Model for Evaluating Software Design Specifications

    Science.gov (United States)

    Räisänen, Teppo; Lehto, Tuomas; Oinas-Kukkonen, Harri

    This paper presents practical findings from applying the PSD model to evaluating the support for persuasive features in software design specifications for a mobile Internet device. On the one hand, our experiences suggest that the PSD model fits relatively well for evaluating design specifications. On the other hand, the model would benefit from more specific heuristics for evaluating each technique to avoid unnecessary subjectivity. Better distinction between the design principles in the social support category would also make the model easier to use. Practitioners who have no theoretical background can apply the PSD model to increase the persuasiveness of the systems they design. The greatest benefit of the PSD model for researchers designing new systems may be achieved when it is applied together with a sound theory, such as the Elaboration Likelihood Model. Using the ELM together with the PSD model, one may increase the chances for attitude change.

  17. Safety evaluation for communication network software modifications of PCS in Ulchin NPP unit 3

    International Nuclear Information System (INIS)

    Ji, S. H.; Koh, J. S.; Kim, B. R.; Oh, S. H.

    1999-01-01

    On February 2, 1999, an incident occurred at the Ulchin Nuclear Power Plant Unit 3 which resulted in the corruption of data on Perform Net of Plant Control System. This incident was caused by the ASIC (Application Specific Integrated Circuit) chip on the Rehostable Module which is a part of Network Interface Module. Regarding this incident, we required that the utility should propose new algorithms to detect the hardware failure of ASIC chip and evaluated the appropriateness of network software modifications. As a result of this evaluation process, we required that the safety related interlock signals using data communication path be hardwired to make up for the vulnerability of the system architecture. In this paper, we will discuss the system architecture of PCS and fault analysis and evaluation findings

  18. Clinical evaluation of monitor unit software and the application of action levels

    International Nuclear Information System (INIS)

    Georg, Dietmar; Nyholm, Tufve; Olofsson, Joergen; Kjaer-Kristoffersen, Flemming; Schnekenburger, Bruno; Winkler, Peter; Nystroem, Hakan; Ahnesjoe, Anders; Karlsson, Mikael

    2007-01-01

    Purpose: The aim of this study was the clinical evaluation of an independent dose and monitor unit verification (MUV) software which is based on sophisticated semi-analytical modelling. The software was developed within the framework of an ESTRO project. Finally, consistent handling of dose calculation deviations applying individual action levels is discussed. Materials and methods: A Matlab-based software ('MUV') was distributed to five well-established treatment centres in Europe (Vienna, Graz, Basel, Copenhagen, and Umea) and evaluated as a quality assurance (QA) tool in clinical routine. Results were acquired for 226 individual treatment plans including a total of 815 radiation fields. About 150 beam verification measurements were performed for a portion of the individual treatment plans, mainly with time variable fluence patterns. The deviations between dose calculations performed with a treatment planning system (TPS) and the MUV software were scored with respect to treatment area, treatment technique, geometrical depth, radiological depth, etc. Results: In general good agreement was found between calculations performed with the different TPSs and MUV, with a mean deviation per field of 0.2 ± 3.5% (1 SD) and mean deviations of 0.2 ± 2.2% for composite treatment plans. For pelvic treatments less than 10% of all fields showed deviations larger than 3%. In general, when using the radiological depth for verification calculations the results and the spread in the results improved significantly, especially for head-and-neck and for thorax treatments. For IMRT head-and-neck beams, mean deviations between MUV and the local TPS were -1.0 ± 7.3% for dynamic, and -1.3 ± 3.2% for step-and-shoot IMRT delivery. For dynamic IMRT beams in the pelvis good agreement was obtained between MUV and the local TPS (mean: -1.6 ± 1.5%). Treatment site and treatment technique dependent action levels between ±3% and ±5% seem to be clinically realistic if a radiological depth

  19. Self-assembling software generator

    Science.gov (United States)

    Bouchard, Ann M [Albuquerque, NM; Osbourn, Gordon C [Albuquerque, NM

    2011-11-25

    A technique to generate an executable task includes inspecting a task specification data structure to determine what software entities are to be generated to create the executable task, inspecting the task specification data structure to determine how the software entities will be linked after generating the software entities, inspecting the task specification data structure to determine logic to be executed by the software entities, and generating the software entities to create the executable task.

  20. Quantum Logic as a Dynamic Logic

    NARCIS (Netherlands)

    Baltag, A.; Smets, S.

    We address the old question whether a logical understanding of Quantum Mechanics requires abandoning some of the principles of classical logic. Against Putnam and others (Among whom we may count or not E. W. Beth, depending on how we interpret some of his statements), our answer is a clear “no”.

  1. Quantum logic as a dynamic logic

    NARCIS (Netherlands)

    Baltag, Alexandru; Smets, Sonja

    We address the old question whether a logical understanding of Quantum Mechanics requires abandoning some of the principles of classical logic. Against Putnam and others (Among whom we may count or not E. W. Beth, depending on how we interpret some of his statements), our answer is a clear "no".

  2. Transforming equality logic to propositional logic

    NARCIS (Netherlands)

    Zantema, H.; Groote, J.F.

    2003-01-01

    Abstract We investigate and compare various ways of transforming equality formulas to propositional formulas, in order to be able to solve satisfiability in equality logic by means of satisfiability in propositional logic. We propose equality substitution as a new approach combining desirable

  3. A European Commission software tool for radon risk calculation and evaluation of countermeasures

    International Nuclear Information System (INIS)

    Degrange, J.P.; Levy, F.P.; Birchall, A.; Haylock, R.; Marsh, J.; Muirhead, C.; Janssens, A.

    2000-01-01

    The effects of exposure to radon on workers and members of the public have been examined for many years. Recent advances have been made in evaluating the risk associated with radon exposure and in implementing remediation programmes in dwellings. However, decisions about whether to implement countermeasures to reduce radon exposures may benefit from an enhanced capability to evaluate and understand the associated health risk. In this context, the European Commission has launched a project to develop a user friendly software package based upon current information on radon epidemiology, radon dosimetry, demography, and countermeasure efficiency. The software has been designed to perform lung cancer risk calculations specific to European populations for various exposure profiles and to evaluate, in terms of risk reduction, the efficiency of various countermeasures in dwellings. This paper presents an overview of the general structure of the software and outlines its most important modelling approaches. The software is able to evaluate the risk of fatal lung cancer associated with individual or collective exposure to radon. The individual risk calculation module determines, for an individual of given age, sex and tobacco consumption, the excess risk for a given exposure (past or future exposure) time-profile, on the basis of demographic data (WHO, 1996 and 1998) specific to the selected population. The collective risk calculation module determines the excess risk for each occupant of a dwelling, for an exposure time-profile derived from data describing the dwelling, its occupancy, and the chosen countermeasures. These calculations may be done using one of two classical approaches: 1) The epidemiological approach in which the risk is obtained directly from the radon exposure data using a risk model (BEIR IV, 1988; BEIR VI, 1999) derived from epidemiological studies of uranium miners; 2) The dosimetric approach in which the risk is derived from the radon exposure data

  4. A European Commission software tool for radon risk calculation and evaluation of countermeasures

    Energy Technology Data Exchange (ETDEWEB)

    Degrange, J.P.; Levy, F.P. [CEPN, Fontenay-aux-Roses Cedex (France); Birchall, A.; Haylock, R.; Marsh, J.; Muirhead, C. [National Radiological Protection Board, Chilton (United Kingdom); Janssens, A. [European Commission, DG XI (Luxembourg)

    2000-05-01

    The effects of exposure to radon on workers and members of the public have been examined for many years. Recent advances have been made in evaluating the risk associated with radon exposure and in implementing remediation programmes in dwellings. However, decisions about whether to implement countermeasures to reduce radon exposures may benefit from an enhanced capability to evaluate and understand the associated health risk. In this context, the European Commission has launched a project to develop a user friendly software package based upon current information on radon epidemiology, radon dosimetry, demography, and countermeasure efficiency. The software has been designed to perform lung cancer risk calculations specific to European populations for various exposure profiles and to evaluate, in terms of risk reduction, the efficiency of various countermeasures in dwellings. This paper presents an overview of the general structure of the software and outlines its most important modelling approaches. The software is able to evaluate the risk of fatal lung cancer associated with individual or collective exposure to radon. The individual risk calculation module determines, for an individual of given age, sex and tobacco consumption, the excess risk for a given exposure (past or future exposure) time-profile, on the basis of demographic data (WHO, 1996 and 1998) specific to the selected population. The collective risk calculation module determines the excess risk for each occupant of a dwelling, for an exposure time-profile derived from data describing the dwelling, its occupancy, and the chosen countermeasures. These calculations may be done using one of two classical approaches: 1) The epidemiological approach in which the risk is obtained directly from the radon exposure data using a risk model (BEIR IV, 1988; BEIR VI, 1999) derived from epidemiological studies of uranium miners; 2) The dosimetric approach in which the risk is derived from the radon exposure data

  5. LOGIC SIMULATION OF LIFE SUPPORT SYSTEM COMPONENT IN REAL TIME

    Directory of Open Access Journals (Sweden)

    A. S. Marchenko

    2016-01-01

    Full Text Available Abstract. The article proposed the use of simulation methods for evaluating the effectiveness of a stepped fan engine speed control while maintaining the air flow volume in the set boundaries of the «fan-filter» system. A detailed algorithm of the program made on the basis of an Any Logic software package. Is analyzed the possibility of using the proposed method in the design of ventilation systems.The proposed method allows at the design stage to determine the maximum replacement intervals of the systems filter elements, as well as to predict the time to switch the fan motor speeds. Using of the technique allows to refuse the complex air flow systems and maximize the life of the filter elements set.Methods of logical processes modeling allows to reduce construction costs and improve energy efficiency of buildings. 

  6. Product-oriented Software Certification Process for Software Synthesis

    Science.gov (United States)

    Nelson, Stacy; Fischer, Bernd; Denney, Ewen; Schumann, Johann; Richardson, Julian; Oh, Phil

    2004-01-01

    The purpose of this document is to propose a product-oriented software certification process to facilitate use of software synthesis and formal methods. Why is such a process needed? Currently, software is tested until deemed bug-free rather than proving that certain software properties exist. This approach has worked well in most cases, but unfortunately, deaths still occur due to software failure. Using formal methods (techniques from logic and discrete mathematics like set theory, automata theory and formal logic as opposed to continuous mathematics like calculus) and software synthesis, it is possible to reduce this risk by proving certain software properties. Additionally, software synthesis makes it possible to automate some phases of the traditional software development life cycle resulting in a more streamlined and accurate development process.

  7. Many-valued logics

    CERN Document Server

    Bolc, Leonard

    1992-01-01

    Many-valued logics were developed as an attempt to handle philosophical doubts about the "law of excluded middle" in classical logic. The first many-valued formal systems were developed by J. Lukasiewicz in Poland and E.Post in the U.S.A. in the 1920s, and since then the field has expanded dramatically as the applicability of the systems to other philosophical and semantic problems was recognized. Intuitionisticlogic, for example, arose from deep problems in the foundations of mathematics. Fuzzy logics, approximation logics, and probability logics all address questions that classical logic alone cannot answer. All these interpretations of many-valued calculi motivate specific formal systems thatallow detailed mathematical treatment. In this volume, the authors are concerned with finite-valued logics, and especially with three-valued logical calculi. Matrix constructions, axiomatizations of propositional and predicate calculi, syntax, semantic structures, and methodology are discussed. Separate chapters deal w...

  8. Against Logical Form

    Directory of Open Access Journals (Sweden)

    P N Johnson-Laird

    2010-10-01

    Full Text Available An old view in logic going back to Aristotle is that an inference is valid in virtue of its logical form. Many psychologists have adopted the same point of view about human reasoning: the first step is to recover the logical form of an inference, and the second step is to apply rules of inference that match these forms in order to prove that the conclusion follows from the premises. The present paper argues against this idea. The logical form of an inference transcends the grammatical forms of the sentences used to express it, because logical form also depends on context. Context is not readily expressed in additional premises. And the recovery of logical form leads ineluctably to the need for infinitely many axioms to capture the logical properties of relations. An alternative theory is that reasoning depends on mental models, and this theory obviates the need to recover logical form.

  9. Logic an introductory course

    CERN Document Server

    Newton-Smith, WH

    2003-01-01

    A complete introduction to logic for first-year university students with no background in logic, philosophy or mathematics. In easily understood steps it shows the mechanics of the formal analysis of arguments.

  10. Anticoincidence logic using PALs

    International Nuclear Information System (INIS)

    Bolanos, L.; Arista Romeu, E.

    1997-01-01

    This paper describes the functioning principle of an anticoincidence logic and a design of this based on programing logic. The circuit was included in a discriminator of an equipment for single-photon absorptiometry

  11. Evaluation of glomerular filtration rate of 99mTc-DTPA using PIP software

    International Nuclear Information System (INIS)

    Opazo, C.; Troncoso, M.; Gutierrez, E.; Guerrero, B.; Mena, J.

    2002-01-01

    Aim: Our purpose is to compare the measurement of glomerular filtration rate by DTPA Renogram (DTPA-GFR) using PIP software with those by 24-hour creatinine clearance (CC) in order to evaluate the results provided for the procedure and the software we used. The need for using this method well known from earlier eighties raised from the practical difficulties in getting an accurate CC in pediatric population specially in out patients as well as the fact there is not radiation, time, morbidity or discomfort added to the renogram. Methods: In a prospective study running from Sep-2001, up to now 18 patients aged 1 to 18 years underwent DTPA Renogram. DTPA-GFR was calculated from the renogram in a computer and PIP software system developed for the IAEA to be attached to analogical gammacameras The procedure involves 30 minutes DTPA renogram, full and empty DTPA syringe activity measure, input patient height and weight, to make ROIs around kidneys and background ROIs drawn below and in the lateral side of both kidneys. The results are provided automatically for the software using a kidney uptake index with Gates method. The results are expressed in ml/min for both and each kidney separately. No blood samples were used. All patients had CC measurement done at most 48 hours from the renogram using 24 hours urine collection and serum creatinine level. We make sure patients were well hydrated orally before starting renogram acquisition. Results: The DTPA-GFR mean was 81.6 ml/min (22.5-153.6). The CC mean was 78.8 ml/min (14.8-132). The comparison between DTPA-GFR and CC measurements showed an acceptable R2 coefficient (0.9228), a slope close to identity line (0.9504). The intercept was 6.75 ml/min and the T value was 0.2983. Conclusion: We have found an acceptable correlation between DTPA-GFR and CC with results obtained up to now. DTPA-GFR is a very easy procedure adding no extra time or cost to the renogram. The information provided can be useful to be considered by the

  12. A real time fuzzy logic power management strategy for a fuel cell vehicle

    International Nuclear Information System (INIS)

    Hemi, Hanane; Ghouili, Jamel; Cheriti, Ahmed

    2014-01-01

    Highlights: • We present a real time fuzzy logic power management strategy. • This strategy is applied to hybrid electric vehicle dynamic model. • Three configurations evaluated during a drive cycle. • The hydrogen consumption is analysed for the three configurations. - Abstract: This paper presents real time fuzzy logic controller (FLC) approach used to design a power management strategy for a hybrid electric vehicle and to protect the battery from overcharging during the repetitive braking energy accumulation. The fuel cell (FC) and battery (B)/supercapacitor (SC) are the primary and secondary power sources, respectively. This paper analyzes and evaluates the performance of the three configurations, FC/B, FC/SC and FC/B/SC during real time driving conditions and unknown driving cycle. The MATLAB/Simulink and SimPowerSystems software packages are used to model the electrical and mechanical elements of hybrid vehicles and implement a fuzzy logic strategy

  13. Connections among quantum logics

    International Nuclear Information System (INIS)

    Lock, P.F.; Hardegree, G.M.

    1985-01-01

    This paper gives a brief introduction to the major areas of work in quantum event logics: manuals (Foulis and Randall) and semi-Boolean algebras (Abbott). The two theories are compared, and the connection between quantum event logics and quantum propositional logics is made explicit. In addition, the work on manuals provides us with many examples of results stated in Part I. (author)

  14. Equational type logic

    NARCIS (Netherlands)

    Manca, V.; Salibra, A.; Scollo, Giuseppe

    1990-01-01

    Equational type logic is an extension of (conditional) equational logic, that enables one to deal in a single, unified framework with diverse phenomena such as partiality, type polymorphism and dependent types. In this logic, terms may denote types as well as elements, and atomic formulae are either

  15. Concurrent weighted logic

    DEFF Research Database (Denmark)

    Xue, Bingtian; Larsen, Kim Guldstrand; Mardare, Radu Iulian

    2015-01-01

    We introduce Concurrent Weighted Logic (CWL), a multimodal logic for concurrent labeled weighted transition systems (LWSs). The synchronization of LWSs is described using dedicated functions that, in various concurrency paradigms, allow us to encode the compositionality of LWSs. To reflect these......-completeness results for this logic. To complete these proofs we involve advanced topological techniques from Model Theory....

  16. Real Islamic Logic

    NARCIS (Netherlands)

    Bergstra, J.A.

    2011-01-01

    Four options for assigning a meaning to Islamic Logic are surveyed including a new proposal for an option named "Real Islamic Logic" (RIL). That approach to Islamic Logic should serve modern Islamic objectives in a way comparable to the functionality of Islamic Finance. The prospective role of RIL

  17. Evaluation and selection of security products for authentication of computer software

    Science.gov (United States)

    Roenigk, Mark W.

    2000-04-01

    Software Piracy is estimated to cost software companies over eleven billion dollars per year in lost revenue worldwide. Over fifty three percent of all intellectual property in the form of software is pirated on a global basis. Software piracy has a dramatic effect on the employment figures for the information industry as well. In the US alone, over 130,000 jobs are lost annually as a result of software piracy.

  18. Fracture toughness evaluation using circumferential notched tensile specimens by the tensile test and ANSYS software

    Energy Technology Data Exchange (ETDEWEB)

    Meydanlik, N. [Mechanical Engineering Department, Trakya University, Edirne (Turkey)

    2013-07-01

    Fracture toughness (K{sub Ic} ) is the most important parameter that defines mechanical behaviour of the materials using machine design. Since, fracture tests are both difficult and time consuming, the researchers have been investigating for the easier evaluation of K{sub Ic} for many years. In this work; K{sub Ic} values have been obtained by using ANSYS software based on the experimental values evaluated in the previous studies. It was shown that there is no significant difference between the experimental ones and the ones obtained by ANSYS. This procedure can provide an important advantage on obtaining of the K{sub IC} values. Key words: Fracture toughness (K{sub Ic} ), circumferential notched tensile specimens, ANSYS.

  19. Evaluation of atlas-based auto-segmentation software in prostate cancer patients

    International Nuclear Information System (INIS)

    Greenham, Stuart; Dean, Jenna; Fu, Cheuk Kuen Kenneth; Goman, Joanne; Mulligan, Jeremy; Tune, Deanna; Sampson, David; Westhuyzen, Justin; McKay, Michael

    2014-01-01

    The performance and limitations of an atlas-based auto-segmentation software package (ABAS; Elekta Inc.) was evaluated using male pelvic anatomy as the area of interest. Contours from 10 prostate patients were selected to create atlases in ABAS. The contoured regions of interest were created manually to align with published guidelines and included the prostate, bladder, rectum, femoral heads and external patient contour. Twenty-four clinically treated prostate patients were auto-contoured using a randomised selection of two, four, six, eight or ten atlases. The concordance between the manually drawn and computer-generated contours were evaluated statistically using Pearson's product–moment correlation coefficient (r) and clinically in a validated qualitative evaluation. In the latter evaluation, six radiation therapists classified the degree of agreement for each structure using seven clinically appropriate categories. The ABAS software generated clinically acceptable contours for the bladder, rectum, femoral heads and external patient contour. For these structures, ABAS-generated volumes were highly correlated with ‘as treated’ volumes, manually drawn; for four atlases, for example, bladder r = 0.988 (P < 0.001), rectum r = 0.739 (P < 0.001) and left femoral head r = 0.560 (P < 0.001). Poorest results were seen for the prostate (r = 0.401, P < 0.05) (four atlases); however this was attributed to the comparison prostate volume being contoured on magnetic resonance imaging (MRI) rather than computed tomography (CT) data. For all structures, increasing the number of atlases did not consistently improve accuracy. ABAS-generated contours are clinically useful for a range of structures in the male pelvis. Clinically appropriate volumes were created, but editing of some contours was inevitably required. The ideal number of atlases to improve generated automatic contours is yet to be determined

  20. Abductive Logic Grammars

    DEFF Research Database (Denmark)

    Christiansen, Henning; Dahl, Veronica

    2009-01-01

    By extending logic grammars with constraint logic, we give them the ability to create knowledge bases that represent the meaning of an input string. Semantic information is thus defined through extra-grammatical means, and a sentence's meaning logically follows as a by-product of string rewriting....... We formalize these ideas, and exemplify them both within and outside first-order logic, and for both fixed and dynamic knowledge bases. Within the latter variety, we consider the usual left-to-right derivations that are traditional in logic grammars, but also -- in a significant departure from...

  1. Product Lukasiewicz Logic

    Czech Academy of Sciences Publication Activity Database

    Horčík, Rostislav; Cintula, Petr

    2004-01-01

    Roč. 43, - (2004), s. 477-503 ISSN 1432-0665 R&D Projects: GA AV ČR IAA1030004; GA ČR GA201/02/1540 Grant - others:GA CTU(CZ) project 0208613; net CEEPUS(SK) SK-042 Institutional research plan: CEZ:AV0Z1030915 Keywords : fuzzy logic * many-valued logic * Lukasiewicz logic * Lpi logic * Takeuti-Titani logic * MV-algebras * product MV-algebras Subject RIV: BA - General Mathematics Impact factor: 0.295, year: 2004

  2. Evaluation of software based redundancy algorithms for the EOS storage system at CERN

    International Nuclear Information System (INIS)

    Peters, Andreas-Joachim; Sindrilaru, Elvin Alin; Zigann, Philipp

    2012-01-01

    EOS is a new disk based storage system used in production at CERN since autumn 2011. It is implemented using the plug-in architecture of the XRootD software framework and allows remote file access via XRootD protocol or POSIX-like file access via FUSE mounting. EOS was designed to fulfill specific requirements of disk storage scalability and IO scheduling performance for LHC analysis use cases. This is achieved by following a strategy of decoupling disk and tape storage as individual storage systems. A key point of the EOS design is to provide high availability and redundancy of files via a software implementation which uses disk-only storage systems without hardware RAID arrays. All this is aimed at reducing the overall cost of the system and also simplifying the operational procedures. This paper presents the advantages and disadvantages of redundancy by hardware (most classical storage installations) in comparison to redundancy by software. The latter is implemented in the EOS system and achieves its goal by spawning data and parity stripes via remote file access over nodes. The gain in redundancy and reliability comes with a trade-off in the following areas: • Increased complexity of the network connectivity • CPU intensive parity computations during file creation and recovery • Performance loss through remote disk coupling An evaluation and performance figures of several redundancy algorithms are presented for dual parity RAID and Reed-Solomon codecs. Moreover, the characteristics and applicability of these algorithms are discussed in the context of reliable data storage systems.

  3. Generic physical protection logic trees

    International Nuclear Information System (INIS)

    Paulus, W.K.

    1981-10-01

    Generic physical protection logic trees, designed for application to nuclear facilities and materials, are presented together with a method of qualitative evaluation of the trees for design and analysis of physical protection systems. One or more defense zones are defined where adversaries interact with the physical protection system. Logic trees that are needed to describe the possible scenarios within a defense zone are selected. Elements of a postulated or existing physical protection system are tagged to the primary events of the logic tree. The likelihood of adversary success in overcoming these elements is evaluated on a binary, yes/no basis. The effect of these evaluations is propagated through the logic of each tree to determine whether the adversary is likely to accomplish the end event of the tree. The physical protection system must be highly likely to overcome the adversary before he accomplishes his objective. The evaluation must be conducted for all significant states of the site. Deficiencies uncovered become inputs to redesign and further analysis, closing the loop on the design/analysis cycle

  4. Generic physical protection logic trees

    Energy Technology Data Exchange (ETDEWEB)

    Paulus, W.K.

    1981-10-01

    Generic physical protection logic trees, designed for application to nuclear facilities and materials, are presented together with a method of qualitative evaluation of the trees for design and analysis of physical protection systems. One or more defense zones are defined where adversaries interact with the physical protection system. Logic trees that are needed to describe the possible scenarios within a defense zone are selected. Elements of a postulated or existing physical protection system are tagged to the primary events of the logic tree. The likelihood of adversary success in overcoming these elements is evaluated on a binary, yes/no basis. The effect of these evaluations is propagated through the logic of each tree to determine whether the adversary is likely to accomplish the end event of the tree. The physical protection system must be highly likely to overcome the adversary before he accomplishes his objective. The evaluation must be conducted for all significant states of the site. Deficiencies uncovered become inputs to redesign and further analysis, closing the loop on the design/analysis cycle.

  5. Henkin and Hybrid Logic

    DEFF Research Database (Denmark)

    Blackburn, Patrick Rowan; Huertas, Antonia; Manzano, Maria

    2014-01-01

    Leon Henkin was not a modal logician, but there is a branch of modal logic that has been deeply influenced by his work. That branch is hybrid logic, a family of logics that extend orthodox modal logic with special proposition symbols (called nominals) that name worlds. This paper explains why...... Henkin’s techniques are so important in hybrid logic. We do so by proving a completeness result for a hybrid type theory called HTT, probably the strongest hybrid logic that has yet been explored. Our completeness result builds on earlier work with a system called BHTT, or basic hybrid type theory...... is due to the first-order perspective, which lies at the heart of Henin’s best known work and hybrid logic....

  6. Logic and Ontology

    Directory of Open Access Journals (Sweden)

    Newton C. A. da Costa

    2002-12-01

    Full Text Available In view of the present state of development of non classical logic, especially of paraconsistent logic, a new stand regarding the relations between logic and ontology is defended In a parody of a dictum of Quine, my stand May be summarized as follows. To be is to be the value of a variable a specific language with a given underlying logic Yet my stand differs from Quine’s, because, among other reasons, I accept some first order heterodox logics as genuine alternatives to classical logic I also discuss some questions of non classical logic to substantiate my argument, and suggest that may position complements and extends some ideas advanced by L Apostel.

  7. Institutional Logics in Action

    DEFF Research Database (Denmark)

    Lounsbury, Michael; Boxenbaum, Eva

    2013-01-01

    This double volume presents state-of-the-art research and thinking on the dynamics of actors and institutional logics. In the introduction, we briefly sketch the roots and branches of institutional logics scholarship before turning to the new buds of research on the topic of how actors engage...... institutional logics in the course of their organizational practice. We introduce an exciting line of new works on the meta-theoretical foundations of logics, institutional logic processes, and institutional complexity and organizational responses. Collectively, the papers in this volume advance the very...... prolific stream of research on institutional logics by deepening our insight into the active use of institutional logics in organizational action and interaction, including the institutional effects of such (inter)actions....

  8. A software package for evaluating the performance of a star sensor operation

    Science.gov (United States)

    Sarpotdar, Mayuresh; Mathew, Joice; Sreejith, A. G.; Nirmal, K.; Ambily, S.; Prakash, Ajin; Safonova, Margarita; Murthy, Jayant

    2017-02-01

    We have developed a low-cost off-the-shelf component star sensor ( StarSense) for use in minisatellites and CubeSats to determine the attitude of a satellite in orbit. StarSense is an imaging camera with a limiting magnitude of 6.5, which extracts information from star patterns it records in the images. The star sensor implements a centroiding algorithm to find centroids of the stars in the image, a Geometric Voting algorithm for star pattern identification, and a QUEST algorithm for attitude quaternion calculation. Here, we describe the software package to evaluate the performance of these algorithms as a star sensor single operating system. We simulate the ideal case where sky background and instrument errors are omitted, and a more realistic case where noise and camera parameters are added to the simulated images. We evaluate such performance parameters of the algorithms as attitude accuracy, calculation time, required memory, star catalog size, sky coverage, etc., and estimate the errors introduced by each algorithm. This software package is written for use in MATLAB. The testing is parametrized for different hardware parameters, such as the focal length of the imaging setup, the field of view (FOV) of the camera, angle measurement accuracy, distortion effects, etc., and therefore, can be applied to evaluate the performance of such algorithms in any star sensor. For its hardware implementation on our StarSense, we are currently porting the codes in form of functions written in C. This is done keeping in view its easy implementation on any star sensor electronics hardware.

  9. Supervisory control system implemented in programmable logical controller web server

    OpenAIRE

    Milavec, Simon

    2012-01-01

    In this thesis, we study the feasibility of supervisory control and data acquisition (SCADA) system realisation in a web server of a programmable logic controller. With the introduction of Ethernet protocol to the area of process control, the more powerful programmable logic controllers obtained integrated web servers. The web server of a programmable logic controller, produced by Siemens, will also be described in this thesis. Firstly, the software and the hardware equipment used for real...

  10. Implementing conventional logic unconventionally: photochromic molecular populations as registers and logic gates.

    Science.gov (United States)

    Chaplin, J C; Russell, N A; Krasnogor, N

    2012-07-01

    In this paper we detail experimental methods to implement registers, logic gates and logic circuits using populations of photochromic molecules exposed to sequences of light pulses. Photochromic molecules are molecules with two or more stable states that can be switched reversibly between states by illuminating with appropriate wavelengths of radiation. Registers are implemented by using the concentration of molecules in each state in a given sample to represent an integer value. The register's value can then be read using the intensity of a fluorescence signal from the sample. Logic gates have been implemented using a register with inputs in the form of light pulses to implement 1-input/1-output and 2-input/1-output logic gates. A proof of concept logic circuit is also demonstrated; coupled with the software workflow describe the transition from a circuit design to the corresponding sequence of light pulses. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  11. Computational logic: its origins and applications.

    Science.gov (United States)

    Paulson, Lawrence C

    2018-02-01

    Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the 'logic for computable functions (LCF) approach' pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users' code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself.

  12. 4th International Conference on Quantitative Logic and Soft Computing

    CERN Document Server

    Chen, Shui-Li; Wang, San-Min; Li, Yong-Ming

    2017-01-01

    This book is the proceedings of the Fourth International Conference on Quantitative Logic and Soft Computing (QLSC2016) held 14-17, October, 2016 in Zhejiang Sci-Tech University, Hangzhou, China. It includes 61 papers, of which 5 are plenary talks( 3 abstracts and 2 full length talks). QLSC2016 was the fourth in a series of conferences on Quantitative Logic and Soft Computing. This conference was a major symposium for scientists, engineers and practitioners to present their updated results, ideas, developments and applications in all areas of quantitative logic and soft computing. The book aims to strengthen relations between industry research laboratories and universities in fields such as quantitative logic and soft computing worldwide as follows: (1) Quantitative Logic and Uncertainty Logic; (2) Automata and Quantification of Software; (3) Fuzzy Connectives and Fuzzy Reasoning; (4) Fuzzy Logical Algebras; (5) Artificial Intelligence and Soft Computing; (6) Fuzzy Sets Theory and Applications.

  13. a java-platform software for the evaluation of mass attenuation and ...

    African Journals Online (AJOL)

    USER

    of software programs such as XCOM(Berger and. Hubbell,1987 ... programming language used in the design of the software are as ... interfaces (GUIs) and adding rich graphics functionality ... program at the first launching of the application. It.

  14. A study on the quantitative evaluation of the reliability for safety critical software using Bayesian belief nets

    International Nuclear Information System (INIS)

    Eom, H. S.; Jang, S. C.; Ha, J. J.

    2003-01-01

    Despite the efforts to avoid undesirable risks, or at least to bring them under control in the world, new risks that are highly difficult to manage continue to emerge from the use of new technologies, such as the use of digital instrumentation and control (I and C) components in nuclear power plant. Whenever new risk issues came out by now, we have endeavored to find the most effective ways to reduce risks, or to allocate limited resources to do this. One of the major challenges is the reliability analysis of safety-critical software associated with digital safety systems. Though many activities such as testing, verification and validation (V and V) techniques have been carried out in the design stage of software, however, the process of quantitatively evaluating the reliability of safety-critical software has not yet been developed because of the irrelevance of the conventional software reliability techniques to apply for the digital safety systems. This paper focuses on the applicability of Bayesian Belief Net (BBN) techniques to quantitatively estimate the reliability of safety-critical software adopted in digital safety system. In this paper, a typical BBN model was constructed using the dedication process of the Commercial-Off-The-Shelf (COTS) installed by KAERI. In conclusion, the adoption of BBN technique can facilitate the process of evaluating the safety-critical software reliability in nuclear power plant, as well as provide very useful information (e.g., 'what if' analysis) associated with software reliability in the viewpoint of practicality

  15. The classification and evaluation of Computer-Aided Software Engineering tools

    OpenAIRE

    Manley, Gary W.

    1990-01-01

    Approved for public release; distribution unlimited. The use of Computer-Aided Software Engineering (CASE) tools has been viewed as a remedy for the software development crisis by achieving improved productivity and system quality via the automation of all or part of the software engineering process. The proliferation and tremendous variety of tools available have stretched the understanding of experienced practitioners and has had a profound impact on the software engineering process itse...

  16. Comparison of Anti-Virus Programs using Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Vaclav Bezdek

    2013-07-01

    Full Text Available This work follows the previous author´s paper: Possible use of Fuzzy Logic in Database. It tries to show application of Fuzzy Logic in selecting the best anti-virus software based on testing made by AV-Comparatives.

  17. Improving Object-Oriented Methods by using Fuzzy Logic

    NARCIS (Netherlands)

    Marcelloni, Francesco; Aksit, Mehmet

    2000-01-01

    Object-oriented methods create software artifacts through the application of a large number or rules. Rules are typically formulated in two-valued logic. There are a number of problems on how rules are defined and applied in current methods. First, two-valued logic can capture completely neither

  18. Project maturity evaluation model for SMEs from the software development sub-sector

    Directory of Open Access Journals (Sweden)

    ÁLVARO JULIO CUADROS LÓPEZ

    Full Text Available The purpose of the paper is to present a project management maturity model for SMEs oriented to software development. The proposal is based on CMMI capability maturity model, and the SCAMPI evaluation method. The proposal includes a quantitative satisfaction scale, redundant evidence assessment, and multiple criteria for selecting experts. The proposal was validated with a case study carried out in a medium-sized company from the Information and Communications Technology sector. The model concluded that the company did not reach maturity level 2; however it showed that 92% of the processes from maturity level 2 and 77% of the total process had already been implemented, which allows the company to adopt a specific orientation for its improvement efforts.

  19. CT and MRI slice separation evaluation by LabView developed software.

    Science.gov (United States)

    Acri, Giuseppe; Testagrossa, Barbara; Sestito, Angela; Bonanno, Lilla; Vermiglio, Giuseppe

    2018-02-01

    The efficient use of Computed Tomography (CT) and Magnetic Resonance Imaging (MRI) equipment necessitates establishing adequate quality-control (QC) procedures. In particular, the accuracy of slice separation, during multislices acquisition, requires scan exploration of phantoms containing test objects. To simplify such procedures, a novel phantom and a computerised LabView-based procedure have been devised, enabling determination the midpoint of full width at half maximum (FWHM) in real time while the distance from the profile midpoint of two progressive images is evaluated and measured. The results were compared with those obtained by processing the same phantom images with commercial software. To validate the proposed methodology the Fisher test was conducted on the resulting data sets. In all cases, there was no statistically significant variation between the commercial procedure and the LabView one, which can be used on any CT and MRI diagnostic devices. Copyright © 2017. Published by Elsevier GmbH.

  20. ANT: Software for Generating and Evaluating Degenerate Codons for Natural and Expanded Genetic Codes.

    Science.gov (United States)

    Engqvist, Martin K M; Nielsen, Jens

    2015-08-21

    The Ambiguous Nucleotide Tool (ANT) is a desktop application that generates and evaluates degenerate codons. Degenerate codons are used to represent DNA positions that have multiple possible nucleotide alternatives. This is useful for protein engineering and directed evolution, where primers specified with degenerate codons are used as a basis for generating libraries of protein sequences. ANT is intuitive and can be used in a graphical user interface or by interacting with the code through a defined application programming interface. ANT comes with full support for nonstandard, user-defined, or expanded genetic codes (translation tables), which is important because synthetic biology is being applied to an ever widening range of natural and engineered organisms. The Python source code for ANT is freely distributed so that it may be used without restriction, modified, and incorporated in other software or custom data pipelines.

  1. Technical Evaluation Report 24: Open Source Software: an alternative to costly Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Jim Depow

    2003-10-01

    Full Text Available This is the first in a series of two reports discussing the use of open source software (OSS and free software (FS in online education as an alternative to expensive proprietary software. It details the steps taken in a Canadian community college to download and install the Linux Operating System in order to support an OSS/ FS learning management system (LMS.

  2. Individual radiation therapy patient whole-body phantoms for peripheral dose evaluations: method and specific software

    International Nuclear Information System (INIS)

    Alziar, I; Vicente, C; Giordana, G; Ben-Harrath, O; De Vathaire, F; Diallo, I; Bonniaud, G; Couanet, D; Chavaudra, J; Lefkopoulos, D; Ruaud, J B; Diaz, J C; Grandjean, P; Kafrouni, H

    2009-01-01

    This study presents a method aimed at creating radiotherapy (RT) patient-adjustable whole-body phantoms to permit retrospective and prospective peripheral dose evaluations for enhanced patient radioprotection. Our strategy involves virtual whole-body patient models (WBPM) in different RT treatment positions for both genders and for different age groups. It includes a software tool designed to match the anatomy of the phantoms with the anatomy of the actual patients, based on the quality of patient data available. The procedure for adjusting a WBPM to patient morphology includes typical dimensions available in basic auxological tables for the French population. Adjustment is semi-automatic. Because of the complexity of the human anatomy, skilled personnel are required to validate changes made in the phantom anatomy. This research is part of a global project aimed at proposing appropriate methods and software tools capable of reconstituting the anatomy and dose evaluations in the entire body of RT patients in an adapted treatment planning system (TPS). The graphic user interface is that of a TPS adapted to obtain a comfortable working process. Such WBPM have been used to supplement patient therapy planning images, usually restricted to regions involved in treatment. Here we report, as an example, the case of a patient treated for prostate cancer whose therapy planning images were complemented by an anatomy model. Although present results are preliminary and our research is ongoing, they appear encouraging, since such patient-adjusted phantoms are crucial in the optimization of radiation protection of patients and for follow-up studies. (note)

  3. Individual radiation therapy patient whole-body phantoms for peripheral dose evaluations: method and specific software.

    Science.gov (United States)

    Alziar, I; Bonniaud, G; Couanet, D; Ruaud, J B; Vicente, C; Giordana, G; Ben-Harrath, O; Diaz, J C; Grandjean, P; Kafrouni, H; Chavaudra, J; Lefkopoulos, D; de Vathaire, F; Diallo, I

    2009-09-07

    This study presents a method aimed at creating radiotherapy (RT) patient-adjustable whole-body phantoms to permit retrospective and prospective peripheral dose evaluations for enhanced patient radioprotection. Our strategy involves virtual whole-body patient models (WBPM) in different RT treatment positions for both genders and for different age groups. It includes a software tool designed to match the anatomy of the phantoms with the anatomy of the actual patients, based on the quality of patient data available. The procedure for adjusting a WBPM to patient morphology includes typical dimensions available in basic auxological tables for the French population. Adjustment is semi-automatic. Because of the complexity of the human anatomy, skilled personnel are required to validate changes made in the phantom anatomy. This research is part of a global project aimed at proposing appropriate methods and software tools capable of reconstituting the anatomy and dose evaluations in the entire body of RT patients in an adapted treatment planning system (TPS). The graphic user interface is that of a TPS adapted to obtain a comfortable working process. Such WBPM have been used to supplement patient therapy planning images, usually restricted to regions involved in treatment. Here we report, as an example, the case of a patient treated for prostate cancer whose therapy planning images were complemented by an anatomy model. Although present results are preliminary and our research is ongoing, they appear encouraging, since such patient-adjusted phantoms are crucial in the optimization of radiation protection of patients and for follow-up studies.

  4. Evaluating the Governance Model of Hardware-Dependent Software Ecosystems – A Case Study of the Axis Ecosystem

    OpenAIRE

    Wnuk, Krzysztof; Manikas, Konstantinos; Runeson, Per; Matilda, Lantz; Oskar, Weijden; Munir, Hussan

    2014-01-01

    Ecosystem governance becomes gradually more relevant for a set of companies or actors characterized by symbiotic relations evolved on the top of a technological platform, i.e. a software ecosystem. In this study, we focus on the governance of a hardware-dependent software ecosystem. More specifically, we evaluate the governance model applied by Axis, a network video and surveillance camera producer, that is the platform owner and orchestrator of the Application Development Partner (ADP) softw...

  5. use of fuzzy logic to investigate weather parameter impact

    African Journals Online (AJOL)

    user

    2016-07-03

    Jul 3, 2016 ... developed in the Simulink environment of a MATLAB software. The model ... smoothing, stochastic process, ARMA (autoregressive integrated moving .... 2.3 Building of Fuzzy Logic Simulation Model. The fuzzy model is ...

  6. Evaluation of expert systems - An approach and case study. [of determining software functional requirements for command management of satellites

    Science.gov (United States)

    Liebowitz, J.

    1985-01-01

    Techniques that were applied in defining an expert system prototype for first-cut evaluations of the software functional requirements of NASA satellite command management activities are described. The prototype was developed using the Knowledge Engineering System. Criteria were selected for evaluating the satellite software before defining the expert system prototype. Application of the prototype system is illustrated in terms of the evaluation procedures used with the COBE satellite to be launched in 1988. The limited number of options which can be considered by the program mandates that biases in the system output must be well understood by the users.

  7. Validation of the Mobile Information Software Evaluation Tool (MISET) With Nursing Students.

    Science.gov (United States)

    Secco, M Loretta; Furlong, Karen E; Doyle, Glynda; Bailey, Judy

    2016-07-01

    This study evaluated the Mobile Information Software Evaluation Tool (MISET) with a sample of Canadian undergraduate nursing students (N = 240). Psychometric analyses determined how well the MISET assessed the extent that nursing students find mobile device-based information resources useful and supportive of learning in the clinical and classroom settings. The MISET has a valid three-factor structure with high explained variance (74.7%). Internal consistency reliabilities were high for the MISET total (.90) and three subscales: Usefulness/Helpfulness, Information Literacy Support, and Use of Evidence-Based Sources (.87 to .94). Construct validity evidence included significantly higher mean total MISET, Helpfulness/Usefulness, and Information Literacy Support scores for senior students and those with higher computer competence. The MISET is a promising tool to evaluate mobile information technologies and information literacy support; however, longitudinal assessment of changes in scores over time would determine scale sensitivity and responsiveness. [J Nurs Educ. 2016;55(7):385-390.]. Copyright 2016, SLACK Incorporated.

  8. Software reliability evaluation of digital plant protection system development process using V and V

    International Nuclear Information System (INIS)

    Lee, Na Young; Hwang, Il Soon; Seong, Seung Hwan; Oh, Seung Rok

    2001-01-01

    In the nuclear power industry, digital technology has been introduced recently for the Instrumentation and Control (I and C) of reactor systems. For its application to the safety critical system such as Reactor Protection System(RPS), a reliability assessment is indispensable. Unlike traditional reliability models, software reliability is hard to evaluate, and should be evaluated throughout development lifecycle. In the development process of Digital Plant Protection System(DPPS), the concept of verification and validation (V and V) was introduced to assure the quality of the product. Also, test should be performed to assure the reliability. Verification procedure with model checking is relatively well defined, however, test is labor intensive and not well organized. In this paper, we developed the methodological process of combining the verification with validation test case generation. For this, we used PVS for the table specification and for the theorem proving. As a result, we could not only save time to design test case but also get more effective and complete verification related test case set. Add to this, we could extract some meaningful factors useful for the reliability evaluation both from the V and V and verification combined tests

  9. Application of a Software tool for Evaluating Human Factors in Accident Sequences

    International Nuclear Information System (INIS)

    Queral, Cesar; Exposito, Antonio; Gonzalez, Isaac; Quiroga, Juan Antonio; Ibarra, Aitor; Hortal, Javier; Hulsund, John-Einar; Nilsen, Svein

    2006-01-01

    The Probabilistic Safety Assessment (PSA) includes the actions of the operator like elements in the set of the considered protection performances during accident sequences. Nevertheless, its impact throughout a sequence is not analyzed in a dynamic way. In this sense, it is convenient to make more detailed studies about its importance in the dynamics of the sequences, letting make studies of sensitivity respect to the human reliability and the response times. For this reason, the CSN is involved in several activities oriented to develop a new safety analysis methodology, the Integrated Safety Assessment (ISA), which must be able to incorporate operator actions in conventional thermo-hydraulic (TH) simulations. One of them is the collaboration project between CSN, HRP and the DSE-UPM that started in 2003. In the framework of this project, a software tool has been developed to incorporate operator actions in TH simulations. As a part of the ISA, this tool permits to quantify human error probabilities (HEP) and to evaluate its impact in the final state of the plant. Independently, it can be used for evaluating the impact of the execution by operators of procedures and guidelines in the final state of the plant and the evaluation of the allowable response times for the manual actions of the operator. The results obtained in the first pilot case are included in this paper. (authors)

  10. Design and evaluation of a software prototype for participatory planning of environmental adaptations.

    Science.gov (United States)

    Eriksson, J; Ek, A; Johansson, G

    2000-03-01

    A software prototype to support the planning process for adapting home and work environments for people with physical disabilities was designed and later evaluated. The prototype exploits low-cost three-dimensional (3-D) graphics products in the home computer market. The essential features of the prototype are: interactive rendering with optional hardware acceleration, interactive walk-throughs, direct manipulation tools for moving objects and measuring distances, and import of 3-D-objects from a library. A usability study was conducted, consisting of two test sessions (three weeks apart) and a final interview. The prototype was then tested and evaluated by representatives of future users: five occupational therapist students, and four persons with physical disability, with no previous experience of the prototype. Emphasis in the usability study was placed on the prototype's efficiency and learnability. We found that it is possible to realise a planning tool for environmental adaptations, both regarding usability and technical efficiency. The usability evaluation confirms our findings from previous case studies, regarding the relevance and positive attitude towards this kind of planning tool. Although the prototype was found to be satisfactorily efficient for the basic tasks, the paper presents several suggestions for improvement of future prototype versions.

  11. Logic and structure

    CERN Document Server

    Dalen, Dirk

    1983-01-01

    A book which efficiently presents the basics of propositional and predicate logic, van Dalen’s popular textbook contains a complete treatment of elementary classical logic, using Gentzen’s Natural Deduction. Propositional and predicate logic are treated in separate chapters in a leisured but precise way. Chapter Three presents the basic facts of model theory, e.g. compactness, Skolem-Löwenheim, elementary equivalence, non-standard models, quantifier elimination, and Skolem functions. The discussion of classical logic is rounded off with a concise exposition of second-order logic. In view of the growing recognition of constructive methods and principles, one chapter is devoted to intuitionistic logic. Completeness is established for Kripke semantics. A number of specific constructive features, such as apartness and equality, the Gödel translation, the disjunction and existence property have been incorporated. The power and elegance of natural deduction is demonstrated best in the part of proof theory cal...

  12. The Football of Logic

    Directory of Open Access Journals (Sweden)

    Schang Fabien

    2017-03-01

    Full Text Available An analogy is made between two rather different domains, namely: logic, and football (or soccer. Starting from a comparative table between the two activities, an alternative explanation of logic is given in terms of players, ball, goal, and the like. Our main thesis is that, just as the task of logic is preserving truth from premises to the conclusion, footballers strive to keep the ball as far as possible until the opposite goal. Assuming this analogy may help think about logic in the same way as in dialogical logic, but it should also present truth-values in an alternative sense of speech-acts occurring in a dialogue. The relativity of truth-values is focused by this way, thereby leading to an additional way of logical pluralism.

  13. Logic of likelihood

    International Nuclear Information System (INIS)

    Wall, M.J.W.

    1992-01-01

    The notion of open-quotes probabilityclose quotes is generalized to that of open-quotes likelihood,close quotes and a natural logical structure is shown to exist for any physical theory which predicts likelihoods. Two physically based axioms are given for this logical structure to form an orthomodular poset, with an order-determining set of states. The results strengthen the basis of the quantum logic approach to axiomatic quantum theory. 25 refs

  14. Logical database design principles

    CERN Document Server

    Garmany, John; Clark, Terry

    2005-01-01

    INTRODUCTION TO LOGICAL DATABASE DESIGNUnderstanding a Database Database Architectures Relational Databases Creating the Database System Development Life Cycle (SDLC)Systems Planning: Assessment and Feasibility System Analysis: RequirementsSystem Analysis: Requirements Checklist Models Tracking and Schedules Design Modeling Functional Decomposition DiagramData Flow Diagrams Data Dictionary Logical Structures and Decision Trees System Design: LogicalSYSTEM DESIGN AND IMPLEMENTATION The ER ApproachEntities and Entity Types Attribute Domains AttributesSet-Valued AttributesWeak Entities Constraint

  15. Erotetic epistemic logic

    Czech Academy of Sciences Publication Activity Database

    Peliš, Michal

    2017-01-01

    Roč. 26, č. 3 (2017), s. 357-381 ISSN 1425-3305 R&D Projects: GA ČR(CZ) GC16-07954J Institutional support: RVO:67985955 Keywords : epistemic logic * erotetic implication * erotetic logic * logic of questions Subject RIV: AA - Philosophy ; Religion OBOR OECD: Philosophy, History and Philosophy of science and technology http://apcz.umk.pl/czasopisma/index.php/LLP/article/view/LLP.2017.007

  16. Logic for Physicists

    Science.gov (United States)

    Pereyra, Nicolas A.

    2018-06-01

    This book gives a rigorous yet 'physics-focused' introduction to mathematical logic that is geared towards natural science majors. We present the science major with a robust introduction to logic, focusing on the specific knowledge and skills that will unavoidably be needed in calculus topics and natural science topics in general (rather than taking a philosophical-math-fundamental oriented approach that is commonly found in mathematical logic textbooks).

  17. A Hybrid Parallel Execution Model for Logic Based Requirement Specifications (Invited Paper

    Directory of Open Access Journals (Sweden)

    Jeffrey J. P. Tsai

    1999-05-01

    Full Text Available It is well known that undiscovered errors in a requirements specification is extremely expensive to be fixed when discovered in the software maintenance phase. Errors in the requirement phase can be reduced through the validation and verification of the requirements specification. Many logic-based requirements specification languages have been developed to achieve these goals. However, the execution and reasoning of a logic-based requirements specification can be very slow. An effective way to improve their performance is to execute and reason the logic-based requirements specification in parallel. In this paper, we present a hybrid model to facilitate the parallel execution of a logic-based requirements specification language. A logic-based specification is first applied by a data dependency analysis technique which can find all the mode combinations that exist within a specification clause. This mode information is used to support a novel hybrid parallel execution model, which combines both top-down and bottom-up evaluation strategies. This new execution model can find the failure in the deepest node of the search tree at the early stage of the evaluation, thus this new execution model can reduce the total number of nodes searched in the tree, the total processes needed to be generated, and the total communication channels needed in the search process. A simulator has been implemented to analyze the execution behavior of the new model. Experiments show significant improvement based on several criteria.

  18. What is mathematical logic?

    CERN Document Server

    Crossley, J N; Brickhill, CJ; Stillwell, JC

    2010-01-01

    Although mathematical logic can be a formidably abstruse topic, even for mathematicians, this concise book presents the subject in a lively and approachable fashion. It deals with the very important ideas in modern mathematical logic without the detailed mathematical work required of those with a professional interest in logic.The book begins with a historical survey of the development of mathematical logic from two parallel streams: formal deduction, which originated with Aristotle, Euclid, and others; and mathematical analysis, which dates back to Archimedes in the same era. The streams beg

  19. Indexical Hybrid Tense Logic

    DEFF Research Database (Denmark)

    Blackburn, Patrick Rowan; Jørgensen, Klaus Frovin

    2012-01-01

    In this paper we explore the logic of now, yesterday, today and tomorrow by combining the semantic approach to indexicality pioneered by Hans Kamp [9] and refined by David Kaplan [10] with hybrid tense logic. We first introduce a special now nominal (our @now corresponds to Kamp’s original now...... operator N) and prove completeness results for both logical and contextual validity. We then add propositional constants to handle yesterday, today and tomorrow; our system correctly treats sentences like “Niels will die yesterday” as contextually unsatisfiable. Building on our completeness results for now......, we prove completeness for the richer language, again for both logical and contextual validity....

  20. A Logic for Choreographies

    DEFF Research Database (Denmark)

    Lopez, Hugo Andres; Carbone, Marco; Hildebrandt, Thomas

    2010-01-01

    We explore logical reasoning for the global calculus, a coordination model based on the notion of choreography, with the aim to provide a methodology for specification and verification of structured communications. Starting with an extension of Hennessy-Milner logic, we present the global logic (GL...... ), a modal logic describing possible interactions among participants in a choreography. We illustrate its use by giving examples of properties on service specifications. Finally, we show that, despite GL is undecidable, there is a significant decidable fragment which we provide with a sound and complete proof...

  1. Superconductor fluxoid logic

    International Nuclear Information System (INIS)

    Andronov, A.A.; Kurin, V.V.; Levichev, M.Yu.; Ryndyk, D.A.; Vostokov, V.I.

    1993-01-01

    In recent years there has been much interest in superconductor logical devices. Our paper is devoted to the analysis of some new possibilities in this field. The main problems here are: minimization of time of logical operations and reducing of device scale. Josephson systems are quite appropriate for this purpose because of small size, short characteristic time and also small energy losses. Two different types of Josephson logic have been investigated during last years. The first type is based on hysteretic V-A characteristic of a single Josephson junction. Superconducting and resistive (with nonzero voltage) states are considered as logical zero and logical unit. The second one - rapid single flux quantum logic, has been developed recently and is based on SQUID-like bistability. Different logical states are the states with different number of magnetic flux quanta inside closed superconducting contour. Information is represented by voltage pulses with fixed ''area'' (∫ V(t)/dt). This pulses are generated when logical state of SQUID-like elementary cell changes. The fundamental role of magnetic flux quantization in this type of logic leads to the necessity of large enough self-inductance of superconductor contour and thus to limitations on minimal device dimensions. (orig.)

  2. A Logic for Choreographies

    Directory of Open Access Journals (Sweden)

    Marco Carbone

    2011-10-01

    Full Text Available We explore logical reasoning for the global calculus, a coordination model based on the notion of choreography, with the aim to provide a methodology for specification and verification of structured communications. Starting with an extension of Hennessy-Milner logic, we present the global logic (GL, a modal logic describing possible interactions among participants in a choreography. We illustrate its use by giving examples of properties on service specifications. Finally, we show that, despite GL is undecidable, there is a significant decidable fragment which we provide with a sound and complete proof system for checking validity of formulae.

  3. Introduction to mathematical logic

    CERN Document Server

    Mendelson, Elliott

    2015-01-01

    The new edition of this classic textbook, Introduction to Mathematical Logic, Sixth Edition explores the principal topics of mathematical logic. It covers propositional logic, first-order logic, first-order number theory, axiomatic set theory, and the theory of computability. The text also discusses the major results of Gödel, Church, Kleene, Rosser, and Turing.The sixth edition incorporates recent work on Gödel's second incompleteness theorem as well as restoring an appendix on consistency proofs for first-order arithmetic. This appendix last appeared in the first edition. It is offered in th

  4. Clinical evaluation of a computerized topography software method for fitting rigid gas permeable contact lenses.

    Science.gov (United States)

    Szczotka, L B; Capretta, D M; Lass, J H

    1994-10-01

    Computerized videokeratoscope software programs now have the ability to assist in the design of rigid gas permeable (RGP) contact lenses and simulate fluorescein patterns. We evaluated the performance of Computed Anatomy's Topographic Modeling System (TMS-1) and its Contact Lens Fitting Program (version 1.41) in fitting RGP lenses in 31 subjects. Computerized topographic analysis, balanced manifest refraction, slit lamp examination, and keratometry were performed. Initial lens parameters were ordered according to manufacturer's programmed recommendations for base curve, power, lens diameter, optic zone diameter, and edge lift. Final lens parameters were based on clinical performance. Lenses were recorded for base curve changes of 0.1 mm or more, power alterations of +/- 0.50 D or more, or for any alteration in diameter/optic zone. Twenty-seven patients were analyzed for all five recommended parameters. Thirteen of 27 patients (48%) required no parameter changes. Nine of 27 patients (33%) required one parameter change, four of 27 patients (15%) required two parameter changes, and one patient (4%) needed three parameters altered. The most prevalent change was a power alteration, required in nine of 27 patients (33%); however, comparisons of all initial to final parameters showed no statistically significant differences. Comparison of initial base curves to that which would have been chosen via standard keratometry also showed no significant difference. This study found the TMS-1 default lens recommendations to be clinically unacceptable. This system, however, could be an alternative method of initial lens selection if used to titrate a fit or if software enhancements are incorporated to account for lens movement and flexure.

  5. The Blue Dog: evaluation of an interactive software program to teach young children how to interact safely with dogs.

    Science.gov (United States)

    Schwebel, David C; Morrongiello, Barbara A; Davis, Aaron L; Stewart, Julia; Bell, Melissa

    2012-04-01

    Pre-post-randomized design evaluated The Blue Dog, a dog safety software program. 76 children aged 3.5-6 years completed 3 tasks to evaluate dog safety pre- and postintervention: (a) pictures (recognition of safe/risky behavior), (b) dollhouse (recall of safe behavior via simulated dollhouse scenarios), and (c) live dog (actual behavior with unfamiliar live dog). Following preintervention evaluation, children were randomly assigned to dog or fire safety conditions, each involving 3 weeks of home computer software use. Children using Blue Dog had greater change in recognition of risky dog situations than children learning fire safety. No between-group differences emerged in recall (dollhouse) or engagement (live-dog) in risky behavior. Families enjoyed using the software. Blue Dog taught children knowledge about safe engagement with dogs, but did not influence recall or implementation of safe behaviors. Dog bites represent a significant pediatric injury concern and continued development of effective interventions is needed.

  6. Research on the evaluation model of the software reliability in nuclear safety class digital instrumentation and control system

    International Nuclear Information System (INIS)

    Liu Ying; Yang Ming; Li Fengjun; Ma Zhanguo; Zeng Hai

    2014-01-01

    In order to analyze the software reliability (SR) in nuclear safety class digital instrumentation and control system (D-I and C), firstly, the international software design standards were analyzed, the standards' framework was built, and we found that the D-I and C software standards should follow the NUREG-0800 BTP7-14, according to the NRC NUREG-0800 review of requirements. Secondly, the quantitative evaluation model of SR using Bayesian Belief Network and thirteen sub-model frameworks were established. Thirdly, each sub-models and the weight of corresponding indexes in the evaluation model were analyzed. Finally, the safety case was introduced. The models lay a foundation for review and quantitative evaluation on the SR in nuclear safety class D-I and C. (authors)

  7. A study on the quantitative evaluation for the software included in digital systems of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. K.; Sung, T. Y.; Eom, H. S.; Jeong, H. S.; Kang, H. G.; Lee, K. Y.; Park, J. K. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    In general, probabilistic safety analysis (PSA) has been used as one of the most important methods to evaluate the safety of NPPs. The PSA, because most of NPPs have been installed and used analog I and C systems, has been performed based on the hardware perspectives. In addition, since the tendency to use digital I and C systems including software instead of analog I and C systems is increasing, the needs of quantitative evaluation methods so as to perform PSA are also increasing. Nevertheless, several reasons such as software did not aged and it is very perplexed to estimate software failure rate due to its non-linearity, make the performance of PSA difficult. In this study, in order to perform PSA including software more efficiently, test-based software reliability estimation methods are reviewed to suggest a preliminary procedure that can provide reasonable guidances to quantify software failure rate. In addition, requisite activities to enhance applicability of the suggested procedure are also discussed. 67 refs., 11 figs., 5 tabs. (Author)

  8. Evaluation of Dosimetry Check software for IMRT patient-specific quality assurance.

    Science.gov (United States)

    Narayanasamy, Ganesh; Zalman, Travis; Ha, Chul S; Papanikolaou, Niko; Stathakis, Sotirios

    2015-05-08

    The purpose of this study is to evaluate the use of the Dosimetry Check system for patient-specific IMRT QA. Typical QA methods measure the dose in an array dosimeter surrounded by homogenous medium for which the treatment plan has been recomputed. With the Dosimetry Check system, fluence measurements acquired on a portal dosimeter is applied to the patient's CT scans. Instead of making dose comparisons in a plane, Dosimetry Check system produces isodose lines and dose-volume histograms based on the planning CT images. By exporting the dose distribution from the treatment planning system into the Dosimetry Check system, one is able to make a direct comparison between the calculated dose and the planned dose. The versatility of the software is evaluated with respect to the two IMRT techniques - step and shoot and volumetric arc therapy. The system analyzed measurements made using EPID, PTW seven29, and IBA MatriXX, and an intercomparison study was performed. Plans from patients previously treated at our institution with treated anatomical site on brain, head & neck, liver, lung, and prostate were analyzed using Dosimetry Check system for any anatomical site dependence. We have recommendations and possible precautions that may be necessary to ensure proper QA with the Dosimetry Check system.

  9. PlanetLab Europe as Geographically-Distributed Testbed for Software Development and Evaluation

    Directory of Open Access Journals (Sweden)

    Dan Komosny

    2015-01-01

    Full Text Available In this paper, we analyse the use of PlanetLab Europe for development and evaluation of geographically-oriented Internet services. PlanetLab is a global research network with the main purpose to support development of new Internet services and protocols. PlanetLab is divided into several branches; one of them is PlanetLab Europe. PlanetLab Europe consists of about 350 nodes at 150 geographically different sites. The nodes are accessible by remote login, and the users can run their software on the nodes. In the paper, we study the PlanetLab's properties that are significant for its use as a geographically distributed testbed. This includes node position accuracy, services availability and stability. We find a considerable number of location inaccuracies and a number of services that cannot be considered as reliable. Based on the results we propose a simple approach to nodes selection in testbeds for geographically-oriented Internet services development and evaluation.

  10. Using the Visualization Software Evaluation Rubric to explore six freely available visualization applications

    Directory of Open Access Journals (Sweden)

    Thea P. Atwood

    2018-01-01

    Full Text Available Objective: As a variety of visualization tools become available to librarians and researchers, it can be challenging to select a tool that is robust and flexible enough to provide the desired visualization outcomes for work or personal use. In this article, the authors provide guidance on several freely available tools, and offer a rubric for use in evaluating visualization tools. Methods: A rubric was generated to assist the authors in assessing the selected six freely available visualization tools. Each author analyzed three tools, and discussed the differences, similarities, challenges, and successes of each. Results: Of the six visualization tools, two tools emerged with high marks. The authors found that the rubric was a successful evaluation tool, and facilitated discussion on the strengths and weaknesses of the six selected visualization pieces of software. Conclusions: Of the six different visualization tools analyzed, all had different functions and features available to best meet the needs of users. In a situation where there are many options available, and it is difficult at first glance to determine a clear winner, a rubric can be useful in providing a method to quickly assess and communicate the effectiveness of a tool.

  11. High-reliability logic system evaluation of a programmed multiprocessor solution. Application in the nuclear reactor safety field

    International Nuclear Information System (INIS)

    Lallement, Dominique.

    1979-01-01

    Nuclear reactors are monitored by several systems combined. The hydraulic and mechanical limitations on the equipment and the heat transfer requirements in the core set a reliable working range for the boiler defined with certain safety margins. The control system tends to keep the power plant within this working range. The protection system covers all the electrical and mechanical equipment needed to safeguard the boiler in the event of abnormal transients or accidents accounted for in the design of the plant. On units in service protection is handled by cabled automatic systems. For better reliability and safety operation, greater flexibility of use (modularity, adaptability) and improved start-up criteria by data processing the tendency is to use digital programmed systems. Computers are already present in control systems but their introduction into protection systems meets with some reticence on the part of the nuclear safety authorities. A study on the replacement of conventional by digital protection systems is presented. From choices partly made on the principles which should govern the hardware and software of a protection system the reliability of different structures and elements was examined and an experimental model built with its simulator and test facilities. A prototype based on these options and studies is being built and is to be set up on one of the CEN-G reactors for tests [fr

  12. Evaluation of an improved algorithm for producing realistic 3D breast software phantoms: Application for mammography

    International Nuclear Information System (INIS)

    Bliznakova, K.; Suryanarayanan, S.; Karellas, A.; Pallikarakis, N.

    2010-01-01

    Purpose: This work presents an improved algorithm for the generation of 3D breast software phantoms and its evaluation for mammography. Methods: The improved methodology has evolved from a previously presented 3D noncompressed breast modeling method used for the creation of breast models of different size, shape, and composition. The breast phantom is composed of breast surface, duct system and terminal ductal lobular units, Cooper's ligaments, lymphatic and blood vessel systems, pectoral muscle, skin, 3D mammographic background texture, and breast abnormalities. The key improvement is the development of a new algorithm for 3D mammographic texture generation. Simulated images of the enhanced 3D breast model without lesions were produced by simulating mammographic image acquisition and were evaluated subjectively and quantitatively. For evaluation purposes, a database with regions of interest taken from simulated and real mammograms was created. Four experienced radiologists participated in a visual subjective evaluation trial, as they judged the quality of the simulated mammograms, using the new algorithm compared to mammograms, obtained with the old modeling approach. In addition, extensive quantitative evaluation included power spectral analysis and calculation of fractal dimension, skewness, and kurtosis of simulated and real mammograms from the database. Results: The results from the subjective evaluation strongly suggest that the new methodology for mammographic breast texture creates improved breast models compared to the old approach. Calculated parameters on simulated images such as β exponent deducted from the power law spectral analysis and fractal dimension are similar to those calculated on real mammograms. The results for the kurtosis and skewness are also in good coincidence with those calculated from clinical images. Comparison with similar calculations published in the literature showed good agreement in the majority of cases. Conclusions: The

  13. Real-time Kernel Implementation Practice Program for Embedded Software Engineers' Education and its Evaluation

    Science.gov (United States)

    Yoshida, Toshio; Matsumoto, Masahide; Seo, Katsuhiko; Chino, Shinichiro; Sugino, Eiji; Sawamoto, Jun; Koizumi, Hisao

    A real-time kernel (henceforth RTK) is in the center place of embedded software technology, and the understanding of RTK is indispensable for the embedded system design. To implement RTK, it is necessary to understand languages that describe RTK software program code, system programming manners, software development tools, CPU on that RTK runs and the interface between software and hardware, etc. in addition to understanding of RTK itself. This means RTK implementation process largely covers embedded software implementation process. Therefore, it is thought that RTK implementation practice program is very effective as a means of the acquisition of common embedded software skill in addition to deeper acquisition of RTK itself. In this paper, we propose to apply RTK implementing practice program to embedded software engineers educational program. We newly developed very small and step-up type RTK named μK for educational use, and held a seminar that used μK as a teaching material for the students of information science and engineers of the software house. As a result, we confirmed that RTK implementation practice program is very effective for the acquisition of embedded software common skill.

  14. Understanding Social Media Logic

    Directory of Open Access Journals (Sweden)

    José van Dijck

    2013-08-01

    Full Text Available Over the past decade, social media platforms have penetrated deeply into the mech­anics of everyday life, affecting people's informal interactions, as well as institutional structures and professional routines. Far from being neutral platforms for everyone, social media have changed the conditions and rules of social interaction. In this article, we examine the intricate dynamic between social media platforms, mass media, users, and social institutions by calling attention to social media logic—the norms, strategies, mechanisms, and economies—underpin­ning its dynamics. This logic will be considered in light of what has been identified as mass me­dia logic, which has helped spread the media's powerful discourse outside its institutional boundaries. Theorizing social media logic, we identify four grounding principles—programmabil­ity, popularity, connectivity, and datafication—and argue that these principles become increas­ingly entangled with mass media logic. The logic of social media, rooted in these grounding principles and strategies, is gradually invading all areas of public life. Besides print news and broadcasting, it also affects law and order, social activism, politics, and so forth. Therefore, its sustaining logic and widespread dissemination deserve to be scrutinized in detail in order to better understand its impact in various domains. Concentrating on the tactics and strategies at work in social media logic, we reassess the constellation of power relationships in which social practices unfold, raising questions such as: How does social media logic modify or enhance ex­isting mass media logic? And how is this new media logic exported beyond the boundaries of (social or mass media proper? The underlying principles, tactics, and strategies may be relat­ively simple to identify, but it is much harder to map the complex connections between plat­forms that distribute this logic: users that employ them, technologies that

  15. A Trustworthiness Evaluation Method for Software Architectures Based on the Principle of Maximum Entropy (POME and the Grey Decision-Making Method (GDMM

    Directory of Open Access Journals (Sweden)

    Rong Jiang

    2014-09-01

    Full Text Available As the early design decision-making structure, a software architecture plays a key role in the final software product quality and the whole project. In the software design and development process, an effective evaluation of the trustworthiness of a software architecture can help making scientific and reasonable decisions on the architecture, which are necessary for the construction of highly trustworthy software. In consideration of lacking the trustworthiness evaluation and measurement studies for software architecture, this paper provides one trustworthy attribute model of software architecture. Based on this model, the paper proposes to use the Principle of Maximum Entropy (POME and Grey Decision-making Method (GDMM as the trustworthiness evaluation method of a software architecture and proves the scientificity and rationality of this method, as well as verifies the feasibility through case analysis.

  16. Systematic evaluation program review of NRC safety topic VII-2 associated with the electrical, instrumentation and control portions of the ESF system control logic and design for the Dresden Station, Unit II nuclear power plant

    International Nuclear Information System (INIS)

    St Leger-Barter, G.

    1980-11-01

    This report documents the technical evaluation and review of NRC Safety Topic VII-2, associated with the electrical, instrumentation, and control portions of the ESF system control logic and design for the Dresden Station Unit II nuclear power plant, using current licensing criteria

  17. Evaluation of a Surveillance Review Software based on Automatic Image Summaries

    International Nuclear Information System (INIS)

    Rocchi, S.; Hadfi, G.; John, M.; Moeslinger, M.; Murray, J.; Juengling, K.; Sequeira, V.; Versino, C.; )

    2015-01-01

    Surveillance streams from safeguards instruments contain thousands of images. Inspectors review them in order to find safeguards-relevant events. Statistically a very small fraction of the images is expected to be safeguards-relevant. For this reason inspectors need a tool which helps them to focus their attention directly to the relevant parts of the surveillance stream. The current approach for surveillance review makes use of scene change detection within areas of interest (AOIs). The data reduction provided can be effective for the review of regular processes, and requires specific knowledge of the process/environment under review for the proper setting of the AOIs. The VideoZoom approach, developed by the European Commission Joint Research Centre-Institute for Transuranium Elements (JRC-ITU), detects scene changes on the whole image plane. Changes are then summarized and rendered at different levels of abstraction in four layers of summaries, each one revealing more information about the image changes. By means of a zooming interface, the reviewer is able to navigate the summary layers and decide which are to be examined with full photographic detail or skipped because they are clearly not safeguards-relevant. In this way reviewers can make best use of their time by investigating what really requires their attention. VideoZoom was evaluated by a group of IAEA inspectors on a benchmark of image reviews, with promising results in terms of identification of safeguards-relevant events, efficiency and usability. Following the positive results collected during the preliminary benchmark, the IAEA initiated a task under the European Commission Support Programme (EC SP), aimed at the research, development, and evaluation of surveillance review software based on VideoZoom and compatible with surveillance streams produced by NGSS cameras, the current safeguards surveillance technology deployed by the IAEA. This paper provides a description of the VideoZoom approach to

  18. Logic in the curricula of Computer Science

    Directory of Open Access Journals (Sweden)

    Margareth Quindeless

    2014-12-01

    Full Text Available The aim of the programs in Computer Science is to educate and train students to understand the problems and build systems that solve them. This process involves applying a special reasoning to model interactions, capabilities, and limitations of the components involved. A good curriculum must involve the use of tools to assist in these tasks, and one that could be considered as a fundamental is the logic, because with it students develop the necessary reasoning. Besides, software developers analyze the behavior of the program during the designed, the depuration, and testing; hardware designers perform minimization and equivalence verification of circuits; designers of operating systems validate routing protocols, programing, and synchronization; and formal logic underlying all these activities. Therefore, a strong background in applied logic would help students to develop or potentiate their ability to reason about complex systems. Unfortunately, few curricula formed and properly trained in logic. Most includes only one or two courses of Discrete Mathematics, which in a few weeks covered truth tables and the propositional calculus, and nothing more. This is not enough, and higher level courses in which they are applied and many other logical concepts are needed. In addition, students will not see the importance of logic in their careers and need to modify the curriculum committees or adapt the curriculum to reverse this situation.

  19. Weakly Intuitionistic Quantum Logic

    NARCIS (Netherlands)

    Hermens, Ronnie

    2013-01-01

    In this article von Neumann's proposal that in quantum mechanics projections can be seen as propositions is followed. However, the quantum logic derived by Birkhoff and von Neumann is rejected due to the failure of the law of distributivity. The options for constructing a distributive logic while

  20. Modal Logics and Definability

    OpenAIRE

    Kuusisto, Antti

    2013-01-01

    In recent years, research into the mathematical foundations of modal logic has become increasingly popular. One of the main reasons for this is the fact that modal logic seems to adapt well to the requirements of a wide range of different fields of application. This paper is a summary of some of the author’s contributions to the understanding of modal definability theory.

  1. Modal logics are coalgebraic

    NARCIS (Netherlands)

    Cirstea, C.; Kurz, A.; Pattinson, D.; Schröder, L.; Venema, Y.

    2011-01-01

    Applications of modal logics are abundant in computer science, and a large number of structurally different modal logics have been successfully employed in a diverse spectrum of application contexts. Coalgebraic semantics, on the other hand, provides a uniform and encompassing view on the large

  2. Description logics of context

    CSIR Research Space (South Africa)

    Klarman, S

    2013-05-01

    Full Text Available We introduce Description Logics of Context (DLCs) - an extension of Description Logics (DLs) for context-based reasoning. Our approach descends from J. McCarthy's tradition of treating contexts as formal objects over which one can quantify...

  3. Criteria for logical formalization

    Czech Academy of Sciences Publication Activity Database

    Peregrin, Jaroslav; Svoboda, Vladimír

    2013-01-01

    Roč. 190, č. 14 (2013), s. 2897-2924 ISSN 0039-7857 R&D Projects: GA ČR(CZ) GAP401/10/1279 Institutional support: RVO:67985955 Keywords : logic * logical form * formalization * reflective equilibrium Subject RIV: AA - Philosophy ; Religion Impact factor: 0.637, year: 2013

  4. Automata, Logic, and XML

    OpenAIRE

    NEVEN, Frank

    2002-01-01

    We survey some recent developments in the broad area of automata and logic which are motivated by the advent of XML. In particular, we consider unranked tree automata, tree-walking automata, and automata over infinite alphabets. We focus on their connection with logic and on questions imposed by XML.

  5. One reason, several logics

    Directory of Open Access Journals (Sweden)

    Evandro Agazzi

    2011-06-01

    Full Text Available Humans have used arguments for defending or refuting statements long before the creation of logic as a specialized discipline. This can be interpreted as the fact that an intuitive notion of "logical consequence" or a psychic disposition to articulate reasoning according to this pattern is present in common sense, and logic simply aims at describing and codifying the features of this spontaneous capacity of human reason. It is well known, however, that several arguments easily accepted by common sense are actually "logical fallacies", and this indicates that logic is not just a descriptive, but also a prescriptive or normative enterprise, in which the notion of logical consequence is defined in a precise way and then certain rules are established in order to maintain the discourse in keeping with this notion. Yet in the justification of the correctness and adequacy of these rules commonsense reasoning must necessarily be used, and in such a way its foundational role is recognized. Moreover, it remains also true that several branches and forms of logic have been elaborated precisely in order to reflect the structural features of correct argument used in different fields of human reasoning and yet insufficiently mirrored by the most familiar logical formalisms.

  6. The logic of ACP

    NARCIS (Netherlands)

    A. Ponse (Alban); M.B. van der Zwaag

    2002-01-01

    textabstractWe distinguish two interpretations for the truth value `undefined' in Kleene's three-valued logic. Combining these two interpretations leads to a four-valued propositional logic that characterizes two particular ingredients of process algebra: ``choice' and ``inaction'. We study two

  7. Anselm's logic of agency

    NARCIS (Netherlands)

    Uckelman, S.L.

    2009-01-01

    The origins of treating agency as a modal concept go back at least to the 11th century when Anselm, Archbishop of Canterbury, provided a modal explication of the Latin facere ‘to do’, which can be formalized within the context of modern modal logic and neighborhood semantics. The agentive logic

  8. Temporalized Epistemic Default Logic

    NARCIS (Netherlands)

    van der Hoek, W.; Meyer, J.J.; Treur, J.; Gabbay, D.

    2001-01-01

    The nonmonotonic logic Epistemic Default Logic (EDL) [Meyer and van der Hoek, 1993] is based on the metaphore of a meta-level architecture. It has already been established [Meyer and van der Hoek, 1993] how upward reflection can be formalized by a nonmonotonic entailment based on epistemic states,

  9. Logic Programming: PROLOG.

    Science.gov (United States)

    Lopez, Antonio M., Jr.

    1989-01-01

    Provides background material on logic programing and presents PROLOG as a high-level artificial intelligence programing language that borrows its basic constructs from logic. Suggests the language is one which will help the educator to achieve various goals, particularly the promotion of problem solving ability. (MVL)

  10. Honesty in partial logic

    NARCIS (Netherlands)

    W. van der Hoek (Wiebe); J.O.M. Jaspars; E. Thijsse

    1995-01-01

    textabstractWe propose an epistemic logic in which knowledge is fully introspective and implies truth, although truth need not imply epistemic possibility. The logic is presented in sequential format and is interpreted in a natural class of partial models, called balloon models. We examine the

  11. Using fuzzy logic to improve the project time and cost estimation based on Project Evaluation and Review Technique (PERT

    Directory of Open Access Journals (Sweden)

    Farhad Habibi

    2018-09-01

    Full Text Available Among different factors, correct scheduling is one of the vital elements for project management success. There are several ways to schedule projects including the Critical Path Method (CPM and Program Evaluation and Review Technique (PERT. Due to problems in estimating dura-tions of activities, these methods cannot accurately and completely model actual projects. The use of fuzzy theory is a basic way to improve scheduling and deal with such problems. Fuzzy theory approximates project scheduling models to reality by taking into account uncertainties in decision parameters and expert experience and mental models. This paper provides a step-by-step approach for accurate estimation of time and cost of projects using the Project Evaluation and Review Technique (PERT and expert views as fuzzy numbers. The proposed method included several steps. In the first step, the necessary information for project time and cost is estimated using the Critical Path Method (CPM and the Project Evaluation and Review Technique (PERT. The second step considers the duration and cost of the project activities as the trapezoidal fuzzy numbers, and then, the time and cost of the project are recalculated. The duration and cost of activities are estimated using the questionnaires as well as weighing the expert opinions, averaging and defuzzification based on a step-by-step algorithm. The calculating procedures for evaluating these methods are applied in a real project; and the obtained results are explained.

  12. Performance Evaluation of a Software Engineering Tool for Automated Design of Cooling Systems in Injection Moulding

    DEFF Research Database (Denmark)

    Jauregui-Becker, Juan M.; Tosello, Guido; van Houten, Fred J.A.M.

    2013-01-01

    This paper presents a software tool for automating the design of cooling systems for injection moulding and a validation of its performance. Cooling system designs were automatically generated by the proposed software tool and by applying a best practice tool engineering design approach. The two...

  13. Analyzing the State of Static Analysis : A Large-Scale Evaluation in Open Source Software

    NARCIS (Netherlands)

    Beller, M.; Bholanath, R.; McIntosh, S.; Zaidman, A.E.

    2016-01-01

    The use of automatic static analysis has been a software engineering best practice for decades. However, we still do not know a lot about its use in real-world software projects: How prevalent is the use of Automated Static Analysis Tools (ASATs) such as FindBugs and JSHint? How do developers use

  14. Empirical Evaluation of Two Best-Practices for Energy-Efficient Software Development

    NARCIS (Netherlands)

    Procaccianti, G.; Fernandez, H.J.; Lago, P.

    2016-01-01

    Background. Energy efficiency is an increasingly important property of software. A large number of empirical studies have been conducted on the topic. However, current state-of-the-Art does not provide empirically-validated guidelines for developing energy-efficient software. Aim. This study aims at

  15. Microelectromechanical reprogrammable logic device

    Science.gov (United States)

    Hafiz, M. A. A.; Kosuru, L.; Younis, M. I.

    2016-01-01

    In modern computing, the Boolean logic operations are set by interconnect schemes between the transistors. As the miniaturization in the component level to enhance the computational power is rapidly approaching physical limits, alternative computing methods are vigorously pursued. One of the desired aspects in the future computing approaches is the provision for hardware reconfigurability at run time to allow enhanced functionality. Here we demonstrate a reprogrammable logic device based on the electrothermal frequency modulation scheme of a single microelectromechanical resonator, capable of performing all the fundamental 2-bit logic functions as well as n-bit logic operations. Logic functions are performed by actively tuning the linear resonance frequency of the resonator operated at room temperature and under modest vacuum conditions, reprogrammable by the a.c.-driving frequency. The device is fabricated using complementary metal oxide semiconductor compatible mass fabrication process, suitable for on-chip integration, and promises an alternative electromechanical computing scheme. PMID:27021295

  16. Amplifying genetic logic gates.

    Science.gov (United States)

    Bonnet, Jerome; Yin, Peter; Ortiz, Monica E; Subsoontorn, Pakpoom; Endy, Drew

    2013-05-03

    Organisms must process information encoded via developmental and environmental signals to survive and reproduce. Researchers have also engineered synthetic genetic logic to realize simpler, independent control of biological processes. We developed a three-terminal device architecture, termed the transcriptor, that uses bacteriophage serine integrases to control the flow of RNA polymerase along DNA. Integrase-mediated inversion or deletion of DNA encoding transcription terminators or a promoter modulates transcription rates. We realized permanent amplifying AND, NAND, OR, XOR, NOR, and XNOR gates actuated across common control signal ranges and sequential logic supporting autonomous cell-cell communication of DNA encoding distinct logic-gate states. The single-layer digital logic architecture developed here enables engineering of amplifying logic gates to control transcription rates within and across diverse organisms.

  17. Heterogeneous logics of competition

    DEFF Research Database (Denmark)

    Mossin, Christiane

    2015-01-01

    of competition are only realized as particular forms of social organization by virtue of interplaying with other kinds of logics, like legal logics. (2) Competition logics enjoy a peculiar status in-between constructedness and givenness; although competition depends on laws and mechanisms of socialization, we...... still experience competition as an expression of spontaneous human activities. On the basis of these perspectives, a study of fundamental rights of EU law, springing from the principle of ‘free movement of people’, is conducted. The first part of the empirical analysis seeks to detect the presence...... of a presumed logic of competition within EU law, whereas the second part focuses on particular legal logics. In this respect, the so-called ‘real link criterion’ (determining the access to transnational social rights for certain groups of unemployed people) is given special attention. What is particularly...

  18. Microelectromechanical reprogrammable logic device

    KAUST Repository

    Hafiz, Md Abdullah Al

    2016-03-29

    In modern computing, the Boolean logic operations are set by interconnect schemes between the transistors. As the miniaturization in the component level to enhance the computational power is rapidly approaching physical limits, alternative computing methods are vigorously pursued. One of the desired aspects in the future computing approaches is the provision for hardware reconfigurability at run time to allow enhanced functionality. Here we demonstrate a reprogrammable logic device based on the electrothermal frequency modulation scheme of a single microelectromechanical resonator, capable of performing all the fundamental 2-bit logic functions as well as n-bit logic operations. Logic functions are performed by actively tuning the linear resonance frequency of the resonator operated at room temperature and under modest vacuum conditions, reprogrammable by the a.c.-driving frequency. The device is fabricated using complementary metal oxide semiconductor compatible mass fabrication process, suitable for on-chip integration, and promises an alternative electromechanical computing scheme.

  19. Designing an Adaptive Nuero-Fuzzy Inference System for Evaluating the Business Intelligence System Implementation in Software Industry

    Directory of Open Access Journals (Sweden)

    Iman Raeesi Vanani

    2015-03-01

    Full Text Available The main goal of research is designing an adaptive nuero-fuzzy inference system for evaluating the implementation of business intelligence systems in software industry. Iranian software development organizations have been facing a lot of problems in case of implementing business intelligence systems. This system would be helpful in recognizing the conditions and prerequisites of success or failure. Organizations can recalculate the neuro-fuzzy system outputs with some considerations on various inputs to figure out which inputs have the most effect on the implementation outputs. By resolving the problems on inputs, organizations can achieve a better level of implementation success. The designed system has been trained by a data set and afterwards, it has been evaluated. The trained system has reached the error value of 0.08. Eventually, some recommendations have been provided for software development firms on the areas that might need more considerations and improvements.

  20. Evaluation of Model Driven Development of Safety Critical Software in the Nuclear Power Plant I and C system

    International Nuclear Information System (INIS)

    Jung, Jae Cheon; Chang, Hoon Seon; Chang, Young Woo; Kim, Jae Hack; Sohn, Se Do

    2005-01-01

    The major issues of the safety critical software are formalism and V and V. Implementing these two characteristics in the safety critical software will greatly enhance the quality of software product. The structure based development requires lots of output documents from the requirements phase to the testing phase. The requirements analysis phase is open omitted. According to the Standish group report in 2001, 49% of software project is cancelled before completion or never implemented. In addition, 23% is completed and become operational, but over-budget, over the time estimation, and with fewer features and functions than initially specified. They identified ten success factors. Among them, firm basic requirements and formal methods are technically achievable factors while the remaining eight are management related. Misunderstanding of requirements due to lack of communication between the design engineer and verification engineer causes unexpected result such as functionality error of system. Safety critical software shall comply with such characteristics as; modularity, simplicity, minimizing the sub-routine, and excluding the interrupt routine. In addition, the crosslink fault and erroneous function shall be eliminated. The easiness of repairing work after the installation shall be achieved as well. In consideration of the above issues, we evaluate the model driven development (MDD) methods for nuclear I and C systems software. For qualitative analysis, the unified modeling language (UML), functional block language (FBL) and the safety critical application environment (SCADE) are tested for the above characteristics

  1. Quality assurance applied to mammographic equipments using phantoms and software for its evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Mayo, Patricia, E-mail: p.mayo@titaniast.co [Titania Servicios Tecnologicos S.L., Grupo Dominguis, Apartado 46015, Valencia (Spain); Rodenas, Francisco [Departamento de Matematica Aplicada, Universidad Politecnica de Valencia, Apartado 46022, Valencia (Spain); Manuel Campayo, Juan [Hospital Clinico Universitario de Valencia, Avda. Blasco Ibanez, Apartado 46017, Valencia (Spain); Verdu, Gumersido [Departamento de Ingenieria Quimica y Nuclear, Universidad Politecnica de Valencia, Apartado 46022, Valencia (Spain)

    2010-07-21

    The image quality assessment in radiographic equipments is a very important item for a complete quality control of the radiographic image chain. The periodic evaluation of the radiographic image quality must guarantee the constancy of this quality to carry out a suitable diagnosis. Mammographic phantom images are usually used to study the quality of images obtained by determined mammographic equipment. The digital image treatment techniques allow to carry out an automatic analysis of the phantom image. In this work we apply some techniques of digital image processing to analyze in an automatic way the image quality of mammographic phantoms, namely CIRS SP01 and RACON for different varying conditions of the mammographic equipment. The CIRS SP01 phantom is usually used in analogic mammographic equipments and the RACON phantom has been specifically developed by authors to be applied to acceptance and constancy tests of the image quality in digital radiographic equipments following recommendations of international associations. The purpose of this work consists in analyzing the image quality for both phantoms by means of an automatic software utility. This analysis allows us to study the functioning of the image chain of the mammographic system in an objective way, so an abnormal functioning of the radiographic equipment might be detected.

  2. Quality assurance applied to mammographic equipments using phantoms and software for its evaluation

    International Nuclear Information System (INIS)

    Mayo, Patricia; Rodenas, Francisco; Manuel Campayo, Juan; Verdu, Gumersido

    2010-01-01

    The image quality assessment in radiographic equipments is a very important item for a complete quality control of the radiographic image chain. The periodic evaluation of the radiographic image quality must guarantee the constancy of this quality to carry out a suitable diagnosis. Mammographic phantom images are usually used to study the quality of images obtained by determined mammographic equipment. The digital image treatment techniques allow to carry out an automatic analysis of the phantom image. In this work we apply some techniques of digital image processing to analyze in an automatic way the image quality of mammographic phantoms, namely CIRS SP01 and RACON for different varying conditions of the mammographic equipment. The CIRS SP01 phantom is usually used in analogic mammographic equipments and the RACON phantom has been specifically developed by authors to be applied to acceptance and constancy tests of the image quality in digital radiographic equipments following recommendations of international associations. The purpose of this work consists in analyzing the image quality for both phantoms by means of an automatic software utility. This analysis allows us to study the functioning of the image chain of the mammographic system in an objective way, so an abnormal functioning of the radiographic equipment might be detected.

  3. Multi-Valued Modal Fixed Point Logics for Model Checking

    Science.gov (United States)

    Nishizawa, Koki

    In this paper, I will show how multi-valued logics are used for model checking. Model checking is an automatic technique to analyze correctness of hardware and software systems. A model checker is based on a temporal logic or a modal fixed point logic. That is to say, a system to be checked is formalized as a Kripke model, a property to be satisfied by the system is formalized as a temporal formula or a modal formula, and the model checker checks that the Kripke model satisfies the formula. Although most existing model checkers are based on 2-valued logics, recently new attempts have been made to extend the underlying logics of model checkers to multi-valued logics. I will summarize these new results.

  4. Error Free Software

    Science.gov (United States)

    1985-01-01

    A mathematical theory for development of "higher order" software to catch computer mistakes resulted from a Johnson Space Center contract for Apollo spacecraft navigation. Two women who were involved in the project formed Higher Order Software, Inc. to develop and market the system of error analysis and correction. They designed software which is logically error-free, which, in one instance, was found to increase productivity by 600%. USE.IT defines its objectives using AXES -- a user can write in English and the system converts to computer languages. It is employed by several large corporations.

  5. Evaluation of a software package for automated quality assessment of contrast detail images-comparison with subjective visual assessment

    International Nuclear Information System (INIS)

    Pascoal, A; Lawinski, C P; Honey, I; Blake, P

    2005-01-01

    Contrast detail analysis is commonly used to assess image quality (IQ) associated with diagnostic imaging systems. Applications include routine assessment of equipment performance and optimization studies. Most frequently, the evaluation of contrast detail images involves human observers visually detecting the threshold contrast detail combinations in the image. However, the subjective nature of human perception and the variations in the decision threshold pose limits to the minimum image quality variations detectable with reliability. Objective methods of assessment of image quality such as automated scoring have the potential to overcome the above limitations. A software package (CDRAD analyser) developed for automated scoring of images produced with the CDRAD test object was evaluated. Its performance to assess absolute and relative IQ was compared with that of an average observer. Results show that the software does not mimic the absolute performance of the average observer. The software proved more sensitive and was able to detect smaller low-contrast variations. The observer's performance was superior to the software's in the detection of smaller details. Both scoring methods showed frequent agreement in the detection of image quality variations resulting from changes in kVp and KERMA detector , which indicates the potential to use the software CDRAD analyser for assessment of relative IQ

  6. Using Coupled Energy, Airflow and IAQ Software (TRNSYS/CONTAM) to Evaluate Building Ventilation Strategies.

    Science.gov (United States)

    Dols, W Stuart; Emmerich, Steven J; Polidoro, Brian J

    2016-03-01

    tool to couple CONTAM with existing energy analysis software to address the interaction between indoor air quality considerations and energy conservation measures in building design and analysis. This paper presents two practical case studies using the coupled modelling tool to evaluate IAQ performance of a CO 2 -based demand-controlled ventilation system under different levels of building envelope airtightness and the design and analysis of a natural ventilation system.

  7. Logic Learning in Hopfield Networks

    OpenAIRE

    Sathasivam, Saratha; Abdullah, Wan Ahmad Tajuddin Wan

    2008-01-01

    Synaptic weights for neurons in logic programming can be calculated either by using Hebbian learning or by Wan Abdullah's method. In other words, Hebbian learning for governing events corresponding to some respective program clauses is equivalent with learning using Wan Abdullah's method for the same respective program clauses. In this paper we will evaluate experimentally the equivalence between these two types of learning through computer simulations.

  8. TEMAS: fleet-based bio-economic simulation software to evaluate management strategies accounting for fleet behaviour

    DEFF Research Database (Denmark)

    Ulrich, Clara; Andersen, Bo Sølgaard; Sparre, Per Johan

    2007-01-01

    TEMAS (technical management measures) is a fleet-based bio-economic software for evaluating management strategies accounting for technical measures and fleet behaviour. It focuses on mixed fisheries in which several fleets can choose among several fishing activities to target different stocks...

  9. 3D MODELLING BY LOW-COST RANGE CAMERA: SOFTWARE EVALUATION AND COMPARISON

    Directory of Open Access Journals (Sweden)

    R. Ravanelli

    2017-11-01

    Full Text Available The aim of this work is to present a comparison among three software applications currently available for the Occipital Structure SensorTM; all these software were developed for collecting 3D models of objects easily and in real-time with this structured light range camera. The SKANECT, itSeez3D and Scanner applications were thus tested: a DUPLOTM bricks construction was scanned with the three applications and the obtained models were compared to the model virtually generated with a standard CAD software, which served as reference. The results demonstrate that all the software applications are generally characterized by the same level of geometric accuracy, which amounts to very few millimetres. However, the itSeez3D software, which requires a payment of $7 to export each model, represents surely the best solution, both from the point of view of the geometric accuracy and, mostly, at the level of the color restitution. On the other hand, Scanner, which is a free software, presents an accuracy comparable to that of itSeez3D. At the same time, though, the colors are often smoothed and not perfectly overlapped to the corresponding part of the model. Lastly, SKANECT is the software that generates the highest number of points, but it has also some issues with the rendering of the colors.

  10. Evaluation of multiple intelligences in children aged 7 to 11 years through the implementation of an interactive software

    OpenAIRE

    Rebolledo Rodríguez, Rigel A.; Samaniego González, Euclides

    2017-01-01

    This document describes the evaluation project of multiple intelligences in children aged 7 to 11 years through the implementation of an interactive software, developed on Android for tablets, allowing parents, teachers, tutors, psy-chologists or other responsible adult, identifying the different types of intelligences that have children in order to know each other and develop their skills and their future potential. Similarly, research was developed to evaluate the effective-ness of the t...

  11. Evaluating the governance model of hardware-dependent software ecosystems - a case study of the axis ecosystem

    DEFF Research Database (Denmark)

    Wnuk, Krzysztof; Manikas, Konstantinos; Runeson, Per

    2014-01-01

    specifically, we evaluate the governance model applied by Axis, a network video and surveillance camera producer, that is the platform owner and orchestrator of the Application Development Partner (ADP) software ecosystem. We conduct an exploratory case study collecting data from observations and interviews...... and apply the governance model for prevention and improvement of the software ecosystem health proposed by Jansen and Cusumano. Our results reveal that although the governance actions do not address the majority of their governance model, the ADP ecosystem is considered a growing ecosystem providing...

  12. THRESHOLD LOGIC IN ARTIFICIAL INTELLIGENCE

    Science.gov (United States)

    COMPUTER LOGIC, ARTIFICIAL INTELLIGENCE , BIONICS, GEOMETRY, INPUT OUTPUT DEVICES, LINEAR PROGRAMMING, MATHEMATICAL LOGIC, MATHEMATICAL PREDICTION, NETWORKS, PATTERN RECOGNITION, PROBABILITY, SWITCHING CIRCUITS, SYNTHESIS

  13. Evaluating the accuracy of the Wechsler Memory Scale-Fourth Edition (WMS-IV) logical memory embedded validity index for detecting invalid test performance.

    Science.gov (United States)

    Soble, Jason R; Bain, Kathleen M; Bailey, K Chase; Kirton, Joshua W; Marceaux, Janice C; Critchfield, Edan A; McCoy, Karin J M; O'Rourke, Justin J F

    2018-01-08

    Embedded performance validity tests (PVTs) allow for continuous assessment of invalid performance throughout neuropsychological test batteries. This study evaluated the utility of the Wechsler Memory Scale-Fourth Edition (WMS-IV) Logical Memory (LM) Recognition score as an embedded PVT using the Advanced Clinical Solutions (ACS) for WAIS-IV/WMS-IV Effort System. This mixed clinical sample was comprised of 97 total participants, 71 of whom were classified as valid and 26 as invalid based on three well-validated, freestanding criterion PVTs. Overall, the LM embedded PVT demonstrated poor concordance with the criterion PVTs and unacceptable psychometric properties using ACS validity base rates (42% sensitivity/79% specificity). Moreover, 15-39% of participants obtained an invalid ACS base rate despite having a normatively-intact age-corrected LM Recognition total score. Receiving operating characteristic curve analysis revealed a Recognition total score cutoff of < 61% correct improved specificity (92%) while sensitivity remained weak (31%). Thus, results indicated the LM Recognition embedded PVT is not appropriate for use from an evidence-based perspective, and that clinicians may be faced with reconciling how a normatively intact cognitive performance on the Recognition subtest could simultaneously reflect invalid performance validity.

  14. Moralized Rationality: Relying on Logic and Evidence in the Formation and Evaluation of Belief Can Be Seen as a Moral Issue

    Science.gov (United States)

    Skitka, Linda J.

    2016-01-01

    In the present article we demonstrate stable individual differences in the extent to which a reliance on logic and evidence in the formation and evaluation of beliefs is perceived as a moral virtue, and a reliance on less rational processes is perceived as a vice. We refer to this individual difference variable as moralized rationality. Eight studies are reported in which an instrument to measure individual differences in moralized rationality is validated. Results show that the Moralized Rationality Scale (MRS) is internally consistent, and captures something distinct from the personal importance people attach to being rational (Studies 1–3). Furthermore, the MRS has high test-retest reliability (Study 4), is conceptually distinct from frequently used measures of individual differences in moral values, and it is negatively related to common beliefs that are not supported by scientific evidence (Study 5). We further demonstrate that the MRS predicts morally laden reactions, such as a desire for punishment, of people who rely on irrational (vs. rational) ways of forming and evaluating beliefs (Studies 6 and 7). Finally, we show that the MRS uniquely predicts motivation to contribute to a charity that works to prevent the spread of irrational beliefs (Study 8). We conclude that (1) there are stable individual differences in the extent to which people moralize a reliance on rationality in the formation and evaluation of beliefs, (2) that these individual differences do not reduce to the personal importance attached to rationality, and (3) that individual differences in moralized rationality have important motivational and interpersonal consequences. PMID:27851777

  15. Moralized Rationality: Relying on Logic and Evidence in the Formation and Evaluation of Belief Can Be Seen as a Moral Issue.

    Directory of Open Access Journals (Sweden)

    Tomas Ståhl

    Full Text Available In the present article we demonstrate stable individual differences in the extent to which a reliance on logic and evidence in the formation and evaluation of beliefs is perceived as a moral virtue, and a reliance on less rational processes is perceived as a vice. We refer to this individual difference variable as moralized rationality. Eight studies are reported in which an instrument to measure individual differences in moralized rationality is validated. Results show that the Moralized Rationality Scale (MRS is internally consistent, and captures something distinct from the personal importance people attach to being rational (Studies 1-3. Furthermore, the MRS has high test-retest reliability (Study 4, is conceptually distinct from frequently used measures of individual differences in moral values, and it is negatively related to common beliefs that are not supported by scientific evidence (Study 5. We further demonstrate that the MRS predicts morally laden reactions, such as a desire for punishment, of people who rely on irrational (vs. rational ways of forming and evaluating beliefs (Studies 6 and 7. Finally, we show that the MRS uniquely predicts motivation to contribute to a charity that works to prevent the spread of irrational beliefs (Study 8. We conclude that (1 there are stable individual differences in the extent to which people moralize a reliance on rationality in the formation and evaluation of beliefs, (2 that these individual differences do not reduce to the personal importance attached to rationality, and (3 that individual differences in moralized rationality have important motivational and interpersonal consequences.

  16. Relativistic quantum logic

    International Nuclear Information System (INIS)

    Mittelstaedt, P.

    1983-01-01

    on the basis of the well-known quantum logic and quantum probability a formal language of relativistic quantum physics is developed. This language incorporates quantum logical as well as relativistic restrictions. It is shown that relativity imposes serious restrictions on the validity regions of propositions in space-time. By an additional postulate this relativistic quantum logic can be made consistent. The results of this paper are derived exclusively within the formal quantum language; they are, however, in accordance with well-known facts of relativistic quantum physics in Hilbert space. (author)

  17. Coherent quantum logic

    International Nuclear Information System (INIS)

    Finkelstein, D.

    1987-01-01

    The von Neumann quantum logic lacks two basic symmetries of classical logic, that between sets and classes, and that between lower and higher order predicates. Similarly, the structural parallel between the set algebra and linear algebra of Grassmann and Peano was left incomplete by them in two respects. In this work a linear algebra is constructed that completes this correspondence and is interpreted as a new quantum logic that restores these invariances, and as a quantum set theory. It applies to experiments with coherent quantum phase relations between the quantum and the apparatus. The quantum set theory is applied to model a Lorentz-invariant quantum time-space complex

  18. Layered Fixed Point Logic

    DEFF Research Database (Denmark)

    Filipiuk, Piotr; Nielson, Flemming; Nielson, Hanne Riis

    2012-01-01

    We present a logic for the specification of static analysis problems that goes beyond the logics traditionally used. Its most prominent feature is the direct support for both inductive computations of behaviors as well as co-inductive specifications of properties. Two main theoretical contributions...... are a Moore Family result and a parametrized worst case time complexity result. We show that the logic and the associated solver can be used for rapid prototyping of analyses and illustrate a wide variety of applications within Static Analysis, Constraint Satisfaction Problems and Model Checking. In all cases...

  19. A multiplicity logic unit

    International Nuclear Information System (INIS)

    Bialkowski, J.; Moszynski, M.; Zagorski, A.

    1981-01-01

    The logic diagram principle of operation and some details of the design of the multiplicity logic unit are presented. This unit was specially designed to fulfil the requirements of a multidetector arrangement for gamma-ray multiplicity measurements. The unit is equipped with 16 inputs controlled by a common coincidence gate. It delivers a linear output pulse with the height proportional to the multiplicity of coincidences and logic pulses corresponding to 0, 1, ... up to >= 5-fold coincidences. These last outputs are used to steer the routing unit working with the multichannel analyser. (orig.)

  20. An evaluation of software tools for the design and development of cockpit displays

    Science.gov (United States)

    Ellis, Thomas D., Jr.

    1993-01-01

    The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.

  1. An Evaluation of GeoBEST Contingency Beddown Planning Software Using the Technology Acceptance Model

    National Research Council Canada - National Science Library

    Jensen, Shawn

    2002-01-01

    .... The Technology Acceptance Model (TAM) was applied, which measures a prospective user's perceptions of the technology's usefulness and ease-of-use and predicts their intentions to use the software in the future...

  2. Logic integer programming models for signaling networks.

    Science.gov (United States)

    Haus, Utz-Uwe; Niermann, Kathrin; Truemper, Klaus; Weismantel, Robert

    2009-05-01

    We propose a static and a dynamic approach to model biological signaling networks, and show how each can be used to answer relevant biological questions. For this, we use the two different mathematical tools of Propositional Logic and Integer Programming. The power of discrete mathematics for handling qualitative as well as quantitative data has so far not been exploited in molecular biology, which is mostly driven by experimental research, relying on first-order or statistical models. The arising logic statements and integer programs are analyzed and can be solved with standard software. For a restricted class of problems the logic models reduce to a polynomial-time solvable satisfiability algorithm. Additionally, a more dynamic model enables enumeration of possible time resolutions in poly-logarithmic time. Computational experiments are included.

  3. SOFTWARE EFFORT PREDICTION: AN EMPIRICAL EVALUATION OF METHODS TO TREAT MISSING VALUES WITH RAPIDMINER ®

    OpenAIRE

    OLGA FEDOTOVA; GLADYS CASTILLO; LEONOR TEIXEIRA; HELENA ALVELOS

    2011-01-01

    Missing values is a common problem in the data analysis in all areas, being software engineering not an exception. Particularly, missing data is a widespread phenomenon observed during the elaboration of effort prediction models (EPMs) required for budget, time and functionalities planning. Current work presents the results of a study carried out on a Portuguese medium-sized software development organization in order to obtain a formal method for EPMs elicitation in development processes. Thi...

  4. Logic Model Checking of Unintended Acceleration Claims in the 2005 Toyota Camry Electronic Throttle Control System

    Science.gov (United States)

    Gamble, Ed; Holzmann, Gerard

    2011-01-01

    Part of the US DOT investigation of Toyota SUA involved analysis of the throttle control software. JPL LaRS applied several techniques, including static analysis and logic model checking, to the software. A handful of logic models were built. Some weaknesses were identified; however, no cause for SUA was found. The full NASA report includes numerous other analyses

  5. Quantification frameworks and their application for evaluating the software quality factor using quality characteristic value

    International Nuclear Information System (INIS)

    Kim, C.; Chung, C.H.; Won-Ahn, K.

    2004-01-01

    Many problems, related with safety, frequently occur because Digital Instrument and Control Systems are widely used and expanding their ranges to many applications in Nuclear Power Plants. It, however, does not hold a general position to estimate an appropriate software quality. Thus, the Quality Characteristic Value, a software quality factor through each software life cycle, is suggested in this paper. The Quality Characteristic Value is obtained as following procedure: 1) Scoring Quality Characteristic Factors (especially correctness, traceability, completeness, and understandability) onto Software Verification and Validation results, 2) Deriving the diamond-shaped graphs by setting values of Factors at each axis and lining every points, and lastly 3) Measuring the area of the graph for Quality Characteristic Value. In this paper, this methodology is applied to Plant Control System. In addition, the series of quantification frameworks exhibit some good characteristics in the view of software quality factor. More than any thing else, it is believed that introduced framework may be applicable to regulatory guide, software approval procedures, due to its soundness and simple characteristics. (authors)

  6. Exploiting the potential of free software to evaluate root canal biomechanical preparation outcomes through micro-CT images.

    Science.gov (United States)

    Neves, A A; Silva, E J; Roter, J M; Belladona, F G; Alves, H D; Lopes, R T; Paciornik, S; De-Deus, G A

    2015-11-01

    To propose an automated image processing routine based on free software to quantify root canal preparation outcomes in pairs of sound and instrumented roots after micro-CT scanning procedures. Seven mesial roots of human mandibular molars with different canal configuration systems were studied: (i) Vertucci's type 1, (ii) Vertucci's type 2, (iii) two individual canals, (iv) Vertucci's type 6, canals (v) with and (vi) without debris, and (vii) canal with visible pulp calcification. All teeth were instrumented with the BioRaCe system and scanned in a Skyscan 1173 micro-CT before and after canal preparation. After reconstruction, the instrumented stack of images (IS) was registered against the preoperative sound stack of images (SS). Image processing included contrast equalization and noise filtering. Sound canal volumes were obtained by a minimum threshold. For the IS, a fixed conservative threshold was chosen as the best compromise between instrumented canal and dentine whilst avoiding debris, resulting in instrumented canal plus empty spaces. Arithmetic and logical operations between sound and instrumented stacks were used to identify debris. Noninstrumented dentine was calculated using a minimum threshold in the IS and subtracting from the SS and total debris. Removed dentine volume was obtained by subtracting SS from IS. Quantitative data on total debris present in the root canal space after instrumentation, noninstrumented areas and removed dentine volume were obtained for each test case, as well as three-dimensional volume renderings. After standardization of acquisition, reconstruction and image processing micro-CT images, a quantitative approach for calculation of root canal biomechanical outcomes was achieved using free software. © 2014 International Endodontic Journal. Published by John Wiley & Sons Ltd.

  7. An evaluation of the subtraction photoshop software accuracy to detect minor changes in optical density by radiovisiography

    Directory of Open Access Journals (Sweden)

    Talaeipour AR.

    2004-06-01

    Full Text Available Statement of Problem: Subtraction is a newly presented radiography technique to detect minor density"nchanges that are not visible by conventional radiography."nPurpose: The aim of this In-vitro study was to evaluate the efficacy of photoshop subtraction software for"ndetecting minor density changes between two dental images."nMaterials and Methods: In this research, five dried human mandibles were held in fixed position while thin"naluminium sheets were superimposed on each mandible on the 1th and 2nd molar teeth regions."nA reference image, without aluminium sheet placement, was obtained from each mandible subsequently series"nconsist of 20 images with aluminium sheets, ranging from 50p. to "5Q0"x were recorded by radiovisiography"n(RVG system. Initial images were subtracted from subsequent ones by Photoshop subtraction software. The"ndifference in density between the two images at the 1st and 2nd molar sites was related to the aluminium"nsheets. The optical density of aluminium sheets was determined by densitometer."nResults: In the present study, 6.6% of the optical density changes of the minimum aluminium thickness as"n300u. could be detected by photoshop software software."nConclusion: The findings of this study showed that the accuracy of photoshop subtraction software was equal"nto that of the conventional subtraction softwares. Additionally, the accuracy of this software was proved to be"nsuitable for clinical investigations of small localized changes in alveolar bone.

  8. An integrated software testing framework for FGA-based controllers in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Jae Yeob; Kim, Eun Sub; Yoo, Jun Beom; Lee, Young Jun; Choi, Jong Gyun

    2016-01-01

    Field-programmable gate arrays (FPGAs) have received much attention from the nuclear industry as an alternative platform to programmable logic controllers for digital instrumentation and control. The software aspect of FPGA development consists of several steps of synthesis and refinement, and also requires verification activities, such as simulations that are performed individually at each step. This study proposed an integrated software-testing framework for simulating all artifacts of the FPGA software development simultaneously and evaluating whether all artifacts work correctly using common oracle programs. This method also generates a massive number of meaningful simulation scenarios that reflect reactor shutdown logics. The experiment, which was performed on two FPGA software implementations, showed that it can dramatically save both time and costs

  9. Advances in temporal logic

    CERN Document Server

    Fisher, Michael; Gabbay, Dov; Gough, Graham

    2000-01-01

    Time is a fascinating subject that has captured mankind's imagination from ancient times to the present. It has been, and continues to be studied across a wide range of disciplines, from the natural sciences to philosophy and logic. More than two decades ago, Pnueli in a seminal work showed the value of temporal logic in the specification and verification of computer programs. Today, a strong, vibrant international research community exists in the broad community of computer science and AI. This volume presents a number of articles from leading researchers containing state-of-the-art results in such areas as pure temporal/modal logic, specification and verification, temporal databases, temporal aspects in AI, tense and aspect in natural language, and temporal theorem proving. Earlier versions of some of the articles were given at the most recent International Conference on Temporal Logic, University of Manchester, UK. Readership: Any student of the area - postgraduate, postdoctoral or even research professor ...

  10. Logic and Learning

    DEFF Research Database (Denmark)

    Hendricks, Vincent Fella; Gierasimczuk, Nina; de Jong, Dick

    2014-01-01

    Learning and learnability have been long standing topics of interests within the linguistic, computational, and epistemological accounts of inductive in- ference. Johan van Benthem’s vision of the “dynamic turn” has not only brought renewed life to research agendas in logic as the study of inform......Learning and learnability have been long standing topics of interests within the linguistic, computational, and epistemological accounts of inductive in- ference. Johan van Benthem’s vision of the “dynamic turn” has not only brought renewed life to research agendas in logic as the study...... of information processing, but likewise helped bring logic and learning in close proximity. This proximity relation is examined with respect to learning and belief revision, updating and efficiency, and with respect to how learnability fits in the greater scheme of dynamic epistemic logic and scientific method....

  11. Magnonic logic circuits

    International Nuclear Information System (INIS)

    Khitun, Alexander; Bao Mingqiang; Wang, Kang L

    2010-01-01

    We describe and analyse possible approaches to magnonic logic circuits and basic elements required for circuit construction. A distinctive feature of the magnonic circuitry is that information is transmitted by spin waves propagating in the magnetic waveguides without the use of electric current. The latter makes it possible to exploit spin wave phenomena for more efficient data transfer and enhanced logic functionality. We describe possible schemes for general computing and special task data processing. The functional throughput of the magnonic logic gates is estimated and compared with the conventional transistor-based approach. Magnonic logic circuits allow scaling down to the deep submicrometre range and THz frequency operation. The scaling is in favour of the magnonic circuits offering a significant functional advantage over the traditional approach. The disadvantages and problems of the spin wave devices are also discussed.

  12. Model for qualitative evaluation of risk in ducts through the fuzzy logic in conformity with the methodology of IBR; Modelo para avaliacao qualitativa do risco em oleodutos atraves da logica fuzzy segundo a metodologia da IBR

    Energy Technology Data Exchange (ETDEWEB)

    Mishina, Koje Daniel Vasconcelos [Centro Federal de Educacao Tecnologica da Bahia (CEFET-BA), Salvador, BA (Brazil); Silva, Jose Felicio da; Silva, Joao Bosco de Aquino [Universidade Federal da Paraiba (UFPb), Joao Pessoa, PB (Brazil)

    2006-07-01

    This work considers a system for qualitative evaluation of the terrestrial risk in pipelines, based in the methodology of the Inspection Based on Risk and the Fuzzy Logic. For this one developed a matrix of risk associated with the oil transport and its derivatives, that define the type of tracking and the regularity of inspection with Pig Instrumented in function of the level of risk found in the stretch in study. This matrix of risk considers the probabilities and the consequences associates to the damage for corrosion. The evaluation of the considered system was based on the consistency of the found levels of risk, in relation to that it would be found in the practical one. The gotten results had demonstrated that the use of the methodology of the IBR and the Fuzzy Logic can be used jointly as one sufficiently efficient alternative technique in the evaluation of risk in corroded pipe-lines. (author)

  13. Programming Games for Logical Thinking

    Directory of Open Access Journals (Sweden)

    H. Tsalapatas

    2013-03-01

    Full Text Available Analytical thinking is a transversal skill that helps learners synthesize knowledge across subject areas; from mathematics, science, and technology to critical reading, critical examination, and evaluation of lessons. While most would not doubt the importance of analytical capacity in academic settings and its growing demand for the skill in professional environments, school curricula do not comprehensively address its development. As a result, the responsibility for structuring related learning activities falls to teachers. This work examines learning paradigms that can be integrated into mathematics and science school education for developing logical thinking through game-based exercises based on programming. The proposed learning design promotes structured algorithmic mindsets, is based on inclusive universal logic present in all cultures, and promotes constructivism educational approaches encouraging learners to drive knowledge building by composing past and emerging experiences.

  14. Characterization of quantum logics

    International Nuclear Information System (INIS)

    Lahti, P.J.

    1980-01-01

    The quantum logic approach to axiomatic quantum mechanics is used to analyze the conceptual foundations of the traditional quantum theory. The universal quantum of action h>0 is incorporated into the theory by introducing the uncertainty principle, the complementarity principle, and the superposition principle into the framework. A characterization of those quantum logics (L,S) which may provide quantum descriptions is then given. (author)

  15. A Conceptual Space Logic

    DEFF Research Database (Denmark)

    Nilsson, Jørgen Fischer

    1999-01-01

    Conceptual spaces have been proposed as topological or geometric means for establishing conceptual structures and models. This paper, after briey reviewing conceptual spaces, focusses on the relationship between conceptual spaces and logical concept languages with operations for combining concepts...... to form concepts. Speci cally is introduced an algebraic concept logic, for which conceptual spaces are installed as semantic domain as replacement for, or enrichment of, the traditional....

  16. Automated remedial assessment methodology software system

    International Nuclear Information System (INIS)

    Whiting, M.; Wilkins, M.; Stiles, D.

    1994-11-01

    The Automated Remedial Analysis Methodology (ARAM) software system has been developed by the Pacific Northwest Laboratory to assist the U.S. Department of Energy (DOE) in evaluating cleanup options for over 10,000 contaminated sites across the DOE complex. The automated methodology comprises modules for decision logic diagrams, technology applicability and effectiveness rules, mass balance equations, cost and labor estimating factors and equations, and contaminant stream routing. ARAM is used to select technologies for meeting cleanup targets; determine the effectiveness of the technologies in destroying, removing, or immobilizing contaminants; decide the nature and amount of secondary waste requiring further treatment; and estimate the cost and labor involved when applying technologies

  17. Extending Value Logic Thinking to Value Logic Portfolios

    DEFF Research Database (Denmark)

    Andersen, Poul Houman; Ritter, Thomas

    2014-01-01

    Based on value creation logic theory (Stabell & Fjeldstad, 1998), this paper suggests an extension of the original Stabell & Fjeldstad model by an additional fourth value logic, the value system logic. Furthermore, instead of only allowing one dominant value creation logic for a given firm...... or transaction, an understanding of firms and transactions as a portfolio of value logics (i.e. an interconnected coexistence of different value creation logics) is proposed. These additions to the original value creation logic theory imply interesting avenues for both, strategic decision making in firms...

  18. Towards a Formal Occurrence Logic based on Predicate Logic

    DEFF Research Database (Denmark)

    Badie, Farshad; Götzsche, Hans

    2015-01-01

    In this discussion we will concentrate on the main characteristics of an alternative kind of logic invented by Hans Götzsche: Occurrence Logic, which is not based on truth functionality. Our approach is based on temporal logic developed and elaborated by A. N. Prior. We will focus on characterising...... argumentation based on formal Occurrence Logic concerning events and occurrences, and illustrate the relations between Predicate Logic and Occurrence Logic. The relationships (and dependencies) is conducive to an approach that can analyse the occurrences of ”logical statements based on different logical...... principles” in different moments. We will also conclude that the elaborated Götzsche’s Occurrence Logic could be able to direct us to a truth-functional independent computer-based logic for analysing argumentation based on events and occurrences....

  19. Integrated development environment for fuzzy logic applications

    Science.gov (United States)

    Pagni, Andrea; Poluzzi, Rinaldo; Rizzotto, GianGuido; Lo Presti, Matteo

    1993-12-01

    During the last five years, Fuzzy Logic has gained enormous popularity, both in the academic and industrial worlds, breaking up the traditional resistance against changes thanks to its innovative approach to problems formalization. The success of this new methodology is pushing the creation of a brand new class of devices, called Fuzzy Machines, to overcome the limitations of traditional computing systems when acting as Fuzzy Systems and adequate Software Tools to efficiently develop new applications. This paper aims to present a complete development environment for the definition of fuzzy logic based applications. The environment is also coupled with a sophisticated software tool for semiautomatic synthesis and optimization of the rules with stability verifications. Later it is presented the architecture of WARP, a dedicate VLSI programmable chip allowing to compute in real time a fuzzy control process. The article is completed with two application examples, which have been carried out exploiting the aforementioned tools and devices.

  20. Application of newly developed Fluoro-QC software for image quality evaluation in cardiac X-ray systems.

    Science.gov (United States)

    Oliveira, M; Lopez, G; Geambastiani, P; Ubeda, C

    2018-05-01

    A quality assurance (QA) program is a valuable tool for the continuous production of optimal quality images. The aim of this paper is to assess a newly developed automatic computer software for image quality (IR) evaluation in fluoroscopy X-ray systems. Test object images were acquired using one fluoroscopy system, Siemens Axiom Artis model (Siemens AG, Medical Solutions Erlangen, Germany). The software was developed as an ImageJ plugin. Two image quality parameters were assessed: high-contrast spatial resolution (HCSR) and signal-to-noise ratio (SNR). The time between manual and automatic image quality assessment procedures were compared. The paired t-test was used to assess the data. p Values of less than 0.05 were considered significant. The Fluoro-QC software generated faster IQ evaluation results (mean = 0.31 ± 0.08 min) than manual procedure (mean = 4.68 ± 0.09 min). The mean difference between techniques was 4.36 min. Discrepancies were identified in the region of interest (ROI) areas drawn manually with evidence of user dependence. The new software presented the results of two tests (HCSR = 3.06, SNR = 5.17) and also collected information from the DICOM header. Significant differences were not identified between manual and automatic measures of SNR (p value = 0.22) and HCRS (p value = 0.46). The Fluoro-QC software is a feasible, fast and free to use method for evaluating imaging quality parameters on fluoroscopy systems. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  1. Evaluation of static analysis tools used to assess software important to nuclear power plant safety

    Energy Technology Data Exchange (ETDEWEB)

    Ourghanlian, Alain [EDF Lab CHATOU, Simulation and Information Technologies for Power Generation Systems Department, EDF R and D, Cedex (France)

    2015-03-15

    We describe a comparative analysis of different tools used to assess safety-critical software used in nuclear power plants. To enhance the credibility of safety assessments and to optimize safety justification costs, Electricit e de France (EDF) investigates the use of methods and tools for source code semantic analysis, to obtain indisputable evidence and help assessors focus on the most critical issues. EDF has been using the PolySpace tool for more than 10 years. Currently, new industrial tools based on the same formal approach, Abstract Interpretation, are available. Practical experimentation with these new tools shows that the precision obtained on one of our shutdown systems software packages is substantially improved. In the first part of this article, we present the analysis principles of the tools used in our experimentation. In the second part, we present the main characteristics of protection-system software, and why these characteristics are well adapted for the new analysis tools.

  2. Evaluation of the efficiency and fault density of software generated by code generators

    Science.gov (United States)

    Schreur, Barbara

    1993-01-01

    Flight computers and flight software are used for GN&C (guidance, navigation, and control), engine controllers, and avionics during missions. The software development requires the generation of a considerable amount of code. The engineers who generate the code make mistakes and the generation of a large body of code with high reliability requires considerable time. Computer-aided software engineering (CASE) tools are available which generates code automatically with inputs through graphical interfaces. These tools are referred to as code generators. In theory, code generators could write highly reliable code quickly and inexpensively. The various code generators offer different levels of reliability checking. Some check only the finished product while some allow checking of individual modules and combined sets of modules as well. Considering NASA's requirement for reliability, an in house manually generated code is needed. Furthermore, automatically generated code is reputed to be as efficient as the best manually generated code when executed. In house verification is warranted.

  3. Statistical Software Engineering

    Science.gov (United States)

    1998-04-13

    multiversion software subject to coincident errors. IEEE Trans. Software Eng. SE-11:1511-1517. Eckhardt, D.E., A.K Caglayan, J.C. Knight, L.D. Lee, D.F...J.C. and N.G. Leveson. 1986. Experimental evaluation of the assumption of independence in multiversion software. IEEE Trans. Software

  4. Languages for Software-Defined Networks

    Science.gov (United States)

    2013-02-01

    switches, firewalls, and middleboxes) with closed and proprietary configuration inter- faces. Software - Defined Networks ( SDN ) are poised to change...how- ever, have seen growing interest in software - defined networks ( SDNs ), in which a logically-centralized controller manages the packet-processing...switches, firewalls, and middleboxes) with closed and proprietary configuration interfaces. Software - Defined Networks ( SDN ) are poised to change this

  5. Neutrosophic Logic Applied to Decision Making

    DEFF Research Database (Denmark)

    Madsen, Henrik; Albeanu, Grigore; Burtschy, Bernard

    2014-01-01

    Decision making addresses the usage of various methods to select "the best", in some way, alternative strategy (from many available) when a problem is given for solving. The authors propose the usage of neutrosophic way of thinking, called also Smarandache's logic, to select a model by experts when...... degrees of trustability, ultrastability (falsehood), and indeterminacy are used to decide. The procedures deal with multi-attribute neutrosophic decision making and a case study on e-learning software objects is presented....

  6. Modern logic and quantum mechanics

    International Nuclear Information System (INIS)

    Garden, R.W.

    1984-01-01

    The book applies the methods of modern logic and probabilities to ''interpreting'' quantum mechanics. The subject is described and discussed under the chapter headings: classical and quantum mechanics, modern logic, the propositional logic of mechanics, states and measurement in mechanics, the traditional analysis of probabilities, the probabilities of mechanics and the model logic of predictions. (U.K.)

  7. Semantic theory for logic programming

    Energy Technology Data Exchange (ETDEWEB)

    Brown, F M

    1981-01-01

    The author axiomatizes a number of meta theoretic concepts which have been used in logic programming, including: meaning, logical truth, nonentailment, assertion and erasure, thus showing that these concepts are logical in nature and need not be defined as they have previously been defined in terms of the operations of any particular interpreter for logic programs. 10 references.

  8. Relational Parametricity and Separation Logic

    DEFF Research Database (Denmark)

    Birkedal, Lars; Yang, Hongseok

    2008-01-01

    Separation logic is a recent extension of Hoare logic for reasoning about programs with references to shared mutable data structures. In this paper, we provide a new interpretation of the logic for a programming language with higher types. Our interpretation is based on Reynolds's relational...... parametricity, and it provides a formal connection between separation logic and data abstraction. Udgivelsesdato: 2008...

  9. U.S. Army Armament Research, Development and Engineering Center Grain Evaluation Software to Numerically Predict Linear Burn Regression for Solid Propellant Grain Geometries

    Science.gov (United States)

    2017-10-01

    ENGINEERING CENTER GRAIN EVALUATION SOFTWARE TO NUMERICALLY PREDICT LINEAR BURN REGRESSION FOR SOLID PROPELLANT GRAIN GEOMETRIES Brian...distribution is unlimited. AD U.S. ARMY ARMAMENT RESEARCH, DEVELOPMENT AND ENGINEERING CENTER Munitions Engineering Technology Center Picatinny...U.S. ARMY ARMAMENT RESEARCH, DEVELOPMENT AND ENGINEERING CENTER GRAIN EVALUATION SOFTWARE TO NUMERICALLY PREDICT LINEAR BURN REGRESSION FOR SOLID

  10. Evaluation of a breast software model for 2D and 3D X-ray imaging studies of the breast.

    Science.gov (United States)

    Baneva, Yanka; Bliznakova, Kristina; Cockmartin, Lesley; Marinov, Stoyko; Buliev, Ivan; Mettivier, Giovanni; Bosmans, Hilde; Russo, Paolo; Marshall, Nicholas; Bliznakov, Zhivko

    2017-09-01

    In X-ray imaging, test objects reproducing breast anatomy characteristics are realized to optimize issues such as image processing or reconstruction, lesion detection performance, image quality and radiation induced detriment. Recently, a physical phantom with a structured background has been introduced for both 2D mammography and breast tomosynthesis. A software version of this phantom and a few related versions are now available and a comparison between these 3D software phantoms and the physical phantom will be presented. The software breast phantom simulates a semi-cylindrical container filled with spherical beads of different diameters. Four computational breast phantoms were generated with a dedicated software application and for two of these, physical phantoms are also available and they are used for the side by side comparison. Planar projections in mammography and tomosynthesis were simulated under identical incident air kerma conditions. Tomosynthesis slices were reconstructed with an in-house developed reconstruction software. In addition to a visual comparison, parameters like fractal dimension, power law exponent β and second order statistics (skewness, kurtosis) of planar projections and tomosynthesis reconstructed images were compared. Visually, an excellent agreement between simulated and real planar and tomosynthesis images is observed. The comparison shows also an overall very good agreement between parameters evaluated from simulated and experimental images. The computational breast phantoms showed a close match with their physical versions. The detailed mathematical analysis of the images confirms the agreement between real and simulated 2D mammography and tomosynthesis images. The software phantom is ready for optimization purpose and extrapolation of the phantom to other breast imaging techniques. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  11. Modelling Imprecise Arguments in Description Logic

    Directory of Open Access Journals (Sweden)

    LETIA, I. A.

    2009-10-01

    Full Text Available Real arguments are a mixture of fuzzy linguistic variables and ontological knowledge. This paper focuses on modelling imprecise arguments in order to obtain a better interleaving of human and software agents argumentation, which might be proved useful for extending the number of real life argumentative-based applications. We propose Fuzzy Description Logic as the adequate technical instrumentation for filling the gap between human arguments and software agents arguments. A proof of concept scenario has been tested with the fuzzyDL reasoner.

  12. Non-logic devices in logic processes

    CERN Document Server

    Ma, Yanjun

    2017-01-01

    This book shows readers how to design semiconductor devices using the most common and lowest cost logic CMOS processes.  Readers will benefit from the author’s extensive, industrial experience and the practical approach he describes for designing efficiently semiconductor devices that typically have to be implemented using specialized processes that are expensive, time-consuming, and low-yield. The author presents an integrated picture of semiconductor device physics and manufacturing techniques, as well as numerous practical examples of device designs that are tried and true.

  13. An Empirical Evaluation of an Activity-Based Infrastructure for Supporting Cooperation in Software Engineering

    DEFF Research Database (Denmark)

    Tell, Paolo; Babar, Muhammad Ali

    2016-01-01

    Software engineering (SE) is predominantly a team effort that needs close cooperation among several people who may be geographically distributed. It has been recognized that appropriate tool support is a prerequisite to improve cooperation within SE teams. In an effort to contribute to this line...

  14. Designing an economic-driven evaluation framework for process-oriented software technologies.

    NARCIS (Netherlands)

    Mutschler, B.B.; Bumiller, J.; Reichert, M.U.

    2006-01-01

    During the last decade there has been a dramatic increase in the number of paradigms, standards and tools that can be used to realize process-oriented information systems. A major problem neglected in software engineering research so far has been the systematic determination of costs, benefits, and

  15. EDNA-An expert software system for comparison and evaluation of DNA profiles in forensic casework

    DEFF Research Database (Denmark)

    Haldemann, B.; Dornseifer, S.; Heylen, T.

    2015-01-01

    eDNA is an expert software system for DNA profile comparison, match interpretation and automated report generation in forensic DNA casework. Process automation and intelligent graphical representation maximise reliability of DNA evidence, while facilitating and accelerating the work of DNA experts....

  16. SU-E-I-13: Evaluation of Metal Artifact Reduction (MAR) Software On Computed Tomography (CT) Images

    International Nuclear Information System (INIS)

    Huang, V; Kohli, K

    2015-01-01

    Purpose: A new commercially available metal artifact reduction (MAR) software in computed tomography (CT) imaging was evaluated with phantoms in the presence of metals. The goal was to assess the ability of the software to restore the CT number in the vicinity of the metals without impacting the image quality. Methods: A Catphan 504 was scanned with a GE Optima RT 580 CT scanner (GE Healthcare, Milwaukee, WI) and the images were reconstructed with and without the MAR software. Both datasets were analyzed with Image Owl QA software (Image Owl Inc, Greenwich, NY). CT number sensitometry, MTF, low contrast, uniformity, noise and spatial accuracy were compared for scans with and without MAR software. In addition, an in-house made phantom was scanned with and without a stainless steel insert at three different locations. The accuracy of the CT number and metal insert dimension were investigated as well. Results: Comparisons between scans with and without MAR algorithm on the Catphan phantom demonstrate similar results for image quality. However, noise was slightly higher for the MAR algorithm. Evaluation of the CT number at various locations of the in-house made phantom was also performed. The baseline HU, obtained from the scan without metal insert, was compared to scans with the stainless steel insert at 3 different locations. The HU difference between the baseline scan versus metal scan was improved when the MAR algorithm was applied. In addition, the physical diameter of the stainless steel rod was over-estimated by the MAR algorithm by 0.9 mm. Conclusion: This work indicates with the presence of metal in CT scans, the MAR algorithm is capable of providing a more accurate CT number without compromising the overall image quality. Future work will include the dosimetric impact on the MAR algorithm

  17. Evaluation of Software Quality to Improve Application Performance Using Mc Call Model

    Directory of Open Access Journals (Sweden)

    Inda D Lestantri

    2018-04-01

    Full Text Available The existence of software should have more value to improve the performance of the organization in addition to having the primary function to automate. Before being implemented in an operational environment, software must pass the test gradually to ensure that the software is functioning properly, meeting user needs and providing convenience for users to use it. This test is performed on a web-based application, by taking a test case in an e-SAP application. E-SAP is an application used to monitor teaching and learning activities used by a university in Jakarta. To measure software quality, testing can be done on users randomly. The user samples selected in this test are users with an age range of 18 years old up to 25 years, background information technology. This test was conducted on 30 respondents. This test is done by using Mc Call model. Model of testing Mc Call consists of 11 dimensions are grouped into 3 categories. This paper describes the testing with reference to the category of product operation, which includes 5 dimensions. The dimensions of testing performed include the dimensions of correctness, usability, efficiency, reliability, and integrity. This paper discusses testing on each dimension to measure software quality as an effort to improve performance. The result of research is e-SAP application has good quality with product operation value equal to 85.09%. This indicates that the e-SAP application has a great quality, so this application deserves to be examined in the next stage on the operational environment.

  18. Preliminary evaluation of lung care software of 16-slice helical CT in the study of pulmonary nodules

    International Nuclear Information System (INIS)

    Song Wei; Jin Zhengyu; Yan Hongzhen; Wang Yun; Zhang Yunqing; Wang Linhui; Zhu Haifeng; Liang Jixiang; Qi Bing

    2005-01-01

    Objective: To evaluate the auxiliary diagnostic ability and applicability of the Lung Care software for the study of the pulmonary nodules. Methods: Fifty-six patients underwent low-dose CT scan with 1.5 mm collimation, 4 mm reconstruction interval, and 4 mm reconstruction slice in group A, and with 1.5 mm collimation, 2 mm reconstruction interval, and 2 mm reconstruction slice in group B. 12 patients underwent low-dose CT with 0.75 mm collimation, 0.75 mm reconstruction interval, and 0.75 mm reconstruction slice in group C. The nodules detected in groups A, B, and C were analyzed by r-MPR or VOI of the Lung Care software to distinguish the true pulmonary nodules from the vessels. The volume and density distribution of the true pulmonary nodules in groups A, B, and C were measured with the Lung Care software. Results: It was difficult to observe the diffuse pulmonary nodules by r-MPR or VOI of the Lung Care software. The images of each patient in group C were too many to be applied in the clinic. There was statistically consistent in the observation of pulmonary nodules between r-MPR and VOI, but the coincidence was not good (Kappa=0.369, P=0.002). There was statistically significant difference in showing faint nodules between r-MPR and VOI (P=0.001), r-MPR was better than VOI. There was statistically significant difference between group A and B in showing = 3.886, P=0.045), but no statistically significant difference in showing 5-10 mm nodules (χ 2 =0.170, P=0.680). The volume and density distribution of most 5 - ≤20 mm nodules were successfully measured with the Lung Care software, whereas those of most 2 =5.811, P=0.016) and 5-10 mm nodules (χ 2 =13.500, P 10 - ≤20 mm nodules (χ 2 =0.000, P=1.000). Conclusion: For distinguishing the true pulmonary nodules from others, the Lung Care software is suitable for the well-edged pulmonary nodules and most faint nodules, but not suitable for the nodules such as ground-glass opacity. For measuring the volume and

  19. Evaluation of the free, open source software WordPress as electronic portfolio system in undergraduate medical education.

    Science.gov (United States)

    Avila, Javier; Sostmann, Kai; Breckwoldt, Jan; Peters, Harm

    2016-06-03

    Electronic portfolios (ePortfolios) are used to document and support learning activities. E-portfolios with mobile capabilities allow even more flexibility. However, the development or acquisition of ePortfolio software is often costly, and at the same time, commercially available systems may not sufficiently fit the institution's needs. The aim of this study was to design and evaluate an ePortfolio system with mobile capabilities using a commercially free and open source software solution. We created an online ePortfolio environment using the blogging software WordPress based on reported capability features of such software by a qualitative weight and sum method. Technical implementation and usability were evaluated by 25 medical students during their clinical training by quantitative and qualitative means using online questionnaires and focus groups. The WordPress ePortfolio environment allowed students a broad spectrum of activities - often documented via mobile devices - like collection of multimedia evidences, posting reflections, messaging, web publishing, ePortfolio searches, collaborative learning, knowledge management in a content management system including a wiki and RSS feeds, and the use of aid tools for studying. The students' experience with WordPress revealed a few technical problems, and this report provides workarounds. The WordPress ePortfolio was rated positively by the students as a content management system (67 % of the students), for exchange with other students (74 %), as a note pad for reflections (53 %) and for its potential as an information source for assessment (48 %) and exchange with a mentor (68 %). On the negative side, 74 % of the students in this pilot study did not find it easy to get started with the system, and 63 % rated the ePortfolio as not being user-friendly. Qualitative analysis indicated a need for more introductory information and training. It is possible to build an advanced ePortfolio system with mobile

  20. Validation of acute physiologic and chronic health evaluation II scoring system software developed at The Aga Khan University, Pakistan.

    Science.gov (United States)

    Hashmi, M; Asghar, A; Shamim, F; Khan, F H

    2016-01-01

    To assess the predictive performance of Acute Physiologic and Chronic Health Evaluation II (APACHE II) software available on the hospital intranet and analyze interrater reliability of calculating the APACHE II score by the gold standard manual method or automatically using the software. An expert scorer not involved in the data collection had calculated APACHE II score of 213 patients admitted to surgical Intensive Care Unit using the gold standard manual method for a previous study performed in the department. The same data were entered into the computer software available on the hospital intranet (http://intranet/apacheii) to recalculate the APACHE II score automatically along with the predicted mortality. Receiver operating characteristic curve (ROC), Hosmer-Lemeshow goodness-of-fit statistical test and Pearson's correlation coefficient was computed. The 213 patients had an average APACHE II score of 17.20 ± 8.24, the overall mortality rate was 32.8% and standardized mortality ratio was 1.00. The area under the ROC curve of 0.827 was significantly >0.5 (P test showed a good calibration (H = 5.46, P = 0.71). Interrater reliability using Pearson's product moment correlations demonstrated a strong positive relationship between the computer and the manual expert scorer (r = 0.98, P = 0.0005). APACHE II software available on the hospital's intranet has satisfactory calibration and discrimination and interrater reliability is good when compared with the gold standard manual method.