Directory of Open Access Journals (Sweden)
Douglas Walton
2015-12-01
Full Text Available This paper presents a formalization of informal logic using the Carneades Argumentation System (CAS, a formal, computational model of argument that consists of a formal model of argument graphs and audiences. Conflicts between pro and con arguments are resolved using proof standards, such as preponderance of the evidence. CAS also formalizes argumentation schemes. Schemes can be used to check whether a given argument instantiates the types of argument deemed normatively appropriate for the type of dialogue.
Criteria for logical formalization
Czech Academy of Sciences Publication Activity Database
Peregrin, Jaroslav; Svoboda, Vladimír
2013-01-01
Roč. 190, č. 14 (2013), s. 2897-2924 ISSN 0039-7857 R&D Projects: GA ČR(CZ) GAP401/10/1279 Institutional support: RVO:67985955 Keywords : logic * logical form * formalization * reflective equilibrium Subject RIV: AA - Philosophy ; Religion Impact factor: 0.637, year: 2013
Formalized Epistemology, Logic, and Grammar
Bitbol, Michel
The task of a formal epistemology is defined. It appears that a formal epistemology must be a generalization of "logic" in the sense of Wittgenstein's Tractatus. The generalization is required because, whereas logic presupposes a strict relation between activity and language, this relation may be broken in some domains of experimental enquiry (e.g., in microscopic physics). However, a formal epistemology should also retain a major feature of Wittgenstein's "logic": It must not be a discourse about scientific knowledge, but rather a way of making manifest the structures usually implicit in knowledge-gaining activity. This strategy is applied to the formalism of quantum mechanics.
Logicism, intuitionism, and formalism
Symons, John
2008-01-01
Aims to review the programmes in the foundations of mathematics from the classical period and to assess their possible relevance for contemporary philosophy of mathematics. This work is suitable for researchers and graduate students of philosophy, logic, mathematics and theoretical computer science.
Towards a Formal Occurrence Logic based on Predicate Logic
DEFF Research Database (Denmark)
Badie, Farshad; Götzsche, Hans
2015-01-01
In this discussion we will concentrate on the main characteristics of an alternative kind of logic invented by Hans Götzsche: Occurrence Logic, which is not based on truth functionality. Our approach is based on temporal logic developed and elaborated by A. N. Prior. We will focus on characterising...... argumentation based on formal Occurrence Logic concerning events and occurrences, and illustrate the relations between Predicate Logic and Occurrence Logic. The relationships (and dependencies) is conducive to an approach that can analyse the occurrences of ”logical statements based on different logical...... principles” in different moments. We will also conclude that the elaborated Götzsche’s Occurrence Logic could be able to direct us to a truth-functional independent computer-based logic for analysing argumentation based on events and occurrences....
Formalization of Many-Valued Logics
DEFF Research Database (Denmark)
Villadsen, Jørgen; Schlichtkrull, Anders
2017-01-01
Partiality is a key challenge for computational approaches to artificial intelligence in general and natural language in particular. Various extensions of classical two-valued logic to many-valued logics have been investigated in order to meet this challenge. We use the proof assistant Isabelle...... to formalize the syntax and semantics of many-valued logics with determinate as well as indeterminate truth values. The formalization allows for a concise presentation and makes automated verification possible....
Towards a formal logic of design rationalization
DEFF Research Database (Denmark)
Galle, Per
1997-01-01
Certain extensions to standard predicate logic are proposed and used as a framework for critical logical study of patterns of inference in design reasoning. It is shown that within this framework a modal logic of design rationalization (suggested by an empirical study reported earlier) can...... be formally defined in terms of quantification over a universe of discourse of ‘relevant points of view’. Five basic principles of the extended predicate logic are listed, on the basis of which the validity of ten modal patterns of inference encountered in design rationalization is tested. The basic idea...
A Formal Semantics for Concept Understanding relying on Description Logics
DEFF Research Database (Denmark)
Badie, Farshad
2017-01-01
logical assumptions whose discovery may lead us to a better understanding of ‘concept understanding’. The Structure of Observed Learning Outcomes (SOLO) model as an appropriate model of increasing complexity of humans’ understanding has supported the formal analysis.......In this research, Description Logics (DLs) will be employed for logical description, logical characterisation, logical modelling and ontological description of concept understanding in terminological systems. It’s strongly believed that using a formal descriptive logic could support us in revealing...
A Formal Semantics for Concept Understanding relying on Description Logics
DEFF Research Database (Denmark)
Badie, Farshad
2017-01-01
In this research, Description Logics (DLs) will be employed for logical description, logical characterisation, logical modelling and ontological description of concept understanding in terminological systems. It’s strongly believed that using a formal descriptive logic could support us in reveali...... logical assumptions whose discovery may lead us to a better understanding of ‘concept understanding’. The Structure of Observed Learning Outcomes (SOLO) model as an appropriate model of increasing complexity of humans’ understanding has supported the formal analysis....
Combining Formal Logic and Machine Learning for Sentiment Analysis
DEFF Research Database (Denmark)
Petersen, Niklas Christoffer; Villadsen, Jørgen
2014-01-01
This paper presents a formal logical method for deep structural analysis of the syntactical properties of texts using machine learning techniques for efficient syntactical tagging. To evaluate the method it is used for entity level sentiment analysis as an alternative to pure machine learning...
The formal logic of business rules
Directory of Open Access Journals (Sweden)
Ivana Rábová
2007-01-01
Full Text Available Identification of improvement areas and utilization of information and communication technologies have gained value and priority in our knowledge driven society. Rules define constraints, conditions and policies of how the business processes are to be performed but they also affect the behavior of the resource and facilitate strategic business goals achieving. They control the business and represent business knowledge. The research works about business rules show how to specify and classify business rules from the business perspective and to establish an approach to managing them that will enable faster change in business processes and other business concepts in all areas of the business. In concrete this paper deals with four approaches to business rules formalization, i. e. notation of OCL, inference rules, decision table and predicate logic and with their general evaluation. The article shows also the advantages and disadvantages of these approaches of formalization. They are the example of every mentioned approach.
Formalizing a Paraconsistent Logic in the Isabelle Proof Assistant
DEFF Research Database (Denmark)
Villadsen, Jørgen; Schlichtkrull, Anders
2017-01-01
We present a formalization of a so-called paraconsistent logic that avoids the catastrophic explosiveness of inconsistency in classical logic. The paraconsistent logic has a countably infinite number of non-classical truth values. We show how to use the proof assistant Isabelle to formally prove...... theorems in the logic as well as meta-theorems about the logic. In particular, we formalize a meta-theorem that allows us to reduce the infinite number of truth values to a finite number of truth values, for a given formula, and we use this result in a formalization of a small case study....
Biological formal counterparts of logical machines
Energy Technology Data Exchange (ETDEWEB)
Moreno-diaz, R; Hernandez Guarch, F
1983-01-01
The significance of the McCulloch-Pitts formal neural net theory (1943) is still nowadays frequently misunderstood, and their basic units are wrongly considered as factual models for neurons. As a consequence, the whole original theory and its later addenda are unreasonably criticized for their simplicity. But, as it was proved then and since, the theory is after the modular neurophysiological counterpart of logical machines, so that it actually provides biologically plausible models for automata, turing machines, etc., and not vice versa. In its true context, no theory has surpassed its proposals. In McCulloch and Pitts memoriam and for the sake of future theoretical research, the authors stress this important historical point, including also some recent results on the neurophysiological counterparts of modular arbitrary probabilistic automata. 16 references.
Proposal for the Formalization of Dialectical Logic
Directory of Open Access Journals (Sweden)
José Luis Usó-Doménech
2016-12-01
Full Text Available Classical logic is typically concerned with abstract analysis. The problem for a synthetic logic is to transcend and unify available data to reconstruct the object as a totality. Three rules are proposed to pass from classic logic to synthetic logic. We present the category logic of qualitative opposition using examples from various sciences. This logic has been defined to include the neuter as part of qualitative opposition. The application of these rules to qualitative opposition, and, in particular, its neuter, demonstrated that a synthetic logic allows the truth of some contradictions. This synthetic logic is dialectical with a multi-valued logic, which gives every proposition a truth value in the interval [0,1] that is the square of the modulus of a complex number. In this dialectical logic, contradictions of the neuter of an opposition may be true.
Formalization of the Resolution Calculus for First-Order Logic
DEFF Research Database (Denmark)
Schlichtkrull, Anders
2016-01-01
A formalization in Isabelle/HOL of the resolution calculus for first-order logic is presented. Its soundness and completeness are formally proven using the substitution lemma, semantic trees, Herbrand’s theorem, and the lifting lemma. In contrast to previous formalizations of resolution, it consi......A formalization in Isabelle/HOL of the resolution calculus for first-order logic is presented. Its soundness and completeness are formally proven using the substitution lemma, semantic trees, Herbrand’s theorem, and the lifting lemma. In contrast to previous formalizations of resolution...
TOWARDS A PHILOSOPHICAL UNDERSTANDING OF THE LOGICS OF FORMAL INCONSISTENCY
Directory of Open Access Journals (Sweden)
WALTER CARNIELLI
2015-01-01
Full Text Available AbstractIn this paper we present a philosophical motivation for the logics of formal inconsistency, a family of paraconsistent logics whose distinctive feature is that of having resources for expressing the notion of consistency within the object language in such a way that consistency may be logically independent of non-contradiction. We defend the view according to which logics of formal inconsistency may be interpreted as theories of logical consequence of an epistemological character. We also argue that in order to philosophically justify paraconsistency there is no need to endorse dialetheism, the thesis that there are true contradictions. Furthermore, we show that mbC, a logic of formal inconsistency based on classical logic, may be enhanced in order to express the basic ideas of an intuitive interpretation of contradictions as conflicting evidence.
International Nuclear Information System (INIS)
Adam, J.; Tsupko-Sitnikov, V.M.
1996-01-01
A new, rigorously substantiated approach to construction of decay schemes on the basis of γ-γ coincidence data is described. Complete decay modes (concrete cascades of transitions from excited to the ground-state of a nucleus excited in a decay or a reaction) and continuity regions in complete modes (regions of successive transitions) are isolated by logical operations with rows (columns) of a symmetrical coincidence matrix where rows (columns) correspond to energies of coinciding transitions and matrix elements are unities and zeros, depending on the presence or absence of the given coincidence. To reject false complete modes and continuity regions arising from incompleteness of the coincidence data and errors in them, energy selection is introduced for complete modes and continuity regions, which demands that total energy of their constituent transitions should be equal to total energy of some other complete modes and continuity regions and to the energies of singles. With the continuity regions found, it is possible to order transitions in the selected complete modes and to algorithmize matching of complete modes into a decay scheme. 10 refs., 3 figs
A Survey of Formal Methods in Software Development
DEFF Research Database (Denmark)
Bjørner, Dines
2012-01-01
The use of formal methods and formal techniques in industry is steadily growing. In this survey we shall characterise what we mean by software development and by a formal method; briefly overview a history of formal specification languages - some of which are: VDM (Vienna Development Method, 1974...... need for multi-language formalisation (Petri Nets, MSC, StateChart, Temporal Logics); the sociology of university and industry acceptance of formal methods; the inevitability of the use of formal software development methods; while referring to seminal monographs and textbooks on formal methods....
A formalized design process for bacterial consortia that perform logic computing.
Directory of Open Access Journals (Sweden)
Weiyue Ji
Full Text Available The concept of microbial consortia is of great attractiveness in synthetic biology. Despite of all its benefits, however, there are still problems remaining for large-scaled multicellular gene circuits, for example, how to reliably design and distribute the circuits in microbial consortia with limited number of well-behaved genetic modules and wiring quorum-sensing molecules. To manage such problem, here we propose a formalized design process: (i determine the basic logic units (AND, OR and NOT gates based on mathematical and biological considerations; (ii establish rules to search and distribute simplest logic design; (iii assemble assigned basic logic units in each logic operating cell; and (iv fine-tune the circuiting interface between logic operators. We in silico analyzed gene circuits with inputs ranging from two to four, comparing our method with the pre-existing ones. Results showed that this formalized design process is more feasible concerning numbers of cells required. Furthermore, as a proof of principle, an Escherichia coli consortium that performs XOR function, a typical complex computing operation, was designed. The construction and characterization of logic operators is independent of "wiring" and provides predictive information for fine-tuning. This formalized design process provides guidance for the design of microbial consortia that perform distributed biological computation.
Remmel, Jeffrey; Shore, Richard; Sweedler, Moss; Progress in Computer Science and Applied Logic
1993-01-01
The twenty-six papers in this volume reflect the wide and still expanding range of Anil Nerode's work. A conference on Logical Methods was held in honor of Nerode's sixtieth birthday (4 June 1992) at the Mathematical Sciences Institute, Cornell University, 1-3 June 1992. Some of the conference papers are here, but others are from students, co-workers and other colleagues. The intention of the conference was to look forward, and to see the directions currently being pursued, in the development of work by, or with, Nerode. Here is a brief summary of the contents of this book. We give a retrospective view of Nerode's work. A number of specific areas are readily discerned: recursive equivalence types, recursive algebra and model theory, the theory of Turing degrees and r.e. sets, polynomial-time computability and computer science. Nerode began with automata theory and has also taken a keen interest in the history of mathematics. All these areas are represented. The one area missing is Nerode's applied mathematica...
Optimization methods for logical inference
Chandru, Vijay
2011-01-01
Merging logic and mathematics in deductive inference-an innovative, cutting-edge approach. Optimization methods for logical inference? Absolutely, say Vijay Chandru and John Hooker, two major contributors to this rapidly expanding field. And even though ""solving logical inference problems with optimization methods may seem a bit like eating sauerkraut with chopsticks. . . it is the mathematical structure of a problem that determines whether an optimization model can help solve it, not the context in which the problem occurs."" Presenting powerful, proven optimization techniques for logic in
DEFF Research Database (Denmark)
Bjørner, Dines; Havelund, Klaus
2014-01-01
In this "40 years of formal methods" essay we shall first delineate, Sect. 1, what we mean by method, formal method, computer science, computing science, software engineering, and model-oriented and algebraic methods. Based on this, we shall characterize a spectrum from specification-oriented met...
Methods in Logic Based Control
DEFF Research Database (Denmark)
Christensen, Georg Kronborg
1999-01-01
Desing and theory of Logic Based Control systems.Boolean Algebra, Karnaugh Map, Quine McClusky's algorithm. Sequential control design. Logic Based Control Method, Cascade Control Method. Implementation techniques: relay, pneumatic, TTL/CMOS,PAL and PLC- and Soft_PLC implementation. PLC...
Formalization of software requirements for information systems using fuzzy logic
Yegorov, Y. S.; Milov, V. R.; Kvasov, A. S.; Sorokoumova, S. N.; Suvorova, O. V.
2018-05-01
The paper considers an approach to the design of information systems based on flexible software development methodologies. The possibility of improving the management of the life cycle of information systems by assessing the functional relationship between requirements and business objectives is described. An approach is proposed to establish the relationship between the degree of achievement of business objectives and the fulfillment of requirements for the projected information system. It describes solutions that allow one to formalize the process of formation of functional and non-functional requirements with the help of fuzzy logic apparatus. The form of the objective function is formed on the basis of expert knowledge and is specified via learning from very small data set.
A Temporal Fuzzy Logic Formalism for Knowledge Based Systems
Directory of Open Access Journals (Sweden)
Vasile MAZILESCU
2012-11-01
Full Text Available This paper shows that the influence of knowledge on new forms of work organisation can be described as mutual relationships. Different changes in work organisation also have a strong influence on the increasing importance of knowledge of different individual and collective actors in working situations. After that, we characterize a piece of basic formal system, an Extended Fuzzy Logic System (EFLS with temporal attributes, to conceptualize future DKMSs based on human imprecise for distributed just in time decisions. The approximate reasoning is perceived as a derivation of new formulas with the corresponding temporal attributes, within a fuzzy theory defined by the fuzzy set of special axioms. In a management application, the reasoning is evolutionary because of unexpected events which may change the state of the DKMS. In this kind of situations it is necessary to elaborate certain mechanisms in order to maintain the coherence of the obtained conclusions, to figure out their degree of reliability and the time domain for which these are true. These last aspects stand as possible further directions of development at a basic logic level for future technologies that must automate knowledge organizational processes.
Industrial use of formal methods formal verification
Boulanger, Jean-Louis
2012-01-01
At present the literature gives students and researchers of the very general books on the formal technics. The purpose of this book is to present in a single book, a return of experience on the used of the "formal technics" (such proof and model-checking) on industrial examples for the transportation domain. This book is based on the experience of people which are completely involved in the realization and the evaluation of safety critical system software based. The implication of the industrialists allows to raise the problems of confidentiality which could appear and so allow
Thinking skills in the context of Formal Logic, Informal Logic and Critical Thinking19
Directory of Open Access Journals (Sweden)
Pieter van Veuren
1995-03-01
Full Text Available The aim of this essay is to explore the concept of thinking skills in three different contexts, i.e. Formal Logic, Informal Logic and Critical Thinking. The essay traces some contemporary historical connections between these approaches and illustrates differences and overlap between them by referring to the content pages of textbooks which are representative of the different approaches. In evaluating the historical developments sketched in the essay, the conclusion is reached that the open and pragmatic way in which Critical Thinking handles the topic of thinking skills has advantages for interdisciplinary contact and cooperation. However, this pragmatic approach also has a possible downside: the concept of thinking skills can become so vague as to be of no use.
The Logic Process Formalism of the Informational Domain
Directory of Open Access Journals (Sweden)
2007-01-01
Full Text Available The performance of present-day informational technologies has two main properties: the universality of the structures used and the flexibility of the final user's interfaces. The first determines the potential cover area of the informational domain. The second determines the diversity and efficiency of processing methods of the proceedings being automated. The mentioned aspects are of great importance in agriculture and ecology because there are complex processes and considerable volumes of used information. For example, the meteoro-logical processes are a part of the ecological one like habitats' existential conditions and are known as a complex prognostic problem. The latter needs considerable computational resources to solve the appropriate equations. Likewise, agriculture as a controlled activity under strong impact from natural conditions has the same high requirements for diverse structures and flexibility of information processing.
Formal Methods: Practice and Experience
DEFF Research Database (Denmark)
Woodcock, Jim; Larsen, Peter Gorm; Bicarregui, Juan
2009-01-01
. Based on this, we discuss the issues surrounding the industrial adoption of formal methods. Finally, we look to the future and describe the development of a Verified Software Repository, part of the worldwide Verified Software Initiative. We introduce the initial projects being used to populate...... the repository, and describe the challenges they address. © 2009 ACM. (146 refs.)...
Formalization of the Resolution Calculus for First-Order Logic
DEFF Research Database (Denmark)
Schlichtkrull, Anders
2018-01-01
between unsatisfiable sets of clauses and finite semantic trees is formalized in Herbrand’s theorem. I discuss the difficulties that I had formalizing proofs of the lifting lemma found in the literature, and I formalize a correct proof. The completeness proof is by induction on the size of a finite...
Taming Living Logic using Formal Methods
DEFF Research Database (Denmark)
Baig, Hasan; Madsen, Jan
2017-01-01
One of the goals of synthetic biology is to build genetic circuits to control the behavior of a cell for different application domains, such as medical, environmental, and biotech. During the design process of genetic circuits, biologists are often interested in the probability of a system to work...
Formal Verification of Digital Protection Logic and Automatic Testing Software
Energy Technology Data Exchange (ETDEWEB)
Cha, S. D.; Ha, J. S.; Seo, J. S. [KAIST, Daejeon (Korea, Republic of)
2008-06-15
- Technical aspect {center_dot} It is intended that digital I and C software have safety and reliability. Project results help the software to acquire license. Software verification technique, which results in this project, can be to use for digital NPP(Nuclear power plant) in the future. {center_dot} This research introduces many meaningful results of verification on digital protection logic and suggests I and C software testing strategy. These results apply to verify nuclear fusion device, accelerator, nuclear waste management and nuclear medical device that require dependable software and high-reliable controller. Moreover, These can be used for military, medical or aerospace-related software. - Economical and industrial aspect {center_dot} Since safety of digital I and C software is highly import, It is essential for the software to be verified. But verification and licence acquisition related to digital I and C software face high cost. This project gives economic profit to domestic economy by using introduced verification and testing technique instead of foreign technique. {center_dot} The operation rate of NPP will rise, when NPP safety critical software is verified with intellectual V and V tool. It is expected that these software substitute safety-critical software that wholly depend on foreign. Consequently, the result of this project has high commercial value and the recognition of the software development works will be able to be spread to the industrial circles. - Social and cultural aspect People expect that nuclear power generation contributes to relieving environmental problems because that does not emit more harmful air pollution source than other power generations. To give more trust and expectation about nuclear power generation to our society, we should make people to believe that NPP is highly safe system. In that point of view, we can present high-reliable I and C proofed by intellectual V and V technique as evidence
Meaning and proscription in formal logic variations on the propositional logic of William T. Parry
Ferguson, Thomas Macaulay
2017-01-01
This book aids in the rehabilitation of the wrongfully deprecated work of William Parry, and is the only full-length investigation into Parry-type propositional logics. A central tenet of the monograph is that the sheer diversity of the contexts in which the mereological analogy emerges – its effervescence with respect to fields ranging from metaphysics to computer programming – provides compelling evidence that the study of logics of analytic implication can be instrumental in identifying connections between topics that would otherwise remain hidden. More concretely, the book identifies and discusses a host of cases in which analytic implication can play an important role in revealing distinct problems to be facets of a larger, cross-disciplinary problem. It introduces an element of constancy and cohesion that has previously been absent in a regrettably fractured field, shoring up those who are sympathetic to the worth of mereological analogy. Moreover, it generates new interest in the field by illustrat...
Automatic Testing with Formal Methods
Tretmans, G.J.; Belinfante, Axel
1999-01-01
The use of formal system specifications makes it possible to automate the derivation of test cases from specifications. This allows to automate the whole testing process, not only the test execution part of it. This paper presents the state of the art and future perspectives in testing based on
Kumar, R. Renjith
2017-01-01
The study of formal logic helps to improve the process of thinking and tries to refine and improve the thinking ability. The objectives of this study are to know the effectiveness of formal logic course and to determine the critical thinking variables that are effective and that are ineffective. A sample of 214 students is selected from all the…
Formal Methods for Life-Critical Software
Butler, Ricky W.; Johnson, Sally C.
1993-01-01
The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.
Industrial Practice in Formal Methods : A Review
DEFF Research Database (Denmark)
Bicarregui, Juan C.; Fitzgerald, John; Larsen, Peter Gorm
2009-01-01
We examine the the industrial application of formal methods using data gathered in a review of 62 projects taking place over the last 25 years. The review suggests that formal methods are being applied in a wide range of application domains, with increasingly strong tool support. Significant chal...... challenges remain in providing usable tools that can be integrated into established development processes; in education and training; in taking formal methods from first use to second use, and in gathering and evidence to support informed selection of methods and tools.......We examine the the industrial application of formal methods using data gathered in a review of 62 projects taking place over the last 25 years. The review suggests that formal methods are being applied in a wide range of application domains, with increasingly strong tool support. Significant...
Formal methods for discrete-time dynamical systems
Belta, Calin; Aydin Gol, Ebru
2017-01-01
This book bridges fundamental gaps between control theory and formal methods. Although it focuses on discrete-time linear and piecewise affine systems, it also provides general frameworks for abstraction, analysis, and control of more general models. The book is self-contained, and while some mathematical knowledge is necessary, readers are not expected to have a background in formal methods or control theory. It rigorously defines concepts from formal methods, such as transition systems, temporal logics, model checking and synthesis. It then links these to the infinite state dynamical systems through abstractions that are intuitive and only require basic convex-analysis and control-theory terminology, which is provided in the appendix. Several examples and illustrations help readers understand and visualize the concepts introduced throughout the book.
Application of Formal Methods in Software Engineering
Directory of Open Access Journals (Sweden)
Adriana Morales
2011-12-01
Full Text Available The purpose of this research work is to examine: (1 why are necessary the formal methods for software systems today, (2 high integrity systems through the methodology C-by-C –Correctness-by-Construction–, and (3 an affordable methodology to apply formal methods in software engineering. The research process included reviews of the literature through Internet, in publications and presentations in events. Among the Research results found that: (1 there is increasing the dependence that the nations have, the companies and people of software systems, (2 there is growing demand for software Engineering to increase social trust in the software systems, (3 exist methodologies, as C-by-C, that can provide that level of trust, (4 Formal Methods constitute a principle of computer science that can be applied software engineering to perform reliable process in software development, (5 software users have the responsibility to demand reliable software products, and (6 software engineers have the responsibility to develop reliable software products. Furthermore, it is concluded that: (1 it takes more research to identify and analyze other methodologies and tools that provide process to apply the Formal Software Engineering methods, (2 Formal Methods provide an unprecedented ability to increase the trust in the exactitude of the software products and (3 by development of new methodologies and tools is being achieved costs are not more a disadvantage for application of formal methods.
Applied Formal Methods for Elections
DEFF Research Database (Denmark)
Wang, Jian
development time, or second dynamically, i.e. monitoring while an implementation is used during an election, or after the election is over, for forensic analysis. This thesis contains two chapters on this subject: the chapter Analyzing Implementations of Election Technologies describes a technique...... process. The chapter Measuring Voter Lines describes an automated data collection method for measuring voters' waiting time, and discusses statistical models designed to provide an understanding of the voter behavior in polling stations....
Applied Formal Methods for Elections
DEFF Research Database (Denmark)
Wang, Jian
Information technology is changing the way elections are organized. Technology renders the electoral process more efficient, but things could also go wrong: Voting software is complex, it consists of over thousands of lines of code, which makes it error-prone. Technical problems may cause delays...... bounded model-checking and satisfiability modulo theories (SMT) solvers can be used to check these criteria. Voter Experience: Technology profoundly affects the voter experience. These effects need to be measured and the data should be used to make decisions regarding the implementation of the electoral...... at polling stations, or even delay the announcement of the final result. This thesis describes a set of methods to be used, for example, by system developers, administrators, or decision makers to examine election technologies, social choice algorithms and voter experience. Technology: Verifiability refers...
Protocol design and implementation using formal methods
van Sinderen, Marten J.; Ferreira Pires, Luis; Pires, L.F.; Vissers, C.A.
1992-01-01
This paper reports on a number of formal methods that support correct protocol design and implementation. These methods are placed in the framework of a design methodology for distributed systems that was studied and developed within the ESPRIT II Lotosphere project (2304). The paper focuses on
What makes industries believe in formal methods
Vissers, C.A.; van Sinderen, Marten J.; Ferreira Pires, Luis; Pires, L.F.; Danthine, A.S.; Leduc, G.; Wolper, P.
1993-01-01
The introduction of formal methods in the design and development departments of an industrial company has far reaching and long lasting consequences. In fact it changes the whole environment of methods, tools and skills that determine the design culture of that company. A decision to replace current
FORMAL METHOD TO IMPLEMENT FUZZY REQUIREMENTS
Directory of Open Access Journals (Sweden)
MARLENE GONCALVES
2012-01-01
Full Text Available RESUMEN: Muchos requerimientos de usuario pueden involucrar criterios de preferencia expresados en el lenguaje natural por medio de términos difusos; éstos son llamados requerimientos difusos. Por otro lado, los lenguajes de consulta a bases de datos han sido extendidos incorporando la lógica difusa para manejar las preferencias de usuarios. Pocas de las metodologías conocidas para el desarrollo de aplicaciones sobre base de datos consideran las consultas difusas. En este trabajo, se propone un método para aplicaciones a bases de datos cuyo objetivo es desarrollar sistemas de software con soporte de consultas difusas. Lo novedoso de éste es la extensión al cálculo de tuplas para la especificación formal de consultas difusas. Además, el método incluye reglas de traducción de una especificación formal a una consulta en SQLf (structured query language + fuzzy logic, un lenguaje de consultas difusas sobre bases de datos precisas. Se ilustra su utilidad con la aplicación a un caso de estudio real.
Rethinking logic logic in relation to mathematics, evolution, and method
Cellucci, Carlo
2014-01-01
This book examines the limitations of mathematical logic and proposes a new approach intended to overcome them. Formulates new rules of discovery, such as induction, analogy, generalization, specialization, metaphor, metonymy, definition and diagrams.
On the Formal-Logical Analysis of the Foundations of Mathematics Applied to Problems in Physics
Kalanov, Temur Z.
2016-03-01
Analysis of the foundations of mathematics applied to problems in physics was proposed. The unity of formal logic and of rational dialectics is methodological basis of the analysis. It is shown that critical analysis of the concept of mathematical quantity - central concept of mathematics - leads to the following conclusion: (1) The concept of ``mathematical quantity'' is the result of the following mental operations: (a) abstraction of the ``quantitative determinacy of physical quantity'' from the ``physical quantity'' at that the ``quantitative determinacy of physical quantity'' is an independent object of thought; (b) abstraction of the ``amount (i.e., abstract number)'' from the ``quantitative determinacy of physical quantity'' at that the ``amount (i.e., abstract number)'' is an independent object of thought. In this case, unnamed, abstract numbers are the only sign of the ``mathematical quantity''. This sign is not an essential sign of the material objects. (2) The concept of mathematical quantity is meaningless, erroneous, and inadmissible concept in science because it represents the following formal-logical and dialectical-materialistic error: negation of the existence of the essential sign of the concept (i.e., negation of the existence of the essence of the concept) and negation of the existence of measure of material object.
A brief overview of NASA Langley's research program in formal methods
1992-01-01
An overview of NASA Langley's research program in formal methods is presented. The major goal of this work is to bring formal methods technology to a sufficiently mature level for use by the United States aerospace industry. Towards this goal, work is underway to design and formally verify a fault-tolerant computing platform suitable for advanced flight control applications. Also, several direct technology transfer efforts have been initiated that apply formal methods to critical subsystems of real aerospace computer systems. The research team consists of six NASA civil servants and contractors from Boeing Military Aircraft Company, Computational Logic Inc., Odyssey Research Associates, SRI International, University of California at Davis, and Vigyan Inc.
On the Need for Practical Formal Methods
1998-01-01
additional research and engineering that is needed to make the current set of formal methods more practical. To illustrate the ideas, I present several exam ...either a good violin or a highly talented violinist. Light-weight techniques o er software developers good violins . A user need not be a talented
Formal Method of Description Supporting Portfolio Assessment
Morimoto, Yasuhiko; Ueno, Maomi; Kikukawa, Isao; Yokoyama, Setsuo; Miyadera, Youzou
2006-01-01
Teachers need to assess learner portfolios in the field of education. However, they need support in the process of designing and practicing what kind of portfolios are to be assessed. To solve the problem, a formal method of describing the relations between the lesson forms and portfolios that need to be collected and the relations between…
A Formal Methods Approach to the Analysis of Mode Confusion
Butler, Ricky W.; Miller, Steven P.; Potts, James N.; Carreno, Victor A.
2004-01-01
The goal of the new NASA Aviation Safety Program (AvSP) is to reduce the civil aviation fatal accident rate by 80% in ten years and 90% in twenty years. This program is being driven by the accident data with a focus on the most recent history. Pilot error is the most commonly cited cause for fatal accidents (up to 70%) and obviously must be given major consideration in this program. While the greatest source of pilot error is the loss of situation awareness , mode confusion is increasingly becoming a major contributor as well. The January 30, 1995 issue of Aviation Week lists 184 incidents and accidents involving mode awareness including the Bangalore A320 crash 2/14/90, the Strasbourg A320 crash 1/20/92, the Mulhouse-Habsheim A320 crash 6/26/88, and the Toulouse A330 crash 6/30/94. These incidents and accidents reveal that pilots sometimes become confused about what the cockpit automation is doing. Consequently, human factors research is an obvious investment area. However, even a cursory look at the accident data reveals that the mode confusion problem is much deeper than just training deficiencies and a lack of human-oriented design. This is readily acknowledged by human factors experts. It seems that further progress in human factors must come through a deeper scrutiny of the internals of the automation. It is in this arena that formal methods can contribute. Formal methods refers to the use of techniques from logic and discrete mathematics in the specification, design, and verification of computer systems, both hardware and software. The fundamental goal of formal methods is to capture requirements, designs and implementations in a mathematically based model that can be analyzed in a rigorous manner. Research in formal methods is aimed at automating this analysis as much as possible. By capturing the internal behavior of a flight deck in a rigorous and detailed formal model, the dark corners of a design can be analyzed. This paper will explore how formal
DEFF Research Database (Denmark)
Nilsson, Jørgen Fischer
A Gentle introduction to logical languages, logical modeling, formal reasoning and computational logic for computer science and software engineering students......A Gentle introduction to logical languages, logical modeling, formal reasoning and computational logic for computer science and software engineering students...
Preliminary Proceedings First International Workshop on Formal Methods for WirelessSystems
DEFF Research Database (Denmark)
The FMWS workshops aim at bringing together researchers interested in formal methods for wireless systems, more specifically in theories for semantics, logics, and verification techniques for wireless systems. Wireless systems are rapidly increasing their success in real-world applications while...
Versatile Formal Methods Applied to Quantum Information.
Energy Technology Data Exchange (ETDEWEB)
Witzel, Wayne [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Rudinger, Kenneth Michael [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Sarovar, Mohan [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)
2015-11-01
Using a novel formal methods approach, we have generated computer-veri ed proofs of major theorems pertinent to the quantum phase estimation algorithm. This was accomplished using our Prove-It software package in Python. While many formal methods tools are available, their practical utility is limited. Translating a problem of interest into these systems and working through the steps of a proof is an art form that requires much expertise. One must surrender to the preferences and restrictions of the tool regarding how mathematical notions are expressed and what deductions are allowed. Automation is a major driver that forces restrictions. Our focus, on the other hand, is to produce a tool that allows users the ability to con rm proofs that are essentially known already. This goal is valuable in itself. We demonstrate the viability of our approach that allows the user great exibility in expressing state- ments and composing derivations. There were no major obstacles in following a textbook proof of the quantum phase estimation algorithm. There were tedious details of algebraic manipulations that we needed to implement (and a few that we did not have time to enter into our system) and some basic components that we needed to rethink, but there were no serious roadblocks. In the process, we made a number of convenient additions to our Prove-It package that will make certain algebraic manipulations easier to perform in the future. In fact, our intent is for our system to build upon itself in this manner.
The HACMS program: using formal methods to eliminate exploitable bugs.
Fisher, Kathleen; Launchbury, John; Richards, Raymond
2017-10-13
For decades, formal methods have offered the promise of verified software that does not have exploitable bugs. Until recently, however, it has not been possible to verify software of sufficient complexity to be useful. Recently, that situation has changed. SeL4 is an open-source operating system microkernel efficient enough to be used in a wide range of practical applications. Its designers proved it to be fully functionally correct, ensuring the absence of buffer overflows, null pointer exceptions, use-after-free errors, etc., and guaranteeing integrity and confidentiality. The CompCert Verifying C Compiler maps source C programs to provably equivalent assembly language, ensuring the absence of exploitable bugs in the compiler. A number of factors have enabled this revolution, including faster processors, increased automation, more extensive infrastructure, specialized logics and the decision to co-develop code and correctness proofs rather than verify existing artefacts. In this paper, we explore the promise and limitations of current formal-methods techniques. We discuss these issues in the context of DARPA's HACMS program, which had as its goal the creation of high-assurance software for vehicles, including quadcopters, helicopters and automobiles.This article is part of the themed issue 'Verified trustworthy software systems'. © 2017 The Authors.
The HACMS program: using formal methods to eliminate exploitable bugs
Launchbury, John; Richards, Raymond
2017-01-01
For decades, formal methods have offered the promise of verified software that does not have exploitable bugs. Until recently, however, it has not been possible to verify software of sufficient complexity to be useful. Recently, that situation has changed. SeL4 is an open-source operating system microkernel efficient enough to be used in a wide range of practical applications. Its designers proved it to be fully functionally correct, ensuring the absence of buffer overflows, null pointer exceptions, use-after-free errors, etc., and guaranteeing integrity and confidentiality. The CompCert Verifying C Compiler maps source C programs to provably equivalent assembly language, ensuring the absence of exploitable bugs in the compiler. A number of factors have enabled this revolution, including faster processors, increased automation, more extensive infrastructure, specialized logics and the decision to co-develop code and correctness proofs rather than verify existing artefacts. In this paper, we explore the promise and limitations of current formal-methods techniques. We discuss these issues in the context of DARPA’s HACMS program, which had as its goal the creation of high-assurance software for vehicles, including quadcopters, helicopters and automobiles. This article is part of the themed issue ‘Verified trustworthy software systems’. PMID:28871050
A Formal Verification Method of Function Block Diagram
International Nuclear Information System (INIS)
Koh, Kwang Yong; Seong, Poong Hyun; Jee, Eun Kyoung; Jeon, Seung Jae; Park, Gee Yong; Kwon, Kee Choon
2007-01-01
Programmable Logic Controller (PLC), an industrial computer specialized for real-time applications, is widely used in diverse control systems in chemical processing plants, nuclear power plants or traffic control systems. As a PLC is often used to implement safety, critical embedded software, rigorous safety demonstration of PLC code is necessary. Function block diagram (FBD) is a standard application programming language for the PLC and currently being used in the development of a fully-digitalized reactor protection system (RPS), which is called the IDiPS, under the KNICS project. Therefore, verification issue of FBD programs is a pressing problem, and hence is of great importance. In this paper, we propose a formal verification method of FBD programs; we defined FBD programs formally in compliance with IEC 61131-3, and then translate the programs into Verilog model, and finally the model is verified using a model checker SMV. To demonstrate the feasibility and effective of this approach, we applied it to IDiPS which currently being developed under KNICS project. The remainder of this paper is organized as follows. Section 2 briefly describes Verilog and Cadence SMV. In Section 3, we introduce FBD2V which is a tool implemented to support the proposed FBD verification framework. A summary and conclusion are provided in Section 4
Fuzzy logic control to be conventional method
Energy Technology Data Exchange (ETDEWEB)
Eker, Ilyas [University of Gaziantep, Gaziantep (Turkey). Department of Electrical and Electronic Engineering; Torun, Yunis [University of Gaziantep, Gaziantep (Turkey). Technical Vocational School of Higher Education
2006-03-01
Increasing demands for flexibility and fast reactions in modern process operation and production methods result in nonlinear system behaviour of partly unknown systems, and this necessitates application of alternative control methods to meet the demands. Fuzzy logic (FL) control can play an important role because knowledge based design rules can easily be implemented in systems with unknown structure, and it is going to be a conventional control method since the control design strategy is simple and practical and is based on linguistic information. Computational complexity is not a limitation any more because the computing power of computers has been significantly improved even for high speed industrial applications. This makes FL control an important alternative method to the conventional PID control method for use in nonlinear industrial systems. This paper presents a practical implementation of the FL control to an electrical drive system. Such drive systems used in industry are composed of masses moving under the action of position and velocity dependent forces. These forces exhibit nonlinear behaviour. For a multi-mass drive system, the nonlinearities, like Coulomb friction and dead zone, significantly influence the operation of the systems. The proposed FL control configuration is based on speed error and change of speed error. The feasibility and effectiveness of the control method are experimentally demonstrated. The results obtained from conventional FL control, fuzzy PID and adaptive FL control are compared with traditional PID control for the dynamic responses of the closed loop drive system. (author)
Fuzzy logic control to be conventional method
International Nuclear Information System (INIS)
Eker, Ilyas; Torun, Yunis
2006-01-01
Increasing demands for flexibility and fast reactions in modern process operation and production methods result in nonlinear system behaviour of partly unknown systems, and this necessitates application of alternative control methods to meet the demands. Fuzzy logic (FL) control can play an important role because knowledge based design rules can easily be implemented in systems with unknown structure, and it is going to be a conventional control method since the control design strategy is simple and practical and is based on linguistic information. Computational complexity is not a limitation any more because the computing power of computers has been significantly improved even for high speed industrial applications. This makes FL control an important alternative method to the conventional PID control method for use in nonlinear industrial systems. This paper presents a practical implementation of the FL control to an electrical drive system. Such drive systems used in industry are composed of masses moving under the action of position and velocity dependent forces. These forces exhibit nonlinear behaviour. For a multi-mass drive system, the nonlinearities, like Coulomb friction and dead zone, significantly influence the operation of the systems. The proposed FL control configuration is based on speed error and change of speed error. The feasibility and effectiveness of the control method are experimentally demonstrated. The results obtained from conventional FL control, fuzzy PID and adaptive FL control are compared with traditional PID control for the dynamic responses of the closed loop drive system
On Fitting a Formal Method into Practice
DEFF Research Database (Denmark)
Gmehlich, Rainer; Grau, Katrin; Hallerstede, Stefan
2011-01-01
. The interaction between the two proved to be crucial for the success of the case study. The heart of the problem was tracing informal requirements from Problem Frames descriptions to formal Event-B models. To a large degree, this issue dictated the approach that had to be used for formal modelling. A dedicated...
Eriksson Lundström, Jenny S. Z.
2009-01-01
Argumentation is a highly dynamical and dialectical process drawing on human cognition. Successful argumentation is ubiquitous to human interaction. Comprehensive formal modeling and analysis of argumentation presupposes a dynamical approach to the following phenomena: the deductive logic notion, the dialectical notion and the cognitive notion of justified belief. For each step of an argumentation these phenomena form networks of rules which determine the propositions to be allowed to make se...
Formal methods for industrial critical systems a survey of applications
Margaria-Steffen, Tiziana
2012-01-01
"Today, formal methods are widely recognized as an essential step in the design process of industrial safety-critical systems. In its more general definition, the term formal methods encompasses all notations having a precise mathematical semantics, together with their associated analysis methods, that allow description and reasoning about the behavior of a system in a formal manner.Growing out of more than a decade of award-winning collaborative work within the European Research Consortium for Informatics and Mathematics, Formal Methods for Industrial Critical Systems: A Survey of Applications presents a number of mainstream formal methods currently used for designing industrial critical systems, with a focus on model checking. The purpose of the book is threefold: to reduce the effort required to learn formal methods, which has been a major drawback for their industrial dissemination; to help designers to adopt the formal methods which are most appropriate for their systems; and to offer a panel of state-of...
Formal Methods Applications in Air Transportation
Farley, Todd
2009-01-01
The U.S. air transportation system is the most productive in the world, moving far more people and goods than any other. It is also the safest system in the world, thanks in part to its venerable air traffic control system. But as demand for air travel continues to grow, the air traffic control system s aging infrastructure and labor-intensive procedures are impinging on its ability to keep pace with demand. And that impinges on the growth of our economy. Air traffic control modernization has long held the promise of a more efficient air transportation system. Part of NASA s current mission is to develop advanced automation and operational concepts that will expand the capacity of our national airspace system while still maintaining its excellent record for safety. It is a challenging mission, as efforts to modernize have, for decades, been hamstrung by the inability to assure safety to the satisfaction of system operators, system regulators, and/or the traveling public. In this talk, we ll provide a brief history of air traffic control, focusing on the tension between efficiency and safety assurance, and the promise of formal methods going forward.
Formal methods in software development: A road less travelled
Directory of Open Access Journals (Sweden)
John A van der Poll
2010-08-01
Full Text Available An integration of traditional verification techniques and formal specifications in software engineering is presented. Advocates of such techniques claim that mathematical formalisms allow them to produce quality, verifiably correct, or at least highly dependable software and that the testing and maintenance phases are shortened. Critics on the other hand maintain that software formalisms are hard to master, tedious to use and not well suited for the fast turnaround times demanded by industry. In this paper some popular formalisms and the advantages of using these during the early phases of the software development life cycle are presented. Employing the Floyd-Hoare verification principles during the formal specification phase facilitates reasoning about the properties of a specification. Some observations that may help to alleviate the formal-methods controversy are established and a number of formal methods successes is presented. Possible conditions for an increased acceptance of formalisms in oftware development are discussed.
Terfve, Camille; Cokelaer, Thomas; Henriques, David; MacNamara, Aidan; Goncalves, Emanuel; Morris, Melody K; van Iersel, Martijn; Lauffenburger, Douglas A; Saez-Rodriguez, Julio
2012-10-18
Cells process signals using complex and dynamic networks. Studying how this is performed in a context and cell type specific way is essential to understand signaling both in physiological and diseased situations. Context-specific medium/high throughput proteomic data measured upon perturbation is now relatively easy to obtain but formalisms that can take advantage of these features to build models of signaling are still comparatively scarce. Here we present CellNOptR, an open-source R software package for building predictive logic models of signaling networks by training networks derived from prior knowledge to signaling (typically phosphoproteomic) data. CellNOptR features different logic formalisms, from Boolean models to differential equations, in a common framework. These different logic model representations accommodate state and time values with increasing levels of detail. We provide in addition an interface via Cytoscape (CytoCopteR) to facilitate use and integration with Cytoscape network-based capabilities. Models generated with this pipeline have two key features. First, they are constrained by prior knowledge about the network but trained to data. They are therefore context and cell line specific, which results in enhanced predictive and mechanistic insights. Second, they can be built using different logic formalisms depending on the richness of the available data. Models built with CellNOptR are useful tools to understand how signals are processed by cells and how this is altered in disease. They can be used to predict the effect of perturbations (individual or in combinations), and potentially to engineer therapies that have differential effects/side effects depending on the cell type or context.
Improving Object-Oriented Methods by using Fuzzy Logic
Marcelloni, Francesco; Aksit, Mehmet
2000-01-01
Object-oriented methods create software artifacts through the application of a large number or rules. Rules are typically formulated in two-valued logic. There are a number of problems on how rules are defined and applied in current methods. First, two-valued logic can capture completely neither
Towards a Wide-coverage Tableau Method for Natural Logic
Abzianidze, Lasha; Murata, Tsuyoshi; Mineshima, Koji; Bekki, Daisuke
2015-01-01
The rst step towards a wide-coverage tableau prover for natural logic is presented. We describe an automatized method for obtaining Lambda Logical Forms from surface forms and use this method with an implemented prover to hunt for new tableau rules in textual entailment data sets. The collected
Universal programmable logic gate and routing method
Fijany, Amir (Inventor); Vatan, Farrokh (Inventor); Akarvardar, Kerem (Inventor); Blalock, Benjamin (Inventor); Chen, Suheng (Inventor); Cristoloveanu, Sorin (Inventor); Kolawa, Elzbieta (Inventor); Mojarradi, Mohammad M. (Inventor); Toomarian, Nikzad (Inventor)
2009-01-01
An universal and programmable logic gate based on G.sup.4-FET technology is disclosed, leading to the design of more efficient logic circuits. A new full adder design based on the G.sup.4-FET is also presented. The G.sup.4-FET can also function as a unique router device offering coplanar crossing of signal paths that are isolated and perpendicular to one another. This has the potential of overcoming major limitations in VLSI design where complex interconnection schemes have become increasingly problematic.
Back to inertia : Theoretical implications of alternative styles of logical formalization
Péli, Gabor; Pólos, L.; Hannan, M.T.
This article applies two new criteria, desirability and faithfulness, to evaluate Péli et al.'s (1994) formalization of Hannan and Freeman's structural inertia argument (1984, 1989). We conclude that this formalization fails to meet these criteria. We argue that part of the rational reconstruction
A logic circuit for solving linear function by digital method
International Nuclear Information System (INIS)
Ma Yonghe
1986-01-01
A mathematical method for determining the linear relation of physical quantity with rediation intensity is described. A logic circuit has been designed for solving linear function by digital method. Some applications and the circuit function are discussed
Horn逻辑程序和形式文法之间的对应关系%The Correspondence between Horn Logic Programs and Formal Grammars
Institute of Scientific and Technical Information of China (English)
陈文彬; 王驹
2003-01-01
The paper researches Horn logic programs with grammatical view. The correspondence between Horn logic programs and grammars is found. The method by which type-0 grammars generate the least Herbrand models of logic programs is found. The method by which Horn logic programs generate the languages of type-0 grammars is found.The characterization of Horn Logic programs that are semantically equavanent to type-2 grammars and type-3 grammars is found.
DEFF Research Database (Denmark)
Schürmann, Carsten; Sarnat, Jeffrey
2008-01-01
Tait's method (a.k.a. proof by logical relations) is a powerful proof technique frequently used for showing foundational properties of languages based on typed lambda-calculi. Historically, these proofs have been extremely difficult to formalize in proof assistants with weak meta-logics......, such as Twelf, and yet they are often straightforward in proof assistants with stronger meta-logics. In this paper, we propose structural logical relations as a technique for conducting these proofs in systems with limited meta-logical strength by explicitly representing and reasoning about an auxiliary logic...
Formal specification level concepts, methods, and algorithms
Soeken, Mathias
2015-01-01
This book introduces a new level of abstraction that closes the gap between the textual specification of embedded systems and the executable model at the Electronic System Level (ESL). Readers will be enabled to operate at this new, Formal Specification Level (FSL), using models which not only allow significant verification tasks in this early stage of the design flow, but also can be extracted semi-automatically from the textual specification in an interactive manner. The authors explain how to use these verification tasks to check conceptual properties, e.g. whether requirements are in conflict, as well as dynamic behavior, in terms of execution traces. • Serves as a single-source reference to a new level of abstraction for embedded systems, known as the Formal Specification Level (FSL); • Provides a variety of use cases which can be adapted to readers’ specific design flows; • Includes a comprehensive illustration of Natural Language Processing (NLP) techniques, along with examples of how to i...
Formal Methods and Safety Certification: Challenges in the Railways Domain
DEFF Research Database (Denmark)
Fantechi, Alessandro; Ferrari, Alessio; Gnesi, Stefania
2016-01-01
The railway signalling sector has historically been a source of success stories about the adoption of formal methods in the certification of software safety of computer-based control equipment.......The railway signalling sector has historically been a source of success stories about the adoption of formal methods in the certification of software safety of computer-based control equipment....
Automated logic conversion method for plant controller systems
International Nuclear Information System (INIS)
Wada, Yutaka; Kobayashi, Yasuhiro; Miyo, Tsunemasa; Okano, Masato.
1990-01-01
An automated method is proposed for logic conversion from functional description diagrams to detailed logic schematics by incorporating expertise knowledge in plant controller systems design. The method uses connection data of function elements in the functional description diagram as input, and synthesizes a detailed logic structure by adding elements to the given connection data incrementally, and to generate detailed logic schematics. In logic synthesis, for building up complex synthesis procedures by combining generally-described knowledge, knowledge is applied by groups. The search order of the groups is given by upper-level knowledge. Furthermore, the knowledge is expressed in terms of two classes of rules; one for generating a hypothesis of individual synthesis operations and the other for considering several hypotheses to determine the connection ordering of elements to be added. In the generation of detailed logic schematics, knowledge is used as rules for deriving various kinds of layout conditions on schematics, and rules for generating two-dimensional coordinates of layout objects. Rules in the latter class use layout conditions to predict intersections among layout objects without their coordinates being fixed. The effectiveness of the method with 150 rules was verified by its experimental application to some logic conversions in a real power plant design. Evaluation of the results showed them to be equivalent to those obtained by well qualified designers. (author)
Moderation instead of modelling: some arguments against formal engineering methods
Rauterberg, G.W.M.; Sikorski, M.; Rauterberg, G.W.M.
1998-01-01
The more formal the used engineering techniques are, the less non-technical facts can be captured. Several business process reengineering and software development projects fail, because the project management concentrates to much on formal methods and modelling approaches. A successful change of
An exact method for solving logical loops in reliability analysis
International Nuclear Information System (INIS)
Matsuoka, Takeshi
2009-01-01
This paper presents an exact method for solving logical loops in reliability analysis. The systems that include logical loops are usually described by simultaneous Boolean equations. First, present a basic rule of solving simultaneous Boolean equations. Next, show the analysis procedures for three-component system with external supports. Third, more detailed discussions are given for the establishment of logical loop relation. Finally, take up two typical structures which include more than one logical loop. Their analysis results and corresponding GO-FLOW charts are given. The proposed analytical method is applicable to loop structures that can be described by simultaneous Boolean equations, and it is very useful in evaluating the reliability of complex engineering systems.
Multiband discrete ordinates method: formalism and results
International Nuclear Information System (INIS)
Luneville, L.
1998-06-01
The multigroup discrete ordinates method is a classical way to solve transport equation (Boltzmann) for neutral particles. Self-shielding effects are not correctly treated due to large variations of cross sections in a group (in the resonance range). To treat the resonance domain, the multiband method is introduced. The main idea is to divide the cross section domain into bands. We obtain the multiband parameters using the moment method; the code CALENDF provides probability tables for these parameters. We present our implementation in an existing discrete ordinates code: SN1D. We study deep penetration benchmarks and show the improvement of the method in the treatment of self-shielding effects. (author)
SYNTHESIS METHODS OF ALGEBRAIC NORMAL FORM OF MANY-VALUED LOGIC FUNCTIONS
Directory of Open Access Journals (Sweden)
A. V. Sokolov
2016-01-01
Full Text Available The rapid development of methods of error-correcting coding, cryptography, and signal synthesis theory based on the principles of many-valued logic determines the need for a more detailed study of the forms of representation of functions of many-valued logic. In particular the algebraic normal form of Boolean functions, also known as Zhegalkin polynomial, that well describe many of the cryptographic properties of Boolean functions is widely used. In this article, we formalized the notion of algebraic normal form for many-valued logic functions. We developed a fast method of synthesis of algebraic normal form of 3-functions and 5-functions that work similarly to the Reed-Muller transform for Boolean functions: on the basis of recurrently synthesized transform matrices. We propose the hypothesis, which determines the rules of the synthesis of these matrices for the transformation from the truth table to the coefficients of the algebraic normal form and the inverse transform for any given number of variables of 3-functions or 5-functions. The article also introduces the definition of algebraic degree of nonlinearity of the functions of many-valued logic and the S-box, based on the principles of many-valued logic. Thus, the methods of synthesis of algebraic normal form of 3-functions applied to the known construction of recurrent synthesis of S-boxes of length N = 3k, whereby their algebraic degrees of nonlinearity are computed. The results could be the basis for further theoretical research and practical applications such as: the development of new cryptographic primitives, error-correcting codes, algorithms of data compression, signal structures, and algorithms of block and stream encryption, all based on the perspective principles of many-valued logic. In addition, the fast method of synthesis of algebraic normal form of many-valued logic functions is the basis for their software and hardware implementation.
Formal methods in design and verification of functional specifications
International Nuclear Information System (INIS)
Vaelisuo, H.
1995-01-01
It is claimed that formal methods should be applied already when specifying the functioning of the control/monitoring system, i.e. when planning how to implement the desired operation of the plant. Formal methods are seen as a way to mechanize and thus automate part of the planning. All mathematical methods which can be applied on related problem solving should be considered as formal methods. Because formal methods can only support the designer, not replace him/her, they must be integrated into a design support tool. Such a tool must also aid the designer in getting the correct conception of the plant and its behaviour. The use of a hypothetic design support tool is illustrated to clarify the requirements such a tool should fulfill. (author). 3 refs, 5 figs
Directory of Open Access Journals (Sweden)
Pimonov Alexander
2017-01-01
Full Text Available This paper presents the multipurpose approach to evaluation of research and innovation projects based on the method of analysis of hierarchies and fuzzy logics for the mining industry. The approach, implemented as part of a decision support system, can reduce the degree of subjectivity during examinations by taking into account both quantitative and qualitative characteristics of the compared innovative alternatives; it does not depend on specific conditions of examination and allows engagement of experts of various fields of knowledge. The system includes the mechanism of coordination of several experts’ views. Using of fuzzy logics allows evaluating the qualitative characteristics of innovations in the form of formalized logical conclusions.
FORMED: Bringing Formal Methods to the Engineering Desktop
2016-02-01
FORMED: BRINGING FORMAL METHODS TO THE ENGINEERING DESKTOP BAE SYSTEMS FEBRUARY 2016 FINAL TECHNICAL REPORT APPROVED FOR PUBLIC RELEASE...This report is published in the interest of scientific and technical information exchange, and its publication does not constitute the Government’s...BRINGING FORMAL METHODS TO THE ENGINEERING DESKTOP 5a. CONTRACT NUMBER FA8750-14-C-0024 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 63781D
Data Mining and Knowledge Discovery via Logic-Based Methods
Triantaphyllou, Evangelos
2010-01-01
There are many approaches to data mining and knowledge discovery (DM&KD), including neural networks, closest neighbor methods, and various statistical methods. This monograph, however, focuses on the development and use of a novel approach, based on mathematical logic, that the author and his research associates have worked on over the last 20 years. The methods presented in the book deal with key DM&KD issues in an intuitive manner and in a natural sequence. Compared to other DM&KD methods, those based on mathematical logic offer a direct and often intuitive approach for extracting easily int
Logic-based aggregation methods for ranking student applicants
Directory of Open Access Journals (Sweden)
Milošević Pavle
2017-01-01
Full Text Available In this paper, we present logic-based aggregation models used for ranking student applicants and we compare them with a number of existing aggregation methods, each more complex than the previous one. The proposed models aim to include depen- dencies in the data using Logical aggregation (LA. LA is a aggregation method based on interpolative Boolean algebra (IBA, a consistent multi-valued realization of Boolean algebra. This technique is used for a Boolean consistent aggregation of attributes that are logically dependent. The comparison is performed in the case of student applicants for master programs at the University of Belgrade. We have shown that LA has some advantages over other presented aggregation methods. The software realization of all applied aggregation methods is also provided. This paper may be of interest not only for student ranking, but also for similar problems of ranking people e.g. employees, team members, etc.
Formalization and Implementation of Algebraic Methods in Geometry
Directory of Open Access Journals (Sweden)
Filip Marić
2012-02-01
Full Text Available We describe our ongoing project of formalization of algebraic methods for geometry theorem proving (Wu's method and the Groebner bases method, their implementation and integration in educational tools. The project includes formal verification of the algebraic methods within Isabelle/HOL proof assistant and development of a new, open-source Java implementation of the algebraic methods. The project should fill-in some gaps still existing in this area (e.g., the lack of formal links between algebraic methods and synthetic geometry and the lack of self-contained implementations of algebraic methods suitable for integration with dynamic geometry tools and should enable new applications of theorem proving in education.
Formal methods for dependable real-time systems
Rushby, John
1993-01-01
The motivation for using formal methods to specify and reason about real time properties is outlined and approaches that were proposed and used are sketched. The formal verifications of clock synchronization algorithms are concluded as showing that mechanically supported reasoning about complex real time behavior is feasible. However, there was significant increase in the effectiveness of verification systems since those verifications were performed, at it is to be expected that verifications of comparable difficulty will become fairly routine. The current challenge lies in developing perspicuous and economical approaches to the formalization and specification of real time properties.
Spacecraft early design validation using formal methods
International Nuclear Information System (INIS)
Bozzano, Marco; Cimatti, Alessandro; Katoen, Joost-Pieter; Katsaros, Panagiotis; Mokos, Konstantinos; Nguyen, Viet Yen; Noll, Thomas; Postma, Bart; Roveri, Marco
2014-01-01
The size and complexity of software in spacecraft is increasing exponentially, and this trend complicates its validation within the context of the overall spacecraft system. Current validation methods are labor-intensive as they rely on manual analysis, review and inspection. For future space missions, we developed – with challenging requirements from the European space industry – a novel modeling language and toolset for a (semi-)automated validation approach. Our modeling language is a dialect of AADL and enables engineers to express the system, the software, and their reliability aspects. The COMPASS toolset utilizes state-of-the-art model checking techniques, both qualitative and probabilistic, for the analysis of requirements related to functional correctness, safety, dependability and performance. Several pilot projects have been performed by industry, with two of them having focused on the system-level of a satellite platform in development. Our efforts resulted in a significant advancement of validating spacecraft designs from several perspectives, using a single integrated system model. The associated technology readiness level increased from level 1 (basic concepts and ideas) to early level 4 (laboratory-tested)
Proceedings of the First NASA Formal Methods Symposium
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
Dominant partition method. [based on a wave function formalism
Dixon, R. M.; Redish, E. F.
1979-01-01
By use of the L'Huillier, Redish, and Tandy (LRT) wave function formalism, a partially connected method, the dominant partition method (DPM) is developed for obtaining few body reductions of the many body problem in the LRT and Bencze, Redish, and Sloan (BRS) formalisms. The DPM maps the many body problem to a fewer body one by using the criterion that the truncated formalism must be such that consistency with the full Schroedinger equation is preserved. The DPM is based on a class of new forms for the irreducible cluster potential, which is introduced in the LRT formalism. Connectivity is maintained with respect to all partitions containing a given partition, which is referred to as the dominant partition. Degrees of freedom corresponding to the breakup of one or more of the clusters of the dominant partition are treated in a disconnected manner. This approach for simplifying the complicated BRS equations is appropriate for physical problems where a few body reaction mechanism prevails.
Software Components and Formal Methods from a Computational Viewpoint
Lambertz, Christian
2012-01-01
Software components and the methodology of component-based development offer a promising approach to master the design complexity of huge software products because they separate the concerns of software architecture from individual component behavior and allow for reusability of components. In combination with formal methods, the specification of a formal component model of the later software product or system allows for establishing and verifying important system properties in an automatic a...
The importance of training in formal methods in Software Engineering
Directory of Open Access Journals (Sweden)
John Polansky
2014-12-01
Full Text Available The paradigm of formal methods provides systematic techniques and rigorous to software develop and, due the crescent complexity and quality requirements of current products, is necessary introduce them in curriculum of software engineer. In this article is analyzed the importance of train in formal methods and described specific techniques to achieved it efficiently. This techniques are the result of an experimental process in the class room of more than fifteen years in undergraduate and graduate programs, the same as company training. Also are presented a proposal a curriculum to systematic introduction of this paradigm and description of a program in training methods that has been success to industry. Results shows that students gain confidence in formal methods just when found out of the benefits of this in the context of software engineer.
Fuzzy logic controller using different inference methods
International Nuclear Information System (INIS)
Liu, Z.; De Keyser, R.
1994-01-01
In this paper the design of fuzzy controllers by using different inference methods is introduced. Configuration of the fuzzy controllers includes a general rule-base which is a collection of fuzzy PI or PD rules, the triangular fuzzy data model and a centre of gravity defuzzification algorithm. The generalized modus ponens (GMP) is used with the minimum operator of the triangular norm. Under the sup-min inference rule, six fuzzy implication operators are employed to calculate the fuzzy look-up tables for each rule base. The performance is tested in simulated systems with MATLAB/SIMULINK. Results show the effects of using the fuzzy controllers with different inference methods and applied to different test processes
Formal Methods for Abstract Specifications – A Comparison of Concepts
DEFF Research Database (Denmark)
Instenberg, Martin; Schneider, Axel; Schnetter, Sabine
2006-01-01
In industry formal methods are becoming increasingly important for the verification of hardware and software designs. However current practice for specification of system and protocol functionality on high level of abstraction is textual description. For verification of the system behavior manual...... inspections and tests are usual means. To facilitate the introduction of formal methods in the development process of complex systems and protocols, two different tools evolved from research activities – UPPAAL and SpecEdit – have been investigated and compared regarding their concepts and functionality...
Bicycle Frame Prediction Techniques with Fuzzy Logic Method
Directory of Open Access Journals (Sweden)
Rafiuddin Syam
2015-03-01
Full Text Available In general, an appropriate size bike frame would get comfort to the rider while biking. This study aims to predict the simulation system on the bike frame sizes with fuzzy logic. Testing method used is the simulation test. In this study, fuzzy logic will be simulated using Matlab language to test their performance. Mamdani fuzzy logic using 3 variables and 1 output variable intake. Triangle function for the input and output. The controller is designed in the type mamdani with max-min composition and the method deffuzification using center of gravity method. The results showed that height, inseam and Crank Size generating appropriate frame size for the rider associated with comfort. Has a height range between 142 cm and 201 cm. Inseam has a range between 64 cm and 97 cm. Crank has a size range between 175 mm and 180 mm. The simulation results have a range of frame sizes between 13 inches and 22 inches. By using the fuzzy logic can be predicted the size frame of bicycle suitable for the biker.
Bicycle Frame Prediction Techniques with Fuzzy Logic Method
Directory of Open Access Journals (Sweden)
Rafiuddin Syam
2017-03-01
Full Text Available In general, an appropriate size bike frame would get comfort to the rider while biking. This study aims to predict the simulation system on the bike frame sizes with fuzzy logic. Testing method used is the simulation test. In this study, fuzzy logic will be simulated using Matlab language to test their performance. Mamdani fuzzy logic using 3 variables and 1 output variable intake. Triangle function for the input and output. The controller is designed in the type mamdani with max-min composition and the method deffuzification using center of gravity method. The results showed that height, inseam and Crank Size generating appropriate frame size for the rider associated with comfort. Has a height range between 142 cm and 201 cm. Inseam has a range between 64 cm and 97 cm. Crank has a size range between 175 mm and 180 mm. The simulation results have a range of frame sizes between 13 inches and 22 inches. By using the fuzzy logic can be predicted the size frame of bicycle suitable for the biker.
Description logic-based methods for auditing frame-based medical terminological systems.
Cornet, Ronald; Abu-Hanna, Ameen
2005-07-01
Medical terminological systems (TSs) play an increasingly important role in health care by supporting recording, retrieval and analysis of patient information. As the size and complexity of TSs are growing, the need arises for means to audit them, i.e. verify and maintain (logical) consistency and (semantic) correctness of their contents. This is not only important for the management of TSs but also for providing their users with confidence about the reliability of their contents. Formal methods have the potential to play an important role in the audit of TSs, although there are few empirical studies to assess the benefits of using these methods. In this paper we propose a method based on description logics (DLs) for the audit of TSs. This method is based on the migration of the medical TS from a frame-based representation to a DL-based one. Our method is characterized by a process in which initially stringent assumptions are made about concept definitions. The assumptions allow the detection of concepts and relations that might comprise a source of logical inconsistency. If the assumptions hold then definitions are to be altered to eliminate the inconsistency, otherwise the assumptions are revised. In order to demonstrate the utility of the approach in a real-world case study we audit a TS in the intensive care domain and discuss decisions pertaining to building DL-based representations. This case study demonstrates that certain types of inconsistencies can indeed be detected by applying the method to a medical terminological system. The added value of the method described in this paper is that it provides a means to evaluate the compliance to a number of common modeling principles in a formal manner. The proposed method reveals potential modeling inconsistencies, helping to audit and (if possible) improve the medical TS. In this way, it contributes to providing confidence in the contents of the terminological system.
Dual deep modeling: multi-level modeling with dual potencies and its formalization in F-Logic.
Neumayr, Bernd; Schuetz, Christoph G; Jeusfeld, Manfred A; Schrefl, Michael
2018-01-01
An enterprise database contains a global, integrated, and consistent representation of a company's data. Multi-level modeling facilitates the definition and maintenance of such an integrated conceptual data model in a dynamic environment of changing data requirements of diverse applications. Multi-level models transcend the traditional separation of class and object with clabjects as the central modeling primitive, which allows for a more flexible and natural representation of many real-world use cases. In deep instantiation, the number of instantiation levels of a clabject or property is indicated by a single potency. Dual deep modeling (DDM) differentiates between source potency and target potency of a property or association and supports the flexible instantiation and refinement of the property by statements connecting clabjects at different modeling levels. DDM comes with multiple generalization of clabjects, subsetting/specialization of properties, and multi-level cardinality constraints. Examples are presented using a UML-style notation for DDM together with UML class and object diagrams for the representation of two-level user views derived from the multi-level model. Syntax and semantics of DDM are formalized and implemented in F-Logic, supporting the modeler with integrity checks and rich query facilities.
Tretmans, G.J.; Wijbrans, K.C.J.; Chaudron, M.
2001-01-01
This paper discusses the use of formal methods in the development of the control system for the Maeslant Kering. The Maeslant Kering is the movable dam which has to protect Rotterdam from floodings while, at (almost) the same time, not restricting ship traffic to the port of Rotterdam. The control
Classification of Children Intelligence with Fuzzy Logic Method
Syahminan; ika Hidayati, Permata
2018-04-01
Intelligence of children s An Important Thing To Know The Parents Early on. Typing Can be done With a Child’s intelligence Grouping Dominant Characteristics Of each Type of Intelligence. To Make it easier for Parents in Determining The type of Children’s intelligence And How to Overcome them, for It Created A Classification System Intelligence Grouping Children By Using Fuzzy logic method For determination Of a Child’s degree of intelligence type. From the analysis We concluded that The presence of Intelligence Classification systems Pendulum Children With Fuzzy Logic Method Of determining The type of The Child’s intelligence Can be Done in a way That is easier And The results More accurate Conclusions Than Manual tests.
Meyer, J.J.Ch.; Broersen, J.M.; Herzig, A.
2015-01-01
This paper presents an overview of so-called BDI logics, logics where the notion of Beliefs, Desires and Intentions play a central role. Starting out from the basic ideas about BDI by Bratman, we consider various formalizations in logic, such as the approach of Cohen and Levesque, slightly
A Survey of Formal Methods for Intelligent Swarms
Truszkowski, Walt; Rash, James; Hinchey, Mike; Rouff, Chrustopher A.
2004-01-01
cutting edge in system correctness, and requires higher levels of assurance than other (traditional) missions that use a single or small number of spacecraft that are deterministic in nature and have near continuous communication access. One of the highest possible levels of assurance comes from the application of formal methods. Formal methods are mathematics-based tools and techniques for specifying and verifying (software and hardware) systems. They are particularly useful for specifying complex parallel systems, such as exemplified by the ANTS mission, where the entire system is difficult for a single person to fully understand, a problem that is multiplied with multiple developers. Once written, a formal specification can be used to prove properties of a system (e.g., the underlying system will go from one state to another or not into a specific state) and check for particular types of errors (e.g., race or livelock conditions). A formal specification can also be used as input to a model checker for further validation. This report gives the results of a survey of formal methods techniques for verification and validation of space missions that use swarm technology. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft using the ANTS mission as an example system. This report is the first result of the project to determine formal approaches that are promising for formally specifying swarm-based systems. From this survey, the most promising approaches were selected and are discussed relative to their possible application to the ANTS mission. Future work will include the application of an integrated approach, based on the selected approaches identified in this report, to the formal specification of the ANTS mission.
NASA Langley's Formal Methods Research in Support of the Next Generation Air Transportation System
Butler, Ricky W.; Munoz, Cesar A.
2008-01-01
This talk will provide a brief introduction to the formal methods developed at NASA Langley and the National Institute for Aerospace (NIA) for air traffic management applications. NASA Langley's formal methods research supports the Interagency Joint Planning and Development Office (JPDO) effort to define and develop the 2025 Next Generation Air Transportation System (NGATS). The JPDO was created by the passage of the Vision 100 Century of Aviation Reauthorization Act in Dec 2003. The NGATS vision calls for a major transformation of the nation s air transportation system that will enable growth to 3 times the traffic of the current system. The transformation will require an unprecedented level of safety-critical automation used in complex procedural operations based on 4-dimensional (4D) trajectories that enable dynamic reconfiguration of airspace scalable to geographic and temporal demand. The goal of our formal methods research is to provide verification methods that can be used to insure the safety of the NGATS system. Our work has focused on the safety assessment of concepts of operation and fundamental algorithms for conflict detection and resolution (CD&R) and self- spacing in the terminal area. Formal analysis of a concept of operations is a novel area of application of formal methods. Here one must establish that a system concept involving aircraft, pilots, and ground resources is safe. The formal analysis of algorithms is a more traditional endeavor. However, the formal analysis of ATM algorithms involves reasoning about the interaction of algorithmic logic and aircraft trajectories defined over an airspace. These trajectories are described using 2D and 3D vectors and are often constrained by trigonometric relations. Thus, in many cases it has been necessary to unload the full power of an advanced theorem prover. The verification challenge is to establish that the safety-critical algorithms produce valid solutions that are guaranteed to maintain separation
A vector matching method for analysing logic Petri nets
Du, YuYue; Qi, Liang; Zhou, MengChu
2011-11-01
Batch processing function and passing value indeterminacy in cooperative systems can be described and analysed by logic Petri nets (LPNs). To directly analyse the properties of LPNs, the concept of transition enabling vector sets is presented and a vector matching method used to judge the enabling transitions is proposed in this article. The incidence matrix of LPNs is defined; an equation about marking change due to a transition's firing is given; and a reachable tree is constructed. The state space explosion is mitigated to a certain extent from directly analysing LPNs. Finally, the validity and reliability of the proposed method are illustrated by an example in electronic commerce.
Formal methods applied to industrial complex systems implementation of the B method
Boulanger, Jean-Louis
2014-01-01
This book presents real-world examples of formal techniques in an industrial context. It covers formal methods such as SCADE and/or the B Method, in various fields such as railways, aeronautics, and the automotive industry. The purpose of this book is to present a summary of experience on the use of "formal methods" (based on formal techniques such as proof, abstract interpretation and model-checking) in industrial examples of complex systems, based on the experience of people currently involved in the creation and assessment of safety critical system software. The involvement of people from
Smullyan, Raymond
2008-01-01
This book features a unique approach to the teaching of mathematical logic by putting it in the context of the puzzles and paradoxes of common language and rational thought. It serves as a bridge from the author's puzzle books to his technical writing in the fascinating field of mathematical logic. Using the logic of lying and truth-telling, the author introduces the readers to informal reasoning preparing them for the formal study of symbolic logic, from propositional logic to first-order logic, a subject that has many important applications to philosophy, mathematics, and computer science. T
What is the method in applying formal methods to PLC applications?
Mader, Angelika H.; Engel, S.; Wupper, Hanno; Kowalewski, S.; Zaytoon, J.
2000-01-01
The question we investigate is how to obtain PLC applications with confidence in their proper functioning. Especially, we are interested in the contribution that formal methods can provide for their development. Our maxim is that the place of a particular formal method in the total picture of system
Wollbold, Johannes; Jaster, Robert; Müller, Sarah; Rateitschak, Katja; Wolkenhauer, Olaf
2014-09-24
Recent findings suggest that in pancreatic acinar cells stimulated with bile acid, a pro-apoptotic effect of reactive oxygen species (ROS) dominates their effect on necrosis and spreading of inflammation. The first effect presumably occurs via cytochrome C release from the inner mitochondrial membrane. A pro-necrotic effect - similar to the one of Ca2+ - can be strong opening of mitochondrial pores leading to breakdown of the membrane potential, ATP depletion, sustained Ca2+ increase and premature activation of digestive enzymes. To explain published data and to understand ROS effects during the onset of acute pancreatitis, a model using multi-valued logic is constructed. Formal concept analysis (FCA) is used to validate the model against data as well as to analyze and visualize rules that capture the dynamics. Simulations for two different levels of bile stimulation and for inhibition or addition of antioxidants reproduce the qualitative behaviour shown in the experiments. Based on reported differences of ROS production and of ROS induced pore opening, the model predicts a more uniform apoptosis/necrosis ratio for higher and lower bile stimulation in liver cells than in pancreatic acinar cells. FCA confirms that essential dynamical features of the data are captured by the model. For instance, high necrosis always occurs together with at least a medium level of apoptosis. At the same time, FCA helps to reveal subtle differences between data and simulations. The FCA visualization underlines the protective role of ROS against necrosis. The analysis of the model demonstrates how ROS and decreased antioxidant levels contribute to apoptosis. Studying the induction of necrosis via a sustained Ca2+ increase, we implemented the commonly accepted hypothesis of ATP depletion after strong bile stimulation. Using an alternative model, we demonstrate that this process is not necessary to generate the dynamics of the measured variables. Opening of plasma membrane channels could
A Fuzzy Logic Based Method for Analysing Test Results
Directory of Open Access Journals (Sweden)
Le Xuan Vinh
2017-11-01
Full Text Available Network operators must perform many tasks to ensure smooth operation of the network, such as planning, monitoring, etc. Among those tasks, regular testing of network performance, network errors and troubleshooting is very important. Meaningful test results will allow the operators to evaluate network performanceof any shortcomings and to better plan for network upgrade. Due to the diverse and mainly unquantifiable nature of network testing results, there is a needs to develop a method for systematically and rigorously analysing these results. In this paper, we present STAM (System Test-result Analysis Method which employs a bottom-up hierarchical processing approach using Fuzzy logic. STAM is capable of combining all test results into a quantitative description of the network performance in terms of network stability, the significance of various network erros, performance of each function blocks within the network. The validity of this method has been successfully demonstrated in assisting the testing of a VoIP system at the Research Instiute of Post and Telecoms in Vietnam. The paper is organized as follows. The first section gives an overview of fuzzy logic theory the concepts of which will be used in the development of STAM. The next section describes STAM. The last section, demonstrating STAM’s capability, presents a success story in which STAM is successfully applied.
Formal methods in the design of Ada 1995
Guaspari, David
1995-01-01
Formal, mathematical methods are most useful when applied early in the design and implementation of a software system--that, at least, is the familiar refrain. I will report on a modest effort to apply formal methods at the earliest possible stage, namely, in the design of the Ada 95 programming language itself. This talk is an 'experience report' that provides brief case studies illustrating the kinds of problems we worked on, how we approached them, and the extent (if any) to which the results proved useful. It also derives some lessons and suggestions for those undertaking future projects of this kind. Ada 95 is the first revision of the standard for the Ada programming language. The revision began in 1988, when the Ada Joint Programming Office first asked the Ada Board to recommend a plan for revising the Ada standard. The first step in the revision was to solicit criticisms of Ada 83. A set of requirements for the new language standard, based on those criticisms, was published in 1990. A small design team, the Mapping Revision Team (MRT), became exclusively responsible for revising the language standard to satisfy those requirements. The MRT, from Intermetrics, is led by S. Tucker Taft. The work of the MRT was regularly subject to independent review and criticism by a committee of distinguished Reviewers and by several advisory teams--for example, the two User/Implementor teams, each consisting of an industrial user (attempting to make significant use of the new language on a realistic application) and a compiler vendor (undertaking, experimentally, to modify its current implementation in order to provide the necessary new features). One novel decision established the Language Precision Team (LPT), which investigated language proposals from a mathematical point of view. The LPT applied formal mathematical analysis to help improve the design of Ada 95 (e.g., by clarifying the language proposals) and to help promote its acceptance (e.g., by identifying a
Controlling Smart Green House Using Fuzzy Logic Method
Directory of Open Access Journals (Sweden)
Rafiuddin Syam
2017-03-01
Full Text Available To increase agricultural output it is needed a system that can help the environmental conditions for optimum plant growth. Smart greenhouse allows for plants to grow optimally, because the temperature and humidity can be controlled so that no drastic changes. It is necessary for optimal smart greenhouse needed a system to manipulate the environment in accordance with the needs of the plant. In this case the setting temperature and humidity in the greenhouse according to the needs of the plant. So using an automated system for keeping such environmental condition is important. In this study, the authors use fuzzy logic to make the duration of watering the plants more dynamic in accordance with the input temperature and humidity so that the temperature and humidity in the green house plants maintained in accordance to the reference condition. Based on the experimental results using fuzzy logic method is effective to control the duration of watering and to maintain the optimum temperature and humidity inside the greenhouse
Controlling Smart Green House Using Fuzzy Logic Method
Directory of Open Access Journals (Sweden)
Rafiuddin Syam
2015-10-01
Full Text Available To increase agricultural output it is needed a system that can help the environmental conditions for optimum plant growth. Smart greenhouse allows for plants to grow optimally, because the temperature and humidity can be controlled so that no drastic changes. It is necessary for optimal smart greenhouse needed a system to manipulate the environment in accordance with the needs of the plant. In this case the setting temperature and humidity in the greenhouse according to the needs of the plant. So using an automated system for keeping such environmental condition is important. In this study, the authors use fuzzy logic to make the duration of watering the plants more dynamic in accordance with the input temperature and humidity so that the temperature and humidity in the green house plants maintained in accordance to the reference condition. Based on the experimental results using fuzzy logic method is effective to control the duration of watering and to maintain the optimum temperature and humidity inside the greenhouse
Directory of Open Access Journals (Sweden)
Serhii D. Bushuiev
2017-12-01
Full Text Available The current state of project management has been steadily demonstrating a trend toward increasing the role of flexible "soft" management practices. A method for preparing solutions for the formation of a value-oriented portfolio based on a comparison of the level of internal organizational values is proposed. The method formalizes the methodological foundations of value-oriented portfolio management in the development of organizations in the form of approaches, basic terms and technological methods with ICT using, which makes it possible to use them as an integral knowledge system for creating an automated system for managing portfolios of organizations. The result of the study is the deepening of the theoretical provisions for managing the development of organizations through the implementation of a value-oriented portfolio of projects, which allowed formalize the method of recording value memes in the development portfolios of organizations, to disclose its logic, essence, objective basis and rules.
Edge detection methods based on generalized type-2 fuzzy logic
Gonzalez, Claudia I; Castro, Juan R; Castillo, Oscar
2017-01-01
In this book four new methods are proposed. In the first method the generalized type-2 fuzzy logic is combined with the morphological gra-dient technique. The second method combines the general type-2 fuzzy systems (GT2 FSs) and the Sobel operator; in the third approach the me-thodology based on Sobel operator and GT2 FSs is improved to be applied on color images. In the fourth approach, we proposed a novel edge detec-tion method where, a digital image is converted a generalized type-2 fuzzy image. In this book it is also included a comparative study of type-1, inter-val type-2 and generalized type-2 fuzzy systems as tools to enhance edge detection in digital images when used in conjunction with the morphologi-cal gradient and the Sobel operator. The proposed generalized type-2 fuzzy edge detection methods were tested with benchmark images and synthetic images, in a grayscale and color format. Another contribution in this book is that the generalized type-2 fuzzy edge detector method is applied in the preproc...
Methods for converging correlation energies within the dielectric matrix formalism
Dixit, Anant; Claudot, Julien; Gould, Tim; Lebègue, Sébastien; Rocca, Dario
2018-03-01
Within the dielectric matrix formalism, the random-phase approximation (RPA) and analogous methods that include exchange effects are promising approaches to overcome some of the limitations of traditional density functional theory approximations. The RPA-type methods however have a significantly higher computational cost, and, similarly to correlated quantum-chemical methods, are characterized by a slow basis set convergence. In this work we analyzed two different schemes to converge the correlation energy, one based on a more traditional complete basis set extrapolation and one that converges energy differences by accounting for the size-consistency property. These two approaches have been systematically tested on the A24 test set, for six points on the potential-energy surface of the methane-formaldehyde complex, and for reaction energies involving the breaking and formation of covalent bonds. While both methods converge to similar results at similar rates, the computation of size-consistent energy differences has the advantage of not relying on the choice of a specific extrapolation model.
Foundations of mathematical logic
Curry, Haskell B
2010-01-01
Written by a pioneer of mathematical logic, this comprehensive graduate-level text explores the constructive theory of first-order predicate calculus. It covers formal methods, including algorithms and epitheory, and offers a brief treatment of Markov's approach to algorithms, explains elementary facts about lattices and similar algebraic systems, and more. 1963 edition.
Automatic control logics to eliminate xenon oscillation based on Axial Offsets Trajectory Method
International Nuclear Information System (INIS)
Shimazu, Yoichiro
1996-01-01
We have proposed Axial Offsets (AO) Trajectory Method for xenon oscillation control in pressurized water reactors. The features of this method are described as such that it can clearly give necessary control operations to eliminate xenon oscillations. It is expected that using the features automatic control logics for xenon oscillations can be simple and be realized easily. We investigated automatic control logics. The AO Trajectory Method could realize a very simple logic only for eliminating xenon oscillations. However it was necessary to give another considerations to eliminate the xenon oscillation with a given axial power distribution. The other control logic based on the modern control theory was also studied for comparison of the control performance of the new control logic. As the results, it is presented that the automatic control logics based on the AO Trajectory Method are very simple and effective. (author)
Kaliszyk, C.; Urban, J.; Vyskocil, J.; Geuvers, J.H.; Watt, S.M.; Davenport, J.H.; Sexton, A.P.; Sojka, P.; Urban, J.
2014-01-01
The goal of this project is to (i) accumulate annotated informal/formal mathematical corpora suitable for training semi-automated translation between informal and formal mathematics by statistical machine-translation methods, (ii) to develop such methods oriented at the formalization task, and in
Digital system verification a combined formal methods and simulation framework
Li, Lun
2010-01-01
Integrated circuit capacity follows Moore's law, and chips are commonly produced at the time of this writing with over 70 million gates per device. Ensuring correct functional behavior of such large designs before fabrication poses an extremely challenging problem. Formal verification validates the correctness of the implementation of a design with respect to its specification through mathematical proof techniques. Formal techniques have been emerging as commercialized EDA tools in the past decade. Simulation remains a predominantly used tool to validate a design in industry. After more than 5
Recent trends related to the use of formal methods in software engineering
Prehn, Soren
1986-01-01
An account is given of some recent developments and trends related to the development and use of formal methods in software engineering. Ongoing activities in Europe are focussed on, since there seems to be a notable difference in attitude towards industrial usage of formal methods in Europe and in the U.S. A more detailed account is given of the currently most widespread formal method in Europe: the Vienna Development Method. Finally, the use of Ada is discussed in relation to the application of formal methods, and the potential for constructing Ada-specific tools based on that method is considered.
An automatic tuning method of a fuzzy logic controller for nuclear reactors
International Nuclear Information System (INIS)
Ramaswamy, P.; Lee, K.Y.; Edwards, R.M.
1993-01-01
The design and evaluation by simulation of an automatically tuned fuzzy logic controller is presented. Typically, fuzzy logic controllers are designed based on an expert's knowledge of the process. However, this approach has its limitations in the fact that the controller is hard to optimize or tune to get the desired control action. A method to automate the tuning process using a simplified Kalman filter approach is presented for the fuzzy logic controller to track a suitable reference trajectory. Here, for purposes of illustration an optimal controller's response is used as a reference trajectory to determine automatically the rules for the fuzzy logic controller. To demonstrate the robustness of this design approach, a nonlinear six-delayed neutron group plant is controlled using a fuzzy logic controller that utilizes estimated reactor temperatures from a one-delayed neutron group observer. The fuzzy logic controller displayed good stability and performance robustness characteristics for a wide range of operation
Logic-based methods for optimization combining optimization and constraint satisfaction
Hooker, John
2011-01-01
A pioneering look at the fundamental role of logic in optimization and constraint satisfaction While recent efforts to combine optimization and constraint satisfaction have received considerable attention, little has been said about using logic in optimization as the key to unifying the two fields. Logic-Based Methods for Optimization develops for the first time a comprehensive conceptual framework for integrating optimization and constraint satisfaction, then goes a step further and shows how extending logical inference to optimization allows for more powerful as well as flexible
SACS2: Dynamic and Formal Safety Analysis Method for Complex Safety Critical System
International Nuclear Information System (INIS)
Koh, Kwang Yong; Seong, Poong Hyun
2009-01-01
Fault tree analysis (FTA) is one of the most widely used safety analysis technique in the development of safety critical systems. However, over the years, several drawbacks of the conventional FTA have become apparent. One major drawback is that conventional FTA uses only static gates and hence can not capture dynamic behaviors of the complex system precisely. Although several attempts such as dynamic fault tree (DFT), PANDORA, formal fault tree (FFT) and so on, have been made to overcome this problem, they can not still do absolute or actual time modeling because they adapt relative time concept and can capture only sequential behaviors of the system. Second drawback of conventional FTA is its lack of rigorous semantics. Because it is informal in nature, safety analysis results heavily depend on an analyst's ability and are error-prone. Finally reasoning process which is to check whether basic events really cause top events is done manually and hence very labor-intensive and timeconsuming for the complex systems. In this paper, we propose a new safety analysis method for complex safety critical system in qualitative manner. We introduce several temporal gates based on timed computational tree logic (TCTL) which can represent quantitative notion of time. Then, we translate the information of the fault trees into UPPAAL query language and the reasoning process is automatically done by UPPAAL which is the model checker for time critical system
Newton-Smith, WH
2003-01-01
A complete introduction to logic for first-year university students with no background in logic, philosophy or mathematics. In easily understood steps it shows the mechanics of the formal analysis of arguments.
A comparative study of formal methods for safety critical software in nuclear power plant
International Nuclear Information System (INIS)
Sohn, Se Do; Seong Poong Hyun
2000-01-01
The requirement of ultra high reliability of the safety critical software can not be demonstrated by testing alone. The specification based on formal method is recommended for safety system software. But there exist various kinds of formal methods, and this variety of formal method is recognized as an obstacle to the wide use of formal method. In this paper six different formal method have been applied to the same part of the functional requirements that is calculation algorithm intensive. The specification results were compared against the criteria that is derived from the characteristics that good software requirements specifications should have and regulatory body recommends to have. The application experience shows that the critical characteristics should be defined first, then appropriate method has to e selected. In our case, the Software Cost Reduction method was recommended for internal condition or calculation algorithm checking, and state chart method is recommended for the external behavioral description. (author)
Methodological imperfection and formalizations in scientific activity
International Nuclear Information System (INIS)
Svetlichny, G.
1987-01-01
Any mathematical formalization of scientific activity allows for imperfections in the methodology that is formalized. These can be of three types, dirty, rotten, and dammed. Restricting mathematical attention to those methods that cannot be construed to be imperfect drastically reduces the class of objects that must be analyzed, and related all other objects to these more regular ones. Examples are drawn from empirical logic
Higher order temporal finite element methods through mixed formalisms.
Kim, Jinkyu
2014-01-01
The extended framework of Hamilton's principle and the mixed convolved action principle provide new rigorous weak variational formalism for a broad range of initial boundary value problems in mathematical physics and mechanics. In this paper, their potential when adopting temporally higher order approximations is investigated. The classical single-degree-of-freedom dynamical systems are primarily considered to validate and to investigate the performance of the numerical algorithms developed from both formulations. For the undamped system, all the algorithms are symplectic and unconditionally stable with respect to the time step. For the damped system, they are shown to be accurate with good convergence characteristics.
THE RATIONALE OF LAW. THE ROLE AND IMPORTANCE OF THE LOGICAL METHOD OF INTERPRETATION OF LEGAL NORMS
Directory of Open Access Journals (Sweden)
Mihai BĂDESCU
2017-05-01
Full Text Available The interpretation of law was and remains an indispensable postulation, inherent and the most significant in the application of the law. Through interpretation the aim is to clarify the obscure text, to rectify the imperfection of the text of the legal norm, to remedy its shortcomings, and in consequence, to specify the exact meaning of the legal norm. Interpretation concerns itself with emphasizing the authentic meaning of the normative texts, finding the spirit of the lawmaker-author, the authentic legal sense of the actions that occurred, of the conduct of the perpetrator, and the significant legal connection of these meanings. The necessity of interpreting legal norms is justified by several considerations, out of which the most important remains the one regarding the act that the lawmaker cannot and need not provide everything in the normative text. The unity between the spirit and the letter of the law, the continuity of interpretation, the useful effect of the legal norm are just a few of the principles that need to be taken into account in interpretation. Be it official (obligatory, or unofficial (doctrinary, interpretation remains an extremely important stage in the application of the law: the literature of specialty consecrates five important methods of interpretation (grammatical, historical, systematical, teleological, and logical. The latter method allows for the formulation by the interpreter of certain rational assessments, done through operations of generalization, of logical analysis of the text, of analogy, through applying formal logic. The present study will mainly deal with this method, analyzing the main logical arguments used in interpretation.
Metamathematics of fuzzy logic
Hájek, Petr
1998-01-01
This book presents a systematic treatment of deductive aspects and structures of fuzzy logic understood as many valued logic sui generis. Some important systems of real-valued propositional and predicate calculus are defined and investigated. The aim is to show that fuzzy logic as a logic of imprecise (vague) propositions does have well-developed formal foundations and that most things usually named `fuzzy inference' can be naturally understood as logical deduction.
Enzyme-Based Logic Gates and Networks with Output Signals Analyzed by Various Methods.
Katz, Evgeny
2017-07-05
The paper overviews various methods that are used for the analysis of output signals generated by enzyme-based logic systems. The considered methods include optical techniques (optical absorbance, fluorescence spectroscopy, surface plasmon resonance), electrochemical techniques (cyclic voltammetry, potentiometry, impedance spectroscopy, conductivity measurements, use of field effect transistor devices, pH measurements), and various mechanoelectronic methods (using atomic force microscope, quartz crystal microbalance). Although each of the methods is well known for various bioanalytical applications, their use in combination with the biomolecular logic systems is rather new and sometimes not trivial. Many of the discussed methods have been combined with the use of signal-responsive materials to transduce and amplify biomolecular signals generated by the logic operations. Interfacing of biocomputing logic systems with electronics and "smart" signal-responsive materials allows logic operations be extended to actuation functions; for example, stimulating molecular release and switchable features of bioelectronic devices, such as biofuel cells. The purpose of this review article is to emphasize the broad variability of the bioanalytical systems applied for signal transduction in biocomputing processes. All bioanalytical systems discussed in the article are exemplified with specific logic gates and multi-gate networks realized with enzyme-based biocatalytic cascades. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
On the Application of Formal Methods to Clinical Guidelines, an Artificial Intelligence Perspective
Hommersom, A.J.
2008-01-01
In computer science, all kinds of methods and techniques have been developed to study systems, such as simulation of the behaviour of a system. Furthermore, it is possible to study these systems by proving formal formal properties or by searching through all the possible states that a system may be
A Mixed Methods Study of Barriers to Formal Diagnosis of Autism Spectrum Disorder in Adults
Lewis, Laura Foran
2017-01-01
Delayed diagnosis of autism spectrum disorder (ASD) into adulthood is common, and self-diagnosis is a growing phenomenon. This mixed methods study aimed to explore barriers to formal diagnosis of ASD in adults. In a qualitative strand, secondary analysis of data on the experiences of 114 individuals who were self-diagnosed or formally diagnosed…
Directory of Open Access Journals (Sweden)
V. I. Freyman
2015-11-01
Full Text Available Subject of Research.Representation features of education results for competence-based educational programs are analyzed. Solution importance of decoding and proficiency estimation for elements and components of discipline parts of competences is shown. The purpose and objectives of research are formulated. Methods. The paper deals with methods of mathematical logic, Boolean algebra, and parametrical analysis of complex diagnostic test results, that controls proficiency of some discipline competence elements. Results. The method of logical conditions analysis is created. It will give the possibility to formulate logical conditions for proficiency determination of each discipline competence element, controlled by complex diagnostic test. Normalized test result is divided into noncrossing zones; a logical condition about controlled elements proficiency is formulated for each of them. Summarized characteristics for test result zones are imposed. An example of logical conditions forming for diagnostic test with preset features is provided. Practical Relevance. The proposed method of logical conditions analysis is applied in the decoding algorithm of proficiency test diagnosis for discipline competence elements. It will give the possibility to automate the search procedure for elements with insufficient proficiency, and is also usable for estimation of education results of a discipline or a component of competence-based educational program.
Method of ATMS operators in the formalism of Faddeev equations
International Nuclear Information System (INIS)
Zubarev, D.A.
1991-01-01
The method of ATMS operators is generalized for the case of Faddeev equations. The method to construct effective equations for both elastic scattering and scattering with rearrangement is presented. Properties to obtained equations are considered
INFORMATIONAL-METHODICAL SUPPORT OF THE COURSE «MATHEMATICAL LOGIC AND THEORY OF ALGORITHMS»
Directory of Open Access Journals (Sweden)
Y. I. Sinko
2010-06-01
Full Text Available In this article the basic principles of training technique of future teachers of mathematics to foundations of mathematical logic and theory of algorithms in the Kherson State University with the use of information technologies are examined. General description of functioning of the methodical system of learning of mathematical logic with the use of information technologies, in that variant, when information technologies are presented by the integrated specialized programmatic environment of the educational purpose «MatLog» is given.
Scalable Techniques for Formal Verification
Ray, Sandip
2010-01-01
This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue
Multi-objective decision-making under uncertainty: Fuzzy logic methods
Hardy, Terry L.
1995-01-01
Fuzzy logic allows for quantitative representation of vague or fuzzy objectives, and therefore is well-suited for multi-objective decision-making. This paper presents methods employing fuzzy logic concepts to assist in the decision-making process. In addition, this paper describes software developed at NASA Lewis Research Center for assisting in the decision-making process. Two diverse examples are used to illustrate the use of fuzzy logic in choosing an alternative among many options and objectives. One example is the selection of a lunar lander ascent propulsion system, and the other example is the selection of an aeration system for improving the water quality of the Cuyahoga River in Cleveland, Ohio. The fuzzy logic techniques provided here are powerful tools which complement existing approaches, and therefore should be considered in future decision-making activities.
Industrial applications of formal methods to model, design and analyze computer systems
Craigen, Dan
1995-01-01
Formal methods are mathematically-based techniques, often supported by reasoning tools, that can offer a rigorous and effective way to model, design and analyze computer systems. The purpose of this study is to evaluate international industrial experience in using formal methods. The cases selected are representative of industrial-grade projects and span a variety of application domains. The study had three main objectives: · To better inform deliberations within industry and government on standards and regulations; · To provide an authoritative record on the practical experience of formal m
Practice-Oriented Formal Methods to Support the Software Development of Industrial Control Systems
AUTHOR|(CDS)2088632; Blanco Viñuela, Enrique
Formal specification and verification methods provide ways to describe requirements precisely and to check whether the requirements are satisfied by the design or the implementation. In other words, they can prevent development faults and therefore improve the quality of the developed systems. These methods are part of the state-of-the-practice in application domains with high criticality, such as avionics, railway or nuclear industry. The situation is different in the industrial control systems domain. As the criticality of the systems is much lower, formal methods are rarely used. The two main obstacles to using formal methods in systems with low- or medium-criticality are performance and usability. Overcoming these obstacles often needs deep knowledge and high effort. Model checking, one of the main formal verification techniques, is computationally difficult, therefore the analysis of non-trivial systems requires special considerations. Furthermore, the mainly academic tools implementing different model c...
An Introduction to Formal Methods for the Development of Safety-critical Applications
DEFF Research Database (Denmark)
Haxthausen, Anne Elisabeth
2010-01-01
This report is a delivery to The Danish Government’s railway authority, Traﬁkstyrelsen, as a part of the Public Sector Consultancy service oﬀered by the Technical University of Denmark. The purpose of the report is to give the reader an insight into the stateof-the-art of formal methods. The reader...... is assumed to have some knowledge about software development, but not on formal methods. The background for the railway authorities’ interest in formal methods is the fact that during the next decade a total renewal of the Danish signalling infrastructure is going to take place. Central parts of the new...... systems will be software components that must fulﬁll strong safety requirements: in order to get the software certiﬁed at the highest Safety Integrity Levels of the European CENELEC standards for railway applications, the software providers are expected to use formal methods....
The relationship between the Wigner-Weyl kinetic formalism and the complex geometrical optics method
Maj, Omar
2004-01-01
The relationship between two different asymptotic techniques developed in order to describe the propagation of waves beyond the standard geometrical optics approximation, namely, the Wigner-Weyl kinetic formalism and the complex geometrical optics method, is addressed. More specifically, a solution of the wave kinetic equation, relevant to the Wigner-Weyl formalism, is obtained which yields the same wavefield intensity as the complex geometrical optics method. Such a relationship is also disc...
DEFF Research Database (Denmark)
Bentzen, Martin Mose
2014-01-01
A new deontic logic, Action Type Deontic Logic, is presented. To motivate this logic, a number of benchmark cases are shown, representing inferences a deontic logic should validate. Some of the benchmark cases are singled out for further comments and some formal approaches to deontic reasoning...... are evaluated with respect to the benchmark cases. After that follows an informal introduction to the ideas behind the formal semantics, focussing on the distinction between action types and action tokens. Then the syntax and semantics of Action Type Deontic Logic is presented and it is shown to meet...
Analysis and synthesis of a logic control circuit by binary analysis methods
International Nuclear Information System (INIS)
Chicheportiche, Armand
1974-06-01
The analytical study of the logic circuits described in this report clearly shows the fruitful efficiency of the methods proposed by Binary Analysis. This study is a very new approach in logic and these mathematical methods are systematically precise in their applications. The detailed operations of an automatic system are to be studied in a way which cannot be reached by other methods. The definition and utilization of transition equations allow the determination of the different commutations in the auxiliary switch functions of a sequential system. This new way of analysis digital circuits will certainly develop in a very near future [fr
A survey of formal methods for determining functional joint axes.
Ehrig, Rainald M; Taylor, William R; Duda, Georg N; Heller, Markus O
2007-01-01
Axes of rotation e.g. at the knee, are often generated from clinical gait analysis data to be used in the assessment of kinematic abnormalities, the diagnosis of disease, or the ongoing monitoring of a patient's condition. They are additionally used in musculoskeletal models to aid in the description of joint and segment kinematics for patient specific analyses. Currently available methods to describe joint axes from segment marker positions share the problem that when one segment is transformed into the coordinate system of another, artefacts associated with motion of the markers relative to the bone can become magnified. In an attempt to address this problem, a symmetrical axis of rotation approach (SARA) is presented here to determine a unique axis of rotation that can consider the movement of two dynamic body segments simultaneously, and then compared its performance in a survey against a number of previously proposed techniques. Using a generated virtual joint, with superimposed marker error conditions to represent skin movement artefacts, fitting methods (geometric axis fit, cylinder axis fit, algebraic axis fit) and transformation techniques (axis transformation technique, mean helical axis, Schwartz approach) were classified and compared with the SARA. Nearly all approaches were able to estimate the axis of rotation to within an RMS error of 0.1cm at large ranges of motion (90 degrees ). Although the geometric axis fit produced the least RMS error of approximately 1.2 cm at lower ranges of motion (5 degrees ) with a stationary axis, the SARA and Axis Transformation Technique outperformed all other approaches under the most demanding marker artefact conditions for all ranges of motion. The cylinder and algebraic axis fit approaches were unable to compute competitive AoR estimates. Whilst these initial results using the SARA are promising and are fast enough to be determined "on-line", the technique must now be proven in a clinical environment.
Mathematical logic foundations for information science
Li, Wei
2014-01-01
Mathematical logic is a branch of mathematics that takes axiom systems and mathematical proofs as its objects of study. This book shows how it can also provide a foundation for the development of information science and technology. The first five chapters systematically present the core topics of classical mathematical logic, including the syntax and models of first-order languages, formal inference systems, computability and representability, and Gödel’s theorems. The last five chapters present extensions and developments of classical mathematical logic, particularly the concepts of version sequences of formal theories and their limits, the system of revision calculus, proschemes (formal descriptions of proof methods and strategies) and their properties, and the theory of inductive inference. All of these themes contribute to a formal theory of axiomatization and its application to the process of developing information technology and scientific theories. The book also describes the paradigm of three kinds...
Methods for testing the logical structure of plant procedure documents
International Nuclear Information System (INIS)
Horne, C.P.; Colley, R.; Fahley, J.M.
1990-01-01
This paper describes an ongoing EPRI project to investigate computer based methods to improve the development, maintenance, and verification of plant operating procedures. This project began as an evaluation of the applicability of structured software analysis methods to operating procedures. It was found that these methods offer benefits, if procedures are transformed to a structured representation to make them amenable to computer analysis. The next task was to investigate methods to transform procedures into a structured representation. The use of natural language techniques to read and compile the procedure documents appears to be viable for this purpose and supports conformity to guidelines. The final task was to consider possibilities of automated verification methods for procedures. Methods to help verify procedures were defined and information requirements specified. These methods take the structured representation of procedures as input. The software system being constructed in this project is called PASS, standing for Procedures Analysis Software System
Bolc, Leonard
1992-01-01
Many-valued logics were developed as an attempt to handle philosophical doubts about the "law of excluded middle" in classical logic. The first many-valued formal systems were developed by J. Lukasiewicz in Poland and E.Post in the U.S.A. in the 1920s, and since then the field has expanded dramatically as the applicability of the systems to other philosophical and semantic problems was recognized. Intuitionisticlogic, for example, arose from deep problems in the foundations of mathematics. Fuzzy logics, approximation logics, and probability logics all address questions that classical logic alone cannot answer. All these interpretations of many-valued calculi motivate specific formal systems thatallow detailed mathematical treatment. In this volume, the authors are concerned with finite-valued logics, and especially with three-valued logical calculi. Matrix constructions, axiomatizations of propositional and predicate calculi, syntax, semantic structures, and methodology are discussed. Separate chapters deal w...
Classical Mathematical Logic The Semantic Foundations of Logic
Epstein, Richard L
2011-01-01
In Classical Mathematical Logic, Richard L. Epstein relates the systems of mathematical logic to their original motivations to formalize reasoning in mathematics. The book also shows how mathematical logic can be used to formalize particular systems of mathematics. It sets out the formalization not only of arithmetic, but also of group theory, field theory, and linear orderings. These lead to the formalization of the real numbers and Euclidean plane geometry. The scope and limitations of modern logic are made clear in these formalizations. The book provides detailed explanations of all proo
Formal Solutions for Polarized Radiative Transfer. II. High-order Methods
Energy Technology Data Exchange (ETDEWEB)
Janett, Gioele; Steiner, Oskar; Belluzzi, Luca, E-mail: gioele.janett@irsol.ch [Istituto Ricerche Solari Locarno (IRSOL), 6605 Locarno-Monti (Switzerland)
2017-08-20
When integrating the radiative transfer equation for polarized light, the necessity of high-order numerical methods is well known. In fact, well-performing high-order formal solvers enable higher accuracy and the use of coarser spatial grids. Aiming to provide a clear comparison between formal solvers, this work presents different high-order numerical schemes and applies the systematic analysis proposed by Janett et al., emphasizing their advantages and drawbacks in terms of order of accuracy, stability, and computational cost.
Applying formal method to design of nuclear power plant embedded protection system
International Nuclear Information System (INIS)
Kim, Jin Hyun; Kim, Il Gon; Sung, Chang Hoon; Choi, Jin Young; Lee, Na Young
2001-01-01
Nuclear power embedded protection systems is a typical safety-critical system, which detects its failure and shutdowns its operation of nuclear reactor. These systems are very dangerous so that it absolutely requires safety and reliability. Therefore nuclear power embedded protection system should fulfill verification and validation completely from the design stage. To develop embedded system, various V and V method have been provided and especially its design using Formal Method is studied in other advanced country. In this paper, we introduce design method of nuclear power embedded protection systems using various Formal-Method in various respect following nuclear power plant software development guideline
Analysis of Formal Methods for Specification of E-Commerce Applications
Directory of Open Access Journals (Sweden)
Sadiq Ali Khan
2016-01-01
Full Text Available E-commerce based application characteristics portray elevated dynamics while incorporating decentralized nature. Extreme emphasis influencing structural design plus implementation, positions such applications highly appreciated. Significant research articles reveal that, applying formal methods addressing challenges incumbent with E-commerce based applications, contribute towards reliability and robustness obliging the system. Anticipating and designing sturdy e-process and concurrent implementation, allows application behavior extra strength against errors, frauds and hacking, minimizing program faults during application operations. Programmers find extreme difficulty guaranteeing correct processing under all circumstances, however, not impossible. Concealed flaws and errors, triggered only under unexpected and unanticipated scenarios, pilot subtle mistakes and appalling failures. Code authors utilize various formal methods for reducing these flaws. Mentioning prominent methods would include, ASM (Abstract State Machines, B-Method, z-Language, UML (Unified Modelling Language etc. This paper primarily focuses different formal methods applied while deliberating specification and verification techniques for cost effective.
A fuzzy logic based PROMETHEE method for material selection problems
Directory of Open Access Journals (Sweden)
Muhammet Gul
2018-03-01
Full Text Available Material selection is a complex problem in the design and development of products for diverse engineering applications. This paper presents a fuzzy PROMETHEE (Preference Ranking Organization Method for Enrichment Evaluation method based on trapezoidal fuzzy interval numbers that can be applied to the selection of materials for an automotive instrument panel. Also, it presents uniqueness in making a significant contribution to the literature in terms of the application of fuzzy decision-making approach to material selection problems. The method is illustrated, validated, and compared against three different fuzzy MCDM methods (fuzzy VIKOR, fuzzy TOPSIS, and fuzzy ELECTRE in terms of its ranking performance. Also, the relationships between the compared methods and the proposed scenarios for fuzzy PROMETHEE are evaluated via the Spearman’s correlation coefficient. Styrene Maleic Anhydride and Polypropylene are determined optionally as suitable materials for the automotive instrument panel case. We propose a generic fuzzy MCDM methodology that can be practically implemented to material selection problem. The main advantages of the methodology are consideration of the vagueness, uncertainty, and fuzziness to decision making environment.
A Mathematical Formalization Proposal for Business Growth
Directory of Open Access Journals (Sweden)
Gheorghe BAILESTEANU
2013-01-01
Full Text Available Economic sciences have known a spectacular evolution in the last century; beginning to use axiomatic methods, applying mathematical instruments as a decision-making tool. The quest to formalization needs to be addressed from various different angles, reducing entry and operating formal costs, increasing the incentives for firms to operate formally, reducing obstacles to their growth, and searching for inexpensive approaches through which to enforce compliancy with government regulations. This paper proposes a formalized approach to business growth, based on mathematics and logics, taking into consideration the particularities of the economic sector.
Interferometric architectures based All-Optical logic design methods and their implementations
Singh, Karamdeep; Kaur, Gurmeet
2015-06-01
All-Optical Signal Processing is an emerging technology which can avoid costly Optical-electronic-optical (O-E-O) conversions which are usually compulsory in traditional Electronic Signal Processing systems, thus greatly enhancing operating bit rate with some added advantages such as electro-magnetic interference immunity and low power consumption etc. In order to implement complex signal processing tasks All-Optical logic gates are required as backbone elements. This review describes the advances in the field of All-Optical logic design methods based on interferometric architectures such as Mach-Zehnder Interferometer (MZI), Sagnac Interferometers and Ultrafast Non-Linear Interferometer (UNI). All-Optical logic implementations for realization of arithmetic and signal processing applications based on each interferometric arrangement are also presented in a categorized manner.
System and method for embedding emotion in logic systems
Curtis, Steven A. (Inventor)
2012-01-01
A system, method, and computer readable-media for creating a stable synthetic neural system. The method includes training an intellectual choice-driven synthetic neural system (SNS), training an emotional rule-driven SNS by generating emotions from rules, incorporating the rule-driven SNS into the choice-driven SNS through an evolvable interface, and balancing the emotional SNS and the intellectual SNS to achieve stability in a nontrivial autonomous environment with a Stability Algorithm for Neural Entities (SANE). Generating emotions from rules can include coding the rules into the rule-driven SNS in a self-consistent way. Training the emotional rule-driven SNS can occur during a training stage in parallel with training the choice-driven SNS. The training stage can include a self assessment loop which measures performance characteristics of the rule-driven SNS against core genetic code. The method uses a stability threshold to measure stability of the incorporated rule-driven SNS and choice-driven SNS using SANE.
Ash-Shiddieqy, M. H.; Suparmi, A.; Sunarno, W.
2018-04-01
The purpose of this research is to understand the effectiveness of module based on guided inquiry method to improve students’ logical thinking ability. This research only evaluate the students’ logical ability after follows the learning activities that used developed physics module based on guided inquiry method. After the learning activities, students This research method uses a test instrument that adapts TOLT instrument. There are samples of 68 students of grade XI taken from SMA Negeri 4 Surakarta.Based on the results of the research can be seen that in the experimental class and control class, the posttest value aspect of probabilistic reasoning has the highest value than other aspects, whereas the posttest value of the proportional reasoning aspect has the lowest value. The average value of N-gain in the experimental class is 0.39, while in the control class is 0.30. Nevertheless, the N-gain values obtained in the experimental class are larger than the control class, so the guided inquiry-based module is considered more effective for improving students’ logical thinking. Based on the data obtained from the research shows the modules available to help teachers and students in learning activities. The developed Physics module is integrated with every syntax present in guided inquiry method, so it can be used to improve students’ logical thinking ability.
Baxter, Susan K; Blank, Lindsay; Woods, Helen Buckley; Payne, Nick; Rimmer, Melanie; Goyder, Elizabeth
2014-05-10
There is increasing interest in innovative methods to carry out systematic reviews of complex interventions. Theory-based approaches, such as logic models, have been suggested as a means of providing additional insights beyond that obtained via conventional review methods. This paper reports the use of an innovative method which combines systematic review processes with logic model techniques to synthesise a broad range of literature. The potential value of the model produced was explored with stakeholders. The review identified 295 papers that met the inclusion criteria. The papers consisted of 141 intervention studies and 154 non-intervention quantitative and qualitative articles. A logic model was systematically built from these studies. The model outlines interventions, short term outcomes, moderating and mediating factors and long term demand management outcomes and impacts. Interventions were grouped into typologies of practitioner education, process change, system change, and patient intervention. Short-term outcomes identified that may result from these interventions were changed physician or patient knowledge, beliefs or attitudes and also interventions related to changed doctor-patient interaction. A range of factors which may influence whether these outcomes lead to long term change were detailed. Demand management outcomes and intended impacts included content of referral, rate of referral, and doctor or patient satisfaction. The logic model details evidence and assumptions underpinning the complex pathway from interventions to demand management impact. The method offers a useful addition to systematic review methodologies. PROSPERO registration number: CRD42013004037.
Structural logical relations with case analysis and equality reasoning
DEFF Research Database (Denmark)
Rasmussen, Ulrik Terp; Filinski, Andrzej
2013-01-01
requires the assertion logic to be extended with reasoning principles not present in the original presentation of the formalization method. We address this by generalizing the assertion logic to include dependent sorts, and demonstrate that the original cut elimination proof continues to apply without...
Relational Parametricity and Separation Logic
DEFF Research Database (Denmark)
Birkedal, Lars; Yang, Hongseok
2008-01-01
Separation logic is a recent extension of Hoare logic for reasoning about programs with references to shared mutable data structures. In this paper, we provide a new interpretation of the logic for a programming language with higher types. Our interpretation is based on Reynolds's relational...... parametricity, and it provides a formal connection between separation logic and data abstraction. Udgivelsesdato: 2008...
Mathematical logic foundations for information science
Li, Wei
2010-01-01
This book presents the basic principles and formal calculus of mathematical logic. It covers core contents, extensions and developments of classical mathematical logic, and it offers formal proofs and concrete examples for all theoretical results.
Towards practical defeasible reasoning for description logics
CSIR Research Space (South Africa)
Casini, G
2013-07-01
Full Text Available The formalisation of defeasible reasoning in automated systems is becoming increasingly important. Description Logics (DLs) are nowadays the main logical formalism in the field of formal ontologies. Our focus in this paper is to devise a practical...
Preferential reasoning for modal logics
CSIR Research Space (South Africa)
Britz, K
2011-11-01
Full Text Available Modal logic is the foundation for a versatile and well-established class of knowledge representation formalisms in artificial intelligence. Enriching modal logics with non-monotonic reasoning capabilities such as preferential reasoning as developed...
Directory of Open Access Journals (Sweden)
Fernando Almeida
2017-12-01
Full Text Available Many clinical patients present to mental health clinics with depressive symptoms, anxiety, psychosomatic complaints, and sleeping problems. These symptoms which originated may originate from marital problems, conflictual interpersonal relationships, problems in securing work, and housing issues, among many others. These issues might interfere which underlie the difficulties that with the ability of the patients face in maintaining faultless logical reasoning (FLR and faultless logical functioning (FLF. FLR implies to assess correctly premises, rules, and conclusions. And FLF implies assessing not only FLR, but also the circumstances, life experience, personality, events that validate a conclusion. Almost always, the symptomatology is accompanied by intense emotional changes. Clinical experience shows that a logic-based psychotherapy (LBP approach is not practiced, and that therapists’ resort to psychopharmacotherapy or other types of psychotherapeutic approaches that are not focused on logical reasoning and, especially, logical functioning. Because of this, patients do not learn to overcome their reasoning and functioning errors. The aim of this work was to investigate how LBP works to improve the patients’ ability to think and function in a faultless logical way. This work describes the case studies of three patients. For this purpose we described the treatment of three patients. With this psychotherapeutic approach, patients gain knowledge that can then be applied not only to the issues that led them to the consultation, but also to other problems they have experienced, thus creating a learning experience and helping to prevent such patients from becoming involved in similar problematic situations. This highlights that LBP is a way of treating symptoms that interfere on some level with daily functioning. This psychotherapeutic approach is relevant for improving patients’ quality of life, and it fills a gap in the literature by describing
An Entry Point for Formal Methods: Specification and Analysis of Event Logs
Directory of Open Access Journals (Sweden)
Howard Barringer
2010-03-01
Full Text Available Formal specification languages have long languished, due to the grave scalability problems faced by complete verification methods. Runtime verification promises to use formal specifications to automate part of the more scalable art of testing, but has not been widely applied to real systems, and often falters due to the cost and complexity of instrumentation for online monitoring. In this paper we discuss work in progress to apply an event-based specification system to the logging mechanism of the Mars Science Laboratory mission at JPL. By focusing on log analysis, we exploit the "instrumentation" already implemented and required for communicating with the spacecraft. We argue that this work both shows a practical method for using formal specifications in testing and opens interesting research avenues, including a challenging specification learning problem.
A peak value searching method of the MCA based on digital logic devices
International Nuclear Information System (INIS)
Sang Ziru; Huang Shanshan; Chen Lian; Jin Ge
2010-01-01
Digital multi-channel analyzers play a more important role in multi-channel pulse height analysis technique. The direction of digitalization are characterized by powerful pulse processing ability, high throughput, improved stability and flexibility. This paper introduces a method of searching peak value of waveform based on digital logic with FPGA. This method reduce the dead time. Then data correction offline can improvement the non-linearity of MCA. It gives the α energy spectrum of 241 Am. (authors)
International Nuclear Information System (INIS)
Caudrelier, Jean-Michel; Vial, Stephane; Gibon, David; Kulik, Carine; Fournier, Charles; Castelain, Bernard; Coche-Dequeant, Bernard; Rousseau, Jean
2003-01-01
Purpose: Three-dimensional (3D) volume determination is one of the most important problems in conformal radiation therapy. Techniques of volume determination from tomographic medical imaging are usually based on two-dimensional (2D) contour definition with the result dependent on the segmentation method used, as well as on the user's manual procedure. The goal of this work is to describe and evaluate a new method that reduces the inaccuracies generally observed in the 2D contour definition and 3D volume reconstruction process. Methods and Materials: This new method has been developed by integrating the fuzziness in the 3D volume definition. It first defines semiautomatically a minimal 2D contour on each slice that definitely contains the volume and a maximal 2D contour that definitely does not contain the volume. The fuzziness region in between is processed using possibility functions in possibility theory. A volume of voxels, including the membership degree to the target volume, is then created on each slice axis, taking into account the slice position and slice profile. A resulting fuzzy volume is obtained after data fusion between multiorientation slices. Different studies have been designed to evaluate and compare this new method of target volume reconstruction and a classical reconstruction method. First, target definition accuracy and robustness were studied on phantom targets. Second, intra- and interobserver variations were studied on radiosurgery clinical cases. Results: The absolute volume errors are less than or equal to 1.5% for phantom volumes calculated by the fuzzy logic method, whereas the values obtained with the classical method are much larger than the actual volumes (absolute volume errors up to 72%). With increasing MRI slice thickness (1 mm to 8 mm), the phantom volumes calculated by the classical method are increasing exponentially with a maximum absolute error up to 300%. In contrast, the absolute volume errors are less than 12% for phantom
International Nuclear Information System (INIS)
Turek, M.; Heiden, W.; Riesen, A.; Chhabda, T.A.; Schubert, J.; Zander, W.; Krueger, P.; Keusgen, M.; Schoening, M.J.
2009-01-01
The cross-sensitivity of chemical sensors for several metal ions resembles in a way the overlapping sensitivity of some biological sensors, like the optical colour receptors of human retinal cone cells. While it is difficult to assign crisp classification values to measurands based on complex overlapping sensory signals, fuzzy logic offers a possibility to mathematically model such systems. Current work goes into the direction of mixed heavy metal solutions and the combination of fuzzy logic with heavy metal-sensitive, silicon-based chemical sensors for training scenarios of arbitrary sensor/probe combinations in terms of an electronic tongue. Heavy metals play an important role in environmental analysis. As trace elements as well as water impurities released from industrial processes they occur in the environment. In this work, the development of a new fuzzy logic method based on potentiometric measurements performed with three different miniaturised chalcogenide glass sensors in different heavy metal solutions will be presented. The critical validation of the developed fuzzy logic program will be demonstrated by means of measurements in unknown single- and multi-component heavy metal solutions. Limitations of this program and a comparison between calculated and expected values in terms of analyte composition and heavy metal ion concentration will be shown and discussed.
Energy Technology Data Exchange (ETDEWEB)
Turek, M. [Institute of Nano- and Biotechnologies (INB), Aachen University of Applied Sciences, Campus Juelich, Juelich (Germany); Institute of Bio- and Nanosystems (IBN), Research Centre Juelich GmbH, Juelich (Germany); Heiden, W.; Riesen, A. [Bonn-Rhein-Sieg University of Applied Sciences, Sankt Augustin (Germany); Chhabda, T.A. [Institute of Nano- and Biotechnologies (INB), Aachen University of Applied Sciences, Campus Juelich, Juelich (Germany); Schubert, J.; Zander, W. [Institute of Bio- and Nanosystems (IBN), Research Centre Juelich GmbH, Juelich (Germany); Krueger, P. [Institute of Biochemistry and Molecular Biology, RWTH Aachen, Aachen (Germany); Keusgen, M. [Institute for Pharmaceutical Chemistry, Philipps-University Marburg, Marburg (Germany); Schoening, M.J. [Institute of Nano- and Biotechnologies (INB), Aachen University of Applied Sciences, Campus Juelich, Juelich (Germany); Institute of Bio- and Nanosystems (IBN), Research Centre Juelich GmbH, Juelich (Germany)], E-mail: m.j.schoening@fz-juelich.de
2009-10-30
The cross-sensitivity of chemical sensors for several metal ions resembles in a way the overlapping sensitivity of some biological sensors, like the optical colour receptors of human retinal cone cells. While it is difficult to assign crisp classification values to measurands based on complex overlapping sensory signals, fuzzy logic offers a possibility to mathematically model such systems. Current work goes into the direction of mixed heavy metal solutions and the combination of fuzzy logic with heavy metal-sensitive, silicon-based chemical sensors for training scenarios of arbitrary sensor/probe combinations in terms of an electronic tongue. Heavy metals play an important role in environmental analysis. As trace elements as well as water impurities released from industrial processes they occur in the environment. In this work, the development of a new fuzzy logic method based on potentiometric measurements performed with three different miniaturised chalcogenide glass sensors in different heavy metal solutions will be presented. The critical validation of the developed fuzzy logic program will be demonstrated by means of measurements in unknown single- and multi-component heavy metal solutions. Limitations of this program and a comparison between calculated and expected values in terms of analyte composition and heavy metal ion concentration will be shown and discussed.
Application of path integral method to heavy ion reactions, 1. General formalism
Energy Technology Data Exchange (ETDEWEB)
Fujita, J; Negishi, T [Tokyo Univ. of Education (Japan). Dept. of Physics
1976-03-01
The semiclassical approach for heavy ion reactions has become more and more important in analyzing rapidly accumulating data. The purpose of this paper is to lay a quantum-mechanical foundation of the conventional semiclassical treatments in heavy ion physics by using Feynman's path integral method on the basis of the second paper of Pechukas, and discuss simple consequences of the formalism.
Discrete mathematics, formal methods, the Z schema and the software life cycle
Bown, Rodney L.
1991-01-01
The proper role and scope for the use of discrete mathematics and formal methods in support of engineering the security and integrity of components within deployed computer systems are discussed. It is proposed that the Z schema can be used as the specification language to capture the precise definition of system and component interfaces. This can be accomplished with an object oriented development paradigm.
Bolton, Matthew L.; Bass, Ellen J.
2009-01-01
Both the human factors engineering (HFE) and formal methods communities are concerned with finding and eliminating problems with safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to use model checking with HFE practices to perform formal verification of a human-interactive system. Despite the use of a seemingly simple target system, a patient controlled analgesia pump, the initial model proved to be difficult for the model checker to verify in a reasonable amount of time. This resulted in a number of model revisions that affected the HFE architectural, representativeness, and understandability goals of the effort. If formal methods are to meet the needs of the HFE community, additional modeling tools and technological developments are necessary.
DEFF Research Database (Denmark)
Christiansen, Henning; Dahl, Veronica
2009-01-01
By extending logic grammars with constraint logic, we give them the ability to create knowledge bases that represent the meaning of an input string. Semantic information is thus defined through extra-grammatical means, and a sentence's meaning logically follows as a by-product of string rewriting....... We formalize these ideas, and exemplify them both within and outside first-order logic, and for both fixed and dynamic knowledge bases. Within the latter variety, we consider the usual left-to-right derivations that are traditional in logic grammars, but also -- in a significant departure from...
Formal Methods in Air Traffic Management: The Case of Unmanned Aircraft Systems
Munoz, Cesar A.
2015-01-01
As the technological and operational capabilities of unmanned aircraft systems (UAS) continue to grow, so too does the need to introduce these systems into civil airspace. Unmanned Aircraft Systems Integration in the National Airspace System is a NASA research project that addresses the integration of civil UAS into non-segregated airspace operations. One of the major challenges of this integration is the lack of an onboard pilot to comply with the legal requirement that pilots see and avoid other aircraft. The need to provide an equivalent to this requirement for UAS has motivated the development of a detect and avoid (DAA) capability to provide the appropriate situational awareness and maneuver guidance in avoiding and remaining well clear of traffic aircraft. Formal methods has played a fundamental role in the development of this capability. This talk reports on the formal methods work conducted under NASA's Safe Autonomous System Operations project in support of the development of DAA for UAS. This work includes specification of low-level and high-level functional requirements, formal verification of algorithms, and rigorous validation of software implementations. The talk also discusses technical challenges in formal methods research in the context of the development and safety analysis of advanced air traffic management concepts.
Directory of Open Access Journals (Sweden)
Paweł Sitek
2016-01-01
Full Text Available This paper presents a hybrid method for modeling and solving supply chain optimization problems with soft, hard, and logical constraints. Ability to implement soft and logical constraints is a very important functionality for supply chain optimization models. Such constraints are particularly useful for modeling problems resulting from commercial agreements, contracts, competition, technology, safety, and environmental conditions. Two programming and solving environments, mathematical programming (MP and constraint logic programming (CLP, were combined in the hybrid method. This integration, hybridization, and the adequate multidimensional transformation of the problem (as a presolving method helped to substantially reduce the search space of combinatorial models for supply chain optimization problems. The operation research MP and declarative CLP, where constraints are modeled in different ways and different solving procedures are implemented, were linked together to use the strengths of both. This approach is particularly important for the decision and combinatorial optimization models with the objective function and constraints, there are many decision variables, and these are summed (common in manufacturing, supply chain management, project management, and logistic problems. The ECLiPSe system with Eplex library was proposed to implement a hybrid method. Additionally, the proposed hybrid transformed model is compared with the MILP-Mixed Integer Linear Programming model on the same data instances. For illustrative models, its use allowed finding optimal solutions eight to one hundred times faster and reducing the size of the combinatorial problem to a significant extent.
Method of Automatic Ontology Mapping through Machine Learning and Logic Mining
Institute of Scientific and Technical Information of China (English)
王英林
2004-01-01
Ontology mapping is the bottleneck of handling conflicts among heterogeneous ontologies and of implementing reconfiguration or interoperability of legacy systems. We proposed an ontology mapping method by using machine learning, type constraints and logic mining techniques. This method is able to find concept correspondences through instances and the result is optimized by using an error function; it is able to find attribute correspondence between two equivalent concepts and the mapping accuracy is enhanced by combining together instances learning, type constraints and the logic relations that are imbedded in instances; moreover, it solves the most common kind of categorization conflicts. We then proposed a merging algorithm to generate the shared ontology and proposed a reconfigurable architecture for interoperation based on multi agents. The legacy systems are encapsulated as information agents to participate in the integration system. Finally we give a simplified case study.
The gap values in the profile matching method by fuzzy logic
Sitepu, S. A.; Efendi, S.; Situmorang, Z.
2018-03-01
In this research, the determination of the appropriate values of Gap for the assessment of promotion criteria of position in an institution / company. In this study the authors use Fuzzy Sugeno logic on the determination of Gap values used in Profile Matching method. Test results of 5 employees obtained the eligibility of promotion with the position of Z* values between in 3.20 to 4.11.
A self-consistent nodal method in response matrix formalism for the multigroup diffusion equations
International Nuclear Information System (INIS)
Malambu, E.M.; Mund, E.H.
1996-01-01
We develop a nodal method for the multigroup diffusion equations, based on the transverse integration procedure (TIP). The efficiency of the method rests upon the convergence properties of a high-order multidimensional nodal expansion and upon numerical implementation aspects. The discrete 1D equations are cast in response matrix formalism. The derivation of the transverse leakage moments is self-consistent i.e. does not require additional assumptions. An outstanding feature of the method lies in the linear spatial shape of the local transverse leakage for the first-order scheme. The method is described in the two-dimensional case. The method is validated on some classical benchmark problems. (author)
CSIR Research Space (South Africa)
Klarman, S
2013-05-01
Full Text Available We introduce Description Logics of Context (DLCs) - an extension of Description Logics (DLs) for context-based reasoning. Our approach descends from J. McCarthy's tradition of treating contexts as formal objects over which one can quantify...
Uckelman, S.L.
2009-01-01
The origins of treating agency as a modal concept go back at least to the 11th century when Anselm, Archbishop of Canterbury, provided a modal explication of the Latin facere ‘to do’, which can be formalized within the context of modern modal logic and neighborhood semantics. The agentive logic
Temporalized Epistemic Default Logic
van der Hoek, W.; Meyer, J.J.; Treur, J.; Gabbay, D.
2001-01-01
The nonmonotonic logic Epistemic Default Logic (EDL) [Meyer and van der Hoek, 1993] is based on the metaphore of a meta-level architecture. It has already been established [Meyer and van der Hoek, 1993] how upward reflection can be formalized by a nonmonotonic entailment based on epistemic states,
Modelling of the automatic stabilization system of the aircraft course by a fuzzy logic method
Mamonova, T.; Syryamkin, V.; Vasilyeva, T.
2016-04-01
The problem of the present paper concerns the development of a fuzzy model of the system of an aircraft course stabilization. In this work modelling of the aircraft course stabilization system with the application of fuzzy logic is specified. Thus the authors have used the data taken for an ordinary passenger plane. As a result of the study the stabilization system models were realised in the environment of Matlab package Simulink on the basis of the PID-regulator and fuzzy logic. The authors of the paper have shown that the use of the method of artificial intelligence allows reducing the time of regulation to 1, which is 50 times faster than the time when standard receptions of the management theory are used. This fact demonstrates a positive influence of the use of fuzzy regulation.
Multi-class Mode of Action Classification of Toxic Compounds Using Logic Based Kernel Methods.
Lodhi, Huma; Muggleton, Stephen; Sternberg, Mike J E
2010-09-17
Toxicity prediction is essential for drug design and development of effective therapeutics. In this paper we present an in silico strategy, to identify the mode of action of toxic compounds, that is based on the use of a novel logic based kernel method. The technique uses support vector machines in conjunction with the kernels constructed from first order rules induced by an Inductive Logic Programming system. It constructs multi-class models by using a divide and conquer reduction strategy that splits multi-classes into binary groups and solves each individual problem recursively hence generating an underlying decision list structure. In order to evaluate the effectiveness of the approach for chemoinformatics problems like predictive toxicology, we apply it to toxicity classification in aquatic systems. The method is used to identify and classify 442 compounds with respect to the mode of action. The experimental results show that the technique successfully classifies toxic compounds and can be useful in assessing environmental risks. Experimental comparison of the performance of the proposed multi-class scheme with the standard multi-class Inductive Logic Programming algorithm and multi-class Support Vector Machine yields statistically significant results and demonstrates the potential power and benefits of the approach in identifying compounds of various toxic mechanisms. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
DEFF Research Database (Denmark)
Zambach, Sine; Madsen, Bodil Nistrup
2009-01-01
By applying formal terminological methods to model an ontology within the domain of enzyme inhibition, we aim to clarify concepts and to obtain consistency. Additionally, we propose a procedure for implementing this ontology in OWL with the aim of obtaining a strict structure which can form...
Systematic methods for the design of a class of fuzzy logic controllers
Yasin, Saad Yaser
2002-09-01
Fuzzy logic control, a relatively new branch of control, can be used effectively whenever conventional control techniques become inapplicable or impractical. Various attempts have been made to create a generalized fuzzy control system and to formulate an analytically based fuzzy control law. In this study, two methods, the left and right parameterization method and the normalized spline-base membership function method, were utilized for formulating analytical fuzzy control laws in important practical control applications. The first model was used to design an idle speed controller, while the second was used to control an inverted control problem. The results of both showed that a fuzzy logic control system based on the developed models could be used effectively to control highly nonlinear and complex systems. This study also investigated the application of fuzzy control in areas not fully utilizing fuzzy logic control. Three important practical applications pertaining to the automotive industries were studied. The first automotive-related application was the idle speed of spark ignition engines, using two fuzzy control methods: (1) left and right parameterization, and (2) fuzzy clustering techniques and experimental data. The simulation and experimental results showed that a conventional controller-like performance fuzzy controller could be designed based only on experimental data and intuitive knowledge of the system. In the second application, the automotive cruise control problem, a fuzzy control model was developed using parameters adaptive Proportional plus Integral plus Derivative (PID)-type fuzzy logic controller. Results were comparable to those using linearized conventional PID and linear quadratic regulator (LQR) controllers and, in certain cases and conditions, the developed controller outperformed the conventional PID and LQR controllers. The third application involved the air/fuel ratio control problem, using fuzzy clustering techniques, experimental
New data structures and algorithms for logic synthesis and verification
Amaru, Luca Gaetano
2017-01-01
This book introduces new logic primitives for electronic design automation tools. The author approaches fundamental EDA problems from a different, unconventional perspective, in order to demonstrate the key role of rethinking EDA solutions in overcoming technological limitations of present and future technologies. The author discusses techniques that improve the efficiency of logic representation, manipulation and optimization tasks by taking advantage of majority and biconditional logic primitives. Readers will be enabled to accelerate formal methods by studying core properties of logic circuits and developing new frameworks for logic reasoning engines. · Provides a comprehensive, theoretical study on majority and biconditional logic for logic synthesis; · Updates the current scenario in synthesis and verification – especially in light of emerging technologies; · Demonstrates applications to CMOS technology and emerging technologies.
Bernardo, M.; Vink, de E.P.; Di Pierro, A.; Wiklicky, H.
2013-01-01
Preface. This volume presents a set of papers accompanying the lectures of the 13th International School on Formal Methods for the Design of Computer, Communication, and Software Systems (SFM). This series of schools addresses the use of formal methods in computer science as a prominent approach to
Perturbative method for the derivation of quantum kinetic theory based on closed-time-path formalism
International Nuclear Information System (INIS)
Koide, Jun
2002-01-01
Within the closed-time-path formalism, a perturbative method is presented, which reduces the microscopic field theory to the quantum kinetic theory. In order to make this reduction, the expectation value of a physical quantity must be calculated under the condition that the Wigner distribution function is fixed, because it is the independent dynamical variable in the quantum kinetic theory. It is shown that when a nonequilibrium Green function in the form of the generalized Kadanoff-Baym ansatz is utilized, this condition appears as a cancellation of a certain part of contributions in the diagrammatic expression of the expectation value. Together with the quantum kinetic equation, which can be derived in the closed-time-path formalism, this method provides a basis for the kinetic-theoretical description
International Nuclear Information System (INIS)
Edgar, S.B.
1990-01-01
The structures of the N.P. and G.H.P formalisms are reviewed in order to understand and demonstrate the important role played by the commutator equations in the associated integration procedures. Particular attention is focused on how the commutator equations are to be satisfied, or checked for consistency. It is shown that Held's integration method will only guarantee genuine solutions of Einstein's equations when all the commutator equations are correctly and completely satisfied. (authors)
Directory of Open Access Journals (Sweden)
Emma Emanuilova Yandybaeva
2015-03-01
Full Text Available The methods of information subjects and objects interaction rules formalization in the electronic trading platform system has been developed. They are based on mathematical model of mandatory role-based access control. As a result of the work we have defined set of user roles and constructed roles hierarchy. For the roles hierarchy restrictions have been imposed to ensure the safety of the information system.
Formal Analysis Of Use Case Diagrams
Directory of Open Access Journals (Sweden)
Radosław Klimek
2010-01-01
Full Text Available Use case diagrams play an important role in modeling with UML. Careful modeling is crucialin obtaining a correct and efficient system architecture. The paper refers to the formalanalysis of the use case diagrams. A formal model of use cases is proposed and its constructionfor typical relationships between use cases is described. Two methods of formal analysis andverification are presented. The first one based on a states’ exploration represents a modelchecking approach. The second one refers to the symbolic reasoning using formal methodsof temporal logic. Simple but representative example of the use case scenario verification isdiscussed.
Loregic: A Method to Characterize the Cooperative Logic of Regulatory Factors
Wang, Daifeng; Yan, Koon-Kiu; Sisu, Cristina; Cheng, Chao; Rozowsky, Joel; Meyerson, William; Gerstein, Mark B.
2015-01-01
The topology of the gene-regulatory network has been extensively analyzed. Now, given the large amount of available functional genomic data, it is possible to go beyond this and systematically study regulatory circuits in terms of logic elements. To this end, we present Loregic, a computational method integrating gene expression and regulatory network data, to characterize the cooperativity of regulatory factors. Loregic uses all 16 possible two-input-one-output logic gates (e.g. AND or XOR) to describe triplets of two factors regulating a common target. We attempt to find the gate that best matches each triplet’s observed gene expression pattern across many conditions. We make Loregic available as a general-purpose tool (github.com/gersteinlab/loregic). We validate it with known yeast transcription-factor knockout experiments. Next, using human ENCODE ChIP-Seq and TCGA RNA-Seq data, we are able to demonstrate how Loregic characterizes complex circuits involving both proximally and distally regulating transcription factors (TFs) and also miRNAs. Furthermore, we show that MYC, a well-known oncogenic driving TF, can be modeled as acting independently from other TFs (e.g., using OR gates) but antagonistically with repressing miRNAs. Finally, we inter-relate Loregic’s gate logic with other aspects of regulation, such as indirect binding via protein-protein interactions, feed-forward loop motifs and global regulatory hierarchy. PMID:25884877
Formal methods and their applicability in the development of safety critical software systems
International Nuclear Information System (INIS)
Sievertsen, T.
1995-01-01
The OECD Halden Reactor Project has for a number of years been involved in the development and application of a formal software specification and development method based on algebraic specification and the HRP Prover. In parallel to this activity the Project has been evaluating and comparing different methods and approaches to formal software development by their application on realistic case examples. Recent work has demonstrated that algebraic specification and the HRP Prover can be used both in the specification and design of a software system, even down to a concrete model which can be translated into the chosen implementation language. The HRP Prover is currently being used in a case study on the applicability of the methodology in the development of a power range monitoring system for a nuclear power plant. The presentation reviews some of the experiences drawn from the Project's research activities in this area, with special emphasis on questions relating to applicability and limitations, and the role of formal methods in the development of safety-critical software systems. (14 refs., 1 fig.)
Efficient formalism for treating tapered structures using the Fourier modal method
DEFF Research Database (Denmark)
Østerkryger, Andreas Dyhl; Gregersen, Niels
2016-01-01
We investigate the development of the mode occupations in tapered structures using the Fourier modal method. In order to use the Fourier modal method, tapered structures are divided into layers of uniform refractive index in the propagation direction and the optical modes are found within each...... layer. This is not very efficient and in this proceeding we take the first steps towards a more efficient formalism for treating tapered structures using the Fourier modal method. We show that the coupling coefficients through the structure are slowly varying and that only the first few modes...
International Nuclear Information System (INIS)
Mittelstaedt, P.
1983-01-01
on the basis of the well-known quantum logic and quantum probability a formal language of relativistic quantum physics is developed. This language incorporates quantum logical as well as relativistic restrictions. It is shown that relativity imposes serious restrictions on the validity regions of propositions in space-time. By an additional postulate this relativistic quantum logic can be made consistent. The results of this paper are derived exclusively within the formal quantum language; they are, however, in accordance with well-known facts of relativistic quantum physics in Hilbert space. (author)
Andova, S.; McIver, A.; D'Argenio, P.R.; Cuijpers, P.J.L.; Markovski, J.; Morgan, C.; Núñez, M.
2009-01-01
This volume contains the papers presented at the 1st workshop on Quantitative Formal Methods: Theory and Applications, which was held in Eindhoven on 3 November 2009 as part of the International Symposium on Formal Methods 2009. This volume contains the final versions of all contributions accepted
Formalization of the Access Control on ARM-Android Platform with the B Method
Ren, Lu; Wang, Wei; Zhu, Xiaodong; Man, Yujia; Yin, Qing
2018-01-01
ARM-Android is a widespread mobile platform with multi-layer access control mechanisms, security-critical in the system. Many access control vulnerabilities still exist due to the course-grained policy and numerous engineering defects, which have been widely studied. However, few researches focus on the mechanism formalization, including the Android permission framework, kernel process management and hardware isolation. This paper first develops a comprehensive formal access control model on the ARM-Android platform using the B method, from the Android middleware to hardware layer. All the model specifications are type checked and proved to be well-defined, with 75%of proof obligations demonstrated automatically. The results show that the proposed B model is feasible to specify and verify access control schemes in the ARM-Android system, and capable of implementing a practical control module.
Fuzzy-logic based strategy for validation of multiplex methods: example with qualitative GMO assays.
Bellocchi, Gianni; Bertholet, Vincent; Hamels, Sandrine; Moens, W; Remacle, José; Van den Eede, Guy
2010-02-01
This paper illustrates the advantages that a fuzzy-based aggregation method could bring into the validation of a multiplex method for GMO detection (DualChip GMO kit, Eppendorf). Guidelines for validation of chemical, bio-chemical, pharmaceutical and genetic methods have been developed and ad hoc validation statistics are available and routinely used, for in-house and inter-laboratory testing, and decision-making. Fuzzy logic allows summarising the information obtained by independent validation statistics into one synthetic indicator of overall method performance. The microarray technology, introduced for simultaneous identification of multiple GMOs, poses specific validation issues (patterns of performance for a variety of GMOs at different concentrations). A fuzzy-based indicator for overall evaluation is illustrated in this paper, and applied to validation data for different genetically modified elements. Remarks were drawn on the analytical results. The fuzzy-logic based rules were shown to be applicable to improve interpretation of results and facilitate overall evaluation of the multiplex method.
DEFF Research Database (Denmark)
Masses of Formal Philosophy is an outgrowth of Formal Philosophy. That book gathered the responses of some of the most prominent formal philosophers to five relatively open and broad questions initiating a discussion of metaphilosophical themes and problems surrounding the use of formal methods i...... in philosophy. Including contributions from a wide range of philosophers, Masses of Formal Philosophy contains important new responses to the original five questions.......Masses of Formal Philosophy is an outgrowth of Formal Philosophy. That book gathered the responses of some of the most prominent formal philosophers to five relatively open and broad questions initiating a discussion of metaphilosophical themes and problems surrounding the use of formal methods...
Methods of software V and V for a programmable logic controller in NPPs
International Nuclear Information System (INIS)
Kim, Jang Yeol; Lee, Young Jun; Cha, Kyung Ho; Cheon, Se Woo; Son, Han Seong; Lee, Jang Soo; Kwon, Kee Choon
2004-01-01
This paper addresses the Verification and Validation (V and V) process and methodology for embedded real time software of a safety-grade Programmable Logic Controller(PLC). This safety-grade PLC is being developed in the Korea Nuclear Instrumentation and Control System (KNICS) projects. KNICS projects are developing a Reactor Protection System(RPS) and an Engineered Safety Feature-Component Control System (ESF-CCS) as well as safety-grade PLC. Safety-grade PLC will be a major component that composes the RPS systems and ESF-CCS systems as nuclear instruments and control equipments. This paper describes the V and V guidelines and procedure, V and V environment, V and V process and methodology, and the V and V tools by the KNICS projects. Specially, it describes the real-time operating system V and V experience which corresponds to the requirement analysis phase of the software development life cycle. Main activities of the real-time operating system Software Requirement Specification(SRS) V and V of the PLC are the technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, software safety analysis, and software configuration management. The proposed V and V methodology satisfies the Standard Review Plan(SRP)/Branch Technical Position(BTP)-14(MOST-KSRG 7/Appendix 15 in Korea will be issued soon) criteria for the safety software in nuclear power plants. The proposed V and V methodology is going to verify the upcoming software life cycle in the KNICS projects. (author)
V and V methods of a safety-critical software for a programmable logic controller
Energy Technology Data Exchange (ETDEWEB)
Kim, Jang Yeol; Lee, Young Jun; Cha, Kyung Ho; Cheon, Se Woo; Lee, Jang Soo; Kwon, Kee Choon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Kong, Seung Ju [Korea Hydro and Nuclear Power Co., Ltd, Daejeon (Korea, Republic of)
2005-11-15
This paper addresses the Verification an Validation(V and V) process and the methodology for an embedded real time software of a safety-grade Programmable Logic Controller(PLC). This safety-grade PLC is being developed as one of the Korean Nuclear Instrumentation and Control System(KNICS) project KNICS projects are developing a Reactor Protection System(RPS) and an Engineered Safety Feature-Component Control System(ESF-CCS) as well as a safety-grade PLC. The safety-grade PLC will be a major component that encomposes the RPS systems and the ESF-CCS systems as nuclear instruments and control equipment. This paper describes the V and V guidelines an procedures, V and V environment, V and V process and methodology, and the V and V tools in the KNICS projects. Specifically, it describes the real-time operating system V and V experience which corresponds to the requirement analysis phase, design phase and the implementation and testing phase of the software development life cycle. Main activities of the V and V for the PLC system software are a technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, software safety analysis, and a software configuration management. The proposed V and V methodology satisfies the Standard Review Plan(SRP)/Branch Technical Position(BTP)-14 criteria for the safety software in nuclear power plants. The proposed V and V methodology is going to be used to verify the upcoming software life cycle in the KNICS projects.
Software V and V methods for a safety - grade programmable logic controller
International Nuclear Information System (INIS)
Jang Yeol Kim; Young Jun Lee; Kyung Ho Cha; Se Woo Cheon; Jang Soo Lee; Kee Choon Kwon
2006-01-01
This paper addresses the Verification and Validation(V and V) process and the methodology for an embedded real time software of a safety-grade Programmable Logic Controller(PLC). This safety- grade PLC is being developed as one of the Korean Nuclear Instrumentation and Control System (KNICS) projects. KNICS projects are developing a Reactor Protection System(RPS) and an Engineered Safety Feature-Component Control System(ESF-CCS) as well as a safety-grade PLC. The safety-grade PLC will be a major component that encomposes the RPS systems and the ESF-CCS systems as nuclear instruments and control equipment. This paper describes the V and V guidelines and procedures, V and V environment, V and V process and methodology, and the V and V tools in the KNICS projects. Specifically, it describes the real-time operating system V and V experience which corresponds to the requirement analysis phase, design phase and the implementation and testing phase of the software development life cycle. Main activities of the V and V for the PLC system software are a technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, software safety analysis, and a software configuration management. The proposed V and V methodology satisfies the Standard Review Plan(SRP)/Branch Technical Position(BTP)-14 criteria for the safety software in nuclear power plants. The proposed V and V methodology is going to be used to verify the upcoming software life cycle in the KNICS projects. (author)
V and V methods of a safety-critical software for a programmable logic controller
International Nuclear Information System (INIS)
Kim, Jang Yeol; Lee, Young Jun; Cha, Kyung Ho; Cheon, Se Woo; Lee, Jang Soo; Kwon, Kee Choon; Kong, Seung Ju
2005-01-01
This paper addresses the Verification an Validation(V and V) process and the methodology for an embedded real time software of a safety-grade Programmable Logic Controller(PLC). This safety-grade PLC is being developed as one of the Korean Nuclear Instrumentation and Control System(KNICS) project KNICS projects are developing a Reactor Protection System(RPS) and an Engineered Safety Feature-Component Control System(ESF-CCS) as well as a safety-grade PLC. The safety-grade PLC will be a major component that encomposes the RPS systems and the ESF-CCS systems as nuclear instruments and control equipment. This paper describes the V and V guidelines an procedures, V and V environment, V and V process and methodology, and the V and V tools in the KNICS projects. Specifically, it describes the real-time operating system V and V experience which corresponds to the requirement analysis phase, design phase and the implementation and testing phase of the software development life cycle. Main activities of the V and V for the PLC system software are a technical evaluation, licensing suitability evaluation, inspection and traceability analysis, formal verification, software safety analysis, and a software configuration management. The proposed V and V methodology satisfies the Standard Review Plan(SRP)/Branch Technical Position(BTP)-14 criteria for the safety software in nuclear power plants. The proposed V and V methodology is going to be used to verify the upcoming software life cycle in the KNICS projects
Energy Technology Data Exchange (ETDEWEB)
Le Campion, J.M.
1996-01-30
To provide safety analysis of complex real time systems which have been developed for the protection of French nuclear Plants, the CEA is interested in software testing validation techniques. These series of tests are made by a purely software simulation of the system. The purpose is to establish the truth of some critical properties of the programs either at the simulation run time or after its execution. The operator is able to describe the variation of some inputs parameters of the programs and shows the results with graphics facilities. An important need was to describe formally some categories of properties expressed in terms of academic examples. We thought that a logical textual language was appropriate to achieve this formal expression. This thesis describe a new data-flow language called EFRI extending the semantic of interval temporal logics. Then we describe a calculus using regular languages on arrays which associates to each formula of the EFRI language a regular expression. With this method, the verification of a property described by a formula of EFRI can be viewed as a classical problem of languages theory: does a word belongs to a regular language. We can then build a finite automaton to recognize complex temporal diagrams. (author). 38 refs., 7 tabs., 4 appends.
Directory of Open Access Journals (Sweden)
V. Vakhshoori
2016-09-01
Full Text Available A regional scale basin susceptible to landslide located in Qaemshahr area in northern Iran was chosen for comparing the reliability of weight of evidence (WofE, fuzzy logic, and frequency ratio (FR methods for landslide susceptibility mapping. The locations of 157 landslides were identified using Google Earth® or extracted from archived data, from which, 22 rockslides were eliminated from the data-set due to their different conditions. The 135 remaining landslides were randomly divided into two groups of modelling (70% and validation (30% data-sets. Elevation, slope degree, slope aspect, lithology, land use/cover, normalized difference vegetation index, rainfall, distance to drainage network, roads, and faults were considered as landslide causative factors. The landslide susceptibility maps were prepared using the three mentioned methods. The validation process was measured by the success and prediction rates calculated by area under receiver operating characteristic curve. The ‘OR’, ‘AND’, ‘SUM’, and ‘PRODUCT’ operators of the fuzzy logic method were unacceptable because these operators classify the target area into either very high or very low susceptible zones that are inconsistent with the physical conditions of the study area. The results of fuzzy ‘GAMMA’ operators were relatively reliable while, FR and WofE methods showed results that are more reliable.
DEFF Research Database (Denmark)
Ramli, Carroline Dewi Puspa Kencana; Nielson, Hanne Riis; Nielson, Flemming
2011-01-01
We study the international standard XACML 3.0 for describing security access control policy in a compositional way. Our main contribution is to derive a logic that precisely captures the idea behind the standard and to formally define the semantics of the policy combining algorithms of XACML....... To guard against modelling artefacts we provide an alternative way of characterizing the policy combining algorithms and we formally prove the equivalence of these approaches. This allows us to pinpoint the shortcoming of previous approaches to formalization based either on Belnap logic or on D -algebra....
Löwe, B.; Pacuit, E.; Saraf, S.
2009-01-01
Finding out what makes two stories equivalent is a daunting task for a formalization of narratives. Using a high-level language of beliefs and preferences for describing stories and a simple algorithm for analyzing them, we determine the doxastic game fragment of actual narratives from the TV crime
Crossley, J N; Brickhill, CJ; Stillwell, JC
2010-01-01
Although mathematical logic can be a formidably abstruse topic, even for mathematicians, this concise book presents the subject in a lively and approachable fashion. It deals with the very important ideas in modern mathematical logic without the detailed mathematical work required of those with a professional interest in logic.The book begins with a historical survey of the development of mathematical logic from two parallel streams: formal deduction, which originated with Aristotle, Euclid, and others; and mathematical analysis, which dates back to Archimedes in the same era. The streams beg
Duhem’s Analysis of Newtonian Method and the Logical Priority of Physics over Metaphysics
Directory of Open Access Journals (Sweden)
Eduardo Salles de Oliveira Barra
2017-06-01
Full Text Available This article offers a discussion of Duhemian analysis of Newton's method in the Principia considering both the traditional response to this analysis (Popper et alii and the more recent ones (Harper et alii. It is argued that in General Scholium to the Principia, Newton is not advocating what Duhem suggests in his best-known criticism, but he is proposing something very close to the establishment of a logical priority of physics over metaphysics, a familiar thesis defended by the French physicist himself.
Analysis of selected structures for model-based measuring methods using fuzzy logic
Energy Technology Data Exchange (ETDEWEB)
Hampel, R.; Kaestner, W.; Fenske, A.; Vandreier, B.; Schefter, S. [Hochschule fuer Technik, Wirtschaft und Sozialwesen Zittau/Goerlitz (FH), Zittau (DE). Inst. fuer Prozesstechnik, Prozessautomatisierung und Messtechnik e.V. (IPM)
2000-07-01
Monitoring and diagnosis of safety-related technical processes in nuclear enginering can be improved with the help of intelligent methods of signal processing such as analytical redundancies. This chapter gives an overview about combined methods in form of hybrid models using model based measuring methods (observer) and knowledge-based methods (fuzzy logic). Three variants of hybrid observers (fuzzy-supported observer, hybrid observer with variable gain and hybrid non-linear operating point observer) are explained. As a result of the combination of analytical and fuzzy-based algorithms a new quality of monitoring and diagnosis is achieved. The results will be demonstrated in summary for the example water level estimation within pressure vessels (pressurizer, steam generator, and Boiling Water Reactor) with water-steam mixture during the accidental depressurization. (orig.)
Analysis of selected structures for model-based measuring methods using fuzzy logic
International Nuclear Information System (INIS)
Hampel, R.; Kaestner, W.; Fenske, A.; Vandreier, B.; Schefter, S.
2000-01-01
Monitoring and diagnosis of safety-related technical processes in nuclear engineering can be improved with the help of intelligent methods of signal processing such as analytical redundancies. This chapter gives an overview about combined methods in form of hybrid models using model based measuring methods (observer) and knowledge-based methods (fuzzy logic). Three variants of hybrid observers (fuzzy-supported observer, hybrid observer with variable gain and hybrid non-linear operating point observer) are explained. As a result of the combination of analytical and fuzzy-based algorithms a new quality of monitoring and diagnosis is achieved. The results will be demonstrated in summary for the example water level estimation within pressure vessels (pressurizer, steam generator, and Boiling Water Reactor) with water-steam mixture during the accidental depressurization. (orig.)
A critical study of fuzzy logic as a scientific method in social sciences ...
African Journals Online (AJOL)
The logic of the social sciences, from its inception, has been certain and classic. By advent of Fuzzy logic, gradually making use of it was common because of frequent capabilities and applications that in resolving problems of this science was been attributed to it. Changing of logic in a science or epistemic system has many ...
Topical Roots of Formal Dialectic
Krabbe, Erik C. W.
Formal dialectic has its roots in ancient dialectic. We can trace this influence in Charles Hamblin's book on fallacies, in which he introduced his first formal dialectical systems. Earlier, Paul Lorenzen proposed systems of dialogical logic, which were in fact formal dialectical systems avant la
DEFF Research Database (Denmark)
Braüner, Torben
2011-01-01
Hybrid logic is an extension of modal logic which allows us to refer explicitly to points of the model in the syntax of formulas. It is easy to justify interest in hybrid logic on applied grounds, with the usefulness of the additional expressive power. For example, when reasoning about time one...... often wants to build up a series of assertions about what happens at a particular instant, and standard modal formalisms do not allow this. What is less obvious is that the route hybrid logic takes to overcome this problem often actually improves the behaviour of the underlying modal formalism....... For example, it becomes far simpler to formulate proof-systems for hybrid logic, and completeness results can be proved of a generality that is simply not available in modal logic. That is, hybridization is a systematic way of remedying a number of known deficiencies of modal logic. First-order hybrid logic...
Software Safety Analysis of Digital Protection System Requirements Using a Qualitative Formal Method
International Nuclear Information System (INIS)
Lee, Jang-Soo; Kwon, Kee-Choon; Cha, Sung-Deok
2004-01-01
The safety analysis of requirements is a key problem area in the development of software for the digital protection systems of a nuclear power plant. When specifying requirements for software of the digital protection systems and conducting safety analysis, engineers find that requirements are often known only in qualitative terms and that existing fault-tree analysis techniques provide little guidance on formulating and evaluating potential failure modes. A framework for the requirements engineering process is proposed that consists of a qualitative method for requirements specification, called the qualitative formal method (QFM), and a safety analysis method for the requirements based on causality information, called the causal requirements safety analysis (CRSA). CRSA is a technique that qualitatively evaluates causal relationships between software faults and physical hazards. This technique, extending the qualitative formal method process and utilizing information captured in the state trajectory, provides specific guidelines on how to identify failure modes and the relationship among them. The QFM and CRSA processes are described using shutdown system 2 of the Wolsong nuclear power plants as the digital protection system example
Method of resonating groups in the Faddeev-Hahn equation formalism for three-body nuclear problem
Nasirov, M Z
2002-01-01
The Faddeev-Hahn equation formalism for three-body nuclear problem is considered. For solution of the equations the method of resonant groups have applied. The calculations of tritium binding energy and doublet nd-scattering length have been carried out. The results obtained shows that Faddeev-Hahn equation formalism is very simple and effective. (author)
GOAL Agents Instantiate Intention Logic
Hindriks, Koen; van der Hoek, Wiebe
2008-01-01
It is commonly believed there is a big gap between agent logics and computational agent frameworks. In this paper, we show that this gap is not as big as believed by showing that GOAL agents instantiate Intention Logic of Cohen and Levesque. That is, we show that GOAL agent programs can be formally related to Intention Logic.We do so by proving that the GOAL Verification Logic can be embedded into Intention Logic. It follows that (a fragment of) Intention Logic can be used t...
Nuno David; Jaime Simão Sichman; Helder Coelho
2005-01-01
WOS:000235217900009 (Nº de Acesso Web of Science) The classical theory of computation does not represent an adequate model of reality for simulation in the social sciences. The aim of this paper is to construct a methodological perspective that is able to conciliate the formal and empirical logic of program verification in computer science, with the interpretative and multiparadigmatic logic of the social sciences. We attempt to evaluate whether social simulation implies an additional pers...
Using CASE-tools based on formal methods in real-life system development of distributed systems
International Nuclear Information System (INIS)
Stoelen, Ketil; Karlsen, Tore Willy; Mohn, Peter; Sandmark, Haaakon
1998-03-01
Within the OECD Halden Reactor Project (HRP) the development and application of formal methods to enhance system quality have been prioritised tasks for the last three years per periods. The three year programme 1997-1999 identifies the need to gain experience from applying formal methods in larger real-life system developments. This motivated the initiation of the HRP research activity Integration of Formal Specification in the Development of HAMMLAB 2000 (INT-FS). The principal objective of INT-FS is to experiment with formal methods in system developments connected to HAMMLAB 2000 and thereby gain a better understanding of their suitability to support practical software engineering. In particular, INT-FS will try to measure the effect of formal methods and gain experience in combining formal methods with traditional development techniques. INT-FS was started up in January 1997. This report describes the status of INT-FS by February 1998. The report identifies objectives and plans; it motivates the choice of formal methods, CASE-tool and software process; it motivates and defines metrics for measuring achievement and the effect of formalization. The report also provides preliminary results from an experimental development of a communication manager; it describes the component to be developed and the background of the participants; it offers some provisional statistics and summarises the experiences with methods and tools. The development of the communication manager is the first attempt ever to exploit state-of-the-art CASE-tools for formal methods in practical software engineering at the HRP. (author)
A Method for Capturing and Reconciling Stakeholder Intentions Based on the Formal Concept Analysis
Aoyama, Mikio
Information systems are ubiquitous in our daily life. Thus, information systems need to work appropriately anywhere at any time for everybody. Conventional information systems engineering tends to engineer systems from the viewpoint of systems functionality. However, the diversity of the usage context requires fundamental change compared to our current thinking on information systems; from the functionality the systems provide to the goals the systems should achieve. The intentional approach embraces the goals and related aspects of the information systems. This chapter presents a method for capturing, structuring and reconciling diverse goals of multiple stakeholders. The heart of the method lies in the hierarchical structuring of goals by goal lattice based on the formal concept analysis, a semantic extension of the lattice theory. We illustrate the effectiveness of the presented method through application to the self-checkout systems for large-scale supermarkets.
Fuzzy Logic vs. Neutrosophic Logic: Operations Logic
Directory of Open Access Journals (Sweden)
Salah Bouzina
2016-12-01
Full Text Available The goal of this research is first to show how different, thorough, widespread and effective are the operations logic of the neutrosophic logic compared to the fuzzy logic’s operations logical. The second aim is to observe how a fully new logic, the neutrosophic logic, is established starting by changing the previous logical perspective fuzzy logic, and by changing that, we mean changing changing the truth values from the truth and falsity degrees membership in fuzzy logic, to the truth, falsity and indeterminacy degrees membership in neutrosophic logic; and thirdly, to observe that there is no limit to the logical discoveries - we only change the principle, then the system changes completely.
Semantic foundation for preferential description logics
CSIR Research Space (South Africa)
Britz, K
2011-12-01
Full Text Available Description logics are a well-established family of knowledge representation formalisms in Artificial Intelligence. Enriching description logics with non-monotonic reasoning capabilities, especially preferential reasoning as developed by Lehmann...
A Linguistic Truth-Valued Temporal Reasoning Formalism and Its Implementation
Lu, Zhirui; Liu, Jun; Augusto, Juan C.; Wang, Hui
Temporality and uncertainty are important features of many real world systems. Solving problems in such systems requires the use of formal mechanism such as logic systems, statistical methods or other reasoning and decision-making methods. In this paper, we propose a linguistic truth-valued temporal reasoning formalism to enable the management of both features concurrently using a linguistic truth valued logic and a temporal logic. We also provide a backward reasoning algorithm which allows the answering of user queries. A simple but realistic scenario in a smart home application is used to illustrate our work.
EMRlog method for computer security for electronic medical records with logic and data mining.
Martínez Monterrubio, Sergio Mauricio; Frausto Solis, Juan; Monroy Borja, Raúl
2015-01-01
The proper functioning of a hospital computer system is an arduous work for managers and staff. However, inconsistent policies are frequent and can produce enormous problems, such as stolen information, frequent failures, and loss of the entire or part of the hospital data. This paper presents a new method named EMRlog for computer security systems in hospitals. EMRlog is focused on two kinds of security policies: directive and implemented policies. Security policies are applied to computer systems that handle huge amounts of information such as databases, applications, and medical records. Firstly, a syntactic verification step is applied by using predicate logic. Then data mining techniques are used to detect which security policies have really been implemented by the computer systems staff. Subsequently, consistency is verified in both kinds of policies; in addition these subsets are contrasted and validated. This is performed by an automatic theorem prover. Thus, many kinds of vulnerabilities can be removed for achieving a safer computer system.
EMRlog Method for Computer Security for Electronic Medical Records with Logic and Data Mining
Directory of Open Access Journals (Sweden)
Sergio Mauricio Martínez Monterrubio
2015-01-01
Full Text Available The proper functioning of a hospital computer system is an arduous work for managers and staff. However, inconsistent policies are frequent and can produce enormous problems, such as stolen information, frequent failures, and loss of the entire or part of the hospital data. This paper presents a new method named EMRlog for computer security systems in hospitals. EMRlog is focused on two kinds of security policies: directive and implemented policies. Security policies are applied to computer systems that handle huge amounts of information such as databases, applications, and medical records. Firstly, a syntactic verification step is applied by using predicate logic. Then data mining techniques are used to detect which security policies have really been implemented by the computer systems staff. Subsequently, consistency is verified in both kinds of policies; in addition these subsets are contrasted and validated. This is performed by an automatic theorem prover. Thus, many kinds of vulnerabilities can be removed for achieving a safer computer system.
Application of multi response optimization with grey relational analysis and fuzzy logic method
Winarni, Sri; Wahyu Indratno, Sapto
2018-01-01
Multi-response optimization is an optimization process by considering multiple responses simultaneously. The purpose of this research is to get the optimum point on multi-response optimization process using grey relational analysis and fuzzy logic method. The optimum point is determined from the Fuzzy-GRG (Grey Relational Grade) variable which is the conversion of the Signal to Noise Ratio of the responses involved. The case study used in this research are case optimization of electrical process parameters in electrical disharge machining. It was found that the combination of treatments resulting to optimum MRR and SR was a 70 V gap voltage factor, peak current 9 A and duty factor 0.8.
Formal verification of Simulink/Stateflow diagrams a deductive approach
Zhan, Naijun; Zhao, Hengjun
2017-01-01
This book presents a state-of-the-art technique for formal verification of continuous-time Simulink/Stateflow diagrams, featuring an expressive hybrid system modelling language, a powerful specification logic and deduction-based verification approach, and some impressive, realistic case studies. Readers will learn the HCSP/HHL-based deductive method and the use of corresponding tools for formal verification of Simulink/Stateflow diagrams. They will also gain some basic ideas about fundamental elements of formal methods such as formal syntax and semantics, and especially the common techniques applied in formal modelling and verification of hybrid systems. By investigating the successful case studies, readers will realize how to apply the pure theory and techniques to real applications, and hopefully will be inspired to start to use the proposed approach, or even develop their own formal methods in their future work.
Fault tree construction of hybrid system requirements using qualitative formal method
International Nuclear Information System (INIS)
Lee, Jang-Soo; Cha, Sung-Deok
2005-01-01
When specifying requirements for software controlling hybrid systems and conducting safety analysis, engineers experience that requirements are often known only in qualitative terms and that existing fault tree analysis techniques provide little guidance on formulating and evaluating potential failure modes. In this paper, we propose Causal Requirements Safety Analysis (CRSA) as a technique to qualitatively evaluate causal relationship between software faults and physical hazards. This technique, extending qualitative formal method process and utilizing information captured in the state trajectory, provides specific guidelines on how to identify failure modes and relationship among them. Using a simplified electrical power system as an example, we describe step-by-step procedures of conducting CRSA. Our experience of applying CRSA to perform fault tree analysis on requirements for the Wolsong nuclear power plant shutdown system indicates that CRSA is an effective technique in assisting safety engineers
Implementation of a method for calculating temperature-dependent resistivities in the KKR formalism
Mahr, Carsten E.; Czerner, Michael; Heiliger, Christian
2017-10-01
We present a method to calculate the electron-phonon induced resistivity of metals in scattering-time approximation based on the nonequilibrium Green's function formalism. The general theory as well as its implementation in a density-functional theory based Korringa-Kohn-Rostoker code are described and subsequently verified by studying copper as a test system. We model the thermal expansion by fitting a Debye-Grüneisen curve to experimental data. Both the electronic and vibrational structures are discussed for different temperatures, and employing a Wannier interpolation of these quantities we evaluate the scattering time by integrating the electron linewidth on a triangulation of the Fermi surface. Based thereupon, the temperature-dependent resistivity is calculated and found to be in good agreement with experiment. We show that the effect of thermal expansion has to be considered in the whole calculation regime. Further, for low temperatures, an accurate sampling of the Fermi surface becomes important.
Kandemir, Ekrem; Borekci, Selim; Cetin, Numan S.
2018-04-01
Photovoltaic (PV) power generation has been widely used in recent years, with techniques for increasing the power efficiency representing one of the most important issues. The available maximum power of a PV panel is dependent on environmental conditions such as solar irradiance and temperature. To extract the maximum available power from a PV panel, various maximum-power-point tracking (MPPT) methods are used. In this work, two different MPPT methods were implemented for a 150-W PV panel. The first method, known as incremental conductance (Inc. Cond.) MPPT, determines the maximum power by measuring the derivative of the PV voltage and current. The other method is based on reduced-rule compressed fuzzy logic control (RR-FLC), using which it is relatively easier to determine the maximum power because a single input variable is used to reduce computing loads. In this study, a 150-W PV panel system model was realized using these MPPT methods in MATLAB and the results compared. According to the simulation results, the proposed RR-FLC-based MPPT could increase the response rate and tracking accuracy by 4.66% under standard test conditions.
Energy Technology Data Exchange (ETDEWEB)
Luneville, L
1998-06-01
The multigroup discrete ordinates method is a classical way to solve transport equation (Boltzmann) for neutral particles. Self-shielding effects are not correctly treated due to large variations of cross sections in a group (in the resonance range). To treat the resonance domain, the multiband method is introduced. The main idea is to divide the cross section domain into bands. We obtain the multiband parameters using the moment method; the code CALENDF provides probability tables for these parameters. We present our implementation in an existing discrete ordinates code: SN1D. We study deep penetration benchmarks and show the improvement of the method in the treatment of self-shielding effects. (author) 15 refs.
Formal Verification Method for Configuration of Integrated Modular Avionics System Using MARTE
Directory of Open Access Journals (Sweden)
Lisong Wang
2018-01-01
Full Text Available The configuration information of Integrated Modular Avionics (IMA system includes almost all details of whole system architecture, which is used to configure the hardware interfaces, operating system, and interactions among applications to make an IMA system work correctly and reliably. It is very important to ensure the correctness and integrity of the configuration in the IMA system design phase. In this paper, we focus on modelling and verification of configuration information of IMA/ARINC653 system based on MARTE (Modelling and Analysis for Real-time and Embedded Systems. Firstly, we define semantic mapping from key concepts of configuration (such as modules, partitions, memory, process, and communications to components of MARTE element and propose a method for model transformation between XML-formatted configuration information and MARTE models. Then we present a formal verification framework for ARINC653 system configuration based on theorem proof techniques, including construction of corresponding REAL theorems according to the semantics of those key components of configuration information and formal verification of theorems for the properties of IMA, such as time constraints, spatial isolation, and health monitoring. After that, a special issue of schedulability analysis of ARINC653 system is studied. We design a hierarchical scheduling strategy with consideration of characters of the ARINC653 system, and a scheduling analyzer MAST-2 is used to implement hierarchical schedule analysis. Lastly, we design a prototype tool, called Configuration Checker for ARINC653 (CC653, and two case studies show that the methods proposed in this paper are feasible and efficient.
A formal safety analysis for PLC software-based safety critical system using Z
International Nuclear Information System (INIS)
Koh, Jung Soo; Seong, Poong Hyun
1997-01-01
This paper describes a formal safety analysis technique which is demonstrated by performing empirical formal safety analysis with the case study of beamline hutch door Interlock system that is developed by using PLC (Programmable Logic Controller) systems at the Pohang Accelerator Laboratory. In order to perform formed safety analysis, we have built the Z formal specifications representation from user requirement written in ambiguous natural language and target PLC ladder logic, respectively. We have also studied the effective method to express typical PLC timer component by using specific Z formal notation which is supported by temporal history. We present a formal proof technique specifying and verifying that the hazardous states are not introduced into ladder logic in the PLC-based safety critical system
An evaluation of learning resources in the teaching of formal philosophical methods
Directory of Open Access Journals (Sweden)
Susan A.J. Stuart
2003-12-01
Full Text Available In any discipline, across a wide variety of subjects, there are numerous learning resources available to students. For many students the resources that will be most beneficial to them are quickly apparent but, because of the nature of philosophy and the philosophical method, it is not immediately clear which resources will be most valuable to students for whom the development of critical thinking skills is crucial. If we are to support these students effectively in their learning we must establish what these resources are how we can continue to maintain and improve them, and how we can encourage students to make good use of them. In this paper we describe and assess our evaluation of the use made by students of learning resources in the context of learning logic and in developing their critical thinking skills. We also assess the use of a new resource, electronic handsets, the purpose of which is to encourage students to respond to questions in lectures and to gain feedback about how they are progressing with the material.
On the use of the hybrid causal logic method in offshore risk analysis
International Nuclear Information System (INIS)
Roed, Willy; Mosleh, Ali; Vinnem, Jan Erik; Aven, Terje
2009-01-01
In the Norwegian offshore oil and gas industry risk analyses have been used to provide decision support for more than 20 years. The focus has traditionally been on the planning phase, but during the last years a need for better risk analysis methods for the operational phase has been identified. Such methods should take human and organizational factors into consideration in a more explicit way than the traditional risk analysis methods do. Recently, a framework, called hybrid causal logic (HCL), has been developed based on traditional risk analysis tools combined with Bayesian belief networks (BBNs), using the aviation industry as a case. This paper reviews this framework and discusses its applicability for the offshore industry, and the relationship to existing research projects, such as the barrier and operational risk analysis project (BORA). The paper also addresses specific features of the framework and suggests a new approach for the probability assignment process. This approach simplifies the assignment process considerably without loosing the flexibility that is needed to properly reflect the phenomena being studied
A novel fuzzy logic-based image steganography method to ensure medical data security.
Karakış, R; Güler, I; Çapraz, I; Bilir, E
2015-12-01
This study aims to secure medical data by combining them into one file format using steganographic methods. The electroencephalogram (EEG) is selected as hidden data, and magnetic resonance (MR) images are also used as the cover image. In addition to the EEG, the message is composed of the doctor׳s comments and patient information in the file header of images. Two new image steganography methods that are based on fuzzy-logic and similarity are proposed to select the non-sequential least significant bits (LSB) of image pixels. The similarity values of the gray levels in the pixels are used to hide the message. The message is secured to prevent attacks by using lossless compression and symmetric encryption algorithms. The performance of stego image quality is measured by mean square of error (MSE), peak signal-to-noise ratio (PSNR), structural similarity measure (SSIM), universal quality index (UQI), and correlation coefficient (R). According to the obtained result, the proposed method ensures the confidentiality of the patient information, and increases data repository and transmission capacity of both MR images and EEG signals. Copyright © 2015 Elsevier Ltd. All rights reserved.
Analysis of maizena drying system using temperature control based fuzzy logic method
Arief, Ulfah Mediaty; Nugroho, Fajar; Purbawanto, Sugeng; Setyaningsih, Dyah Nurani; Suryono
2018-03-01
Corn is one of the rice subtitution food that has good potential. Corn can be processed to be a maizena, and it can be used to make type of food that has been made from maizena, viz. Brownies cake, egg roll, and other cookies. Generally, maizena obtained by drying process carried out 2-3 days under the sun. However, drying process not possible during the rainy season. This drying process can be done using an automatic drying tool. This study was to analyze the design result and manufacture of maizena drying system with temperature control based fuzzylogic method. The result show that temperature of drying system with set point 40°C - 60°C work in suitable condition. The level of water content in 15% (BSN) and temperatureat 50°C included in good drying process. Time required to reach the set point of temperature in 50°C is 7.05 minutes. Drying time for 500 gr samples with temperature 50°C and power capacity 127.6 watt was 1 hour. Based on the result, drying process using temperature control based fuzzy logic method can improve energy efficiency than the conventional method of drying using a direct sunlight source with a temperature that cannot be directly controlled by human being causing the quality of drying result of flour is erratic.
Chen, Carla Chia-Ming; Schwender, Holger; Keith, Jonathan; Nunkesser, Robin; Mengersen, Kerrie; Macrossan, Paula
2011-01-01
Due to advancements in computational ability, enhanced technology and a reduction in the price of genotyping, more data are being generated for understanding genetic associations with diseases and disorders. However, with the availability of large data sets comes the inherent challenges of new methods of statistical analysis and modeling. Considering a complex phenotype may be the effect of a combination of multiple loci, various statistical methods have been developed for identifying genetic epistasis effects. Among these methods, logic regression (LR) is an intriguing approach incorporating tree-like structures. Various methods have built on the original LR to improve different aspects of the model. In this study, we review four variations of LR, namely Logic Feature Selection, Monte Carlo Logic Regression, Genetic Programming for Association Studies, and Modified Logic Regression-Gene Expression Programming, and investigate the performance of each method using simulated and real genotype data. We contrast these with another tree-like approach, namely Random Forests, and a Bayesian logistic regression with stochastic search variable selection.
International Nuclear Information System (INIS)
Gran, Bjoern Axel; Sivertsen, Terje; Stoelen, Ketil; Thunem, Harald; Zhang, Wenhui
1999-02-01
The workshop 'Improved system development using case-tools based on formal methods' was organised in Halden, December 1-2, 1998. The purpose of the workshop was to present and discuss the state-of-the-art with respect to formal approaches. The workshop had two invited presentations: 'Formality in specification and modelling: developments in software engineering practice' by John Fitzgerald (Centre for Software Reliability, UK), and 'Formal methods in industry - reaching results when correctness is not the only issue' by Oeystein Haugen (Ericsson NorARC, Norway). The workshop also had several presentations divided into three sessions on industrial experience, tools, and combined approaches. Each day there was a discussion. The first was on the effect of formalization, while the second was on the role of formal verification. At the end of the workshop, the presentations and discussions were summarised into specific recommendations. This report summarises the presentations of the speakers, the discussions, the recommendations, and the demonstrations given at the workshop (author) (ml)
Energy Technology Data Exchange (ETDEWEB)
Ruthruff, Joseph. R. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Armstrong, Robert C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Davis, Benjamin Garry [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Mayo, Jackson R. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Punnoose, Ratish J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States)
2012-09-01
Formal methods describe a class of system analysis techniques that seek to prove specific properties about analyzed designs, or locate flaws compromising those properties. As an analysis capability,these techniques are the subject of increased interest from both internal and external customers of Sandia National Laboratories. Given this lab's other areas of expertise, Sandia is uniquely positioned to advance the state-of-the-art with respect to several research and application areas within formal methods. This research project was a one-year effort funded by Sandia's CyberSecurity S&T Investment Area in its Laboratory Directed Research & Development program to investigate the opportunities for formal methods to impact Sandia's present mission areas, more fully understand the needs of the research community in the area of formal methods and where Sandia can contribute, and clarify from those potential research paths those that would best advance the mission-area interests of Sandia. The accomplishments from this project reinforce the utility of formal methods in Sandia, particularly in areas relevant to Cyber Security, and set the stage for continued Sandia investments to ensure this capabilityis utilized and advanced within this laboratory to serve the national interest.
Ionospheric forecasting model using fuzzy logic-based gradient descent method
Directory of Open Access Journals (Sweden)
D. Venkata Ratnam
2017-09-01
Full Text Available Space weather phenomena cause satellite to ground or satellite to aircraft transmission outages over the VHF to L-band frequency range, particularly in the low latitude region. Global Positioning System (GPS is primarily susceptible to this form of space weather. Faulty GPS signals are attributed to ionospheric error, which is a function of Total Electron Content (TEC. Importantly, precise forecasts of space weather conditions and appropriate hazard observant cautions required for ionospheric space weather observations are limited. In this paper, a fuzzy logic-based gradient descent method has been proposed to forecast the ionospheric TEC values. In this technique, membership functions have been tuned based on the gradient descent estimated values. The proposed algorithm has been tested with the TEC data of two geomagnetic storms in the low latitude station of KL University, Guntur, India (16.44°N, 80.62°E. It has been found that the gradient descent method performs well and the predicted TEC values are close to the original TEC measurements.
Transforming PLC Programs into Formal Models for Verification Purposes
Darvas, D; Blanco, E
2013-01-01
Most of CERN’s industrial installations rely on PLC-based (Programmable Logic Controller) control systems developed using the UNICOS framework. This framework contains common, reusable program modules and their correctness is a high priority. Testing is already applied to find errors, but this method has limitations. In this work an approach is proposed to transform automatically PLC programs into formal models, with the goal of applying formal verification to ensure their correctness. We target model checking which is a precise, mathematical-based method to check formalized requirements automatically against the system.
Meta-Logical Reasoning in Higher-Order Logic
DEFF Research Database (Denmark)
Villadsen, Jørgen; Schlichtkrull, Anders; Hess, Andreas Viktor
The semantics of first-order logic (FOL) can be described in the meta-language of higher-order logic (HOL). Using HOL one can prove key properties of FOL such as soundness and completeness. Furthermore, one can prove sentences in FOL valid using the formalized FOL semantics. To aid...
Directory of Open Access Journals (Sweden)
Evandro Agazzi
2011-06-01
Full Text Available Humans have used arguments for defending or refuting statements long before the creation of logic as a specialized discipline. This can be interpreted as the fact that an intuitive notion of "logical consequence" or a psychic disposition to articulate reasoning according to this pattern is present in common sense, and logic simply aims at describing and codifying the features of this spontaneous capacity of human reason. It is well known, however, that several arguments easily accepted by common sense are actually "logical fallacies", and this indicates that logic is not just a descriptive, but also a prescriptive or normative enterprise, in which the notion of logical consequence is defined in a precise way and then certain rules are established in order to maintain the discourse in keeping with this notion. Yet in the justification of the correctness and adequacy of these rules commonsense reasoning must necessarily be used, and in such a way its foundational role is recognized. Moreover, it remains also true that several branches and forms of logic have been elaborated precisely in order to reflect the structural features of correct argument used in different fields of human reasoning and yet insufficiently mirrored by the most familiar logical formalisms.
New method of contour image processing based on the formalism of spiral light beams
Volostnikov, Vladimir G.; Kishkin, S. A.; Kotova, S. P.
2013-07-01
The possibility of applying the mathematical formalism of spiral light beams to the problems of contour image recognition is theoretically studied. The advantages and disadvantages of the proposed approach are evaluated; the results of numerical modelling are presented.
DEFF Research Database (Denmark)
Ramli, Carroline Dewi Puspa Kencana; Nielson, Hanne Riis; Nielson, Flemming
2014-01-01
We study the international standard XACML 3.0 for describing security access control policies in a compositional way. Our main contributions are (i) to derive a logic that precisely captures the intentions of the standard, (ii) to formally define a semantics for the XACML 3.0 component evaluation...
DEFF Research Database (Denmark)
Ramli, Carroline Dewi Puspa Kencana; Nielson, Hanne Riis; Nielson, Flemming
2011-01-01
We study the international standard XACML 3.0 for describing security access control policy in a compositional way. Our main contribution is to derive a logic that precisely captures the idea behind the standard and to formally define the semantics of the policy combining algorithms of XACML...
Structures for Epistemic Logic
Bezhanishvili, N.; Hoek, W. van der
2013-01-01
Epistemic modal logic in a narrow sense studies and formalises reasoning about knowledge. In a wider sense, it gives a formal account of the informational attitude that agents may have, and covers notions like knowledge, belief, uncertainty, and hence incomplete or partial information. As is so
Duration Calculus: Logical Foundations
DEFF Research Database (Denmark)
Hansen, Michael Reichhardt; Chaochen, Zhou
1997-01-01
The Duration Calculus (abbreviated DC) represents a logical approach to formal design of real-time systems, where real numbers are used to model time and Boolean valued functions over time are used to model states and events of real-time systems. Since it introduction, DC has been applied to many...
Temporalizing Epistemic Default Logic
van der Hoek, Wiebe; Meyer, John Jules; Treur, Jan
1998-01-01
We present an epistemic default logic, based on the metaphore of a meta-level architecture. Upward reflection is formalized by a nonmonotonic entailment relation, based on the objective facts that are either known or unknown at the object level. Then, the meta (monotonic) reasoning process generates
Applying Formal Methods to NASA Projects: Transition from Research to Practice
Othon, Bill
2009-01-01
NASA project managers attempt to manage risk by relying on mature, well-understood process and technology when designing spacecraft. In the case of crewed systems, the margin for error is even tighter and leads to risk aversion. But as we look to future missions to the Moon and Mars, the complexity of the systems will increase as the spacecraft and crew work together with less reliance on Earth-based support. NASA will be forced to look for new ways to do business. Formal methods technologies can help NASA develop complex but cost effective spacecraft in many domains, including requirements and design, software development and inspection, and verification and validation of vehicle subsystems. To realize these gains, the technologies must be matured and field-tested so that they are proven when needed. During this discussion, current activities used to evaluate FM technologies for Orion spacecraft design will be reviewed. Also, suggestions will be made to demonstrate value to current designers, and mature the technology for eventual use in safety-critical NASA missions.
Computability, complexity, logic
Börger, Egon
1989-01-01
The theme of this book is formed by a pair of concepts: the concept of formal language as carrier of the precise expression of meaning, facts and problems, and the concept of algorithm or calculus, i.e. a formally operating procedure for the solution of precisely described questions and problems. The book is a unified introduction to the modern theory of these concepts, to the way in which they developed first in mathematical logic and computability theory and later in automata theory, and to the theory of formal languages and complexity theory. Apart from considering the fundamental themes an
Bernardes, Juliana S; Carbone, Alessandra; Zaverucha, Gerson
2011-03-23
Remote homology detection is a hard computational problem. Most approaches have trained computational models by using either full protein sequences or multiple sequence alignments (MSA), including all positions. However, when we deal with proteins in the "twilight zone" we can observe that only some segments of sequences (motifs) are conserved. We introduce a novel logical representation that allows us to represent physico-chemical properties of sequences, conserved amino acid positions and conserved physico-chemical positions in the MSA. From this, Inductive Logic Programming (ILP) finds the most frequent patterns (motifs) and uses them to train propositional models, such as decision trees and support vector machines (SVM). We use the SCOP database to perform our experiments by evaluating protein recognition within the same superfamily. Our results show that our methodology when using SVM performs significantly better than some of the state of the art methods, and comparable to other. However, our method provides a comprehensible set of logical rules that can help to understand what determines a protein function. The strategy of selecting only the most frequent patterns is effective for the remote homology detection. This is possible through a suitable first-order logical representation of homologous properties, and through a set of frequent patterns, found by an ILP system, that summarizes essential features of protein functions.
Modern logic and quantum mechanics
International Nuclear Information System (INIS)
Garden, R.W.
1984-01-01
The book applies the methods of modern logic and probabilities to ''interpreting'' quantum mechanics. The subject is described and discussed under the chapter headings: classical and quantum mechanics, modern logic, the propositional logic of mechanics, states and measurement in mechanics, the traditional analysis of probabilities, the probabilities of mechanics and the model logic of predictions. (U.K.)
Embedding Logics into Product Logic
Czech Academy of Sciences Publication Activity Database
Baaz, M.; Hájek, Petr; Krajíček, Jan; Švejda, David
1998-01-01
Roč. 61, č. 1 (1998), s. 35-47 ISSN 0039-3215 R&D Projects: GA AV ČR IAA1030601 Grant - others:COST(XE) Action 15 Keywords : fuzzy logic * Lukasiewicz logic * Gödel logic * product logic * computational complexity * arithmetical hierarchy Subject RIV: BA - General Mathematics
A formal safety analysis for PLC software-based safety critical system using Z
International Nuclear Information System (INIS)
Koh, Jung Soo
1997-02-01
This paper describes a formal safety analysis technique which is demonstrated by performing empirical formal safety analysis with the case study of beamline hutch door Interlock system that is developed by using PLC (Programmable Logic Controller) systems at the Pohang Accelerator Laboratory. In order to perform formal safety analysis, we have built the Z formal specifications representation from user requirement written in ambiguous natural language and target PLC ladder logic, respectively. We have also studied the effective method to express typical PLC timer component by using specific Z formal notation which is supported by temporal history. We present a formal proof technique specifying and verifying that the hazardous states are not introduced into ladder logic in the PLC-based safety critical system. And also, we have found that some errors or mismatches in user requirement and final implemented PLC ladder logic while analyzing the process of the consistency and completeness of Z translated formal specifications. In the case of relatively small systems like Beamline hutch door interlock system, a formal safety analysis including explicit proof is highly recommended so that the safety of PLC-based critical system may be enhanced and guaranteed. It also provides a helpful benefits enough to comprehend user requirement expressed by ambiguous natural language
Breaking the fault tree circular logic
International Nuclear Information System (INIS)
Lankin, M.
2000-01-01
Event tree - fault tree approach to model failures of nuclear plants as well as of other complex facilities is noticeably dominant now. This approach implies modeling an object in form of unidirectional logical graph - tree, i.e. graph without circular logic. However, genuine nuclear plants intrinsically demonstrate quite a few logical loops (circular logic), especially where electrical systems are involved. This paper shows the incorrectness of existing practice of circular logic breaking by elimination of part of logical dependencies and puts forward a formal algorithm, which enables the analyst to correctly model the failure of complex object, which involves logical dependencies between system and components, in form of fault tree. (author)
Indian Academy of Sciences (India)
by testing of the components and successful testing leads to the software being ... Formal verification is based on formal methods which are mathematically based ..... scenario under which a similar error could occur. There are various other ...
Stoelinga, Mariëlle; Pinger, Ralf
This volume contains the papers presented at FMICS 2012, the 17th International Workshop on Formal Methods for Industrial Critical Systems, taking place August 27–28, 2012, in Paris, France. The aim of the FMICS workshop series is to provide a forum for researchers who are interested in the
Rienties, Bart; Hosein, Anesa
2015-01-01
How and with whom academics develop and maintain formal and informal networks for reflecting on their teaching practice has received limited attention even though academic development (AD) programmes have become an almost ubiquitous feature of higher education. The primary goal of this mixed-method study is to unpack how 114 academics in an AD…
Roever, de W.P.; Barringer, H.; Courcoubetis, C.; Gabbay, D.M.; Gerth, R.T.; Jonsson, B.; Pnueli, A.; Reed, M.; Sifakis, J.; Vytopil, J.; Wolper, P.
1990-01-01
The Basic Research Action No. 3096, Formal Methods snd Tools for the Development of Distributed and Real Time Systems, is funded in the Area of Computer Science, under the ESPRIT Programme of the European Community. The coordinating institution is the Department of Computing Science, Eindhoven
A system for deduction-based formal verification of workflow-oriented software models
Directory of Open Access Journals (Sweden)
Klimek Radosław
2014-12-01
Full Text Available The work concerns formal verification of workflow-oriented software models using the deductive approach. The formal correctness of a model’s behaviour is considered. Manually building logical specifications, which are regarded as a set of temporal logic formulas, seems to be a significant obstacle for an inexperienced user when applying the deductive approach. A system, along with its architecture, for deduction-based verification of workflow-oriented models is proposed. The process inference is based on the semantic tableaux method, which has some advantages when compared with traditional deduction strategies. The algorithm for automatic generation of logical specifications is proposed. The generation procedure is based on predefined workflow patterns for BPMN, which is a standard and dominant notation for the modeling of business processes. The main idea behind the approach is to consider patterns, defined in terms of temporal logic, as a kind of (logical primitives which enable the transformation of models to temporal logic formulas constituting a logical specification. Automation of the generation process is crucial for bridging the gap between the intuitiveness of deductive reasoning and the difficulty of its practical application when logical specifications are built manually. This approach has gone some way towards supporting, hopefully enhancing, our understanding of deduction-based formal verification of workflow-oriented models.
The Dynamic Turn in Quantum Logic
Baltag, A.; Smets, S.
2012-01-01
In this paper we show how ideas coming from two areas of research in logic can reinforce each other. The first such line of inquiry concerns the "dynamic turn" in logic and especially the formalisms inspired by Propositional Dynamic Logic (PDL); while the second line concerns research into the
The dynamic turn in quantum logic
Baltag, Alexandru; Smets, Sonja
In this paper we show how ideas coming from two areas of research in logic can reinforce each other. The first such line of inquiry concerns the "dynamic turn" in logic and especially the formalisms inspired by Propositional Dynamic Logic (PDL); while the second line concerns research into the
An Adequate First Order Logic of Intervals
DEFF Research Database (Denmark)
Chaochen, Zhou; Hansen, Michael Reichhardt
1998-01-01
This paper introduces left and right neighbourhoods as primitive interval modalities to define other unary and binary modalities of intervals in a first order logic with interval length. A complete first order logic for the neighbourhood modalities is presented. It is demonstrated how the logic can...... support formal specification and verification of liveness and fairness, and also of various notions of real analysis....
HDL to verification logic translator
Gambles, J. W.; Windley, P. J.
1992-01-01
The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.
García-Remesal, M; Maojo, V; Billhardt, H; Crespo, J
2010-01-01
Bringing together structured and text-based sources is an exciting challenge for biomedical informaticians, since most relevant biomedical sources belong to one of these categories. In this paper we evaluate the feasibility of integrating relational and text-based biomedical sources using: i) an original logical schema acquisition method for textual databases developed by the authors, and ii) OntoFusion, a system originally designed by the authors for the integration of relational sources. We conducted an integration experiment involving a test set of seven differently structured sources covering the domain of genetic diseases. We used our logical schema acquisition method to generate schemas for all textual sources. The sources were integrated using the methods and tools provided by OntoFusion. The integration was validated using a test set of 500 queries. A panel of experts answered a questionnaire to evaluate i) the quality of the extracted schemas, ii) the query processing performance of the integrated set of sources, and iii) the relevance of the retrieved results. The results of the survey show that our method extracts coherent and representative logical schemas. Experts' feedback on the performance of the integrated system and the relevance of the retrieved results was also positive. Regarding the validation of the integration, the system successfully provided correct results for all queries in the test set. The results of the experiment suggest that text-based sources including a logical schema can be regarded as equivalent to structured databases. Using our method, previous research and existing tools designed for the integration of structured databases can be reused - possibly subject to minor modifications - to integrate differently structured sources.
Description logic-based methods for auditing frame-based medical terminological systems
Cornet, Ronald; Abu-Hanna, Ameen
2005-01-01
Objective: Medical terminological systems (TSs) play an increasingly important role in health care by supporting recording, retrieval and analysis of patient information. As the size and complexity of TSs are growing, the need arises for means to audit them, i.e. verify and maintain (logical)
step by step process from logic model to case study method as an ...
African Journals Online (AJOL)
Global Journal
Logic models and case study approach to programme evaluation have proven ... in qualitative methodology. There is ... Note: IEHPs= internationally educated health professionals, ... interviews with the programme managers. .... programme assessed to ensure that the IEHPs are ready to face the certification ..... Comparison.
New method of contour image processing based on the formalism of spiral light beams
International Nuclear Information System (INIS)
Volostnikov, Vladimir G; Kishkin, S A; Kotova, S P
2013-01-01
The possibility of applying the mathematical formalism of spiral light beams to the problems of contour image recognition is theoretically studied. The advantages and disadvantages of the proposed approach are evaluated; the results of numerical modelling are presented. (optical image processing)
Using Formal Methods to Cultivate Trust in Smart Card Operating Systems
Alberda, Marjan I.; Hartel, Pieter H.; de Jong, Eduard K.
To be widely accepted, smart cards must contain completely trustworthy software. Because smart cards contain relatively simple computers, and are used only for a specific class of applications, it is feasible to make the language used to program the software components focused and tiny. Formal
International Nuclear Information System (INIS)
Mittelstaedt, P.
1979-01-01
The subspaces of Hilbert space constitute an orthocomplemented quasimodular lattice Lsub(q) for which neither a two-valued function nor generalized truth function exist. A generalisation of the dialogic method can be used as an interpretation of a lattice Lsub(qi), which may be considered as the intuitionistic part of Lsub(q). Some obvious modifications of the dialogic method are introduced which come from the possible incommensurability of propositions about quantum mechanical systems. With the aid of this generalized dialogic method a propositional calculus Qsub(eff) is derived which is similar to the calculus of effective (intuitionistic) logic, but contains a few restrictions which are based on the incommensurability of quantum mechanical propositions. It can be shown within the framework of the calculus Qsub(eff) that the value-definiteness of the elementary propositions which are proved by quantum mechanical propositions is inherited by all finite compund propositions. In this way one arrives at the calculus Q of full quantum logic which incorporates the principle of excluded middle for all propositions and which is a model for the lattice Lsub(q). (Auth.)
Logical foundation of quantum mechanics
International Nuclear Information System (INIS)
Stachow, E.W.
1980-01-01
The subject of this article is the reconstruction of quantum mechanics on the basis of a formal language of quantum mechanical propositions. During recent years, research in the foundations of the language of science has given rise to a dialogic semantics that is adequate in the case of a formal language for quantum physics. The system of sequential logic which is comprised by the language is more general than classical logic; it includes the classical system as a special case. Although the system of sequential logic can be founded without reference to the empirical content of quantum physical propositions, it establishes an essential part of the structure of the mathematical formalism used in quantum mechanics. It is the purpose of this paper to demonstrate the connection between the formal language of quantum physics and its representation by mathematical structures in a self-contained way. (author)
Kleene, Stephen Cole
1967-01-01
Undergraduate students with no prior instruction in mathematical logic will benefit from this multi-part text. Part I offers an elementary but thorough overview of mathematical logic of 1st order. Part II introduces some of the newer ideas and the more profound results of logical research in the 20th century. 1967 edition.
Ayu Nurul Handayani, Hemas; Waspada, Indra
2018-05-01
Non-formal Early Childhood Education (non-formal ECE) is an education that is held for children under 4 years old. The implementation in District of Banyumas, Non-formal ECE is monitored by The District Government of Banyumas and helped by Sanggar Kegiatan Belajar (SKB) Purwokerto as one of the organizer of Non-formal Education. The government itself has a program for distributing ECE to all villages in Indonesia. However, The location to construct the ECE school in several years ahead is not arranged yet. Therefore, for supporting that program, a decision support system is made to give some recommendation villages for constructing The ECE building. The data are projected based on Brown’s Double Exponential Smoothing Method and utilizing Preference Ranking Organization Method for Enrichment Evaluation (Promethee) to generate priority order. As the recommendations system, it generates map visualization which is colored according to the priority level of sub-district and village area. The system was tested with black box testing, Promethee testing, and usability testing. The results showed that the system functionality and Promethee algorithm were working properly, and the user was satisfied.
Evens, Aden
2015-01-01
Building a foundational understanding of the digital, Logic of the Digital reveals a unique digital ontology. Beginning from formal and technical characteristics, especially the binary code at the core of all digital technologies, Aden Evens traces the pathways along which the digital domain of abstract logic encounters the material, human world. How does a code using only 0s and 1s give rise to the vast range of applications and information that constitutes a great and growing portion of our world? Evens' analysis shows how any encounter between the actual and the digital must cross an ontolo
How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project
Butler, Ricky W.; Hagen, George; Maddalon, Jeffrey M.; Munoz, Cesar A.; Narkawicz, Anthony; Dowek, Gilles
2010-01-01
In this paper we describe a process of algorithmic discovery that was driven by our goal of achieving complete, mechanically verified algorithms that compute conflict prevention bands for use in en route air traffic management. The algorithms were originally defined in the PVS specification language and subsequently have been implemented in Java and C++. We do not present the proofs in this paper: instead, we describe the process of discovery and the key ideas that enabled the final formal proof of correctness
Consumer experience of formal crisis-response services and preferred methods of crisis intervention.
Boscarato, Kara; Lee, Stuart; Kroschel, Jon; Hollander, Yitzchak; Brennan, Alice; Warren, Narelle
2014-08-01
The manner in which people with mental illness are supported in a crisis is crucial to their recovery. The current study explored mental health consumers' experiences with formal crisis services (i.e. police and crisis assessment and treatment (CAT) teams), preferred crisis supports, and opinions of four collaborative interagency response models. Eleven consumers completed one-on-one, semistructured interviews. The results revealed that the perceived quality of previous formal crisis interventions varied greatly. Most participants preferred family members or friends to intervene. However, where a formal response was required, general practitioners and mental health case managers were preferred; no participant wanted a police response, and only one indicated a preference for CAT team assistance. Most participants welcomed collaborative crisis interventions. Of four collaborative interagency response models currently being trialled internationally, participants most strongly supported the Ride-Along Model, which enables a police officer and a mental health clinician to jointly respond to distressed consumers in the community. The findings highlight the potential for an interagency response model to deliver a crisis response aligned with consumers' preferences. © 2014 Australian College of Mental Health Nurses Inc.
Dalen, Dirk
1983-01-01
A book which efficiently presents the basics of propositional and predicate logic, van Dalen’s popular textbook contains a complete treatment of elementary classical logic, using Gentzen’s Natural Deduction. Propositional and predicate logic are treated in separate chapters in a leisured but precise way. Chapter Three presents the basic facts of model theory, e.g. compactness, Skolem-Löwenheim, elementary equivalence, non-standard models, quantifier elimination, and Skolem functions. The discussion of classical logic is rounded off with a concise exposition of second-order logic. In view of the growing recognition of constructive methods and principles, one chapter is devoted to intuitionistic logic. Completeness is established for Kripke semantics. A number of specific constructive features, such as apartness and equality, the Gödel translation, the disjunction and existence property have been incorporated. The power and elegance of natural deduction is demonstrated best in the part of proof theory cal...
Methods and Tools for the Analysis, Verification and Synthesis of Genetic Logic Circuits,
DEFF Research Database (Denmark)
Baig, Hasan
2017-01-01
. This usually requires simulating the mathematical models of these genetic circuits and perceive whether or not the circuit behaves appropriately. Furthermore, synthetic biology utilizes the concepts from electronic design automation (EDA) of abstraction and automated construction to generate genetic circuits...... that the proposed approach is eﬀective to determine the variation in the behavior of genetic circuits when the circuit’s parameters are changed. In addition, the thesis also attempts to propose a synthesis and technology mapping tool, called GeneTech, for genetic circuits. It allows users to construct a genetic...... important design characteristics. This thesis also introduces an automated approach to analyze the behavior of genetic logic circuits from the simulation data. With this capability, the boolean logic of complex genetic circuits can be analyzed and/or veriﬁed automatically. It is also shown in this thesis...
Logic-based models in systems biology: a predictive and parameter-free network analysis method.
Wynn, Michelle L; Consul, Nikita; Merajver, Sofia D; Schnell, Santiago
2012-11-01
Highly complex molecular networks, which play fundamental roles in almost all cellular processes, are known to be dysregulated in a number of diseases, most notably in cancer. As a consequence, there is a critical need to develop practical methodologies for constructing and analysing molecular networks at a systems level. Mathematical models built with continuous differential equations are an ideal methodology because they can provide a detailed picture of a network's dynamics. To be predictive, however, differential equation models require that numerous parameters be known a priori and this information is almost never available. An alternative dynamical approach is the use of discrete logic-based models that can provide a good approximation of the qualitative behaviour of a biochemical system without the burden of a large parameter space. Despite their advantages, there remains significant resistance to the use of logic-based models in biology. Here, we address some common concerns and provide a brief tutorial on the use of logic-based models, which we motivate with biological examples.
Energy Technology Data Exchange (ETDEWEB)
Determan, John Clifford [Los Alamos National Laboratory; Longo, Joseph F [Los Alamos National Laboratory; Michel, Kelly D [Los Alamos National Laboratory
2009-01-01
The Unattended and Remote Monitoring (UNARM) system is a collection of specialized hardware and software used by the International Atomic Energy Agency (IAEA) to institute nuclear safeguards at many nuclear facilities around the world. The hardware consists of detectors, instruments, and networked computers for acquiring various forms of data, including but not limited to radiation data, global position coordinates, camera images, isotopic data, and operator declarations. The software provides two primary functions: the secure and reliable collection of this data from the instruments and the ability to perform an integrated review and analysis of the disparate data sources. Several years ago the team responsible for maintaining the software portion of the UNARM system began the process of formalizing its operations. These formal operations include a configuration management system, a change control board, an issue tracking system, and extensive formal testing, for both functionality and reliability. Functionality is tested with formal test cases chosen to fully represent the data types and methods of analysis that will be commonly encountered. Reliability is tested with iterative, concurrent testing where up to five analyses are executed simultaneously for thousands of cycles. Iterative concurrent testing helps ensure that there are no resource conflicts or leaks when multiple system components are in use simultaneously. The goal of this work is to provide a high quality, reliable product, commensurate with the criticality of the application. Testing results will be presented that demonstrate that this goal has been achieved and the impact of the introduction of a formal software engineering framework to the UNARM product will be presented.
Probabilistic logics and probabilistic networks
Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill
2014-01-01
Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.
Vested Madsen, Matias; Macario, Alex; Yamamoto, Satoshi; Tanaka, Pedro
2016-06-01
In this study, we examined the regularly scheduled, formal teaching sessions in a single anesthesiology residency program to (1) map the most common primary instructional methods, (2) map the use of 10 known teaching techniques, and (3) assess if residents scored sessions that incorporated active learning as higher quality than sessions with little or no verbal interaction between teacher and learner. A modified Delphi process was used to identify useful teaching techniques. A representative sample of each of the formal teaching session types was mapped, and residents anonymously completed a 5-question written survey rating the session. The most common primary instructional methods were computer slides-based classroom lectures (66%), workshops (15%), simulations (5%), and journal club (5%). The number of teaching techniques used per formal teaching session averaged 5.31 (SD, 1.92; median, 5; range, 0-9). Clinical applicability (85%) and attention grabbers (85%) were the 2 most common teaching techniques. Thirty-eight percent of the sessions defined learning objectives, and one-third of sessions engaged in active learning. The overall survey response rate equaled 42%, and passive sessions had a mean score of 8.44 (range, 5-10; median, 9; SD, 1.2) compared with a mean score of 8.63 (range, 5-10; median, 9; SD, 1.1) for active sessions (P = 0.63). Slides-based classroom lectures were the most common instructional method, and faculty used an average of 5 known teaching techniques per formal teaching session. The overall education scores of the sessions as rated by the residents were high.
Stereotypical Reasoning: Logical Properties
Lehmann, Daniel
2002-01-01
Stereotypical reasoning assumes that the situation at hand is one of a kind and that it enjoys the properties generally associated with that kind of situation. It is one of the most basic forms of nonmonotonic reasoning. A formal model for stereotypical reasoning is proposed and the logical properties of this form of reasoning are studied. Stereotypical reasoning is shown to be cumulative under weak assumptions.
Le Balleur, J. C.
1988-01-01
The applicability of conventional mathematical analysis (based on the combination of two-valued logic and probability theory) to problems in which human judgment, perception, or emotions play significant roles is considered theoretically. It is shown that dispositional logic, a branch of fuzzy logic, has particular relevance to the common-sense reasoning typical of human decision-making. The concepts of dispositionality and usuality are defined analytically, and a dispositional conjunctive rule and dispositional modus ponens are derived.
International Nuclear Information System (INIS)
Bohn, M.P.
1992-12-01
As part of the NRC-sponsored program to study the implications of Generic Issue 57, ''Effects of Fire Protection System Actuation on Safety-Related Equipment,'' a subtask was performed to evaluate the applicability of formal decision analysis methods to generic issues cost/benefit-type decisions and to apply these methods to the GI-57 results. In this report, the numerical results obtained from the analysis of three plants (two PWRs and one BWR) as developed in the technical resolution program for GI-57 were studied. For each plant, these results included a calculation of the person-REM averted due to various accident scenarios and various proposed modifications to mitigate the accident scenarios identified. These results were recomputed to break out the benefit in terms of contributions due to random event scenarios, fire event scenarios, and seismic event scenarios. Furthermore, the benefits associated with risk (in terms of person-REM) averted from earthquakes at three different seismic ground motion levels were separately considered. Given this data, formal decision methodologies involving decision trees, value functions, and utility functions were applied to this basic data. It is shown that the formal decision methodology can be applied at several different levels. Examples are given in which the decision between several retrofits is changed from that resulting from a simple cost/benefit-ratio criterion by virtue of the decision-makinger's expressed (and assumed) preferences
Leaphart, Cynthia L; Gonwa, Thomas A; Mai, Martin L; Prendergast, Mary B; Wadei, Hani M; Tepas, Joseph J; Taner, C Burcin
2012-09-01
Broad-based formal quality improvement curriculum emphasizing Six Sigma and the DMAIC approach developed by our institution is required for physicians in training. DMAIC methods evaluated the common outcome of postoperative hyponatremia, thus resulting in collaboration to prevent hyponatremia in the renal transplant population. To define postoperative hyponatremia in renal transplant recipients, a project charter outlined project aims. To measure postoperative hyponatremia, serum sodium at admission and immediately postoperative were recorded by retrospective review of renal transplant recipient charts from June 29, 2010 to December 31, 2011. An Ishikawa diagram was generated to analyze potential causative factors. Interdisciplinary collaboration and hospital policy assessment determined necessary improvements to prevent hyponatremia. Continuous monitoring in control phase was performed by establishing the goal of DMAIC approach and formal quality curriculum for trainees addresses core competencies by providing a framework for problem solving, interdisciplinary collaboration, and process improvement. Copyright © 2012 Elsevier Inc. All rights reserved.
The Fuzzy Logic Method to Efficiently Optimize Electricity Consumption in Individual Housing
Directory of Open Access Journals (Sweden)
Sébastien Bissey
2017-10-01
Full Text Available Electricity demand shifting and reduction still raise a huge interest for end-users at the household level, especially because of the ongoing design of a dynamic pricing approach. In particular, end-users must act as the starting point for decreasing their consumption during peak hours to prevent the need to extend the grid and thus save considerable costs. This article points out the relevance of a fuzzy logic algorithm to efficiently predict short term load consumption (STLC. This approach is the cornerstone of a new home energy management (HEM algorithm which is able to optimize the cost of electricity consumption, while smoothing the peak demand. The fuzzy logic modeling involves a strong reliance on a complete database of real consumption data from many instrumented show houses. The proposed HEM algorithm enables any end-user to manage his electricity consumption with a high degree of flexibility and transparency, and “reshape” the load profile. For example, this can be mainly achieved using smart control of a storage system coupled with remote management of the electric appliances. The simulation results demonstrate that an accurate prediction of STLC gives the possibility of achieving optimal planning and operation of the HEM system.
A Novel Strain-Based Method to Estimate Tire Conditions Using Fuzzy Logic for Intelligent Tires
Directory of Open Access Journals (Sweden)
Daniel Garcia-Pozuelo
2017-02-01
Full Text Available The so-called intelligent tires are one of the most promising research fields for automotive engineers. These tires are equipped with sensors which provide information about vehicle dynamics. Up to now, the commercial intelligent tires only provide information about inflation pressure and their contribution to stability control systems is currently very limited. Nowadays one of the major problems for intelligent tire development is how to embed feasible and low cost sensors to obtain reliable information such as inflation pressure, vertical load or rolling speed. These parameters provide key information for vehicle dynamics characterization. In this paper, we propose a novel algorithm based on fuzzy logic to estimate the mentioned parameters by means of a single strain-based system. Experimental tests have been carried out in order to prove the suitability and durability of the proposed on-board strain sensor system, as well as its low cost advantages, and the accuracy of the obtained estimations by means of fuzzy logic.
State-of-the-art report for the testing and formal verification methods for FBD program
International Nuclear Information System (INIS)
Jee, Eun Kyoung; Lee, Jang Soo; Lee, Young Jun; Yoo, Jun Beom
2011-10-01
The importance of PLC testing has increased in the nuclear I and C domain. While regulation authorities require both functional and structural testing for safety system software, FBD testing relies only on functional testing and there has been little research on structural testing techniques for FBD programs. We aim to analyze current techniques related to FBD testing and develop a structural testing technique appropriate to FBD programs. We developed structural test coverage criteria applicable to FBD programs, focusing on data paths from input edges to output edges of FBD programs. A data path condition (DPC), under which input data can flow into the output edge, is defined for each data path. We defined basic coverage, input condition coverage and complex condition coverage criteria based on the formal definition of DPC. We also developed a measurement procedure for FBD testing adequacy and a supporting tool prototype
On the formal equivalence of the TAP and thermodynamic methods in the SK model
International Nuclear Information System (INIS)
Cavagna, Andrea; Giardina, Irene; Parisi, Giorgio; Mezard, Marc
2003-01-01
We revisit two classic Thouless-Anderson-Palmer (TAP) studies of the Sherrington-Kirkpatrick model (Bray A J and Moore M A 1980 J. Phys. C: Solid State Phys. 13 L469; De Dominicis C and Young A P 1983 J. Phys. A: Math. Gen. 16 2063). By using the Becchi-Rouet-Stora-Tyutin (BRST) supersymmetry, we prove the general equivalence of TAP and replica partition functions, and show that the annealed calculation of the TAP complexity is formally identical to the quenched thermodynamic calculation of the free energy at one step level of replica symmetry breaking. The complexity we obtain by means of the BRST symmetry turns out to be considerably smaller than the previous non-symmetric value
Formal Verification of Computerized Procedure with Colored Petri Nets
International Nuclear Information System (INIS)
Kim, Yun Goo; Shin, Yeong Cheol
2008-01-01
Computerized Procedure System (CPS) supports nuclear power plant operators in performing operating procedures which are instructions to guide in monitoring, decision making and controlling nuclear power plants. Computerized Procedure (CP) should be loaded to CPS. Due to its execution characteristic, computerized procedure acts like a software in CPS. For example, procedure flows are determined by operator evaluation and computerized procedure logic which are pre-defined. So the verification of Computerized Procedure logic and execution flow is needed before computerized procedures are installed in the system. Formal verification methods are proposed and the modeling of operating procedures with Coloured Petri Nets(CP-nets) is presented
Krylov, Piotr
2017-01-01
This monograph is a comprehensive account of formal matrices, examining homological properties of modules over formal matrix rings and summarising the interplay between Morita contexts and K theory. While various special types of formal matrix rings have been studied for a long time from several points of view and appear in various textbooks, for instance to examine equivalences of module categories and to illustrate rings with one-sided non-symmetric properties, this particular class of rings has, so far, not been treated systematically. Exploring formal matrix rings of order 2 and introducing the notion of the determinant of a formal matrix over a commutative ring, this monograph further covers the Grothendieck and Whitehead groups of rings. Graduate students and researchers interested in ring theory, module theory and operator algebras will find this book particularly valuable. Containing numerous examples, Formal Matrices is a largely self-contained and accessible introduction to the topic, assuming a sol...
Historical analysis of the logic and rhetoric: problems on the procedural using of entimems
Directory of Open Access Journals (Sweden)
Alexandre Freire Pimentel
2017-06-01
Full Text Available Through the historical and bibliographical method, this article aims to present a critical appraisal of the logic and rhetoric evolution, specifically how the enthymemes are currently used to support judicial decisions. The paper intends to prove that the decision-making precedes the writing process and it is not necessarily guided by a formal logic reasoning but, especially, through an ideological motivation, and also that the wording can be done based on enthymemes.
Empirical logic and quantum mechanics
International Nuclear Information System (INIS)
Foulis, D.J.; Randall, C.H.
1976-01-01
This article discusses some of the basic notions of quantum physics within the more general framework of operational statistics and empirical logic (as developed in Foulis and Randall, 1972, and Randall and Foulis, 1973). Empirical logic is a formal mathematical system in which the notion of an operation is primitive and undefined; all other concepts are rigorously defined in terms of such operations (which are presumed to correspond to actual physical procedures). (Auth.)
Observation Predicates in Flow Logic
DEFF Research Database (Denmark)
Nielson, Flemming; Nielson, Hanne Riis; Sun, Hongyan
2003-01-01
in such a way that the hard constraints are satisfi ed exactly when the observation predicates report no violations. The development is carried out in a large fragment of a first order logic with negation and also takes care of the transformations necessary in order to adhere to the stratification restrictions...... inherent in Alternation-free Least Fixed Point Logic and similar formalisms such as Datalog....
Parodi, Stefano; Manneschi, Chiara; Verda, Damiano; Ferrari, Enrico; Muselli, Marco
2018-03-01
This study evaluates the performance of a set of machine learning techniques in predicting the prognosis of Hodgkin's lymphoma using clinical factors and gene expression data. Analysed samples from 130 Hodgkin's lymphoma patients included a small set of clinical variables and more than 54,000 gene features. Machine learning classifiers included three black-box algorithms ( k-nearest neighbour, Artificial Neural Network, and Support Vector Machine) and two methods based on intelligible rules (Decision Tree and the innovative Logic Learning Machine method). Support Vector Machine clearly outperformed any of the other methods. Among the two rule-based algorithms, Logic Learning Machine performed better and identified a set of simple intelligible rules based on a combination of clinical variables and gene expressions. Decision Tree identified a non-coding gene ( XIST) involved in the early phases of X chromosome inactivation that was overexpressed in females and in non-relapsed patients. XIST expression might be responsible for the better prognosis of female Hodgkin's lymphoma patients.
International Nuclear Information System (INIS)
Vosoughi, Naser; Ekrami, AmirHasan; Naseri, Zahra
2003-01-01
Since suitable control of water level can greatly enhance the operation of a power station, a fuzzy logic controller is applied to control the steam generator water level in a pressurized water reactor. The method does not require a detailed mathematical model of the object to be controlled. It is shown that two inputs, a single output and the least number of rules (9 rules) are considered for a controller, and the ANFIS training method is employed to model functions in a controlled system. By using ANFIS training method, initial membership functions will be trained and appropriate functions are generated to control water level inside the steam generator while using the stated rules. The proposed architecture can construct an input-output mapping based on both human knowledge (in the from of fuzzy if - then rules) and stipulated input-output data. This fuzzy logic controller is applied to the steam generator level control by computer simulations. The simulation results confirm the excellent performance of this control architecture in compare with a well-turned PID controller. (author)
Tugué, Tosiyuki; Slaman, Theodore
1989-01-01
These proceedings include the papers presented at the logic meeting held at the Research Institute for Mathematical Sciences, Kyoto University, in the summer of 1987. The meeting mainly covered the current research in various areas of mathematical logic and its applications in Japan. Several lectures were also presented by logicians from other countries, who visited Japan in the summer of 1987.
Microelectromechanical reprogrammable logic device
Hafiz, M. A. A.; Kosuru, L.; Younis, M. I.
2016-01-01
In modern computing, the Boolean logic operations are set by interconnect schemes between the transistors. As the miniaturization in the component level to enhance the computational power is rapidly approaching physical limits, alternative computing methods are vigorously pursued. One of the desired aspects in the future computing approaches is the provision for hardware reconfigurability at run time to allow enhanced functionality. Here we demonstrate a reprogrammable logic device based on the electrothermal frequency modulation scheme of a single microelectromechanical resonator, capable of performing all the fundamental 2-bit logic functions as well as n-bit logic operations. Logic functions are performed by actively tuning the linear resonance frequency of the resonator operated at room temperature and under modest vacuum conditions, reprogrammable by the a.c.-driving frequency. The device is fabricated using complementary metal oxide semiconductor compatible mass fabrication process, suitable for on-chip integration, and promises an alternative electromechanical computing scheme. PMID:27021295
Microelectromechanical reprogrammable logic device
Hafiz, Md Abdullah Al
2016-03-29
In modern computing, the Boolean logic operations are set by interconnect schemes between the transistors. As the miniaturization in the component level to enhance the computational power is rapidly approaching physical limits, alternative computing methods are vigorously pursued. One of the desired aspects in the future computing approaches is the provision for hardware reconfigurability at run time to allow enhanced functionality. Here we demonstrate a reprogrammable logic device based on the electrothermal frequency modulation scheme of a single microelectromechanical resonator, capable of performing all the fundamental 2-bit logic functions as well as n-bit logic operations. Logic functions are performed by actively tuning the linear resonance frequency of the resonator operated at room temperature and under modest vacuum conditions, reprogrammable by the a.c.-driving frequency. The device is fabricated using complementary metal oxide semiconductor compatible mass fabrication process, suitable for on-chip integration, and promises an alternative electromechanical computing scheme.
Directory of Open Access Journals (Sweden)
Pablo Pancardo
2018-01-01
Full Text Available Knowing the perceived exertion of workers during their physical activities facilitates the decision-making of supervisors regarding the worker allocation in the appropriate job, actions to prevent accidents, and reassignment of tasks, among others. However, although wearable heart rate sensors represent an effective way to capture perceived exertion, ergonomic methods are generic and they do not consider the diffuse nature of the ranges that classify the efforts. Personalized monitoring is needed to enable a real and efficient classification of perceived individual efforts. In this paper, we propose a heart rate-based personalized method to assess perceived exertion; our method uses fuzzy logic as an option to manage imprecision and uncertainty in involved variables. We applied some experiments to cleaning staff and obtained results that highlight the importance of a custom method to classify perceived exertion of people doing physical work.
A connection between the asymptotic iteration method and the continued fractions formalism
International Nuclear Information System (INIS)
Matamala, A.R.; Gutierrez, F.A.; Diaz-Valdes, J.
2007-01-01
In this work, we show that there is a connection between the asymptotic iteration method (a method to solve second order linear ordinary differential equations) and the older method of continued fractions to solve differential equations
International Nuclear Information System (INIS)
Witte, N.S.; Shankar, R.
1999-01-01
We examine the Ising chain in a transverse field at zero temperature from the point of view of a family of moment formalisms based upon the cumulant generating function, where we find exact solutions for the generating functions and cumulants at arbitrary couplings and hence for both the ordered and disordered phases of the model. In a t-expansion analysis, the exact Horn-Weinstein function E(t) has cuts along an infinite set of curves in the complex Jt-plane which are confined to the left-hand half-plane ImJt < -((1)/(4)) for the phase containing the trial state (disordered), but are not so for the other phase (ordered). For finite couplings the expansion has a finite radius of convergence. Asymptotic forms for this function exhibit a crossover at the critical point, giving the excited state gap in the ground state sector for the disordered phase, and the first excited state gap in the ordered phase. Convergence of the t-expansion with respect to truncation order is found in the disordered phase right up to the critical point, for both the ground state energy and the excited state gap. However, convergence is found in only one of the connected moments expansions (CMX), the CMX-LT, and the ground state energy shows convergence right to the criticalpoint again, although to a limited accuracy
Bacon, John B; McCarty, David Charles; Bacon, John B
1999-01-01
First published in the most ambitious international philosophy project for a generation; the Routledge Encyclopedia of Philosophy. Logic from A to Z is a unique glossary of terms used in formal logic and the philosophy of mathematics. Over 500 entries include key terms found in the study of: * Logic: Argument, Turing Machine, Variable * Set and model theory: Isomorphism, Function * Computability theory: Algorithm, Turing Machine * Plus a table of logical symbols. Extensively cross-referenced to help comprehension and add detail, Logic from A to Z provides an indispensable reference source for students of all branches of logic.
Directory of Open Access Journals (Sweden)
Antonio Aguilera Ontiveros
2011-12-01
Full Text Available Los argumentos son parte de un proceso comunicativo con el cual se trata de incidir en la acción de otros. Gilbert (1994 identifica cuatro modos de argumentación: el modo lógico, el modo emocional, el modo visceral y el modo kisceral. Siguiendo la línea de investigación en psicología computacional marcada por Ortony, Clore y Collins (1988 y el modelo de resolución de conflictos usando negociaciones basadas en argumentos propuesto por Jung y Tambe (2001, este trabajo presenta un modelo lógico-formal para el estudio de un modo concreto de argumentos emocionales dentro del contexto de formación de consensos enmarcado en un proceso de negociación/coordinación. Se discuten sus implicaciones en los modelos cognitivos emocionales basados en el proceso de apreciación/evaluación de la emoción.
Formal modeling of a system of chemical reactions under uncertainty.
Ghosh, Krishnendu; Schlipf, John
2014-10-01
We describe a novel formalism representing a system of chemical reactions, with imprecise rates of reactions and concentrations of chemicals, and describe a model reduction method, pruning, based on the chemical properties. We present two algorithms, midpoint approximation and interval approximation, for construction of efficient model abstractions with uncertainty in data. We evaluate computational feasibility by posing queries in computation tree logic (CTL) on a prototype of extracellular-signal-regulated kinase (ERK) pathway.
Formalizing the Process of Constructing Chains of Lexical Units
Directory of Open Access Journals (Sweden)
Grigorij Chetverikov
2015-06-01
Full Text Available Formalizing the Process of Constructing Chains of Lexical Units The paper investigates mathematical aspects of describing the construction of chains of lexical units on the basis of finite-predicate algebra. Analyzing the construction peculiarities is carried out and application of the method of finding the power of linear logical transformation for removing characteristic words of a dictionary entry is given. Analysis and perspectives of the results of the study are provided.
Directory of Open Access Journals (Sweden)
E. Tazik
2014-10-01
Full Text Available Landslides are among the most important natural hazards that lead to modification of the environment. Therefore, studying of this phenomenon is so important in many areas. Because of the climate conditions, geologic, and geomorphologic characteristics of the region, the purpose of this study was landslide hazard assessment using Fuzzy Logic, frequency ratio and Analytical Hierarchy Process method in Dozein basin, Iran. At first, landslides occurred in Dozein basin were identified using aerial photos and field studies. The influenced landslide parameters that were used in this study including slope, aspect, elevation, lithology, precipitation, land cover, distance from fault, distance from road and distance from river were obtained from different sources and maps. Using these factors and the identified landslide, the fuzzy membership values were calculated by frequency ratio. Then to account for the importance of each of the factors in the landslide susceptibility, weights of each factor were determined based on questionnaire and AHP method. Finally, fuzzy map of each factor was multiplied to its weight that obtained using AHP method. At the end, for computing prediction accuracy, the produced map was verified by comparing to existing landslide locations. These results indicate that the combining the three methods Fuzzy Logic, Frequency Ratio and Analytical Hierarchy Process method are relatively good estimators of landslide susceptibility in the study area. According to landslide susceptibility map about 51% of the occurred landslide fall into the high and very high susceptibility zones of the landslide susceptibility map, but approximately 26 % of them indeed located in the low and very low susceptibility zones.
Directory of Open Access Journals (Sweden)
Wei-I Lee
2016-12-01
Full Text Available The New Taipei City Government developed a Code-checking System (CCS using Building Information Modeling (BIM technology to facilitate an architectural design review in 2014. This system was intended to solve problems caused by cognitive gaps between designer and reviewer in the design review process. Along with considering information technology, the most important issue for the system’s development has been the logicalization of literal building codes. Therefore, to enhance the reliability and performance of the CCS, this study uses the Fuzzy Delphi Method (FDM on the basis of design thinking and communication theory to investigate the semantic difference and cognitive gaps among participants in the design review process and to propose the direction of system development. Our empirical results lead us to recommend grouping multi-stage screening and weighted assisted logicalization of non-quantitative building codes to improve the operability of CCS. Furthermore, CCS should integrate the Expert Evaluation System (EES to evaluate the design value under qualitative building codes.
"Debate" Learning Method and Its Implications for the Formal Education System
Najafi, Mohammad; Motaghi, Zohre; Nasrabadi, Hassanali Bakhtiyar; Heshi, Kamal Nosrati
2016-01-01
Regarding the importance of enhancement in learner's social skills, especially in learning process, this study tries to introduce one of the group learning programs entitled "debate" as a teaching method in Iran religious universities. It also considers the concept and the history of this method by qualitative and descriptive-analytical…
International Nuclear Information System (INIS)
Song, Myung Jun; Koo, Seo Ryong; Seong, Poong Hyun
2004-01-01
As PLCs are widely used in the digital I and C systems of nuclear power plants (NPPs), the safety of PLC software has become the most important consideration. Software safety is an important property for safety critical systems, especially those in aerospace, satellite and nuclear power plants, whose failure could result in danger to human life, property or environment. It is recently becoming more important due to the increase in the complexity and size of safety critical systems. This research proposes a method to perform effective verification tasks on the traceability analysis and software design evaluation in the software design phase. In order to perform the traceability analysis between a Software Requirements Specification (SRS) written in a natural language and a Software Design Specification (SDS) written in Function Block Diagram (FBD), this method uses extended-structured decision tables (ESDTs). ESDTs include information related to the traceability analysis from a text-based SRS and a FBD-based SDS, respectively. Through comparing with both ESDTs from an SRS and ESDTs from an SDS, the effective traceability analysis of both a text-based SRS and a FBD-based SDS can be achieved. For the software design evaluation, a model checking, which is mainly used to verify PLC programs formally, is used in this research. A FBD-style design specification is translated into input languages of the SMV by translation rules and then the FBD-style design specification can be formally analyzed using SMV. (author)
International Nuclear Information System (INIS)
Le Tellier, R.; Hebert, A.
2004-01-01
The method of characteristics is well known for its slow convergence; consequently, as it is often done for SN methods, the Generalized Minimal Residual approach (GMRES) has been investigated for its practical implementation and its high reliability. GMRES is one of the most effective Krylov iterative methods to solve large linear systems. Moreover, the system has been 'left preconditioned' with the Algebraic Collapsing Acceleration (ACA) a variant of the Diffusion Synthetic Acceleration (DSA) based on I. Suslov's former works. This paper presents the first numerical results of these methods in 2D geometries with material discontinuities. Indeed, previous investigations have shown a degraded effectiveness of Diffusion Synthetic Accelerations with this kind of geometries. Results are presented for 9 x 9 Cartesian assemblies in terms of the speed of convergence of the inner iterations (fixed source) of the method of characteristics. It shows a significant improvement on the convergence rate. (authors)
Geometry and Formal Linguistics.
Huff, George A.
This paper presents a method of encoding geometric line-drawings in a way which allows sets of such drawings to be interpreted as formal languages. A characterization of certain geometric predicates in terms of their properties as languages is obtained, and techniques usually associated with generative grammars and formal automata are then applied…
Applying Formal Methods to an Information Security Device: An Experience Report
National Research Council Canada - National Science Library
Kirby, Jr, James; Archer, Myla; Heitmeyer, Constance
1999-01-01
.... This paper describes a case study in which the SCR method was used to specify and analyze a different class of system, a cryptographic system called CD, which must satisfy a large set of security properties...
Symbolic logic syntax, semantics, and proof
Agler, David
2012-01-01
Brimming with visual examples of concepts, derivation rules, and proof strategies, this introductory text is ideal for students with no previous experience in logic. Students will learn translation both from formal language into English and from English into formal language; how to use truth trees and truth tables to test propositions for logical properties; and how to construct and strategically use derivation rules in proofs.
Baxter, S; Killoran, A; Kelly, M P; Goyder, E
2010-02-01
The nature of public health evidence presents challenges for conventional systematic review processes, with increasing recognition of the need to include a broader range of work including observational studies and qualitative research, yet with methods to combine diverse sources remaining underdeveloped. The objective of this paper is to report the application of a new approach for review of evidence in the public health sphere. The method enables a diverse range of evidence types to be synthesized in order to examine potential relationships between a public health environment and outcomes. The study drew on previous work by the National Institute for Health and Clinical Excellence on conceptual frameworks. It applied and further extended this work to the synthesis of evidence relating to one particular public health area: the enhancement of employee mental well-being in the workplace. The approach utilized thematic analysis techniques from primary research, together with conceptual modelling, to explore potential relationships between factors and outcomes. The method enabled a logic framework to be built from a diverse document set that illustrates how elements and associations between elements may impact on the well-being of employees. Whilst recognizing potential criticisms of the approach, it is suggested that logic models can be a useful way of examining the complexity of relationships between factors and outcomes in public health, and of highlighting potential areas for interventions and further research. The use of techniques from primary qualitative research may also be helpful in synthesizing diverse document types. Copyright 2010 The Royal Society for Public Health. Published by Elsevier Ltd. All rights reserved.
Energy Technology Data Exchange (ETDEWEB)
Choi, J. Y.; Choi, B. J.; Song, H. J.; Hwang, D. Y.; Song, G. H.; Lee, H. [Korea University, Seoul (Korea, Republic of)
2008-06-15
This research is on help develop nuclear power plant control system, through the requirement specification and verification method development. As the result of applying the test method, a test standard was obtain through test documentation writing support and a test document reflecting the standard test activities based on the test standard. The specification and verification of the pCOS system and the unified testing documentation and execution helps the entire project to progress and enable us to achieve necessary documents and technology to develop a safety critical system.
International Nuclear Information System (INIS)
Choi, J. Y.; Choi, B. J.; Song, H. J.; Hwang, D. Y.; Song, G. H.; Lee, H.
2008-06-01
This research is on help develop nuclear power plant control system, through the requirement specification and verification method development. As the result of applying the test method, a test standard was obtain through test documentation writing support and a test document reflecting the standard test activities based on the test standard. The specification and verification of the pCOS system and the unified testing documentation and execution helps the entire project to progress and enable us to achieve necessary documents and technology to develop a safety critical system
A Dynamic Logic for Learning Theory
DEFF Research Database (Denmark)
Baltag, Alexandru; Gierasimczuk, Nina; Özgün, Aybüke
2017-01-01
Building on previous work that bridged Formal Learning Theory and Dynamic Epistemic Logic in a topological setting, we introduce a Dynamic Logic for Learning Theory (DLLT), extending Subset Space Logics with dynamic observation modalities, as well as with a learning operator, which encodes the le...... the learner’s conjecture after observing a finite sequence of data. We completely axiomatise DLLT, study its expressivity and use it to characterise various notions of knowledge, belief, and learning. ...
Reasoning by cases in Default Logic
Roos, N.; Roos, Nico
1998-01-01
Reiter's Default Logic is one of the most popular formalisms for describing default reasoning. One important defect of Default Logic is, however, the inability to reason by cases. Over the years, several solutions for this problem have been proposed. All these proposals deal with deriving new
Quantum logic: is it necessarily orthocomplemented
International Nuclear Information System (INIS)
Mielnik, B.
1976-01-01
There exist conservative arguments supporting the necessity of the present day form of quantum theory, which are found in the axiomatics of quantum logic. In this paper the axioms of quantum logic are critically reexamined. The lattice macroscopic measurements, the motivation of the Hilbert space formalism and the convex scheme of quantum mechanics are among the topics discussed. (B.R.H.)
Evaluation of formal methods in hip joint center assessment: an in vitro analysis.
Lopomo, Nicola; Sun, Lei; Zaffagnini, Stefano; Giordano, Giovanni; Safran, Marc R
2010-03-01
The hip joint center is a fundamental landmark in the identification of lower limb mechanical axis; errors in its location lead to substantial inaccuracies both in joint reconstruction and in gait analysis. Actually in Computer Aided Surgery functional non-invasive procedures have been tested in identifying this landmark, but an anatomical validation is scarcely discussed. A navigation system was used to acquire data on eight cadaveric hips. Pivoting functional maneuver and hip joint anatomy were analyzed. Two functional methods - both with and without using the pelvic tracker - were evaluated: specifically a sphere fit method and a transformation techniques. The positions of the estimated centers with respect to the anatomical center of the femoral head, the influence of this deviation on the kinematic assessment and on the identification of femoral mechanical axis were analyzed. We found that the implemented transformation technique was the most reliable estimation of hip joint center, introducing a - Mean (SD) - difference of 1.6 (2.7) mm from the anatomical center with the pelvic tracker, whereas sphere fit method without it demonstrated the lowest accuracy with 25.2 (18.9) mm of deviation. Otherwise both the methods reported similar accuracy (<3mm of deviation). The functional estimations resulted in the best case to be in an average of less than 2mm from the anatomical center, which corresponds to angular deviations of the femoral mechanical axis smaller than 1.7 (1.3) degrees and negligible errors in kinematic assessment of angular displacements.
International Nuclear Information System (INIS)
Kim, Yun Goo; Seong, Poong Hyun
2012-01-01
The Computerized Procedure System (CPS) is one of the primary operating support systems in the digital Main Control Room. The CPS displays procedure on the computer screen in the form of a flow chart, and displays plant operating information along with procedure instructions. It also supports operator decision making by providing a system decision. A procedure flow should be correct and reliable, as an error would lead to operator misjudgement and inadequate control. In this paper we present a modeling for the CPS that enables formal verification based on Petri nets. The proposed State Token Petri Nets (STPN) also support modeling of a procedure flow that has various interruptions by the operator, according to the plant condition. STPN modeling is compared with Coloured Petri net when they are applied to Emergency Operating Computerized Procedure. A converting program for Computerized Procedure (CP) to STPN has been also developed. The formal verification and validation methods of CP with STPN increase the safety of a nuclear power plant and provide digital quality assurance means that are needed when the role and function of the CPS is increasing.
Logical Entity Level Sentiment Analysis
DEFF Research Database (Denmark)
Petersen, Niklas Christoffer; Villadsen, Jørgen
2017-01-01
We present a formal logical approach using a combinatory categorial grammar for entity level sentiment analysis that utilizes machine learning techniques for efficient syntactical tagging and performs a deep structural analysis of the syntactical properties of texts in order to yield precise resu...
Propositional Logics of Dependence
Yang, F.; Väänänen, J.
2016-01-01
In this paper, we study logics of dependence on the propositional level. We prove that several interesting propositional logics of dependence, including propositional dependence logic, propositional intuitionistic dependence logic as well as propositional inquisitive logic, are expressively complete
Directory of Open Access Journals (Sweden)
Hanbing Liu
2012-08-01
Full Text Available A fuzzy logic system (FLS is established for damage identification of simply supported bridge. A novel damage indicator is developed based on ratios of mode shape components between before and after damage. Numerical simulation of a simply-supported bridge is presented to demonstrate the memory, inference and anti-noise ability of the proposed method. The bridge is divided into eight elements and nine nodes, the damage indicator vector at characteristic nodes is used as the input measurement of FLS. Results reveal that FLS can detect damage of training patterns with an accuracy of 100%. Aiming at other test patterns, the FLS also possesses favorable inference ability, the identification accuracy for single damage location is up to 93.75%. Tests with noise simulated data show that the FLS possesses favorable anti-noise ability.
Formalized Verification of Snapshotable Trees: Separation and Sharing
DEFF Research Database (Denmark)
Mehnert, Hannes; Sieczkowski, Filip; Birkedal, Lars
2012-01-01
We use separation logic to specify and verify a Java program that implements snapshotable search trees, fully formalizing the speci- cation and verication in the Coq proof assistant. We achieve local and modular reasoning about a tree and its snapshots and their iterators, al- though...... for full functional specication and verication, whether by separation logic or by other formalisms....
Modern Logical Frameworks Design
DEFF Research Database (Denmark)
Murawska, Agata Anna
2017-01-01
lack support for reasoning about, or programming with, the mechanised systems. Our main motivation is to eventually make it possible to model and reason about complex concurrent systems and protocols. No matter the application, be it the development of a logic for multiparty session types...... or a cryptographic protocol used in a voting system, we need the ability to model and reason about both the building blocks of these systems and the intricate connections between them. To this end, this dissertation is an investigation into LF-based formalisms that might help address the aforementioned issues. We...... design and provide the meta-theory of two new frameworks, HyLF and Lincx. The former aims to extend the expressiveness of LF to include proof irrelevance and some user-defined behaviours, using ideas from hybrid logics. The latter is a showcase for an easier to implement framework, while also allowing...
Chu, Shih-I.; Telnov, Dmitry A.
2004-02-01
The advancement of high-power and short-pulse laser technology in the past two decades has generated considerable interest in the study of multiphoton and very high-order nonlinear optical processes of atomic and molecular systems in intense and superintense laser fields, leading to the discovery of a host of novel strong-field phenomena which cannot be understood by the conventional perturbation theory. The Floquet theorem and the time-independent Floquet Hamiltonian method are powerful theoretical framework for the study of bound-bound multiphoton transitions driven by periodically time-dependent fields. However, there are a number of significant strong-field processes cannot be directly treated by the conventional Floquet methods. In this review article, we discuss several recent developments of generalized Floquet theorems, formalisms, and quasienergy methods, beyond the conventional Floquet theorem, for accurate nonperturbative treatment of a broad range of strong-field atomic and molecular processes and phenomena of current interests. Topics covered include (a) artificial intelligence (AI)-most-probable-path approach (MPPA) for effective treatment of ultralarge Floquet matrix problem; (b) non-Hermitian Floquet formalisms and complex quasienergy methods for nonperturbative treatment of bound-free and free-free processes such as multiphoton ionization (MPI) and above-threshold ionization (ATI) of atoms and molecules, multiphoton dissociation (MPD) and above-threshold dissociation (ATD) of molecules, chemical bond softening and hardening, charge-resonance enhanced ionization (CREI) of molecular ions, and multiple high-order harmonic generation (HHG), etc.; (c) many-mode Floquet theorem (MMFT) for exact treatment of multiphoton processes in multi-color laser fields with nonperiodic time-dependent Hamiltonian; (d) Floquet-Liouville supermatrix (FLSM) formalism for exact nonperturbative treatment of time-dependent Liouville equation (allowing for relaxations and
International Nuclear Information System (INIS)
Chu, S.-I.; Telnov, D.A.
2004-01-01
The advancement of high-power and short-pulse laser technology in the past two decades has generated considerable interest in the study of multiphoton and very high-order nonlinear optical processes of atomic and molecular systems in intense and superintense laser fields, leading to the discovery of a host of novel strong-field phenomena which cannot be understood by the conventional perturbation theory. The Floquet theorem and the time-independent Floquet Hamiltonian method are powerful theoretical framework for the study of bound-bound multiphoton transitions driven by periodically time-dependent fields. However, there are a number of significant strong-field processes cannot be directly treated by the conventional Floquet methods. In this review article, we discuss several recent developments of generalized Floquet theorems, formalisms, and quasienergy methods, beyond the conventional Floquet theorem, for accurate nonperturbative treatment of a broad range of strong-field atomic and molecular processes and phenomena of current interests. Topics covered include (a) artificial intelligence (AI)-most-probable-path approach (MPPA) for effective treatment of ultralarge Floquet matrix problem; (b) non-Hermitian Floquet formalisms and complex quasienergy methods for nonperturbative treatment of bound-free and free-free processes such as multiphoton ionization (MPI) and above-threshold ionization (ATI) of atoms and molecules, multiphoton dissociation (MPD) and above-threshold dissociation (ATD) of molecules, chemical bond softening and hardening, charge-resonance enhanced ionization (CREI) of molecular ions, and multiple high-order harmonic generation (HHG), etc.; (c) many-mode Floquet theorem (MMFT) for exact treatment of multiphoton processes in multi-color laser fields with nonperiodic time-dependent Hamiltonian; (d) Floquet-Liouville supermatrix (FLSM) formalism for exact nonperturbative treatment of time-dependent Liouville equation (allowing for relaxations and
DEFF Research Database (Denmark)
Braüner, Torben
2011-01-01
Intuitionistic hybrid logic is hybrid modal logic over an intuitionistic logic basis instead of a classical logical basis. In this short paper we introduce intuitionistic hybrid logic and we give a survey of work in the area.......Intuitionistic hybrid logic is hybrid modal logic over an intuitionistic logic basis instead of a classical logical basis. In this short paper we introduce intuitionistic hybrid logic and we give a survey of work in the area....
Software Formal Inspections Guidebook
1993-01-01
The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.
Formal Analysis of SET and NSL Protocols Using the Interpretation Functions-Based Method
Directory of Open Access Journals (Sweden)
Hanane Houmani
2012-01-01
Full Text Available Most applications in the Internet such as e-banking and e-commerce use the SET and the NSL protocols to protect the communication channel between the client and the server. Then, it is crucial to ensure that these protocols respect some security properties such as confidentiality, authentication, and integrity. In this paper, we analyze the SET and the NSL protocols with respect to the confidentiality (secrecy property. To perform this analysis, we use the interpretation functions-based method. The main idea behind the interpretation functions-based technique is to give sufficient conditions that allow to guarantee that a cryptographic protocol respects the secrecy property. The flexibility of the proposed conditions allows the verification of daily-life protocols such as SET and NSL. Also, this method could be used under different assumptions such as a variety of intruder abilities including algebraic properties of cryptographic primitives. The NSL protocol, for instance, is analyzed with and without the homomorphism property. We show also, using the SET protocol, the usefulness of this approach to correct weaknesses and problems discovered during the analysis.
DEFF Research Database (Denmark)
Hendricks, Vincent Fella; Gierasimczuk, Nina; de Jong, Dick
2014-01-01
Learning and learnability have been long standing topics of interests within the linguistic, computational, and epistemological accounts of inductive in- ference. Johan van Benthem’s vision of the “dynamic turn” has not only brought renewed life to research agendas in logic as the study of inform......Learning and learnability have been long standing topics of interests within the linguistic, computational, and epistemological accounts of inductive in- ference. Johan van Benthem’s vision of the “dynamic turn” has not only brought renewed life to research agendas in logic as the study...... of information processing, but likewise helped bring logic and learning in close proximity. This proximity relation is examined with respect to learning and belief revision, updating and efficiency, and with respect to how learnability fits in the greater scheme of dynamic epistemic logic and scientific method....
A Two-Stage Fuzzy Logic Control Method of Traffic Signal Based on Traffic Urgency Degree
Yan Ge
2014-01-01
City intersection traffic signal control is an important method to improve the efficiency of road network and alleviate traffic congestion. This paper researches traffic signal fuzzy control method on a single intersection. A two-stage traffic signal control method based on traffic urgency degree is proposed according to two-stage fuzzy inference on single intersection. At the first stage, calculate traffic urgency degree for all red phases using traffic urgency evaluation module and select t...
Smets, P
1995-01-01
We start by describing the nature of imperfect data, and giving an overview of the various models that have been proposed. Fuzzy sets theory is shown to be an extension of classical set theory, and as such has a proeminent role or modelling imperfect data. The mathematic of fuzzy sets theory is detailled, in particular the role of the triangular norms. The use of fuzzy sets theory in fuzzy logic and possibility theory,the nature of the generalized modus ponens and of the implication operator for approximate reasoning are analysed. The use of fuzzy logic is detailled for application oriented towards process control and database problems.
DEFF Research Database (Denmark)
Reynolds, John C.
2002-01-01
In joint work with Peter O'Hearn and others, based on early ideas of Burstall, we have developed an extension of Hoare logic that permits reasoning about low-level imperative programs that use shared mutable data structure. The simple imperative programming language is extended with commands (not...... with the inductive definition of predicates on abstract data structures, this extension permits the concise and flexible description of structures with controlled sharing. In this paper, we will survey the current development of this program logic, including extensions that permit unrestricted address arithmetic...
Quantum supports and modal logic
International Nuclear Information System (INIS)
Svetlichny, G.
1986-01-01
Recently Foulis, Piron, and Randall introduced a new interpretation of empirical and quantum logics which substitute for the notion of a probabilistic weight a combinatorial notion called a support. The informal use of the notion of ''possible outcomes of experiments'' suggests that this interpretation can be related to corresponding formal notions as treated by modal logic. The purpose of this paper is to prove that in fact supports are in one-to-one correspondence with the sets of possibly true elementary propositions in Kripke models of a set of modal formulas associated to the empirical or quantum logic. This hopefully provides a sufficiently detailed link between the two rather distinct logical systems to shed useful light on both
Logical Theories for Agent Introspection
DEFF Research Database (Denmark)
Bolander, Thomas
2004-01-01
Artificial intelligence systems (agents) generally have models of the environments they inhabit which they use for representing facts, for reasoning about these facts and for planning actions. Much intelligent behaviour seems to involve an ability to model not only one's external environment...... by self-reference. In the standard approach taken in artificial intelligence, the model that an agent has of its environment is represented as a set of beliefs. These beliefs are expressed as logical formulas within a formal, logical theory. When the logical theory is expressive enough to allow...... introspective reasoning, the presence of self-reference causes the theory to be prone to inconsistency. The challenge therefore becomes to construct logical theories supporting introspective reasoning while at the same time ensuring that consistency is retained. In the thesis, we meet this challenge by devising...
Formal Analysis of Domain Models
National Research Council Canada - National Science Library
Bharadwaj, Ramesh
2002-01-01
Recently, there has been a great deal of interest in the application of formal methods, in particular, precise formal notations and automatic analysis tools for the creation and analysis of requirements specifications (i.e...
Formal representation of complex SNOMED CT expressions
Directory of Open Access Journals (Sweden)
Markó Kornél
2008-10-01
Full Text Available Abstract Background Definitory expressions about clinical procedures, findings and diseases constitute a major benefit of a formally founded clinical reference terminology which is ontologically sound and suited for formal reasoning. SNOMED CT claims to support formal reasoning by description-logic based concept definitions. Methods On the basis of formal ontology criteria we analyze complex SNOMED CT concepts, such as "Concussion of Brain with(out Loss of Consciousness", using alternatively full first order logics and the description logic ℰℒ MathType@MTEF@5@5@+=feaagaart1ev2aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacPC6xNi=xH8viVGI8Gi=hEeeu0xXdbba9frFj0xb9qqpG0dXdb9aspeI8k8fiI+fsY=rqGqVepae9pg0db9vqaiVgFr0xfr=xfr=xc9adbaqaaeGaciGaaiaabeqaaeqabiWaaaGcbaWenfgDOvwBHrxAJfwnHbqeg0uy0HwzTfgDPnwy1aaceaGae8hmHuKae8NeHWeaaa@37B1@. Results Typical complex SNOMED CT concepts, including negations or not, can be expressed in full first-order logics. Negations cannot be properly expressed in the description logic ℰℒ MathType@MTEF@5@5@+=feaagaart1ev2aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacPC6xNi=xH8viVGI8Gi=hEeeu0xXdbba9frFj0xb9qqpG0dXdb9aspeI8k8fiI+fsY=rqGqVepae9pg0db9vqaiVgFr0xfr=xfr=xc9adbaqaaeGaciGaaiaabeqaaeqabiWaaaGcbaWenfgDOvwBHrxAJfwnHbqeg0uy0HwzTfgDPnwy1aaceaGae8hmHuKae8NeHWeaaa@37B1@ underlying SNOMED CT. All concepts concepts the meaning of which implies a temporal scope may be subject to diverging interpretations, which are often unclear in SNOMED CT as their contextual determinants are not made explicit. Conclusion The description of complex medical occurrents is ambiguous, as the same situations can be described as (i a complex occurrent C that has A and B as temporal parts, (ii a simple occurrent A' defined as a kind of A followed by some B, or (iii a simple occurrent B' defined as a kind of B preceded by some A. As negative statements in SNOMED CT cannot be exactly represented without
Problems Encountered in Teaching Logic in Faculties of Theology
Directory of Open Access Journals (Sweden)
Hülya ALTUNYA
2014-04-01
Full Text Available From the past until today logic has always been affiliated with religious education in institutes of higher learning. In the history of Islamic thought, education in logic has at times held an important place in the curriculum, while at other times it has only been represented symbolically. Throughout the history of thought, religious sciences have not only possessed a hierarchical structure based on classification, but have also been institutionalized in order to protect the accumulation of the knowledge that has been attained. As a result, in order to function as a vehicle in the structuring of this knowledge, logic has become what is known as an introductory science. Over time, as a vehicle of religious sciences and an introductory science, logic has become a productive method by which different academic disciplines can attain information. Thus, until today in religious sciences education, logic has been used as a method both in the higher religious education provided in faculties of theology and in the madrasas which continue this education. In this paper, the education of logic that is given in faculties of theology, including how much of the curriculum is devoted to this subject, the quality of instruction, the integration of this subject with other lessons, the interest of students in this subject and whether or not the necessary productivity in logic instruction is being attained will be examined. In addition, to what extent logic can make new contributions to new thought and comprehension techniques for solving the theological problems of today will be investigated. An additional research question asked here is the extent to which students enrolled in theology faculties in formal and informal education, the very people who will later act as instructors in religious sciences, are aware of the importance of being familiar and skilled in “correct reasoning techniques”.
Mapping Modular SOS to Rewriting Logic
DEFF Research Database (Denmark)
Braga, Christiano de Oliveira; Haeusler, Edward Hermann; Meseguer, José
2003-01-01
and verification of MSOS specifications, we have defined a mapping, named , from MSOS to rewriting logic (RWL), a logic which has been proposed as a logical and semantic framework. We have proven the correctness of and implemented it as a prototype, the MSOS-SL Interpreter, in the Maude system, a high......Modular SOS (MSOS) is a framework created to improve the modularity of structural operational semantics specifications, a formalism frequently used in the fields of programming languages semantics and process algebras. With the objective of defining formal tools to support the execution...
A Logical Analysis of Quantum Voting Protocols
Rad, Soroush Rafiee; Shirinkalam, Elahe; Smets, Sonja
2017-12-01
In this paper we provide a logical analysis of the Quantum Voting Protocol for Anonymous Surveying as developed by Horoshko and Kilin in (Phys. Lett. A 375, 1172-1175 2011). In particular we make use of the probabilistic logic of quantum programs as developed in (Int. J. Theor. Phys. 53, 3628-3647 2014) to provide a formal specification of the protocol and to derive its correctness. Our analysis is part of a wider program on the application of quantum logics to the formal verification of protocols in quantum communication and quantum computation.
Indian Academy of Sciences (India)
dimensional superfields, is a clear signature of the presence of the (anti-)BRST invariance in the original. 4D theory. Keywords. Non-Abelian 1-form gauge theory; Dirac fields; (anti-)Becchi–Roucet–Stora–. Tyutin invariance; superfield formalism; ...
The formal and the formalized: the cases of syllogistic and supposition theory
Dutilh Novaes, Catarina
2015-01-01
As a discipline, logic is arguably constituted of two main sub-projects: formal theories of argument validity on the basis of a small number of patterns, and theories of how to reduce the multiplicity of arguments in non-logical, informal contexts to the small number of patterns whose validity is
Directory of Open Access Journals (Sweden)
Sutapa Datta
Full Text Available An important step in understanding gene regulation is to identify the promoter regions where the transcription factor binding takes place. Predicting a promoter region de novo has been a theoretical goal for many researchers for a long time. There exists a number of in silico methods to predict the promoter region de novo but most of these methods are still suffering from various shortcomings, a major one being the selection of appropriate features of promoter region distinguishing them from non-promoters. In this communication, we have proposed a new composite method that predicts promoter sequences based on the interrelationship between structural profiles of DNA and primary sequence elements of the promoter regions. We have shown that a Context Free Grammar (CFG can formalize the relationships between different primary sequence features and by utilizing the CFG, we demonstrate that an efficient parser can be constructed for extracting these relationships from DNA sequences to distinguish the true promoter sequences from non-promoter sequences. Along with CFG, we have extracted the structural features of the promoter region to improve upon the efficiency of our prediction system. Extensive experiments performed on different datasets reveals that our method is effective in predicting promoter sequences on a genome-wide scale and performs satisfactorily as compared to other promoter prediction techniques.
Datta, Sutapa; Mukhopadhyay, Subhasis
2013-01-01
An important step in understanding gene regulation is to identify the promoter regions where the transcription factor binding takes place. Predicting a promoter region de novo has been a theoretical goal for many researchers for a long time. There exists a number of in silico methods to predict the promoter region de novo but most of these methods are still suffering from various shortcomings, a major one being the selection of appropriate features of promoter region distinguishing them from non-promoters. In this communication, we have proposed a new composite method that predicts promoter sequences based on the interrelationship between structural profiles of DNA and primary sequence elements of the promoter regions. We have shown that a Context Free Grammar (CFG) can formalize the relationships between different primary sequence features and by utilizing the CFG, we demonstrate that an efficient parser can be constructed for extracting these relationships from DNA sequences to distinguish the true promoter sequences from non-promoter sequences. Along with CFG, we have extracted the structural features of the promoter region to improve upon the efficiency of our prediction system. Extensive experiments performed on different datasets reveals that our method is effective in predicting promoter sequences on a genome-wide scale and performs satisfactorily as compared to other promoter prediction techniques. PMID:23437045
Directory of Open Access Journals (Sweden)
Alexei Kassian
Full Text Available A lexicostatistical classification is proposed for 20 languages and dialects of the Lezgian group of the North Caucasian family, based on meticulously compiled 110-item wordlists, published as part of the Global Lexicostatistical Database project. The lexical data have been subsequently analyzed with the aid of the principal phylogenetic methods, both distance-based and character-based: Starling neighbor joining (StarlingNJ, Neighbor joining (NJ, Unweighted pair group method with arithmetic mean (UPGMA, Bayesian Markov chain Monte Carlo (MCMC, Unweighted maximum parsimony (UMP. Cognation indexes within the input matrix were marked by two different algorithms: traditional etymological approach and phonetic similarity, i.e., the automatic method of consonant classes (Levenshtein distances. Due to certain reasons (first of all, high lexicographic quality of the wordlists and a consensus about the Lezgian phylogeny among Caucasologists, the Lezgian database is a perfect testing area for appraisal of phylogenetic methods. For the etymology-based input matrix, all the phylogenetic methods, with the possible exception of UMP, have yielded trees that are sufficiently compatible with each other to generate a consensus phylogenetic tree of the Lezgian lects. The obtained consensus tree agrees with the traditional expert classification as well as some of the previously proposed formal classifications of this linguistic group. Contrary to theoretical expectations, the UMP method has suggested the least plausible tree of all. In the case of the phonetic similarity-based input matrix, the distance-based methods (StarlingNJ, NJ, UPGMA have produced the trees that are rather close to the consensus etymology-based tree and the traditional expert classification, whereas the character-based methods (Bayesian MCMC, UMP have yielded less likely topologies.
Kassian, Alexei
2015-01-01
A lexicostatistical classification is proposed for 20 languages and dialects of the Lezgian group of the North Caucasian family, based on meticulously compiled 110-item wordlists, published as part of the Global Lexicostatistical Database project. The lexical data have been subsequently analyzed with the aid of the principal phylogenetic methods, both distance-based and character-based: Starling neighbor joining (StarlingNJ), Neighbor joining (NJ), Unweighted pair group method with arithmetic mean (UPGMA), Bayesian Markov chain Monte Carlo (MCMC), Unweighted maximum parsimony (UMP). Cognation indexes within the input matrix were marked by two different algorithms: traditional etymological approach and phonetic similarity, i.e., the automatic method of consonant classes (Levenshtein distances). Due to certain reasons (first of all, high lexicographic quality of the wordlists and a consensus about the Lezgian phylogeny among Caucasologists), the Lezgian database is a perfect testing area for appraisal of phylogenetic methods. For the etymology-based input matrix, all the phylogenetic methods, with the possible exception of UMP, have yielded trees that are sufficiently compatible with each other to generate a consensus phylogenetic tree of the Lezgian lects. The obtained consensus tree agrees with the traditional expert classification as well as some of the previously proposed formal classifications of this linguistic group. Contrary to theoretical expectations, the UMP method has suggested the least plausible tree of all. In the case of the phonetic similarity-based input matrix, the distance-based methods (StarlingNJ, NJ, UPGMA) have produced the trees that are rather close to the consensus etymology-based tree and the traditional expert classification, whereas the character-based methods (Bayesian MCMC, UMP) have yielded less likely topologies.
International Nuclear Information System (INIS)
Grecco, Claudio H.S.; Santos, Isaac J.A.L.; Carvalho, Paulo V.R.; Vidal, Mario C.R.; Cosenza, Carlos A.N.
2013-01-01
Resilience is the intrinsic ability of a system to adjust its functioning prior to, during, or following changes and disturbances, so that it can sustain required operations under expected and unexpected conditions. This definition focuses on the ability to function, rather than on being impervious to failure, and thereby overcomes the traditional conflict between productivity and safety. Resilience engineering (RE) has fast become recognized as a valuable complement to the established approaches to safety of complex socio-technical systems and methods to monitor organizational resilience are needed. However, few, if any, comprehensive and systematic research studies focus on developing an objective, reliable and practical assessment model for monitoring organizational resilience. Most methods cannot fully solve the subjectivity of resilience evaluation. In order to remedy this deficiency, the aim of this research is to adopt a Fuzzy Set Theory (FST) approach to establish a method for resilience assessment in organizations based on leading safety performance indicators, defined according to the resilience engineering principles. The method uses FST concepts and properties to model the indicators and to assess the results of their application. To exemplify the method we performed an exploratory case study at the process of radiopharmaceuticals dispatch package of a Brazilian radioactive facility. (author)
Energy Technology Data Exchange (ETDEWEB)
Grecco, Claudio H.S.; Santos, Isaac J.A.L.; Carvalho, Paulo V.R., E-mail: grecco@ien.gov.br, E-mail: luquetti@ien.gov.br, E-mail: paulov@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil). Divisao de Instrumentacao e Confiabilidade Humana; Vidal, Mario C.R.; Cosenza, Carlos A.N., E-mail: mvidal@ergonomia.ufrj.br, E-mail: cosenza@pep.ufrj.br [Coordenacao dos Programas de Pos-Graduacao em Engenharia (PEP/COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia de Producao
2013-07-01
Resilience is the intrinsic ability of a system to adjust its functioning prior to, during, or following changes and disturbances, so that it can sustain required operations under expected and unexpected conditions. This definition focuses on the ability to function, rather than on being impervious to failure, and thereby overcomes the traditional conflict between productivity and safety. Resilience engineering (RE) has fast become recognized as a valuable complement to the established approaches to safety of complex socio-technical systems and methods to monitor organizational resilience are needed. However, few, if any, comprehensive and systematic research studies focus on developing an objective, reliable and practical assessment model for monitoring organizational resilience. Most methods cannot fully solve the subjectivity of resilience evaluation. In order to remedy this deficiency, the aim of this research is to adopt a Fuzzy Set Theory (FST) approach to establish a method for resilience assessment in organizations based on leading safety performance indicators, defined according to the resilience engineering principles. The method uses FST concepts and properties to model the indicators and to assess the results of their application. To exemplify the method we performed an exploratory case study at the process of radiopharmaceuticals dispatch package of a Brazilian radioactive facility. (author)
Computational logic: its origins and applications.
Paulson, Lawrence C
2018-02-01
Computational logic is the use of computers to establish facts in a logical formalism. Originating in nineteenth century attempts to understand the nature of mathematical reasoning, the subject now comprises a wide variety of formalisms, techniques and technologies. One strand of work follows the 'logic for computable functions (LCF) approach' pioneered by Robin Milner, where proofs can be constructed interactively or with the help of users' code (which does not compromise correctness). A refinement of LCF, called Isabelle, retains these advantages while providing flexibility in the choice of logical formalism and much stronger automation. The main application of these techniques has been to prove the correctness of hardware and software systems, but increasingly researchers have been applying them to mathematics itself.
International Nuclear Information System (INIS)
Koo, S. R.; Seong, P. H.
1999-01-01
In this work, a formal requirements analysis method for Nuclear Power Plant (NPP) I and C systems is suggested. This method uses Unified Modeling Language (UML) for modeling systems visually and Software Cost Reduction (SCR) formalism for checking the system models. Since object-oriented method can analyze a document by the objects in a real system, UML models that use object-oriented method are useful for understanding problems and communicating with everyone involved in the project. In order to analyze the requirement more formally, SCR tabular notations is converted from UML models. To help flow-through from UML models to SCR specifications, additional syntactic extensions for UML notation and a converting procedure are defined. The combined method has been applied to Dynamic Safety System (DSS). From this application, three kinds of errors were detected in the existing DSS requirements
DEFF Research Database (Denmark)
Vested Madsen, Matias; Macario, Alex; Yamamoto, Satoshi
2016-01-01
learning as higher quality than sessions with little or no verbal interaction between teacher and learner. A modified Delphi process was used to identify useful teaching techniques. A representative sample of each of the formal teaching session types was mapped, and residents anonymously completed a 5...... formal teaching session. The overall education scores of the sessions as rated by the residents were high....
Logical thinking in the pyramidal schema of concepts the logical and mathematical elements
Geldsetzer, Lutz
2014-01-01
This book proposes a new way of formalizing in logic and mathematics - a "pyramidal graph system," devised by the author and based on Porphyrian trees and modern concepts of classification, in both of which pyramids act as the organizing schema.
Quantum Enhanced Inference in Markov Logic Networks.
Wittek, Peter; Gogolin, Christian
2017-04-19
Markov logic networks (MLNs) reconcile two opposing schools in machine learning and artificial intelligence: causal networks, which account for uncertainty extremely well, and first-order logic, which allows for formal deduction. An MLN is essentially a first-order logic template to generate Markov networks. Inference in MLNs is probabilistic and it is often performed by approximate methods such as Markov chain Monte Carlo (MCMC) Gibbs sampling. An MLN has many regular, symmetric structures that can be exploited at both first-order level and in the generated Markov network. We analyze the graph structures that are produced by various lifting methods and investigate the extent to which quantum protocols can be used to speed up Gibbs sampling with state preparation and measurement schemes. We review different such approaches, discuss their advantages, theoretical limitations, and their appeal to implementations. We find that a straightforward application of a recent result yields exponential speedup compared to classical heuristics in approximate probabilistic inference, thereby demonstrating another example where advanced quantum resources can potentially prove useful in machine learning.
Quantum Enhanced Inference in Markov Logic Networks
Wittek, Peter; Gogolin, Christian
2017-04-01
Markov logic networks (MLNs) reconcile two opposing schools in machine learning and artificial intelligence: causal networks, which account for uncertainty extremely well, and first-order logic, which allows for formal deduction. An MLN is essentially a first-order logic template to generate Markov networks. Inference in MLNs is probabilistic and it is often performed by approximate methods such as Markov chain Monte Carlo (MCMC) Gibbs sampling. An MLN has many regular, symmetric structures that can be exploited at both first-order level and in the generated Markov network. We analyze the graph structures that are produced by various lifting methods and investigate the extent to which quantum protocols can be used to speed up Gibbs sampling with state preparation and measurement schemes. We review different such approaches, discuss their advantages, theoretical limitations, and their appeal to implementations. We find that a straightforward application of a recent result yields exponential speedup compared to classical heuristics in approximate probabilistic inference, thereby demonstrating another example where advanced quantum resources can potentially prove useful in machine learning.
Interpreting Quantum Logic as a Pragmatic Structure
Garola, Claudio
2017-12-01
Many scholars maintain that the language of quantum mechanics introduces a quantum notion of truth which is formalized by (standard, sharp) quantum logic and is incompatible with the classical (Tarskian) notion of truth. We show that quantum logic can be identified (up to an equivalence relation) with a fragment of a pragmatic language LGP of assertive formulas, that are justified or unjustified rather than trueor false. Quantum logic can then be interpreted as an algebraic structure that formalizes properties of the notion of empirical justification according to quantum mechanics rather than properties of a quantum notion of truth. This conclusion agrees with a general integrationist perspective that interprets nonstandard logics as theories of metalinguistic notions different from truth, thus avoiding incompatibility with classical notions and preserving the globality of logic.
Logic for computer science foundations of automatic theorem proving
Gallier, Jean H
2015-01-01
This advanced text for undergraduate and graduate students introduces mathematical logic with an emphasis on proof theory and procedures for algorithmic construction of formal proofs. The self-contained treatment is also useful for computer scientists and mathematically inclined readers interested in the formalization of proofs and basics of automatic theorem proving. Topics include propositional logic and its resolution, first-order logic, Gentzen's cut elimination theorem and applications, and Gentzen's sharpened Hauptsatz and Herbrand's theorem. Additional subjects include resolution in fir
Temporal logics and real time expert systems.
Blom, J A
1996-10-01
This paper introduces temporal logics. Due to the eternal compromise between expressive adequacy and reasoning efficiency that must decided upon in any application, full (first order logic or modal logic based) temporal logics are frequently not suitable. This is especially true in real time expert systems, where a fixed (and usually small) response time must be guaranteed. One such expert system, Fagan's VM, is reviewed, and a delineation is given of how to formally describe and reason with time in medical protocols. It is shown that Petri net theory is a useful tool to check the correctness of formalised protocols.
Directory of Open Access Journals (Sweden)
Alper Şen
2016-12-01
Full Text Available The inadequate evaluation of geologic factors and unqualified and unplanned structuring play effective role in giant damage and loss of lives created by the earthquakes and faulty areas choice and structure construction cause building damages during the earthquake, thus it also causes giant loss of lives. Istanbul province and its immediate environment are located in north of North Anatolian Fault Zone having 1500 km length. Hence, it causes that the settlement’s Sea of Marmara coastal region is located in 1st seismic belt. The earthquake risk in Istanbul and related risk factors should be determined besides vulnerability and earthquake risk. A mathematical model has been created in geographic information systems for Kadıkoy, Maltepe and Prince Islands sub-provinces by using Fuzzy Logic method which is one of the artificial intelligence methods by considering 4 vulnerability parameters and earthquake vulnerability analysis have been made in this study. The used parameters are the location by fault line, geologic structure, building structure and the number of floors. The vulnerability grades emerged as a result of analysis have been studied and the distribution of buildings according to those levels have been presented via a thematic map. The pre-earthquake precautions should be determined for the study field by considering the vulnerability grades in case of any earthquake and the loss of life and property should be minimized.
Yeşilkanat, Cafer Mert; Kobya, Yaşar; Taşkın, Halim; Çevik, Uğur
2017-09-01
The aim of this study was to determine spatial risk dispersion of ambient gamma dose rate (AGDR) by using both artificial neural network (ANN) and fuzzy logic (FL) methods, compare the performances of methods, make dose estimations for intermediate stations with no previous measurements and create dose rate risk maps of the study area. In order to determine the dose distribution by using artificial neural networks, two main networks and five different network structures were used; feed forward ANN; Multi-layer perceptron (MLP), Radial basis functional neural network (RBFNN), Quantile regression neural network (QRNN) and recurrent ANN; Jordan networks (JN), Elman networks (EN). In the evaluation of estimation performance obtained for the test data, all models appear to give similar results. According to the cross-validation results obtained for explaining AGDR distribution, Pearson's r coefficients were calculated as 0.94, 0.91, 0.89, 0.91, 0.91 and 0.92 and RMSE values were calculated as 34.78, 43.28, 63.92, 44.86, 46.77 and 37.92 for MLP, RBFNN, QRNN, JN, EN and FL, respectively. In addition, spatial risk maps showing distributions of AGDR of the study area were created by all models and results were compared with geological, topological and soil structure. Copyright © 2017 Elsevier Ltd. All rights reserved.
Denning, Peter J.
1991-01-01
The ongoing debate over the role of formalism and formal specifications in software features many speakers with diverse positions. Yet, in the end, they share the conviction that the requirements of a software system can be unambiguously specified, that acceptable software is a product demonstrably meeting the specifications, and that the design process can be carried out with little interaction between designers and users once the specification has been agreed to. This conviction is part of a larger paradigm prevalent in American management thinking, which holds that organizations are systems that can be precisely specified and optimized. This paradigm, which traces historically to the works of Frederick Taylor in the early 1900s, is no longer sufficient for organizations and software systems today. In the domain of software, a new paradigm, called user-centered design, overcomes the limitations of pure formalism. Pioneered in Scandinavia, user-centered design is spreading through Europe and is beginning to make its way into the U.S.
The logical foundations of scientific theories languages, structures, and models
Krause, Decio
2016-01-01
This book addresses the logical aspects of the foundations of scientific theories. Even though the relevance of formal methods in the study of scientific theories is now widely recognized and regaining prominence, the issues covered here are still not generally discussed in philosophy of science. The authors focus mainly on the role played by the underlying formal apparatuses employed in the construction of the models of scientific theories, relating the discussion with the so-called semantic approach to scientific theories. The book describes the role played by this metamathematical framework in three main aspects: considerations of formal languages employed to axiomatize scientific theories, the role of the axiomatic method itself, and the way set-theoretical structures, which play the role of the models of theories, are developed. The authors also discuss the differences and philosophical relevance of the two basic ways of aximoatizing a scientific theory, namely Patrick Suppes’ set theoretical predicate...
Directory of Open Access Journals (Sweden)
Julian Galindo Losada
2016-11-01
Full Text Available Emerging companies involved in design and implementation of innovative products demand multidisciplinary teams to be competitive in the market. This need mainly exposes designers to extend their knowledge not only in User Interface elements of the design process but also in software methodologies to cover the lack of resources and expertise in start-ups. It raises the question of how designers can line up HCI techniques with best practices in software development while preserving usability and easy-to-use principles. To explore this gap, this paper proposes an approach which combines existing technology and methods by studying the nexus between HCI prototyping and software engineering. The approach is applied into a case study in the design of a virtual shop harmonizing the use of storyboards and the spiral. A comprehensive analysis is performed by using a Technology acceptance model (TAM regarding with two variables: usability and easy-to-use. The present finding underlines the positive integration of HCI techniques and formal methods without compromising user satisfaction with a potential benefit for small companies in a formation stage.
Formal analysis of physical theories
International Nuclear Information System (INIS)
Dalla Chiara, M.L.; Toraldo di Francia, G.
1979-01-01
The rules of inference that are made use of in formalization are considered. It is maintained that a physical law represents the universal assertion of a probability, and not the assessment of the probability of a universal assertion. The precision of the apparatus used to collect the experimental evidence is introduced as an essential part of the theoretical structure of physics. This approach allows the author to define the concept of truth in a satisfactory way, abandoning the unacceptable notion of approximate truth. It is shown that a considerable amount of light can be shed on a number of much debated problems arising in the logic of quantum mechanics. It is stressed that the deductive structure of quantum theory seems to be essentially founded on a kind of mixture of different logics. Two different concepts of truth are distinguished within quantum theory, an empirical truth and quantum-logical truth. (Auth.)
International Nuclear Information System (INIS)
Martin-del-Campo, C.; Francois, J.L.; Barragan, A.M.; Palomera, M.A.
2005-01-01
In this paper we develop a methodology based on the use of the Fuzzy Logic technique to build multi-objective functions to be used in optimization processes applied to in-core nuclear fuel management. As an example, we selected the problem of determining optimal radial fuel enrichment and gadolinia distributions in a typical 'Boiling Water Reactor (BWR)' fuel lattice. The methodology is based on the use of the mathematical capability of Fuzzy Logic to model nonlinear functions of arbitrary complexity. The utility of Fuzzy Logic is to map an input space into an output space, and the primary mechanism for doing this is a list of if-then statements called rules. The rules refer to variables and adjectives that describe those variables and, the Fuzzy Logic technique interprets the values in the input vectors and, based on the set of rules assigns values to the output vector. The methodology was developed for the radial optimization of a BWR lattice where the optimization algorithm employed is Tabu Search. The global objective is to find the optimal distribution of enrichments and burnable poison concentrations in a 10*10 BWR lattice. In order to do that, a fuzzy control inference system was developed using the Fuzzy Logic Toolbox of Matlab and it has been linked to the Tabu Search optimization process. Results show that Tabu Search combined with Fuzzy Logic performs very well, obtaining lattices with optimal fuel utilization. (authors)
Market incentives in the Dutch home care debate: Applying the logics of care method
Directory of Open Access Journals (Sweden)
Stijn Verhagen
2009-12-01
Sinds de jaren negentig hebben politici en beleidsmakers vaak gehamerd op de voordelen van marktwerking in de Nederlandse thuiszorg. Sinds die tijd ook, roepen wachtlijsten, personeelstekorten en overige problemen negatieve publiciteit over de sector af. In dit artikel presenteert de auteur een methode, waarmee hij de onderliggende logica’s en de dilemma’s van marktwerking in de thuiszorg onderzoekt. Op basis van 443 documenten (afkomstig van de Nederlandse overheid, politieke partijen, sociale partners, beroepsgroepen, verzekeraars en patiënten- en consumentenorganisaties, toont de auteur aan dat in het debat over marktwerking, vier concurrerende zorglogica’s aanwezig zijn: de economische, politieke, familiale, en professionele logica’s van zorg. Waar sommige actoren wijzen op de conflicten en tegenstrijdigheden tussen de verschillende logica’s, benadrukken andere hun wederzijdse complementariteit. Dit laatste gebeurt overigens relatief heel weinig, maar áls de partijen de onderlinge complementariteit van de zorglogica’s benadrukken lijkt de kans op een gezamenlijke en effectieve aanpak van de huiszorgproblemen toe te nemen. Deze observatie suggereert een verband tussen het debat enerzijds en de concrete beleids- en uitvoeringspraktijk anderzijds.
Carlsson, Christer; Fullér, Robert
2004-01-01
Fuzzy Logic in Management demonstrates that difficult problems and changes in the management environment can be more easily handled by bringing fuzzy logic into the practice of management. This explicit theme is developed through the book as follows: Chapter 1, "Management and Intelligent Support Technologies", is a short survey of management leadership and what can be gained from support technologies. Chapter 2, "Fuzzy Sets and Fuzzy Logic", provides a short introduction to fuzzy sets, fuzzy relations, the extension principle, fuzzy implications and linguistic variables. Chapter 3, "Group Decision Support Systems", deals with group decision making, and discusses methods for supporting the consensus reaching processes. Chapter 4, "Fuzzy Real Options for Strategic Planning", summarizes research where the fuzzy real options theory was implemented as a series of models. These models were thoroughly tested on a number of real life investments, and validated in 2001. Chapter 5, "Soft Computing Methods for Reducing...
Directory of Open Access Journals (Sweden)
Víctor Vásquez-Villalobos
2015-06-01
Full Text Available The sensory preference (sp and shelf life of sensory acceptability (SLSA of canned artichoke hearts were modeled using fuzzy logic (FL and accelerated testing. The artichoke hearts were marinated in oil of sacha inchi (Plukenetia volubilis, soybean (Glycine max and olive (Olea europea; and evaluated using a Ranking test with a semi-trained panel, to identify the best preference both for flavor (f and limpidity (l. We evaluated a global sp through intersection (AND and union (OR fuzzy operations of f and l, using functions of triangular membership with the Mamdani method for defuzzificacion through 25 linguistic rules. The intersection showed the best modeling performance, with the highest sp value at 3.30 for the treatment with sacha inchi (50%, olive (25% and soybean (25% (p << 0.05 oil, which was subjected to accelerated testing at 37 °C, 49 °C, 55 °C and evaluated according to their sensory acceptability (SA through an unstructured scale test in terms of f and l. The SLSA was determined using accelerated testing with FL through intersection fuzzy operation of f and l, triangular membership functions for f and l, and also 25 linguistic rules. A SLSA at 20 ºC was determined for a "high" SA of 296 days, and 569 days for a SA between "high and beginning of medium SA". Both values were lower than the 892 days’ time determined by accelerated testing when evaluating the peroxide index in canned products.
1991-10-01
SUBJECT TERMS 15. NUMBER OF PAGES engineering management information systems method formalization 60 information engineering process modeling 16 PRICE...CODE information systems requirements definition methods knowlede acquisition methods systems engineering 17. SECURITY CLASSIFICATION ji. SECURITY... Management , Inc., Santa Monica, California. CORYNEN, G. C., 1975, A Mathematical Theory of Modeling and Simula- tion. Ph.D. Dissertation, Department
Access control, security, and trust a logical approach
Chin, Shiu-Kai
2010-01-01
Access Control, Security, Trust, and Logic Deconstructing Access Control Decisions A Logical Approach to Access Control PRELIMINARIES A Language for Access ControlSets and Relations Syntax SemanticsReasoning about Access Control Logical RulesFormal Proofs and Theorems Soundness of Logical RulesBasic Concepts Reference Monitors Access Control Mechanisms: Tickets and Lists Authentication Security PoliciesConfidentiality, Integrity, and Availability Discretionary Security Policies Mandatory Security Policies Military Security Policies Commercial PoliciesDISTRIBUTED ACCESS CONTROL Digital Authenti
Paraconsistency properties in degree-preserving fuzzy logics
Czech Academy of Sciences Publication Activity Database
Ertola, R.; Esteva, F.; Flaminio, T.; Godo, L.; Noguera, Carles
2015-01-01
Roč. 19, č. 3 (2015), s. 531-546 ISSN 1432-7643 R&D Projects: GA ČR GAP202/10/1826 Institutional support: RVO:67985556 Keywords : Mathematical fuzzy logic * degree-preserving fuzzy logics * paraconsistent logics * logics of formal inconsistency Subject RIV: BA - General Mathematics Impact factor: 1.630, year: 2015 http://library.utia.cas.cz/separaty/2016/MTR/noguera-0469166.pdf
Fuzzy Versions of Epistemic and Deontic Logic
Gounder, Ramasamy S.; Esterline, Albert C.
1998-01-01
Epistemic and deontic logics are modal logics, respectively, of knowledge and of the normative concepts of obligation, permission, and prohibition. Epistemic logic is useful in formalizing systems of communicating processes and knowledge and belief in AI (Artificial Intelligence). Deontic logic is useful in computer science wherever we must distinguish between actual and ideal behavior, as in fault tolerance and database integrity constraints. We here discuss fuzzy versions of these logics. In the crisp versions, various axioms correspond to various properties of the structures used in defining the semantics of the logics. Thus, any axiomatic theory will be characterized not only by its axioms but also by the set of properties holding of the corresponding semantic structures. Fuzzy logic does not proceed with axiomatic systems, but fuzzy versions of the semantic properties exist and can be shown to correspond to some of the axioms for the crisp systems in special ways that support dependency networks among assertions in a modal domain. This in turn allows one to implement truth maintenance systems. For the technical development of epistemic logic, and for that of deontic logic. To our knowledge, we are the first to address fuzzy epistemic and fuzzy deontic logic explicitly and to consider the different systems and semantic properties available. We give the syntax and semantics of epistemic logic and discuss the correspondence between axioms of epistemic logic and properties of semantic structures. The same topics are covered for deontic logic. Fuzzy epistemic and fuzzy deontic logic discusses the relationship between axioms and semantic properties for these logics. Our results can be exploited in truth maintenance systems.
Formal Semantics: Origins, Issues, Early Impact
Directory of Open Access Journals (Sweden)
Barbara H. Partee
2010-12-01
Psychology’, 183–216. Cambridge: Cambridge University Press.Chomsky, N. 1975. ‘Questions of form and interpretation’. In R. Austerlitz (ed. ‘The Scope of American Linguistics’, 159–196. Lisse: Peter de Ridder Press.Church, A. 1940. ‘A formulation of the simple theory of types’. Journal of Symbolic Logic 5: 56–68.http://dx.doi.org/10.2307/2266170Church, A. 1951. ‘A formulation of the logic of sense and denotation’. In P. Henle, H. Kallen & S. Langer (eds. ‘Structure, Method, and Meaning: Essays in Honor of H.M. Sheffer’, 3–24. New York: Liberal Arts Press.Cocchiarella, N. 1997. ‘Formally-oriented work in the philosophy of language’. In J.V. Canfield (ed. ‘Philosophy of Meaning, Knowledge and Value in the 20th Century’, 39–75. London; New York: Routledge.Cresswell, M. J. 1978. ‘Semantic competence’. In F. Guenthner & M. Guenthner-Reutter (eds. ‘Meaning and Translation: Philosophical and Linguistic Approaches’, 9–43. London: Duckworth.Davidson, D. 1964. ‘Theories of meaning and learnable languages’. In ‘Proceedings of the 1964 International Congress for Logic, Methodology, and Philosophy of Science’, Jerusalem: North-Holland.Davidson, D. 1967. ‘The logical form of action sentences’. In N. Rescher (ed. ‘The Logic of Decision and Action’, 81–95. Pittsburgh: Pittsburgh University Press.Davidson, D. 1970. ‘Semantics for natural languages’. In B. Visentini (ed. ‘Linguaggi nella Societa e nella Tecnica’, Milan: Edizioni di Comunita.Dowty, D. 1978a. A Guide to Montague’s PTQ. Bloomington, IN: Indiana University Linguistics Club.Dowty, D. 1978b. ‘Governed transformations as lexical rules in a Montague Grammar’. Linguistic Inquiry 9: 393–426.Dowty, D., Wall, R. E. & Peters, S. Jr. 1981. Introduction to Montague Semantics. Dordrecht: Reidel.Feferman, A. Burdman & Feferman, S. 2004. Alfred Tarski: Life and Logic. Cambridge: Cambridge University Press.Fodor, J. 1961. ‘Projection and paraphrase in
Directory of Open Access Journals (Sweden)
Diana-Maria Drigă
2015-12-01
Full Text Available The concept of resilience has represented during the recent years a leading concern both in Romania, within the European Union and worldwide. Specialists in economics, management, finance, legal sciences, political sciences, sociology, psychology, grant a particular interest to this concept. Multidisciplinary research of resilience has materialized throughout the time in multiple conceptualizations and theorizing, but without being a consensus between specialists in terms of content, specificity and scope. Through this paper it is intended to clarify the concept of resilience, achieving an exploration of the evolution of this concept in ecological, social and economic environment. At the same time, the paper presents aspects of feedback mechanisms and proposes a formalization of resilience using the logic and mathematical analysis.
A Formal Framework for Workflow Analysis
Cravo, Glória
2010-09-01
In this paper we provide a new formal framework to model and analyse workflows. A workflow is the formal definition of a business process that consists in the execution of tasks in order to achieve a certain objective. In our work we describe a workflow as a graph whose vertices represent tasks and the arcs are associated to workflow transitions. Each task has associated an input/output logic operator. This logic operator can be the logical AND (•), the OR (⊗), or the XOR -exclusive-or—(⊕). Moreover, we introduce algebraic concepts in order to completely describe completely the structure of workflows. We also introduce the concept of logical termination. Finally, we provide a necessary and sufficient condition for this property to hold.
A tristate optical logic system
Basuray, A.; Mukhopadhyay, S.; Kumar Ghosh, Hirak; Datta, A. K.
1991-09-01
A method is described to represent data in a tristate logic system which are subsequently replaced by Modified Trinary Numbers (MTN). This system is advantagegeous in parallel processing as carry and borrow free operations in arithmatic computation is possible. The logical operations are also modified according to the three states available. A possible practical application of the same using polarized light is also suggested.
[How to write an article: formal aspects].
Corral de la Calle, M A; Encinas de la Iglesia, J
2013-06-01
Scientific research and the publication of the results of the studies go hand in hand. Exquisite research methods can only be adequately reflected in formal publication with the optimum structure. To ensure the success of this process, it is necessary to follow orderly steps, including selecting the journal in which to publish and following the instructions to authors strictly as well as the guidelines elaborated by diverse societies of editors and other institutions. It is also necessary to structure the contents of the article in a logical and attractive way and to use an accurate, clear, and concise style of language. Although not all the authors are directly involved in the actual writing, elaborating a scientific article is a collective undertaking that does not finish until the article is published. This article provides practical advice about formal and not-so-formal details to take into account when writing a scientific article as well as references that will help readers find more information in greater detail. Copyright © 2012 SERAM. Published by Elsevier Espana. All rights reserved.
DEFF Research Database (Denmark)
Carbone, Marco; Montesi, Fabrizio; Schürmann, Carsten
2014-01-01
In Choreographic Programming, a distributed system is programmed by giving a choreography, a global description of its interactions, instead of separately specifying the behaviour of each of its processes. Process implementations in terms of a distributed language can then be automatically...... projected from a choreography. We present Linear Compositional Choreographies (LCC), a proof theory for reasoning about programs that modularly combine choreographies with processes. Using LCC, we logically reconstruct a semantics and a projection procedure for programs. For the first time, we also obtain...... a procedure for extracting choreographies from process terms....
Applications of Mathematical Logic in Philosophy and Linguistics, Papers of a Conference
Malzkom, Wolfgang; Räsch, Thoralf
2003-01-01
"Foundations of the Formal Sciences" (FotFS) is a series of interdisciplinary conferences in mathematics, philosophy, computer science and linguistics. The main goal is to reestablish the traditionally strong links between these areas of research that have been lost in the past decades. The second conference in the series had the subtitle "Applications of Mathematical Logic in Philosophy and Linguistics" and brought speakers from all parts of the Formal Sciences together to give a holistic view of how mathematical methods can improve our philosophical and technical understanding of language and scientific discourse, ranging from the theoretical level up to applications in language recognition software. Audience: This volume is of interest to all formal philosophers and theoretical linguists. In addition to that, logicians interested in the applications of their field and logic students in mathematics, computer science, philosophy and linguistics can use the volume to broaden their knowledge of applications of...
The role of formal specifications
International Nuclear Information System (INIS)
McHugh, J.
1994-01-01
The role of formal requirements specification is discussed under the premise that the primary purpose of such specifications is to facilitate clear and unambiguous communications among the communities of interest for a given project. An example is presented in which the failure to reach such an understanding resulted in an accident at a chemical plant. Following the example, specification languages based on logical formalisms and notations are considered. These are rejected as failing to serve the communications needs of diverse communities. The notion of a specification as a surrogate for a program is also considered and rejected. The paper ends with a discussion of the type of formal notation that will serve the communications role and several encouraging developments are noted
Directory of Open Access Journals (Sweden)
Yu-Shan Cheng
2018-02-01
Full Text Available Self-consumption of household photovoltaic (PV storage systems has become profitable for residential owners under the trends of limited feed-in power and decreasing PV feed-in tariffs. For individual PV-storage systems, the challenge mainly lies in managing surplus generation of battery and grid power flow, ideally without relying on error-prone forecasts for both generation and consumption. Considering the large variation in power profiles of different houses in a neighborhood, the strategy is also supposed to be beneficial and applicable for the entire community. In this study, an adaptable battery charging control strategy is designed in order to obtain minimum costs for houses without any meteorological or load forecasts. Based on fuzzy logic control (FLC, battery state-of-charge (SOC and the variation of SOC (∆SOC are taken as input variables to dynamically determine output charging power with minimum costs. The proposed FLC-based algorithm benefits from the charging battery as much as possible during the daytime, and meanwhile properly preserves the capacity at midday when there is high possibility of curtailment loss. In addition, due to distinct power profiles in each individual house, input membership functions of FLC are improved by particle swarm optimization (PSO to achieve better overall performance. A neighborhood with 74 houses in Germany is set up as a scenario for comparison to prior studies. Without forecasts of generation and consumption power, the proposed method leads to minimum costs in 98.6% of houses in the community, and attains the lowest average expenses for a single house each year.
Directory of Open Access Journals (Sweden)
Alexey E. Fedoseev
2014-01-01
Full Text Available The article considers the development of formal methods of assessing the rating criterion exponents. The article deals with the mathematical model, which allows to connect together quantitative rating criterion characteristics, measured in various scales, with intuitive idea of them. The solution to the problem of rating criterion estimation is proposed.
Miloševic Zupancic, Vesna
2018-01-01
Research from the field of non-formal education (NFE) in youth work emphasises the central role of experiential learning and learning in groups. The present paper aims to research teaching methods and teaching forms in NFE in youth work. The research sought to answer the following research questions: 'What teaching forms can be found in NFE for…
Institute of Scientific and Technical Information of China (English)
LIN; Kuang-Jang; LIN; Chii-Ruey
2010-01-01
The Photovoltaic Array has a best optimal operating point where the array operating can obtain the maximum power.However, the optimal operating point can be compromised by the strength of solar radiation,angle,and by the change of environment and load.Due to the constant changes in these conditions,it has become very difficult to locate the optimal operating point by following a mathematical model.Therefore,this study will focus mostly on the application of Fuzzy Logic Control theory and Three-point Weight Comparison Method in effort to locate the optimal operating point of solar panel and achieve maximum efficiency in power generation. The Three-point Weight Comparison Method is the comparison between the characteristic curves of the voltage of photovoltaic array and output power;it is a rather simple way to track the maximum power.The Fuzzy Logic Control,on the other hand,can be used to solve problems that cannot be effectively dealt with by calculation rules,such as concepts,contemplation, deductive reasoning,and identification.Therefore,this paper uses these two kinds of methods to make simulation successively. The simulation results show that,the Three-point Comparison Method is more effective under the environment with more frequent change of solar radiation;however,the Fuzzy Logic Control has better tacking efficiency under the environment with violent change of solar radiation.
Towards an arithmetical logic the arithmetical foundations of logic
Gauthier, Yvon
2015-01-01
This book offers an original contribution to the foundations of logic and mathematics, and focuses on the internal logic of mathematical theories, from arithmetic or number theory to algebraic geometry. Arithmetical logic is the term used to refer to the internal logic of classical arithmetic, here called Fermat-Kronecker arithmetic, and combines Fermat’s method of infinite descent with Kronecker’s general arithmetic of homogeneous polynomials. The book also includes a treatment of theories in physics and mathematical physics to underscore the role of arithmetic from a constructivist viewpoint. The scope of the work intertwines historical, mathematical, logical and philosophical dimensions in a unified critical perspective; as such, it will appeal to a broad readership from mathematicians to logicians, to philosophers interested in foundational questions. Researchers and graduate students in the fields of philosophy and mathematics will benefit from the author’s critical approach to the foundations of l...
Cleaveland, Rance; Luettgen, Gerald; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
This paper presents the Logical Process Calculus (LPC), a formalism that supports heterogeneous system specifications containing both operational and declarative subspecifications. Syntactically, LPC extends Milner's Calculus of Communicating Systems with operators from the alternation-free linear-time mu-calculus (LT(mu)). Semantically, LPC is equipped with a behavioral preorder that generalizes Hennessy's and DeNicola's must-testing preorder as well as LT(mu's) satisfaction relation, while being compositional for all LPC operators. From a technical point of view, the new calculus is distinguished by the inclusion of: (1) both minimal and maximal fixed-point operators and (2) an unimple-mentability predicate on process terms, which tags inconsistent specifications. The utility of LPC is demonstrated by means of an example highlighting the benefits of heterogeneous system specification.
Carlton, David Bryan
The exponential improvements in speed, energy efficiency, and cost that the computer industry has relied on for growth during the last 50 years are in danger of ending within the decade. These improvements all have relied on scaling the size of the silicon-based transistor that is at the heart of every modern CPU down to smaller and smaller length scales. However, as the size of the transistor reaches scales that are measured in the number of atoms that make it up, it is clear that this scaling cannot continue forever. As a result of this, there has been a great deal of research effort directed at the search for the next device that will continue to power the growth of the computer industry. However, due to the billions of dollars of investment that conventional silicon transistors have received over the years, it is unlikely that a technology will emerge that will be able to beat it outright in every performance category. More likely, different devices will possess advantages over conventional transistors for certain applications and uses. One of these emerging computing platforms is nanomagnetic logic (NML). NML-based circuits process information by manipulating the magnetization states of single-domain nanomagnets coupled to their nearest neighbors through magnetic dipole interactions. The state variable is magnetization direction and computations can take place without passing an electric current. This makes them extremely attractive as a replacement for conventional transistor-based computing architectures for certain ultra-low power applications. In most work to date, nanomagnetic logic circuits have used an external magnetic clocking field to reset the system between computations. The clocking field is then subsequently removed very slowly relative to the magnetization dynamics, guiding the nanomagnetic logic circuit adiabatically into its magnetic ground state. In this dissertation, I will discuss the dynamics behind this process and show that it is greatly
DEFF Research Database (Denmark)
Modal logic is a subject with ancient roots in the western logical tradition. Up until the last few generations, it was pursued mainly as a branch of philosophy. But in recent years, the subject has taken new directions with connections to topics in computer science and mathematics. This volume...... is the proceedings of the conference of record in its fi eld, Advances in Modal Logic. Its contributions are state-of-the-art papers. The topics include decidability and complexity results for specifi c modal logics, proof theory of modal logic, logics for reasoning about time and space, provability logic, dynamic...... epistemic logic, and the logic of evidence....
Stepping Theories of Active Logic with Two Kinds of Negation
Directory of Open Access Journals (Sweden)
Mikhail M. Vinkov
2017-01-01
Full Text Available This paper formulates a stepping theory formalism with two kinds of negation dealing with one of the areas of Active Logic, a new kind of logic aimed at performing practical tasks in real time knowledge-based AI systems. In addition to the standard logical negation, the proposed formalism uses the so-called subjective negation interpreted as inability to arrive at some conclusion through reasoning by a current time. The semantics of the proposed formalism is defined as an~argumentation structure.
Flow Logic for Process Calculi
DEFF Research Database (Denmark)
Nielson, Hanne Riis; Nielson, Flemming; Pilegaard, Henrik
2012-01-01
Flow Logic is an approach to statically determining the behavior of programs and processes. It borrows methods and techniques from Abstract Interpretation, Data Flow Analysis and Constraint Based Analysis while presenting the analysis in a style more reminiscent of Type Systems. Traditionally...... developed for programming languages, this article provides a tutorial development of the approach of Flow Logic for process calculi based on a decade of research. We first develop a simple analysis for the π-calculus; this consists of the specification, semantic soundness (in the form of subject reduction......, and finally, we extend it to a relational analysis. A Flow Logic is a program logic---in the same sense that a Hoare’s logic is. We conclude with an executive summary presenting the highlights of the approach from this perspective including a discussion of theoretical properties as well as implementation...
Flat Coalgebraic Fixed Point Logics
Schröder, Lutz; Venema, Yde
Fixed point logics are widely used in computer science, in particular in artificial intelligence and concurrency. The most expressive logics of this type are the μ-calculus and its relatives. However, popular fixed point logics tend to trade expressivity for simplicity and readability, and in fact often live within the single variable fragment of the μ-calculus. The family of such flat fixed point logics includes, e.g., CTL, the *-nesting-free fragment of PDL, and the logic of common knowledge. Here, we extend this notion to the generic semantic framework of coalgebraic logic, thus covering a wide range of logics beyond the standard μ-calculus including, e.g., flat fragments of the graded μ-calculus and the alternating-time μ-calculus (such as ATL), as well as probabilistic and monotone fixed point logics. Our main results are completeness of the Kozen-Park axiomatization and a timed-out tableaux method that matches ExpTime upper bounds inherited from the coalgebraic μ-calculus but avoids using automata.
Formalizing physical security procedures
Meadows, C.; Pavlovic, Dusko
Although the problems of physical security emerged more than 10,000 years before the problems of computer security, no formal methods have been developed for them, and the solutions have been evolving slowly, mostly through social procedures. But as the traffic on physical and social networks is now
Directory of Open Access Journals (Sweden)
Andrade, Pedro
2016-07-01
Full Text Available In this essay I will present some results of the project Public Communication of Art, which developed a seminal theory and methodology intended to cope with hybridity and new media literacy in our globalized and inter/transcultural world. Some of the methods used blend vision with touch and are called ‘hybrid methods’ or ‘hybrimethods’. Examples of these are, for instance, a Multitouch Interactive Table, a Multitouch Questionnaire, Trichotomies Game and GeoNeoLogic Novel, this last one being a hybrid novel activated by fusion of vision, touch and GPS coordinates. Another hybrimethod is a sort of discursive analysis, named Hybrid Discourse Analysis (HDA, which uses ‘semantic-logical networks’ organized by concepts and ‘relation-concepts’. HDA is here articulated with Critical Sociology and applied to the analysis of a text on Magic Realism, which is also a hybrid genre within the social field of literature.
Is there a relationship between logic and psychology? The question for human reasoning
Directory of Open Access Journals (Sweden)
Jaime Castro Martínez
2013-11-01
Full Text Available This paper presents a debate between the relations of logic and psychology. It starts with the presentation of Keysser’s logicism. It describes some background to the debate on Mill’s psychologism and Husserl´s criticism of the laws of logic, in contrast to the laws of the nature of human thought. It continues with the contributions to the discussion by Gestalt theory. Then, the Piagetian bet for a mental logic is presented. The essay concludes with the need to consider psychological logic as distinct from formal logic: a dynamic logic of facts or logic of experience
The nature of Formal Reasoning among Ghanaian Basic School ...
African Journals Online (AJOL)
cce
theories of cognitive development separated the organization of knowledge from practice. ..... Formal operators must be able to distinguish between false and logical arguments. ... Poole, B. (1997) Education for an Information age. Boston: ...
A Formalization of Linkage Analysis
DEFF Research Database (Denmark)
Ingolfsdottir, Anna; Christensen, A.I.; Hansen, Jens A.
In this report a formalization of genetic linkage analysis is introduced. Linkage analysis is a computationally hard biomathematical method, which purpose is to locate genes on the human genome. It is rooted in the new area of bioinformatics and no formalization of the method has previously been ...
Formal specification and implementation of operations in information management systems
International Nuclear Information System (INIS)
Sandewall, E.
1983-02-01
Among information management systems we include general purpose systems, such as text editors and data editors (forms management systems), as well as special purpose systems such as mail systems and computer based calendars. Based on a method for formal specification of some aspects of IMS, namely the structure of the data base, the update operations, and the user dialogue, the paper shows how reasonable procedures for executing IMS operations can be written in the notation of a first-order theory, in such a way that the procedure is a logical consequence of the specification. (Author)
Paraconsistent Computational Logic
DEFF Research Database (Denmark)
Jensen, Andreas Schmidt; Villadsen, Jørgen
2012-01-01
In classical logic everything follows from inconsistency and this makes classical logic problematic in areas of computer science where contradictions seem unavoidable. We describe a many-valued paraconsistent logic, discuss the truth tables and include a small case study....
Microelectromechanical reprogrammable logic device
Hafiz, Md Abdullah Al; Kosuru, Lakshmoji; Younis, Mohammad I.
2016-01-01
on the electrothermal frequency modulation scheme of a single microelectromechanical resonator, capable of performing all the fundamental 2-bit logic functions as well as n-bit logic operations. Logic functions are performed by actively tuning the linear resonance
Doberkat, Ernst-Erich
2009-01-01
Combining coalgebraic reasoning, stochastic systems and logic, this volume presents the principles of coalgebraic logic from a categorical perspective. Modal logics are also discussed, including probabilistic interpretations and an analysis of Kripke models.
Co-constructive Logics for Proofs and Refutations
Directory of Open Access Journals (Sweden)
Trafford James
2015-01-01
Full Text Available This paper considers logics which are formally dual to intuitionistic logic in order to investigate a co-constructive logic for proofs and refutations. This is philosophically motivated by a set of problems regarding the nature of constructive truth, and its relation to falsity. It is well known both that intuitionism can not deal constructively with negative information, and that defining falsity by means of intuitionistic negation leads, under widely-held assumptions, to a justification of bivalence. For example, we do not want to equate falsity with the non-existence of a proof since this would render a statement such as “pi is transcendental” false prior to 1882. In addition, the intuitionist account of negation as shorthand for the derivation of absurdity is inadequate, particularly outside of purely mathematical contexts. To deal with these issues, I investigate the dual of intuitionistic logic, co-intuitionistic logic, as a logic of refutation, alongside intuitionistic logic of proofs. Direct proof and refutation are dual to each other, and are constructive, whilst there also exist syntactic, weak, negations within both logics. In this respect, the logic of refutation is weakly paraconsistent in the sense that it allows for statements for which, neither they, nor their negation, are refuted. I provide a proof theory for the co-constructive logic, a formal dualizing map between the logics, and a Kripke-style semantics. This is given an intuitive philosophical rendering in a re-interpretation of Kolmogorov's logic of problems.
Classical logic and logicism in human thought
Elqayam, Shira
2012-01-01
This chapter explores the role of classical logic as a theory of human reasoning. I distinguish between classical logic as a normative, computational and algorithmic system, and review its role is theories of human reasoning since the 1960s. The thesis I defend is that psychological theories have been moving further and further away from classical logic on all three levels. I examine some prominent example of logicist theories, which incorporate logic in their psychological account, includin...
Logic programming extensions of Horn clause logic
Directory of Open Access Journals (Sweden)
Ron Sigal
1988-11-01
Full Text Available Logic programming is now firmly established as an alternative programming paradigm, distinct and arguably superior to the still dominant imperative style of, for instance, the Algol family of languages. The concept of a logic programming language is not precisely defined, but it is generally understood to be characterized buy: a declarative nature; foundation in some well understood logical system, e.g., first order logic.
Three-valued logics in modal logic
Kooi, Barteld; Tamminga, Allard
2013-01-01
Every truth-functional three-valued propositional logic can be conservatively translated into the modal logic S5. We prove this claim constructively in two steps. First, we define a Translation Manual that converts any propositional formula of any three-valued logic into a modal formula. Second, we
A Formal Calculus for Categories
DEFF Research Database (Denmark)
Cáccamo, Mario José
This dissertation studies the logic underlying category theory. In particular we present a formal calculus for reasoning about universal properties. The aim is to systematise judgements about functoriality and naturality central to categorical reasoning. The calculus is based on a language which...... extends the typed lambda calculus with new binders to represent universal constructions. The types of the languages are interpreted as locally small categories and the expressions represent functors. The logic supports a syntactic treatment of universality and duality. Contravariance requires a definition...... of universality generous enough to deal with functors of mixed variance. Ends generalise limits to cover these kinds of functors and moreover provide the basis for a very convenient algebraic manipulation of expressions. The equational theory of the lambda calculus is extended with new rules for the definitions...
Logical operations realized on the Ising chain of N qubits
International Nuclear Information System (INIS)
Asano, Masanari; Tateda, Norihiro; Ishii, Chikara
2004-01-01
Multiqubit logical gates are proposed as implementations of logical operations on N qubits realized physically by the local manipulation of qubits before and after the one-time evolution of an Ising chain. This construction avoids complicated tuning of the interactions between qubits. The general rules of the action of multiqubit logical gates are derived by decomposing the process into the product of two-qubit logical operations. The formalism is demonstrated by the construction of a special type of multiqubit logical gate that is simulated by a quantum circuit composed of controlled-NOT gates
Nonmonotonic reasoning in description logics. Rational closure for the ABox
CSIR Research Space (South Africa)
Casini, G
2013-07-01
Full Text Available The introduction of defeasible reasoning in Description Logics has been a main research topic in the field in the last years. Despite the fact that various interesting formalizations of nonmonotonic reasoning for the TBox have been proposed...
Formal verification of complex properties on PLC programs
Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M
2014-01-01
Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.
International Nuclear Information System (INIS)
Yavar, A.R.; Sukiman Sarmani; Tan, C.Y.; Rafie, N.N.; Lim, S.W.E.; Khoo, K.S.
2012-01-01
An electronic database has been developed and implemented for K 0 -INAA method in Malaysia. Databases are often developed according to national requirements. This database contains constant nuclear data for k 0 -INAA method. Hogdahl-convention and Westcott-formalism as 3 separate command user interfaces. It has been created using Microsoft Access 2007 under a Windows operating system. This database saves time and the quality of results can be assured when the calculation of neutron flux parameters and concentration of elements by k 0 -INAA method are utilised. An evaluation of the database was conducted by IAEA Soil7 where the results published which showed a high level of consistency. (Author)
Multi-Valued Modal Fixed Point Logics for Model Checking
Nishizawa, Koki
In this paper, I will show how multi-valued logics are used for model checking. Model checking is an automatic technique to analyze correctness of hardware and software systems. A model checker is based on a temporal logic or a modal fixed point logic. That is to say, a system to be checked is formalized as a Kripke model, a property to be satisfied by the system is formalized as a temporal formula or a modal formula, and the model checker checks that the Kripke model satisfies the formula. Although most existing model checkers are based on 2-valued logics, recently new attempts have been made to extend the underlying logics of model checkers to multi-valued logics. I will summarize these new results.
Automated Formal Verification for PLC Control Systems
Fernández Adiego, Borja
2014-01-01
Programmable Logic Controllers (PLCs) are widely used devices used in industrial control systems. Ensuring that the PLC software is compliant with its specification is a challenging task. Formal verification has become a recommended practice to ensure the correctness of the safety-critical software. However, these techniques are still not widely applied in industry due to the complexity of building formal models, which represent the system and the formalization of requirement specifications. We propose a general methodology to perform automated model checking of complex properties expressed in temporal logics (e.g. CTL, LTL) on PLC programs. This methodology is based on an Intermediate Model (IM), meant to transform PLC programs written in any of the languages described in the IEC 61131-3 standard (ST, IL, etc.) to different modeling languages of verification tools. This approach has been applied to CERN PLC programs validating the methodology.
Hybrid-Logical Reasoning in False-Belief Tasks
Brauner, Torben
2013-01-01
The main aim of the present paper is to use a proof system for hybrid modal logic to formalize what are called falsebelief tasks in cognitive psychology, thereby investigating the interplay between cognition and logical reasoning about belief. We consider two different versions of the Smarties task, involving respectively a shift of perspective to another person and to another time. Our formalizations disclose that despite this difference, the two versions of the Smarties task have exactly th...
Many-dimensional modal logics theory and applications
Gabbay, D M; Wolter, F; Zakharyaschev, M
2003-01-01
Modal logics, originally conceived in philosophy, have recently found many applications in computer science, artificial intelligence, the foundations of mathematics, linguistics and other disciplines. Celebrated for their good computational behaviour, modal logics are used as effective formalisms for talking about time, space, knowledge, beliefs, actions, obligations, provability, etc. However, the nice computational properties can drastically change if we combine some of these formalisms into a many-dimensional system, say, to reason about knowledge bases developing in time or moving objects.
Elbouz, Marwa; Alfalou, Ayman; Brosseau, Christian
2011-06-01
Home automation is being implemented into more and more domiciles of the elderly and disabled in order to maintain their independence and safety. For that purpose, we propose and validate a surveillance video system, which detects various posture-based events. One of the novel points of this system is to use adapted Vander-Lugt correlator (VLC) and joint-transfer correlator (JTC) techniques to make decisions on the identity of a patient and his three-dimensional (3-D) positions in order to overcome the problem of crowd environment. We propose a fuzzy logic technique to get decisions on the subject's behavior. Our system is focused on the goals of accuracy, convenience, and cost, which in addition does not require any devices attached to the subject. The system permits one to study and model subject responses to behavioral change intervention because several levels of alarm can be incorporated according different situations considered. Our algorithm performs a fast 3-D recovery of the subject's head position by locating eyes within the face image and involves a model-based prediction and optical correlation techniques to guide the tracking procedure. The object detection is based on (hue, saturation, value) color space. The system also involves an adapted fuzzy logic control algorithm to make a decision based on information given to the system. Furthermore, the principles described here are applicable to a very wide range of situations and robust enough to be implementable in ongoing experiments.
Heunen, Chris
2008-01-01
We consider categorical logic on the category of Hilbert spaces. More generally, in fact, any pre-Hilbert category suffices. We characterise closed subobjects, and prove that they form orthomodular lattices. This shows that quantum logic is just an incarnation of categorical logic, enabling us to establish an existential quantifier for quantum logic, and conclude that there cannot be a universal quantifier.
On BAN logics for industrial security protocols
Agray, N.; Hoek, van der W.; Vink, de E.P.; Dunin-Keplicz, B.; Nawarecki, E.
2002-01-01
This paper reports on two case-studies of applying BAN logic to industrial strength security protocols. These studies demonstrate the flexibility of the BAN language, as it caters for the addition of appropriate constructs and rules. We argue that, although a semantical foundation of the formalism
Interval logic. Proof theory and theorem proving
DEFF Research Database (Denmark)
Rasmussen, Thomas Marthedal
2002-01-01
of a direction of an interval, and present a sound and complete Hilbert proof system for it. Because of its generality, SIL can conveniently act as a general formalism in which other interval logics can be encoded. We develop proof theory for SIL including both a sequent calculus system and a labelled natural...
Berg Johansen, Christina; Bock Waldorff, Susanne
2015-01-01
This study presents new insights into the explanatory power of the institutional logics perspective. With outset in a discussion of seminal theory texts, we identify two fundamental topics that frame institutional logics: overarching institutional orders guided by institutional logics, as well as change and agency generated by friction between logics. We use these topics as basis for an analysis of selected empirical papers, with the aim of understanding how institutional logics contribute to...
On the Equivalence of Formal Grammars and Machines.
Lund, Bruce
1991-01-01
Explores concepts of formal language and automata theory underlying computational linguistics. A computational formalism is described known as a "logic grammar," with which computational systems process linguistic data, with examples in declarative and procedural semantics and definite clause grammars. (13 references) (CB)
Comparing Two Tests of Formal Reasoning in a College Chemistry Context
Jiang, Bo; Xu, Xiaoying; Garcia, Alicia; Lewis, Jennifer E.
2010-01-01
The Test of Logical Thinking (TOLT) and the Group Assessment of Logical Thinking (GALT) are two of the instruments most widely used by science educators and researchers to measure students' formal reasoning abilities. Based on Piaget's cognitive development theory, formal thinking ability has been shown to be essential for student achievement in…
Energy Technology Data Exchange (ETDEWEB)
Fajkos, A.; Klimek, M.
1980-01-01
A possibility of using a mathematical-logical modeling to improve the quality of mine shaft operation planning in Czechloslovakia based on the example of the Sverma mine in Ostrova with complex mining-geological conditions is studied. For the basic criteria we assumed: extraction plant, number of shifts in the long walls, time period for beginning and ending long wall operation, processing of reserves with consideration of existing conditions, output and dip angle of a formation, quality of extracted coal, and also: time intervals for processing separate formations, limitation of extraction load in a long wall in connection with gas emission, timbering, the necessity of insuring normal operating conditions, concentration of extraction, time relationship of preparatory and extraction operations.
Empirical logic and tensor products
International Nuclear Information System (INIS)
Foulis, D.J.; Randall, C.H.
1981-01-01
In our work we are developing a formalism called empirical logic to support a generalization of conventional statistics; the resulting generalization is called operational statistics. We are not attempting to develop or advocate any particular physical theory; rather we are formulating a precision 'language' in which such theories can be expressed, compared, evaluated, and related to laboratory experiments. We believe that only in such a language can the connections between real physical procedures (operations) and physical theories be made explicit and perspicuous. (orig./HSI)
Machine Learning-based Intelligent Formal Reasoning and Proving System
Chen, Shengqing; Huang, Xiaojian; Fang, Jiaze; Liang, Jia
2018-03-01
The reasoning system can be used in many fields. How to improve reasoning efficiency is the core of the design of system. Through the formal description of formal proof and the regular matching algorithm, after introducing the machine learning algorithm, the system of intelligent formal reasoning and verification has high efficiency. The experimental results show that the system can verify the correctness of propositional logic reasoning and reuse the propositional logical reasoning results, so as to obtain the implicit knowledge in the knowledge base and provide the basic reasoning model for the construction of intelligent system.
Connections among quantum logics
International Nuclear Information System (INIS)
Lock, P.F.; Hardegree, G.M.
1985-01-01
In this paper, a theory of quantum logics is proposed which is general enough to enable us to reexamine a previous work on quantum logics in the context of this theory. It is then easy to assess the differences between the different systems studied. The quantum logical systems which are incorporated are divided into two groups which we call ''quantum propositional logics'' and ''quantum event logics''. The work of Kochen and Specker (partial Boolean algebras) is included and so is that of Greechie and Gudder (orthomodular partially ordered sets), Domotar (quantum mechanical systems), and Foulis and Randall (operational logics) in quantum propositional logics; and Abbott (semi-Boolean algebras) and Foulis and Randall (manuals) in quantum event logics, In this part of the paper, an axiom system for quantum propositional logics is developed and the above structures in the context of this system examined. (author)
Theorem Proving In Higher Order Logics
Carreno, Victor A. (Editor); Munoz, Cesar A.; Tahar, Sofiene
2002-01-01
The TPHOLs International Conference serves as a venue for the presentation of work in theorem proving in higher-order logics and related areas in deduction, formal specification, software and hardware verification, and other applications. Fourteen papers were submitted to Track B (Work in Progress), which are included in this volume. Authors of Track B papers gave short introductory talks that were followed by an open poster session. The FCM 2002 Workshop aimed to bring together researchers working on the formalisation of continuous mathematics in theorem proving systems with those needing such libraries for their applications. Many of the major higher order theorem proving systems now have a formalisation of the real numbers and various levels of real analysis support. This work is of interest in a number of application areas, such as formal methods development for hardware and software application and computer supported mathematics. The FCM 2002 consisted of three papers, presented by their authors at the workshop venue, and one invited talk.
Allabakash, S.; Yasodha, P.; Bianco, L.; Venkatramana Reddy, S.; Srinivasulu, P.; Lim, S.
2017-09-01
This paper presents the efficacy of a "tuned" fuzzy logic method at determining the height of the boundary layer using the measurements from a 1280 MHz lower atmospheric radar wind profiler located in Gadanki (13.5°N, 79°E, 375 mean sea level), India, and discusses the diurnal and seasonal variations of the measured convective boundary layer over this tropical station. The original fuzzy logic (FL) method estimates the height of the atmospheric boundary layer combining the information from the range-corrected signal-to-noise ratio, the Doppler spectral width of the vertical velocity, and the vertical velocity itself, measured by the radar, through a series of thresholds and rules, which did not prove to be optimal for our radar system and geographical location. For this reason the algorithm was tuned to perform better on our data set. Atmospheric boundary layer heights obtained by this tuned FL method, the original FL method, and by a "standard method" (that only uses the information from the range-corrected signal-to-noise ratio) are compared with those obtained from potential temperature profiles measured by collocated Global Positioning System Radio Sonde during years 2011 and 2013. The comparison shows that the tuned FL method is more accurate than the other methods. Maximum convective boundary layer heights are observed between 14:00 and 15:00 local time (LT = UTC + 5:30) for clear-sky days. These daily maxima are found to be lower during winter and postmonsoon seasons and higher during premonsoon and monsoon seasons, due to net surface radiation and convective processes over this region being more intense during premonsoon and monsoon seasons and less intense in winter and postmonsoon seasons.
Logic, mathematics, and computer science modern foundations with practical applications
Nievergelt, Yves
2015-01-01
This text for the first or second year undergraduate in mathematics, logic, computer science, or social sciences, introduces the reader to logic, proofs, sets, and number theory. It also serves as an excellent independent study reference and resource for instructors. Adapted from Foundations of Logic and Mathematics: Applications to Science and Cryptography © 2002 Birkhӓuser, this second edition provides a modern introduction to the foundations of logic, mathematics, and computers science, developing the theory that demonstrates construction of all mathematics and theoretical computer science from logic and set theory. The focus is on foundations, with specific statements of all the associated axioms and rules of logic and set theory, and provides complete details and derivations of formal proofs. Copious references to literature that document historical development is also provided. Answers are found to many questions that usually remain unanswered: Why is the truth table for logical implication so uni...
Moral Particularism and Deontic Logic
Parent, Xavier
The aim of this paper is to strengthen the point made by Horty about the relationship between reason holism and moral particularism. In the literature prima facie obligations have been considered as the only source of reason holism. I strengthen Horty's point in two ways. First, I show that contrary-to-duties provide another independent support for reason holism. Next I outline a formal theory that is able to capture these two sources of holism. While in simple settings the proposed account coincides with Horty's one, this is not true in more complicated or "realistic" settings in which more than two norms collide. My chosen formalism is so-called input/output logic.
DEFF Research Database (Denmark)
Berg Johansen, Christina; Waldorff, Susanne Boch
This study presents new insights into the explanatory power of the institutional logics perspective. With outset in a discussion of seminal theory texts, we identify two fundamental topics that frame institutional logics: overarching institutional orders guides by institutional logics, as well...... as change and agency generated by friction between logics. We use these topics as basis for an analysis of selected empirical papers, with the aim of understanding how institutional logics contribute to institutional theory at large, and which social matters institutional logics can and cannot explore...
Indeterministic Temporal Logic
Directory of Open Access Journals (Sweden)
Trzęsicki Kazimierz
2015-09-01
Full Text Available The questions od determinism, causality, and freedom have been the main philosophical problems debated since the beginning of temporal logic. The issue of the logical value of sentences about the future was stated by Aristotle in the famous tomorrow sea-battle passage. The question has inspired Łukasiewicz’s idea of many-valued logics and was a motive of A. N. Prior’s considerations about the logic of tenses. In the scheme of temporal logic there are different solutions to the problem. In the paper we consider indeterministic temporal logic based on the idea of temporal worlds and the relation of accessibility between them.
Saleh, F.; Flipo, N.; de Fouquet, C.
2012-04-01
The main objective of this study is to provide a realistic simulation of river stage in regional river networks in order to improve the quantification of stream-aquifer exchanges and better assess the associated aquifer responses that are often impacted by the magnitude and the frequency of the river stage fluctuations. The study focuses on the Oise basin (17 000 km2, part of the 65 000 km2 Seine basin in Northern France) where stream-aquifer exchanges cannot be assessed directly by experimental methods. Nowadays numerical methods are the most appropriate approaches for assessing stream-aquifer exchanges at this scale. A regional distributed process-based hydro(geo)logical model, Eau-Dyssée, is used, which aims at the integrated modeling of the hydrosystem to manage the various elements involved in the quantitative and qualitative aspects of water resources. Eau-Dyssée simulates pseudo 3D flow in aquifer systems solving the diffusivity equation with a finite difference numerical scheme. River flow is simulated with a Muskingum model. In addition to the in-stream discharge, a river stage estimate is needed to calculate the water exchange at the stream-aquifer interface using the Darcy law. Three methods for assessing in-stream river stages are explored to determine the most appropriate representation at regional scale over 25 years (1980-2005). The first method consists in defining rating curves for each cell of a 1D Saint-Venant hydraulic model. The second method consists in interpolating observed rating curves (at gauging stations) onto the river cells of the hydro(geo)logical model. The interpolation technique is based on geostatistics. The last method assesses river stage using Manning equation with a simplified rectangular cross-section (water depth equals the hydraulic radius). Compared to observations, the geostatistical and the Manning methodologies lead to slightly less accurate (but still acceptable) results offering a low computational cost opportunity
First-Order Hybrid Logic: Introduction and Survey
DEFF Research Database (Denmark)
Braüner, Torben
2014-01-01
often wants to build up a series of assertions about what happens at a particular instant, and standard modal formalisms do not allow this. What is less obvious is that the route hybrid logic takes to overcome this problem often actually improves the behaviour of the underlying modal formalism...
Quantum Logic as a Dynamic Logic
Baltag, A.; Smets, S.
We address the old question whether a logical understanding of Quantum Mechanics requires abandoning some of the principles of classical logic. Against Putnam and others (Among whom we may count or not E. W. Beth, depending on how we interpret some of his statements), our answer is a clear “no”.
Quantum logic as a dynamic logic
Baltag, Alexandru; Smets, Sonja
We address the old question whether a logical understanding of Quantum Mechanics requires abandoning some of the principles of classical logic. Against Putnam and others (Among whom we may count or not E. W. Beth, depending on how we interpret some of his statements), our answer is a clear "no".
Transforming equality logic to propositional logic
Zantema, H.; Groote, J.F.
2003-01-01
Abstract We investigate and compare various ways of transforming equality formulas to propositional formulas, in order to be able to solve satisfiability in equality logic by means of satisfiability in propositional logic. We propose equality substitution as a new approach combining desirable
Dual Logic and Cerebral Coordinates for Reciprocal Interaction in Eye Contact
Lee, Ray F.
2015-01-01
In order to scientifically study the human brain’s response to face-to-face social interaction, the scientific method itself needs to be reconsidered so that both quantitative observation and symbolic reasoning can be adapted to the situation where the observer is also observed. In light of the recent development of dyadic fMRI which can directly observe dyadic brain interacting in one MRI scanner, this paper aims to establish a new form of logic, dual logic, which provides a theoretical platform for deductive reasoning in a complementary dual system with emergence mechanism. Applying the dual logic in the dfMRI experimental design and data analysis, the exogenous and endogenous dual systems in the BOLD responses can be identified; the non-reciprocal responses in the dual system can be suppressed; a cerebral coordinate for reciprocal interaction can be generated. Elucidated by dual logic deductions, the cerebral coordinate for reciprocal interaction suggests: the exogenous and endogenous systems consist of the empathy network and the mentalization network respectively; the default-mode network emerges from the resting state to activation in the endogenous system during reciprocal interaction; the cingulate plays an essential role in the emergence from the exogenous system to the endogenous system. Overall, the dual logic deductions are supported by the dfMRI experimental results and are consistent with current literature. Both the theoretical framework and experimental method set the stage to formally apply the scientific method in studying complex social interaction. PMID:25885446
Dual logic and cerebral coordinates for reciprocal interaction in eye contact.
Directory of Open Access Journals (Sweden)
Ray F Lee
Full Text Available In order to scientifically study the human brain's response to face-to-face social interaction, the scientific method itself needs to be reconsidered so that both quantitative observation and symbolic reasoning can be adapted to the situation where the observer is also observed. In light of the recent development of dyadic fMRI which can directly observe dyadic brain interacting in one MRI scanner, this paper aims to establish a new form of logic, dual logic, which provides a theoretical platform for deductive reasoning in a complementary dual system with emergence mechanism. Applying the dual logic in the dfMRI experimental design and data analysis, the exogenous and endogenous dual systems in the BOLD responses can be identified; the non-reciprocal responses in the dual system can be suppressed; a cerebral coordinate for reciprocal interaction can be generated. Elucidated by dual logic deductions, the cerebral coordinate for reciprocal interaction suggests: the exogenous and endogenous systems consist of the empathy network and the mentalization network respectively; the default-mode network emerges from the resting state to activation in the endogenous system during reciprocal interaction; the cingulate plays an essential role in the emergence from the exogenous system to the endogenous system. Overall, the dual logic deductions are supported by the dfMRI experimental results and are consistent with current literature. Both the theoretical framework and experimental method set the stage to formally apply the scientific method in studying complex social interaction.
The Pedagogical Value of the Lecture Method: The Case of a Non-Formal Education Programme in Ghana
Addae, David; Quan-Baffour, Kofi
2018-01-01
Adult learning rests on the foundation of learner experience and involvement in the teaching and learning process. The methods employed in facilitating adult learning have to a large extent sought to place the learner at the centre of the entire teaching and learning encounter. The lecture method is one of the many methods used to facilitate…
Formalization of the Integral Calculus in the PVS Theorem Prover
Directory of Open Access Journals (Sweden)
Ricky Wayne Butler
2009-04-01
Full Text Available The PVS Theorem prover is a widely used formal verification tool used for the analysis of safetycritical systems. The PVS prover, though fully equipped to support deduction in a very general logic framework, namely higher-order logic, it must nevertheless, be augmented with the definitions and associated theorems for every branch of mathematics and Computer Science that is used in a verification. This is a formidable task, ultimately requiring the contributions of researchers and developers all over the world. This paper reports on the formalization of the integral calculus in the PVS theorem prover. All of the basic definitions and theorems covered in a first course on integral calculus have been completed.The theory and proofs were based on Rosenlicht’s classic text on real analysis and follow the traditional epsilon-delta method. The goal of this work was to provide a practical set of PVS theories that could be used for verification of hybrid systems that arise in air traffic management systems and other aerospace applications. All of the basic linearity, integrability, boundedness, and continuity properties of the integral calculus were proved. The work culminated in the proof of the Fundamental Theorem Of Calculus. There is a brief discussion about why mechanically checked proofs are so much longer than standard mathematics textbook proofs.
Formalization of the Integral Calculus in the PVS Theorem Prover
Butler, Ricky W.
2004-01-01
The PVS Theorem prover is a widely used formal verification tool used for the analysis of safety-critical systems. The PVS prover, though fully equipped to support deduction in a very general logic framework, namely higher-order logic, it must nevertheless, be augmented with the definitions and associated theorems for every branch of mathematics and Computer Science that is used in a verification. This is a formidable task, ultimately requiring the contributions of researchers and developers all over the world. This paper reports on the formalization of the integral calculus in the PVS theorem prover. All of the basic definitions and theorems covered in a first course on integral calculus have been completed.The theory and proofs were based on Rosenlicht's classic text on real analysis and follow the traditional epsilon-delta method. The goal of this work was to provide a practical set of PVS theories that could be used for verification of hybrid systems that arise in air traffic management systems and other aerospace applications. All of the basic linearity, integrability, boundedness, and continuity properties of the integral calculus were proved. The work culminated in the proof of the Fundamental Theorem Of Calculus. There is a brief discussion about why mechanically checked proofs are so much longer than standard mathematics textbook proofs.
Fuzzy logic control of nuclear power plant
International Nuclear Information System (INIS)
Yao Liangzhong; Guo Renjun; Ma Changwen
1996-01-01
The main advantage of the fuzzy logic control is that the method does not require a detailed mathematical model of the object to be controlled. In this paper, the shortcomings and limitations of the model-based method in nuclear power plant control were presented, the theory of the fuzzy logic control was briefly introduced, and the applications of the fuzzy logic control technology in nuclear power plant controls were surveyed. Finally, the problems to be solved by using the fuzzy logic control in nuclear power plants were discussed
RTL2RTL Formal Equivalence: Boosting the Design Confidence
Directory of Open Access Journals (Sweden)
M V Achutha Kiran Kumar
2014-07-01
Full Text Available Increasing design complexity driven by feature and performance requirements and the Time to Market (TTM constraints force a faster design and validation closure. This in turn enforces novel ways of identifying and debugging behavioral inconsistencies early in the design cycle. Addition of incremental features and timing fixes may alter the legacy design behavior and would inadvertently result in undesirable bugs. The most common method of verifying the correctness of the changed design is to run a dynamic regression test suite before and after the intended changes and compare the results, a method which is not exhaustive. Modern Formal Verification (FV techniques involving new methods of proving Sequential Hardware Equivalence enabled a new set of solutions for the given problem, with complete coverage guarantee. Formal Equivalence can be applied for proving functional integrity after design changes resulting from a wide variety of reasons, ranging from simple pipeline optimizations to complex logic redistributions. We present here our experience of successfully applying the RTL to RTL (RTL2RTL Formal Verification across a wide spectrum of problems on a Graphics design. The RTL2RTL FV enabled checking the design sanity in a very short time, thus enabling faster and safer design churn. The techniques presented in this paper are applicable to any complex hardware design.
Directory of Open Access Journals (Sweden)
P N Johnson-Laird
2010-10-01
Full Text Available An old view in logic going back to Aristotle is that an inference is valid in virtue of its logical form. Many psychologists have adopted the same point of view about human reasoning: the first step is to recover the logical form of an inference, and the second step is to apply rules of inference that match these forms in order to prove that the conclusion follows from the premises. The present paper argues against this idea. The logical form of an inference transcends the grammatical forms of the sentences used to express it, because logical form also depends on context. Context is not readily expressed in additional premises. And the recovery of logical form leads ineluctably to the need for infinitely many axioms to capture the logical properties of relations. An alternative theory is that reasoning depends on mental models, and this theory obviates the need to recover logical form.
Anticoincidence logic using PALs
International Nuclear Information System (INIS)
Bolanos, L.; Arista Romeu, E.
1997-01-01
This paper describes the functioning principle of an anticoincidence logic and a design of this based on programing logic. The circuit was included in a discriminator of an equipment for single-photon absorptiometry
Pingbo, An; Li, Wang; Hongxi, Lu; Zhiguo, Yu; Lei, Liu; Xin, Xi; Lixia, Zhao; Junxi, Wang; Jinmin, Li
2016-06-01
The internal quantum efficiency (IQE) of the light-emitting diodes can be calculated by the ratio of the external quantum efficiency (EQE) and the light extraction efficiency (LEE). The EQE can be measured experimentally, but the LEE is difficult to calculate due to the complicated LED structures. In this work, a model was established to calculate the LEE by combining the transfer matrix formalism and an in-plane ray tracing method. With the calculated LEE, the IQE was determined and made a good agreement with that obtained by the ABC model and temperature-dependent photoluminescence method. The proposed method makes the determination of the IQE more practical and conventional. Project supported by the National Natural Science Foundation of China (Nos.11574306, 61334009), the China International Science and Technology Cooperation Program (No. 2014DFG62280), and the National High Technology Program of China (No. 2015AA03A101).
Game, set, maths : formal investigations into logic with imperfect information
Dechesne, F.
2005-01-01
With his book "The Principles of Mathematics Revisited" (1996), Jaakko Hintikka intends to "wake [his] fellow philosophers of mathematics from their skeptical slumbers, and to point out to them a wealth of new constructive possibilities in the foundations of mathematics" (p. ix). The latter is
Riedinger, Kelly; Marbach-Ad, Gili; McGinnis, J. Randy; Hestness, Emily; Pease, Rebecca
2011-01-01
We investigated curricular and pedagogical innovations in an undergraduate science methods course for elementary education majors at the University of Maryland. The goals of the innovative elementary science methods course included: improving students' attitudes toward and views of science and science teaching, to model innovative science teaching…
On the Predictability of Classical Propositional Logic
Finger, Marcelo; Reis, Poliana
2013-01-01
In this work we provide a statistical form of empirical analysis of classical propositional logic decision methods called SAT solvers. This work is perceived as an empirical counterpart of a theoretical movement, called the enduring scandal of deduction, that opposes considering Boolean Logic as trivial in any sense. For that, we study the predictability of classical logic, which we take to be the distribution of the runtime of its decision process. We present a series of experiments that det...
Connections among quantum logics
International Nuclear Information System (INIS)
Lock, P.F.; Hardegree, G.M.
1985-01-01
This paper gives a brief introduction to the major areas of work in quantum event logics: manuals (Foulis and Randall) and semi-Boolean algebras (Abbott). The two theories are compared, and the connection between quantum event logics and quantum propositional logics is made explicit. In addition, the work on manuals provides us with many examples of results stated in Part I. (author)
Manca, V.; Salibra, A.; Scollo, Giuseppe
1990-01-01
Equational type logic is an extension of (conditional) equational logic, that enables one to deal in a single, unified framework with diverse phenomena such as partiality, type polymorphism and dependent types. In this logic, terms may denote types as well as elements, and atomic formulae are either
DEFF Research Database (Denmark)
Xue, Bingtian; Larsen, Kim Guldstrand; Mardare, Radu Iulian
2015-01-01
We introduce Concurrent Weighted Logic (CWL), a multimodal logic for concurrent labeled weighted transition systems (LWSs). The synchronization of LWSs is described using dedicated functions that, in various concurrency paradigms, allow us to encode the compositionality of LWSs. To reflect these......-completeness results for this logic. To complete these proofs we involve advanced topological techniques from Model Theory....
Bergstra, J.A.
2011-01-01
Four options for assigning a meaning to Islamic Logic are surveyed including a new proposal for an option named "Real Islamic Logic" (RIL). That approach to Islamic Logic should serve modern Islamic objectives in a way comparable to the functionality of Islamic Finance. The prospective role of RIL
Cosmic logic: a computational model
International Nuclear Information System (INIS)
Vanchurin, Vitaly
2016-01-01
We initiate a formal study of logical inferences in context of the measure problem in cosmology or what we call cosmic logic. We describe a simple computational model of cosmic logic suitable for analysis of, for example, discretized cosmological systems. The construction is based on a particular model of computation, developed by Alan Turing, with cosmic observers (CO), cosmic measures (CM) and cosmic symmetries (CS) described by Turing machines. CO machines always start with a blank tape and CM machines take CO's Turing number (also known as description number or Gödel number) as input and output the corresponding probability. Similarly, CS machines take CO's Turing number as input, but output either one if the CO machines are in the same equivalence class or zero otherwise. We argue that CS machines are more fundamental than CM machines and, thus, should be used as building blocks in constructing CM machines. We prove the non-computability of a CS machine which discriminates between two classes of CO machines: mortal that halts in finite time and immortal that runs forever. In context of eternal inflation this result implies that it is impossible to construct CM machines to compute probabilities on the set of all CO machines using cut-off prescriptions. The cut-off measures can still be used if the set is reduced to include only machines which halt after a finite and predetermined number of steps
Educational Software for First Order Logic Semantics in Introductory Logic Courses
Mauco, María Virginia; Ferrante, Enzo; Felice, Laura
2014-01-01
Basic courses on logic are common in most computer science curricula. Students often have difficulties in handling formalisms and getting familiar with them. Educational software helps to motivate and improve the teaching-learning processes. Therefore, incorporating these kinds of tools becomes important, because they contribute to gaining…
Formal Testing of Correspondence Carrying Software
Bujorianu, M.C.; Bujorianu, L.M.; Maharaj, S.
2008-01-01
Nowadays formal software development is characterised by use of multitude formal specification languages. Test case generation from formal specifications depends in general on a specific language, and, moreover, there are competing methods for each language. There is a need for a generic approach to
Towards Safe Navigation by Formalizing Navigation Rules
Directory of Open Access Journals (Sweden)
Arne Kreutzmann
2013-06-01
Full Text Available One crucial aspect of safe navigation is to obey all navigation regulations applicable, in particular the collision regulations issued by the International Maritime Organization (IMO Colregs. Therefore, decision support systems for navigation need to respect Colregs and this feature should be verifiably correct. We tackle compliancy of navigation regulations from a perspective of software verification. One common approach is to use formal logic, but it requires to bridge a wide gap between navigation concepts and simple logic. We introduce a novel domain specification language based on a spatio-temporal logic that allows us to overcome this gap. We are able to capture complex navigation concepts in an easily comprehensible representation that can direcly be utilized by various bridge systems and that allows for software verification.
Directory of Open Access Journals (Sweden)
S. Psakhie
2013-04-01
Full Text Available A general approach to realization of models of elasticity, plasticity and fracture of heterogeneous materials within the framework of particle-based numerical methods is proposed in the paper. It is based on building many-body forces of particle interaction, which provide response of particle ensemble correctly conforming to the response (including elastic-plastic behavior and fracture of simulated solids. Implementation of proposed approach within particle-based methods is demonstrated by the example of the movable cellular automaton (MCA method, which integrates the possibilities of particle-based discrete element method (DEM and cellular automaton methods. Emergent advantages of the developed approach to formulation of many-body interaction are discussed. Main of them are its applicability to various realizations of the concept of discrete elements and a possibility to realize various rheological models (including elastic-plastic or visco-elastic-plastic and models of fracture to study deformation and fracture of solid-phase materials and media. Capabilities of particle-based modeling of heterogeneous solids are demonstrated by the problem of simulation of deformation and fracture of particle-reinforced metal-ceramic composites.
Nilsson, Annika; Engström, Maria
2015-05-06
Among staff working in elderly care, a considerable proportion lack formal competence for their work. Lack of formal competence, in turn, has been linked to higher staff ratings of stress symptoms, sleep disturbances and workload. 1) To describe the strengths and weaknesses of an e-assessment and subsequent e-training program used among elderly care staff who lack formal competence and 2) to study the effects of an e-training program on staff members' working life (quality of care and psychological and structural empowerment) and well-being (job satisfaction and psychosomatic health). The hypothesis was that staff who had completed the e-assessment and the e-training program would rate greater improvements in working life and well-being than would staff who had only participated in the e-assessments. An intervention study with a mixed-methods approach using quantitative (2010-2011) and qualitative data (2011) was conducted in Swedish elderly care. Participants included a total of 41 staff members. To describe the strengths and weaknesses of the e-assessment and the e-training program, qualitative data were gathered using semi-structured interviews together with a study-specific questionnaire. To study the effects of the intervention, quantitative data were collected using questionnaires on: job satisfaction, psychosomatic health, psychological empowerment, structural empowerment and quality of care in an intervention and a comparison group. Staff who completed the e-assessments and the e-training program primarily experienced strengths associated with this approach. The results were also in line with our hypotheses: Staff who completed the e-assessment and the e-training program rated improvements in their working life and well-being. Use of the e-assessments and e-training program employed in the present study could be one way to support elderly care staff who lack formal education by increasing their competence; increased competence, in turn, could improve their
Czech Academy of Sciences Publication Activity Database
Horčík, Rostislav; Cintula, Petr
2004-01-01
Roč. 43, - (2004), s. 477-503 ISSN 1432-0665 R&D Projects: GA AV ČR IAA1030004; GA ČR GA201/02/1540 Grant - others:GA CTU(CZ) project 0208613; net CEEPUS(SK) SK-042 Institutional research plan: CEZ:AV0Z1030915 Keywords : fuzzy logic * many-valued logic * Lukasiewicz logic * Lpi logic * Takeuti-Titani logic * MV-algebras * product MV-algebras Subject RIV: BA - General Mathematics Impact factor: 0.295, year: 2004
DEFF Research Database (Denmark)
Blackburn, Patrick Rowan; Huertas, Antonia; Manzano, Maria
2014-01-01
Leon Henkin was not a modal logician, but there is a branch of modal logic that has been deeply influenced by his work. That branch is hybrid logic, a family of logics that extend orthodox modal logic with special proposition symbols (called nominals) that name worlds. This paper explains why...... Henkin’s techniques are so important in hybrid logic. We do so by proving a completeness result for a hybrid type theory called HTT, probably the strongest hybrid logic that has yet been explored. Our completeness result builds on earlier work with a system called BHTT, or basic hybrid type theory...... is due to the first-order perspective, which lies at the heart of Henin’s best known work and hybrid logic....
Directory of Open Access Journals (Sweden)
Newton C. A. da Costa
2002-12-01
Full Text Available In view of the present state of development of non classical logic, especially of paraconsistent logic, a new stand regarding the relations between logic and ontology is defended In a parody of a dictum of Quine, my stand May be summarized as follows. To be is to be the value of a variable a specific language with a given underlying logic Yet my stand differs from Quine’s, because, among other reasons, I accept some first order heterodox logics as genuine alternatives to classical logic I also discuss some questions of non classical logic to substantiate my argument, and suggest that may position complements and extends some ideas advanced by L Apostel.
Institutional Logics in Action
DEFF Research Database (Denmark)
Lounsbury, Michael; Boxenbaum, Eva
2013-01-01
This double volume presents state-of-the-art research and thinking on the dynamics of actors and institutional logics. In the introduction, we briefly sketch the roots and branches of institutional logics scholarship before turning to the new buds of research on the topic of how actors engage...... institutional logics in the course of their organizational practice. We introduce an exciting line of new works on the meta-theoretical foundations of logics, institutional logic processes, and institutional complexity and organizational responses. Collectively, the papers in this volume advance the very...... prolific stream of research on institutional logics by deepening our insight into the active use of institutional logics in organizational action and interaction, including the institutional effects of such (inter)actions....
Freedom and enforcement in action a study in formal action theory
Czelakowski, Janusz
2015-01-01
Action theory is the object of growing attention in a variety of scientific disciplines, and this is the first volume to offer a synthetic view of the range of approaches possible in the topic. The volume focuses on the nexus of formal action theory with a startlingly diverse set of subjects, which range from logic, linguistics, artificial intelligence, and automata theory to jurisprudence, deontology, and economics. It covers semantic, mathematical and logical aspects of action, showing how the problem of action breaks the boundaries of traditional branches of logic located in syntactics and semantics and now lies on lies on the borderline between logical pragmatics and praxeology. The chapters here focus on specialized tasks in formal action theory, beginning with a thorough description and formalization of the language of action, and moving through material on the differing models of action theory to focus on probabilistic models, the relations of formal action theory to deontic logic, and its key appl...
Logic in the curricula of Computer Science
Directory of Open Access Journals (Sweden)
Margareth Quindeless
2014-12-01
Full Text Available The aim of the programs in Computer Science is to educate and train students to understand the problems and build systems that solve them. This process involves applying a special reasoning to model interactions, capabilities, and limitations of the components involved. A good curriculum must involve the use of tools to assist in these tasks, and one that could be considered as a fundamental is the logic, because with it students develop the necessary reasoning. Besides, software developers analyze the behavior of the program during the designed, the depuration, and testing; hardware designers perform minimization and equivalence verification of circuits; designers of operating systems validate routing protocols, programing, and synchronization; and formal logic underlying all these activities. Therefore, a strong background in applied logic would help students to develop or potentiate their ability to reason about complex systems. Unfortunately, few curricula formed and properly trained in logic. Most includes only one or two courses of Discrete Mathematics, which in a few weeks covered truth tables and the propositional calculus, and nothing more. This is not enough, and higher level courses in which they are applied and many other logical concepts are needed. In addition, students will not see the importance of logic in their careers and need to modify the curriculum committees or adapt the curriculum to reverse this situation.
Pragmatics for formal semantics
DEFF Research Database (Denmark)
Danvy, Olivier
2011-01-01
This tech talk describes how to write and how to inter-derive formal semantics for sequential programming languages. The progress reported here is (1) concrete guidelines to write each formal semantics to alleviate their proof obligations, and (2) simple calculational tools to obtain a formal...
Directory of Open Access Journals (Sweden)
Ali Selamat
2012-01-01
Full Text Available Sensitivity-based linear learning method (SBLLM has recently been used as a predictive tool due to its unique characteristics and performance, particularly its high stability and consistency during predictions. However, the generalisation capability of SBLLM is sometimes limited depending on the nature of the dataset, particularly on whether uncertainty is present in the dataset or not. Since it made use of sensitivity analysis in relation to the data sets used, it is surely very prone to being affected by the nature of the dataset. In order to reduce the effects of uncertainties in SBLLM prediction and improve its generalisation ability, this paper proposes a hybrid system through the unique combination of type-2 fuzzy logic systems (type-2 FLSs and SBLLM; thereafter the hybrid system was used to model PVT properties of crude oil systems. Type-2 FLS has been choosen in order to better handle uncertainties existing in datasets beyond the capability of type-1 fuzzy logic systems. In the proposed hybrid, the type-2 FLS is used to handle uncertainties in reservoir data so that the cleaned data from type-2 FLS is then passed to the SBLLM for training and then final prediction using testing dataset follows. Comparative studies have been carried out to compare the performance of the newly proposed T2-SBLLM hybrid system with each of the constituent type-2 FLS and SBLLM. Empirical results from simulation show that the proposed T2-SBLLM hybrid system has greatly improved upon the performance of SBLLM, while also maintaining a better performance above that of the type-2 FLS.
Energy Technology Data Exchange (ETDEWEB)
Meyer, L.; Witzel, G.; Ghez, A. M. [Department of Physics and Astronomy, University of California, Los Angeles, CA 90095-1547 (United States); Longstaff, F. A. [UCLA Anderson School of Management, University of California, Los Angeles, CA 90095-1481 (United States)
2014-08-10
Continuously time variable sources are often characterized by their power spectral density and flux distribution. These quantities can undergo dramatic changes over time if the underlying physical processes change. However, some changes can be subtle and not distinguishable using standard statistical approaches. Here, we report a methodology that aims to identify distinct but similar states of time variability. We apply this method to the Galactic supermassive black hole, where 2.2 μm flux is observed from a source associated with Sgr A* and where two distinct states have recently been suggested. Our approach is taken from mathematical finance and works with conditional flux density distributions that depend on the previous flux value. The discrete, unobserved (hidden) state variable is modeled as a stochastic process and the transition probabilities are inferred from the flux density time series. Using the most comprehensive data set to date, in which all Keck and a majority of the publicly available Very Large Telescope data have been merged, we show that Sgr A* is sufficiently described by a single intrinsic state. However, the observed flux densities exhibit two states: noise dominated and source dominated. Our methodology reported here will prove extremely useful to assess the effects of the putative gas cloud G2 that is on its way toward the black hole and might create a new state of variability.
Towards a Formal Model of Privacy-Sensitive Dynamic Coalitions
Directory of Open Access Journals (Sweden)
Sebastian Bab
2012-04-01
Full Text Available The concept of dynamic coalitions (also virtual organizations describes the temporary interconnection of autonomous agents, who share information or resources in order to achieve a common goal. Through modern technologies these coalitions may form across company, organization and system borders. Therefor questions of access control and security are of vital significance for the architectures supporting these coalitions. In this paper, we present our first steps to reach a formal framework for modeling and verifying the design of privacy-sensitive dynamic coalition infrastructures and their processes. In order to do so we extend existing dynamic coalition modeling approaches with an access-control-concept, which manages access to information through policies. Furthermore we regard the processes underlying these coalitions and present first works in formalizing these processes. As a result of the present paper we illustrate the usefulness of the Abstract State Machine (ASM method for this task. We demonstrate a formal treatment of privacy-sensitive dynamic coalitions by two example ASMs which model certain access control situations. A logical consideration of these ASMs can lead to a better understanding and a verification of the ASMs according to the aspired specification.
International Nuclear Information System (INIS)
Wong, W.H.; Li, H.; Uribe, J.
1998-01-01
A new method for processing signals from Anger position-sensitive detectors used in gamma cameras and PET is proposed for very high count-rate imaging where multiple-event pileups are the norm. This method is designed to sort out and recover every impinging event from multiple-event pileups while maximizing the collection of scintillation signal for every event to achieve optimal accuracy in the measurement of energy and position. For every detected event, this method cancels the remnant signals from previous events, and excludes the pileup of signals from following events. The remnant subtraction is exact even for multiple pileup events. A prototype circuit for energy recovery demonstrated that the maximum count rates can be increased by more than 10 times comparing to the pulse-shaping method, and the energy resolution is as good as pulse shaping (or fixed integration) at low count rates. At 2 x 10 6 events/sec on NaI(Tl), the true counts acquired with this method is 3.3 times more than the delay-line clipping method (256 ns clipping) due to events recovered from pileups. Pulse-height spectra up to 3.5 x 10 6 events/sec have been studied. Monte Carlo simulation studies have been performed for image-quality comparisons between different processing methods
Stökl, A.
2008-11-01
Context: In spite of all the advances in multi-dimensional hydrodynamics, investigations of stellar evolution and stellar pulsations still depend on one-dimensional computations. This paper devises an alternative to the mixing-length theory or turbulence models usually adopted in modelling convective transport in such studies. Aims: The present work attempts to develop a time-dependent description of convection, which reflects the essential physics of convection and that is only moderately dependent on numerical parameters and far less time consuming than existing multi-dimensional hydrodynamics computations. Methods: Assuming that the most extensive convective patterns generate the majority of convective transport, the convective velocity field is described using two parallel, radial columns to represent up- and downstream flows. Horizontal exchange, in the form of fluid flow and radiation, over their connecting interface couples the two columns and allows a simple circulating motion. The main parameters of this convective description have straightforward geometrical meanings, namely the diameter of the columns (corresponding to the size of the convective cells) and the ratio of the cross-section between up- and downdrafts. For this geometrical setup, the time-dependent solution of the equations of radiation hydrodynamics is computed from an implicit scheme that has the advantage of being unaffected by the Courant-Friedrichs-Lewy time-step limit. This implementation is part of the TAPIR-Code (short for The adaptive, implicit RHD-Code). Results: To demonstrate the approach, results for convection zones in Cepheids are presented. The convective energy transport and convective velocities agree with expectations for Cepheids and the scheme reproduces both the kinetic energy flux and convective overshoot. A study of the parameter influence shows that the type of solution derived for these stars is in fact fairly robust with respect to the constitutive numerical
On Logic and Standards for Structuring Documents
Eyers, David M.; Jones, Andrew J. I.; Kimbrough, Steven O.
The advent of XML has been widely seized upon as an opportunity to develop document representation standards that lend themselves to automated processing. This is a welcome development and much good has come of it. That said, present standardization efforts may be criticized on a number of counts. We explore two issues associated with document XML standardization efforts. We label them (i) the dynamic point and (ii) the logical point. Our dynamic point is that in many cases experience has shown that the search for a final, or even reasonably permanent, document representation standard is futile. The case is especially strong for electronic data interchange (EDI). Our logical point is that formalization into symbolic logic is materially helpful for understanding and designing dynamic document standards.
The Quantum Logical Challenge: Peter Mittelstaedt's Contributions to Logic and Philosophy of Science
Beltrametti, E.; Dalla Chiara, M. L.; Giuntini, R.
2017-12-01
Peter Mittelstaedt's contributions to quantum logic and to the foundational problems of quantum theory have significantly realized the most authentic spirit of the International Quantum Structures Association: an original research about hard technical problems, which are often "entangled" with the emergence of important changes in our general world-conceptions. During a time where both the logical and the physical community often showed a skeptical attitude towards Birkhoff and von Neumann's quantum logic, Mittelstaedt brought into light the deeply innovating features of a quantum logical thinking that allows us to overcome some strong and unrealistic assumptions of classical logical arguments. Later on his intense research on the unsharp approach to quantum theory and to the measurement problem stimulated the increasing interest for unsharp forms of quantum logic, creating a fruitful interaction between the work of quantum logicians and of many-valued logicians. Mittelstaedt's general views about quantum logic and quantum theory seem to be inspired by a conjecture that is today more and more confirmed: there is something universal in the quantum theoretic formalism that goes beyond the limits of microphysics, giving rise to interesting applications to a number of different fields.
The latitude of logic in legal hermeneutics
Directory of Open Access Journals (Sweden)
Medar Suzana
2014-01-01
Full Text Available Legal hermeneutics (the interpretation of law] has always taken a highly significant place in general hermeneutics. The interpretation of laws involves an intricate task of determining the real meaning or rationale of legal norms. Considering the complexity of this goal, the most frequent classification of legal hermeneutics is based on the interpretation instruments. In traditional theory, the most widely recognized instruments for the interpretation of legal norms are language, logic, legal system, history and purpose of a legal norm. Under the influence of general analytic philosophy, the particular interest in language as the basic instrument for the interpretation of law may be found in mid-20th century. The interest in the language of law is closely related to the study of legal logic and legal argumentation. In theory, there is no dispute about the logical interpretation in a narrow sense which is based on drawing true conclusions by applying the basic rule of formal reasoning. Yet, it has given a head start to argumentation as 'a problem-based reasoning skill' which provides answers to the questions raised in contentious cases. Argumentation is closely associated with the dialectic method of reasoning (which has been widely recognized since the Ancient Greece], where conclusions are based on probable premises. One of the most significant goals of the argumentation theory is to locate the sources or common grounds for developing arguments; these basic argumentative patterns are generally known as 'topoi' or 'loci, sedes argumentorum'. On the other hand, 'topica' is part of rhetoric art dealing with the theoretical explanation of the basic argumentative patterns (topoi] and how they are structured, including the location of new topoi and arguments. The most significant proponents of the topical reasoning are Chaïm Perelman and Theodor Viehweg. Perelman relates topical reasoning to judicial reasoning and considers that specific legal topoi
The formal operations: Piaget’s concept, researches and main critics
Directory of Open Access Journals (Sweden)
Stepanović Ivana Ž.
2004-01-01
Full Text Available This paper deals with Piaget's concept of formal operations, formal operations researches and critics related to the concept. The first part of the work is dedicated to the formal operations concept. The main characteristics of formal operational thought and formal operations structure, as well as structure logical model are presented in that part of the work. The second part is a review of formal operational researches and it is divided in three parts: (1 problems of researches (2 characteristics of applied methodology and (3 author approaches as a specific research context. In the last part of the work the main critics of formal operations concept are presented and discussed.
International Nuclear Information System (INIS)
Letkovicova, M.; Rehak, R.; Korec, J.; Mihaly, B.; Prikazsky, V.
1998-01-01
Our paper examines the surrounding areas of NPP from the proportion of premature death-rate which is one of the complex indicators of the health situation of the population. Specially, attention is focused on NPP in Bohunice (SE-EBO) which has been in operation for the last 30 years and NPP Mochovce (SE-EMO) which was still under construction when data was collected. WHO considers every death of the individual before 65 years of age a premature death case, except death cases of children younger that 1 year. Because of the diversity of the population, this factor is a standard for the population of Slovak Republic (SR) as well as for the european population. The objective of the work is to prove, that even a long term production of energy in NPP does not evoke health problems for the population living in the surrounding areas, which could be recorded through analysis of premature death cases. Using the fuzzy logic method when searching for similar objects and evaluating the influence of the NPP on its surrounding area seems more natural than classical accumulation method, which separates objects into groups. When using the classical accumulation method, the objects in particular accumulation group are more similar than 2 objects in different accumulation groups. When using the fuzzy logic method the similarity is defined more naturally. Within the observed regions of the NPP, the percentage of directly standardized premature death cases is almost identical with the average for the SR. The most closely observed region of SE-EMO up to 5 kilometers zone even shows the lowest percentage. Also we did not record any areas that would have unfavourable values from the wind streams perspective neither than from the local water streams recipients of SE-EBO Manivier and Dudvah. The region of SE-EMO is also within the SR average, unfavourable coherent areas of premature death case are non existent. Galanta city region comes out of the comparison with the relatively worse
Directory of Open Access Journals (Sweden)
Schang Fabien
2017-03-01
Full Text Available An analogy is made between two rather different domains, namely: logic, and football (or soccer. Starting from a comparative table between the two activities, an alternative explanation of logic is given in terms of players, ball, goal, and the like. Our main thesis is that, just as the task of logic is preserving truth from premises to the conclusion, footballers strive to keep the ball as far as possible until the opposite goal. Assuming this analogy may help think about logic in the same way as in dialogical logic, but it should also present truth-values in an alternative sense of speech-acts occurring in a dialogue. The relativity of truth-values is focused by this way, thereby leading to an additional way of logical pluralism.
The Leibniz principle in quantum logic
International Nuclear Information System (INIS)
Giuntini, R.; Mittelstaedt, P.
1989-01-01
The principle of the identity of indiscernibles (Leibniz Principle) is investigated within the framework of the formal language of quantum physics, which is given by an orthomodular lattice. The authors show that the validity of this principle is based on very strong preconditions (concerning the existence of convenient predicates) which are given in the language of classical physics but which cannot be fulfilled in orthomodular quantum logic
FUZZY LOGIC IN LEGAL EDUCATION
Directory of Open Access Journals (Sweden)
Z. Gonul BALKIR
2011-04-01
Full Text Available The necessity of examination of every case within its peculiar conditions in social sciences requires different approaches complying with the spirit and nature of social sciences. Multiple realities require different and various perceptual interpretations. In modern world and social sciences, interpretation of perception of valued and multi-valued have been started to be understood by the principles of fuzziness and fuzzy logic. Having the verbally expressible degrees of truthness such as true, very true, rather true, etc. fuzzy logic provides the opportunity for the interpretation of especially complex and rather vague set of information by flexibility or equivalence of the variables’ of fuzzy limitations. The methods and principles of fuzzy logic can be benefited in examination of the methodological problems of law, especially in the applications of filling the legal loopholes arising from the ambiguities and interpretation problems in order to understand the legal rules in a more comprehensible and applicable way and the efficiency of legal implications. On the other hand, fuzzy logic can be used as a technical legal method in legal education and especially in legal case studies and legal practice applications in order to provide the perception of law as a value and the more comprehensive and more quality perception and interpretation of value of justice, which is the core value of law. In the perception of what happened as it has happened in legal relationships and formations, the understanding of social reality and sociological legal rules with multi valued sense perspective and the their applications in accordance with the fuzzy logic’s methods could create more equivalent and just results. It can be useful for the young lawyers and law students as a facilitating legal method especially in the materialization of the perception and interpretation of multi valued and variables. Using methods and principles of fuzzy logic in legal
Kim, Changhwa; Shin, DongHyun
2017-05-12
There are wireless networks in which typically communications are unsafe. Most terrestrial wireless sensor networks belong to this category of networks. Another example of an unsafe communication network is an underwater acoustic sensor network (UWASN). In UWASNs in particular, communication failures occur frequently and the failure durations can range from seconds up to a few hours, days, or even weeks. These communication failures can cause data losses significant enough to seriously damage human life or property, depending on their application areas. In this paper, we propose a framework to reduce sensor data loss during communication failures and we present a formal approach to the Selection by Minimum Error and Pattern (SMEP) method that plays the most important role for the reduction in sensor data loss under the proposed framework. The SMEP method is compared with other methods to validate its effectiveness through experiments using real-field sensor data sets. Moreover, based on our experimental results and performance comparisons, the SMEP method has been validated to be better than others in terms of the average sensor data value error rate caused by sensor data loss.
Orthogonal Algorithm of Logic Probability and Syndrome-Testable Analysis
Institute of Scientific and Technical Information of China (English)
无
1990-01-01
A new method,orthogonal algoritm,is presented to compute the logic probabilities(i.e.signal probabilities)accurately,The transfer properties of logic probabilities are studied first,which are useful for the calculation of logic probability of the circuit with random independent inputs.Then the orthogonal algoritm is described to compute the logic probability of Boolean function realized by a combinational circuit.This algorithm can make Boolean function “ORTHOGONAL”so that the logic probabilities can be easily calculated by summing up the logic probabilities of all orthogonal terms of the Booleam function.
International Nuclear Information System (INIS)
Wall, M.J.W.
1992-01-01
The notion of open-quotes probabilityclose quotes is generalized to that of open-quotes likelihood,close quotes and a natural logical structure is shown to exist for any physical theory which predicts likelihoods. Two physically based axioms are given for this logical structure to form an orthomodular poset, with an order-determining set of states. The results strengthen the basis of the quantum logic approach to axiomatic quantum theory. 25 refs
Logical database design principles
Garmany, John; Clark, Terry
2005-01-01
INTRODUCTION TO LOGICAL DATABASE DESIGNUnderstanding a Database Database Architectures Relational Databases Creating the Database System Development Life Cycle (SDLC)Systems Planning: Assessment and Feasibility System Analysis: RequirementsSystem Analysis: Requirements Checklist Models Tracking and Schedules Design Modeling Functional Decomposition DiagramData Flow Diagrams Data Dictionary Logical Structures and Decision Trees System Design: LogicalSYSTEM DESIGN AND IMPLEMENTATION The ER ApproachEntities and Entity Types Attribute Domains AttributesSet-Valued AttributesWeak Entities Constraint
Czech Academy of Sciences Publication Activity Database
Peliš, Michal
2017-01-01
Roč. 26, č. 3 (2017), s. 357-381 ISSN 1425-3305 R&D Projects: GA ČR(CZ) GC16-07954J Institutional support: RVO:67985955 Keywords : epistemic logic * erotetic implication * erotetic logic * logic of questions Subject RIV: AA - Philosophy ; Religion OBOR OECD: Philosophy, History and Philosophy of science and technology http://apcz.umk.pl/czasopisma/index.php/LLP/article/view/LLP.2017.007
Pereyra, Nicolas A.
2018-06-01
This book gives a rigorous yet 'physics-focused' introduction to mathematical logic that is geared towards natural science majors. We present the science major with a robust introduction to logic, focusing on the specific knowledge and skills that will unavoidably be needed in calculus topics and natural science topics in general (rather than taking a philosophical-math-fundamental oriented approach that is commonly found in mathematical logic textbooks).
Towards Formal Verification of a Separation Microkernel
Butterfield, Andrew; Sanan, David; Hinchey, Mike
2013-08-01
The best approach to verifying an IMA separation kernel is to use a (fixed) time-space partitioning kernel with a multiple independent levels of separation (MILS) architecture. We describe an activity that explores the cost and feasibility of doing a formal verification of such a kernel to the Common Criteria (CC) levels mandated by the Separation Kernel Protection Profile (SKPP). We are developing a Reference Specification of such a kernel, and are using higher-order logic (HOL) to construct formal models of this specification and key separation properties. We then plan to do a dry run of part of a formal proof of those properties using the Isabelle/HOL theorem prover.
DEFF Research Database (Denmark)
Blackburn, Patrick Rowan; Jørgensen, Klaus Frovin
2012-01-01
In this paper we explore the logic of now, yesterday, today and tomorrow by combining the semantic approach to indexicality pioneered by Hans Kamp [9] and refined by David Kaplan [10] with hybrid tense logic. We first introduce a special now nominal (our @now corresponds to Kamp’s original now...... operator N) and prove completeness results for both logical and contextual validity. We then add propositional constants to handle yesterday, today and tomorrow; our system correctly treats sentences like “Niels will die yesterday” as contextually unsatisfiable. Building on our completeness results for now......, we prove completeness for the richer language, again for both logical and contextual validity....
DEFF Research Database (Denmark)
Lopez, Hugo Andres; Carbone, Marco; Hildebrandt, Thomas
2010-01-01
We explore logical reasoning for the global calculus, a coordination model based on the notion of choreography, with the aim to provide a methodology for speciﬁcation and veriﬁcation of structured communications. Starting with an extension of Hennessy-Milner logic, we present the global logic (GL...... ), a modal logic describing possible interactions among participants in a choreography. We illustrate its use by giving examples of properties on service speciﬁcations. Finally, we show that, despite GL is undecidable, there is a signiﬁcant decidable fragment which we provide with a sound and complete proof...
International Nuclear Information System (INIS)
Andronov, A.A.; Kurin, V.V.; Levichev, M.Yu.; Ryndyk, D.A.; Vostokov, V.I.
1993-01-01
In recent years there has been much interest in superconductor logical devices. Our paper is devoted to the analysis of some new possibilities in this field. The main problems here are: minimization of time of logical operations and reducing of device scale. Josephson systems are quite appropriate for this purpose because of small size, short characteristic time and also small energy losses. Two different types of Josephson logic have been investigated during last years. The first type is based on hysteretic V-A characteristic of a single Josephson junction. Superconducting and resistive (with nonzero voltage) states are considered as logical zero and logical unit. The second one - rapid single flux quantum logic, has been developed recently and is based on SQUID-like bistability. Different logical states are the states with different number of magnetic flux quanta inside closed superconducting contour. Information is represented by voltage pulses with fixed ''area'' (∫ V(t)/dt). This pulses are generated when logical state of SQUID-like elementary cell changes. The fundamental role of magnetic flux quantization in this type of logic leads to the necessity of large enough self-inductance of superconductor contour and thus to limitations on minimal device dimensions. (orig.)
Directory of Open Access Journals (Sweden)
Marco Carbone
2011-10-01
Full Text Available We explore logical reasoning for the global calculus, a coordination model based on the notion of choreography, with the aim to provide a methodology for specification and verification of structured communications. Starting with an extension of Hennessy-Milner logic, we present the global logic (GL, a modal logic describing possible interactions among participants in a choreography. We illustrate its use by giving examples of properties on service specifications. Finally, we show that, despite GL is undecidable, there is a significant decidable fragment which we provide with a sound and complete proof system for checking validity of formulae.
Introduction to mathematical logic
Mendelson, Elliott
2015-01-01
The new edition of this classic textbook, Introduction to Mathematical Logic, Sixth Edition explores the principal topics of mathematical logic. It covers propositional logic, first-order logic, first-order number theory, axiomatic set theory, and the theory of computability. The text also discusses the major results of Gödel, Church, Kleene, Rosser, and Turing.The sixth edition incorporates recent work on Gödel's second incompleteness theorem as well as restoring an appendix on consistency proofs for first-order arithmetic. This appendix last appeared in the first edition. It is offered in th
Directory of Open Access Journals (Sweden)
Elena Solana-Arellano
2008-09-01
Full Text Available Seagrass beds provide much of the primary production in estuaries; host many fishes and fish larvae, and abate erosion. The present study presents original analytical methods for estimating mean leaf-growth rates of eelgrass (Zostera marina. The method was calibrated by using data collected in a Z. marina meadow at Punta Banda estuary in Baja California, Mexico. The analytical assessments were based on measurements of leaf length and standard regression procedures. We present a detailed explanation of the formal procedures involved in the derivation of these analytical methods. The measured daily leaf-growth rate was 10.9 mm d-1 leaf-1. The corresponding value projected by our method was 10.2 mm d-1 leaf-1. The associated standard errors were of 0.53 and 0.56 mm d-1 leaf-1 respectively. The method was validated by projecting leaf-growth rates from an independent data set, which gave consistent results. The use of the method to obtain the mean leaf growth rate of a transplanted plot is also illustrated. Comparison of our leaf-growth data with previously reported assessments show the significant forcing of sea-surface temperature on eelgrass leaf dynamics. The formal constructs provided here are of general scope and can be applied to equivalent eelgrass data sets in a straightforward manner. Rev. Biol. Trop. 56 (3: 1003-1013. Epub 2008 September 30.Las praderas de pastos marinos abaten la erosión y aportan gran parte de la productividad primaria de los esteros y son refugio de muchos peces y sus larvas. El presente trabajo introduce métodos analíticos para estimar las tasas medias de crecimiento foliar de Zostera marina L. y sus varianzas. La calibración del método se llevó a cabo utilizando datos de una pradera de esta fanerógama en el Estero de Punta Banda Baja California, México. Las referidas estimaciones analíticas, se basan en medias de longitud foliar y en procedimientos estandarizados de regresión. Dichas determinaciones son por
Pancardo, Pablo; Hernández-Nolasco, J. A.; Acosta-Escalante, Francisco
2018-01-01
Knowing the perceived exertion of workers during their physical activities facilitates the decision-making of supervisors regarding the worker allocation in the appropriate job, actions to prevent accidents, and reassignment of tasks, among others. However, although wearable heart rate sensors represent an effective way to capture perceived exertion, ergonomic methods are generic and they do not consider the diffuse nature of the ranges that classify the efforts. Personalized monitoring is ne...
Directory of Open Access Journals (Sweden)
Sahbi Marrouchi
2014-01-01
Full Text Available Due to the continuous increase of the population and the perpetual progress of industry, the energy management presents nowadays a relevant topic that concerns researchers in electrical engineering. Indeed, in order to establish a good exploitation of the electrical grid, it is necessary to solve technical and economic problems. This can only be done through the resolution of the Unit Commitment Problem. Unit Commitment Problem allows optimizing the combination of the production units’ states and determining their production planning, in order to satisfy the expected consumption with minimal cost during a specified period which varies usually from 24 hours to one week. However, each production unit has some constraints that make this problem complex, combinatorial, and nonlinear. This paper presents a comparative study between a strategy based on hybrid gradient-genetic algorithm method and two strategies based on metaheuristic methods, fuzzy logic, and genetic algorithm, in order to predict the combinations and the unit commitment scheduling of each production unit in one side and to minimize the total production cost in the other side. To test the performance of the optimization proposed strategies, strategies have been applied to the IEEE electrical network 14 busses and the obtained results are very promising.
Understanding Social Media Logic
Directory of Open Access Journals (Sweden)
José van Dijck
2013-08-01
Full Text Available Over the past decade, social media platforms have penetrated deeply into the mechanics of everyday life, affecting people's informal interactions, as well as institutional structures and professional routines. Far from being neutral platforms for everyone, social media have changed the conditions and rules of social interaction. In this article, we examine the intricate dynamic between social media platforms, mass media, users, and social institutions by calling attention to social media logic—the norms, strategies, mechanisms, and economies—underpinning its dynamics. This logic will be considered in light of what has been identified as mass media logic, which has helped spread the media's powerful discourse outside its institutional boundaries. Theorizing social media logic, we identify four grounding principles—programmability, popularity, connectivity, and datafication—and argue that these principles become increasingly entangled with mass media logic. The logic of social media, rooted in these grounding principles and strategies, is gradually invading all areas of public life. Besides print news and broadcasting, it also affects law and order, social activism, politics, and so forth. Therefore, its sustaining logic and widespread dissemination deserve to be scrutinized in detail in order to better understand its impact in various domains. Concentrating on the tactics and strategies at work in social media logic, we reassess the constellation of power relationships in which social practices unfold, raising questions such as: How does social media logic modify or enhance existing mass media logic? And how is this new media logic exported beyond the boundaries of (social or mass media proper? The underlying principles, tactics, and strategies may be relatively simple to identify, but it is much harder to map the complex connections between platforms that distribute this logic: users that employ them, technologies that
Survey of Existing Tools for Formal Verification.
Energy Technology Data Exchange (ETDEWEB)
Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.; Jackson, Mayo
2014-12-01
Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.
Kwiatkowski, Maciej; Todd, Benjamin
The present thesis was realised within the framework of the Doctoral Student programme at the European Organisation for Nuclear Research CERN, which is situated near Geneva. The aim of this thesis was to develop a method for reliable firmware implementation and to use that method to implement a new firmware for the Safe Machine Parameters (SMP) system. That system relies heavily on the Field Programmable Gate Arrays (FPGA) and it is one of the key machine protection systems of the Large Hadron Collider (LHC). The conception of the SMP hardware originates from the fully tested Beam Interlock System (BIS) being a result of another PhD thesis. For that reason the reliable SMP hardware was preserved unchanged. The first version of the SMP was ready for the LHC startup in the year 2008. Nevertheless the quality of the SMP firmware was objectionable. There were new requirements and therefore the SMP specification was extended. On that occasion it was decided that the existing SMP firmware will not be continued and ...
Kwiatkowski, M
2014-01-01
The present thesis was realised within the framework of the Doctoral Student programme at the European Organisation for Nuclear Research CERN, which is situated near Geneva. The aim of this thesis was to develop a method for reliable rmware implementation and to use that method to implement a new rmware for the Safe Machine Parameters (SMP) system. That system relies heavily on the Field Programmable Gate Arrays (FPGA) and it is one of the key machine protection systems of the Large Hadron Collider (LHC). The conception of the SMP hardware originates from the fully tested Beam Interlock System (BIS) being a result of another PhD thesis [1]. For that reason the reliable SMP hardware was preserved unchanged. The rst version of the SMP was ready for the LHC startup in the year 2008. Nevertheless the quality of the SMP rmware was objectionable. There were new requirements and therefore the SMP speci cation was extended. On that occasion it was decided that the existing SMP rmware will not be continued and that it...
Reconfigurable chaotic logic gates based on novel chaotic circuit
International Nuclear Information System (INIS)
Behnia, S.; Pazhotan, Z.; Ezzati, N.; Akhshani, A.
2014-01-01
Highlights: • A novel method for implementing logic gates based on chaotic maps is introduced. • The logic gates can be implemented without any changes in the threshold voltage. • The chaos-based logic gates may serve as basic components of future computing devices. - Abstract: The logical operations are one of the key issues in today’s computer architecture. Nowadays, there is a great interest in developing alternative ways to get the logic operations by chaos computing. In this paper, a novel implementation method of reconfigurable logic gates based on one-parameter families of chaotic maps is introduced. The special behavior of these chaotic maps can be utilized to provide same threshold voltage for all logic gates. However, there is a wide interval for choosing a control parameter for all reconfigurable logic gates. Furthermore, an experimental implementation of this nonlinear system is presented to demonstrate the robustness of computing capability of chaotic circuits