Published: 14th April 2022 DOI: 10.4204/EPTCS.358 ISSN: 2075-2180 |
https://easychair.org/smart-program/NCL'22/
The Program Committee received 44 high-quality submissions, which were evaluated on the basis of their significance, novelty and technical correctness. Reviewing was single-blind and each paper was subjected to at least three independent reviews, followed by a thorough discussion within the Program Committee. 22 submissions were selected for presentation on the basis of their quality. This volume contains abstracts of invited talks, descriptions of workshops and tutorials, and full versions of accepted submissions.I sketch an approach to logic and indexicality, which I illustrate using hybrid tense logic and temporal indexicals. I have chosen temporal indexicals because they are probably the easiest natural language indexicals to model in a propositional logic. I have chosen hybrid tense logic because while it is rooted in the Priorean tradition of taking the internal perspective on time seriously, it also allows us to incorporate both Reichenbach's approach to temporal reference, and (as we shall see) a Kaplan-style character approach to indexicality. This will enable us to view "now" as a bridge that leads us to the logic(s) of "yesterday," "today", "tomorrow," and indeed other indexicals.
We start with an arbitrary but fixed positive integer n. If n is even, we divide it by 2 and get n/2. If n is odd, we multiply n by 3 and add 1 to obtain the even number 3n + 1. Then we repeat the procedure indefinitely. Each sequence defined in this way is referred to as a Collatz sequence. The Collatz conjecture says that no matter what number n we start with, we will always eventually reach 1. For example, if one starts with 5, one gets
5, 3 · 5 + 1 = 16, 8, 4, 2, 1.
The number 7 yields the sequence:7, 3 · 7 + 1 = 22, 11, 3 · 11 + 1 = 34, 17, 3 · 17 + 1 = 52, 26, 13, 3 · 13 + 1 = 40, 20, 10, 5, 3 · 5 + 1 = 16, 8, 4, 2, 1.
The Collatz conjecture is an old problem in number theory, named after Lothar Collatz, who proposed it in 1937.
In this talk, we present logical formalisms in which reasoning about concrete domains is embedded in formulae at the atomic level. These mainly include temporal/description logics with concrete domains. For the simple concrete domain (ℕ,<), we present known proof techniques to handle satisfiable (infinite) symbolic models, sometimes at the cost of going beyond omega-regularity. The talk is freely inspired from [1].
References:
[1] Stéphane Demri & Karin Quaas (2021): Concrete Domains in Logics: A Survey. ACM SIGLOG News 8(3), pp. 6–29, doi:10.1145/3477986.3477988.
We present CEGAR-Tableaux, a tableaux-like method for propositional modal logics utilising SAT-solvers, modal clause-learning and multiple optimisations from modal and description logic tableaux calculi. We use the standard Counter-example Guided Abstract Refinement (CEGAR) strategy for SAT-solvers to mimic a tableau-like search strategy that explores a rooted tree-model with the classical propositional logic part of each Kripke world evaluated using a SAT-solver. Unlike modal encodings into SAT-solvers and modal resolution methods, we do not explicitly represent the accessibility relation but track it implicitly via recursion. By using "satisfiability under unit assumptions", we can iterate rather than "backtrack" over the satisfiable diamonds at the same modal level (context) of the tree model with one SAT-solver. By keeping modal contexts separate from one another, we add further refinements for reflexivity and transitivity which manipulate modal contexts once only. Our solver CEGARBox is, overall, the best for modal logics K, KT and S4 over the standard benchmarks, sometimes by orders of magnitude. It can now also handle all of the 15 logics in the modal cube as well as multimodal tense logic Kt(n) (AKA description logic 𝒜ℒ𝒞ℐ). We believe that in the future, all tableaux calculi implementations will use CEGAR-Tableaux.
The idea of tolerance relations seen as a formalization of the intuitive notion of resemblance was formalized by E. C. Zeeman in 1960s. In mathematics, as a natural generalization of congruences, tolerances appeared to be a very useful tool, especially in universal algebra.
Tolerances of a lattice L are reflexive, symmetric relations compatible with lattice operations of L. They form a lattice denoted by Tol(L).
A block of a tolerance T ∊ Tol(L) is a maximal set X ⊆ L satisfying X2 ⊆ T. Blocks are convex sublattices of L, and it was shown by G. Czédli that they form a lattice (denoted by L/T), called the factor lattice of L modulo T. Although this construction generalizes the notion of a factor lattice L/φ defined by means of a congruence φ ∊ Con(L), properties of factor lattices modulo tolerances are significantly different. For instance, L/φ belongs to the same equational class as L, however, the lattice L/T does not belong to it, in general.
It is known that for any φ ∊ Con(L), the congruence lattice of the factor lattice L/φ is isomorphic to the principal filter [φ) of φ in Con(L) (homomorphism theorem). Moreover, any ψ ∊ Con(L), ψ ≥ φ induces the congruence ψ/φ on the factor lattice L/φ such that (L/φ)/(ψ/φ) ≅ L/ψ holds (second isomorphism theorem).
In this talk we formulate analogous results for tolerance factors defining a new partial order ⊑ on the lattice Tol(L) such that for any S ∊ Tol(L) with T ⊑ S, the tolerance S/T is induced on the factor lattice L/T. We also discuss the philosophical background of tolerances and their significance in lattice theory.
References
[1] Ivan Chajda (1991): Algebraic Theory of Tolerance Relations. Univerzita Palackého Olomouc, Olomouc.
[2] George Grätzer (2011): Lattice Theory: Foundation. Springer, Basel, doi:10.1007/978-3-0348-0018-1.
[3] Joanna Grygiel (2019): Many faces of lattice tolerances. Bulletin of the Section of Logic 48(4), pp. 285–298, doi:10.18778/0138-0680.48.4.03.
[4] Joanna Grygiel & Sándor Radeleczki (2013): On the tolerance lattice of tolerance factors. Acta Mathematica
Hungarica 141(3), pp. 220–237, doi:10.1007/s10474-013-0340-x.
[5] Eric Christopher Zeeman (1962): The topology of the brain and visual perception. In Marion Kirkland Fort,
editor: Topology of 3-Manifolds and Related Topics, Dover Publications, New Jersey.
There is a common agreement that each connexive logic should satisfy the Aristotles and Boethian
Theses (AB). However, the sole AB theses don't guarantee any common content or other form of
"connexions" as they are true in binary matrix {0,1} with distinguished value of 1, with classical
material implication and negation defined as ~1 = ~0 = 1. Similarly, AB are true in a binary matrix
with classical negation and implication defined as x ⇒ y = 1 iff x = y.
Counterexamples above show that the sole AB theses are very weak and should be strengthen in
some way. We can eliminate the first counterexample by assuming that negation behaves in a classical
way. We did it in [6] and [7]. It brings us to the notion of Boolean connexive logic. Boolean algebras
are defined in terms of ∧, ∨, ¬. In Boolean connexive logic these connectives behave in classical way.
It made us to refer to George Boole here.
We can eliminate a second counterexample and consider connexive logics with material implication.
It is not known whether that could lead to a promising research area. Anyway we could call such logics `material
connexive logics'. Obviously we could eliminate both the counterexamples. It would bring us to
connexive logics which are neither Boolean not material.
In this lecture we concetrate on Boolean connexive logics. We indicate a numer of possible
desiderata strengthening Aristotle and Boethius' theses. We investigate them by means of relating
semantics.
A relating semantics has its origins in Epstein and Walton papers. However in its full generality
it has been proposed by Jarmużek and Kaczkowski in [4]. It is extensively studied among others in
[3], [5], [10]. The general idea of relating semantics is based on the relating relation R. The fact that
ARB – sentences A and B relate with respect to R – could be interpreted in many ways depending on
the motivations of a given logical system. For example, A and B could be related causally, they could
be related in the sense of time order, they could have a common content. Generally, two sentences
are related because they have something in common.
In Boolean connexive logics only implication depends on a relating relation R. The truth conditions
for the propositional letters and ∧, ∨, and ¬ remain standard, i.e. as in classical logic. Only
implication → has an intensional nature, it depends on a relation R: 〈v,R〉 ⊨ A → B iff [〈v,R〉 ⊭ A or 〈v,R〉 ⊨ B] & ARB.
By a minimal Boolean connexive logic we mean the least set of sentences containing all classical
tautologies expressed by means of ∧, ∨, ¬, (A1), (A2), (B1), (B2), (A → B) ⊃ (A ⊃ B) and closed
under substitutions and modus ponens with respect to ⊃. ⊃ denotes material implication.
Let JT denote a class of all relations R such that R is (a1), (a2), (b1), (b2). Let JT¬ denote a
class of all relations from JT satisfying (c1), where:
Many-sorted logic is not only a natural logic for reasoning about more than one sort of objects, but also a unifying logic, the best target logic in translation issues, due to its efficient proof theory, flexibility, naturalness and versatility to adapt to reasoning about a variety of objects.
Translations can be seen as the path to completeness in three possible stages: abstract completeness at the first stage, weak and strong completeness at the other two. The case of second-order logic can act as the source of inspiration for translations of other logical systems. The idea is to include in the many-sorted structure obtained by direct conversion of the logic being translated a domain for sets (and relations); namely, for all sets and relations defined in the original logic with their own formulas. And so, we are somehow shifting to second-order logic. In these structures, the Comprehension Schema restricted to formulas which are translations is always true. The idea of restricting CS to obtain new logics is taken from Henkin [1].
References:
[1] Leon Henkin (1953): Banishing the rule of substitution for functional variables. Journal of Symbolic Logic 18(3), pp. 201–208, doi:10.2307/2267403.
In this paper, a paradox is shown to arise from prima facie highly plausible assumptions for the existence predicate as applied to definite descriptions. There are several possibilities to evade the paradox; all involve modifications in first-order logic with identity, existence, and definite descriptions; some stay within classical logic, others leave it. The merits of the various "ways out" are compared. It is proposed, and supported by argument, that the most attractive "way out" is within classical logic and has the consequence that there is a new logical truth: "There is at least one nonexistent object." But this "exit" will certainly not be to everyone's taste and liking. Thus, the paradox defies complete resolution (as every good paradox should).
In the Polish tradition, a (propositional) logic L = LA arises from a structure (matrix) A on a universe U equipped with operations and constants, often including a top (designated) element 1. To take care of the fundamental notion of "consequence", A is usually equipped with an U-valued operation → on U2 satisfying conditions (a) x → y = 1 iff x ≤ y, and (b) x → (y → z) = y → (x → z). For instance, when U = {0,1} and ¬ x is the derived operation x → 0, the only algebra A = (U, ¬, →) satisfying condition (a) is the two-element boolean algebra, (b) automatically follows from (a), and LA is boolean logic. When U = [0,1] ⊆ ℝ, and → is a continuous function satisfying conditions (a)–(b), then ([0,1], ¬, →) is the standard Wajsberg algebra, [3]. Wajsberg algebras, like their term-equivalent MV-algebras, provide algebraic counterparts of infinite-valued Łukasiewicz logic Ł∞.
Let K0 be Grothendieck's functor and Γ the categorical equivalence between unital lattice-ordered abelian groups and MV-algebras. Combining K0 and Γ, from Elliott's classification [1] it follows that for every AF C*-algebra C with lattice-ordered K0(C), every Ł∞-formula φ codes a Murray-von Neumann equivalence class [p] = [p]φ,C of projections of C. Conversely, for every projection p of C there is an Ł∞-formula coding p. The logic algorithmic machinery of Ł∞ can be now applied to decision problem for AF C*-algebras. For instance: Is p a central projection of A? Is p = 0? Does [p] precede [q] in the Murray-von Neumann order of C? In [2] and [4] polytime reductions are constructed among these problems in any fixed AF algebra C such that K0(C) is lattice-ordered – a property of most AF algebras in the literature. The complexity of all these problems turns out to be polytime for many relevant AF algebras, including the Behncke-Leptin algebras Am,n, the CAR algebra, the Farey-Stern-Brocot algebra, Glimm's universal UHF algebra, and every Effros-Shen algebra Fθ for θ a real algebraic integer, or θ = 1/e, with e Euler's constant.
References:
[1] George A. Elliott (1976): On the Classification of Inductive Limits of Sequences of Semisimple Finite-
Dimensional Algebras. Journal of Algebra 38, pp. 29–44, doi:10.1016/0021-8693(76)90242-8.
[2] Daniele Mundici (2018): Word problems in Elliott monoids. Advances in Mathematics 335, pp. 343–371,
doi:10.1016/j.aim.2018.07.015.
[3] Daniele Mundici (2020): What the Łukasiewicz axioms mean. The Journal of Symbolic Logic 85(3), pp.
906–917, doi:10.1017/jsl.2020.74.
[4] Daniele Mundici (2021): Bratteli diagrams via the De Concini-Procesi theorem. Communications in Contemporary
Mathematics 23(7), doi:10.1142/S021919972050073X. Article No. 2050073.
The varieties of regular double Stone algebras and regular Kleene Stone Algebras are fairly well-known and well studied. The amazing similarity in their structures led me to introduce a new variety as a common generalization of the two varieties. The algebras in this new variety are called "Gautama algebras" in memory and honor of the founders of Indian Logic: Medhatithi Gautama and Akshapada Gautama. It turns out that the variety 𝔾 of Gautama algebras is the join of the variety of regular Stone algebra and regular Kleene Stone algebras and also that the variety 𝔾 is the equivalent algebraic semantics of a new logic (in the sense of Blok and Pigozzi).
More recently, the variety 𝔾 has been further generalized to "Almost Gautama algebras." It turns out that this new variety also has a corresponding logic.
In this lecture, I wish to present results about these new varieties and their corresponding logics.
This presentation will give an overview of our ongoing work in developing knowledge extraction methods for description logic-based ontologies. Because the stored knowledge is not only given by axioms stated in an ontology but also by the knowledge that can be inferred from these axioms, knowledge extraction is a challenging problem. Forgetting creates a compact and faithful representation of the stored knowledge over a user-specified signature by performing inferences on the symbols not in this signature. After an introduction of the idea of forgetting, an overview of our forgetting tools and some applications we have explored, I will discuss recent work we have done in collaboration with SNOMED Intl to use our tools on the medical ontology SNOMED CT. Building on these experiences, we have developed a new consequence-based approach for computing self-contained subontologies that satisfy the SNOMED modelling guidelines. Such subontologies make it easier to reuse and share content, assist with ontology analysis, and querying and classification is faster.
The research was undertaken in several projects spanning several years funded by the UKRI/EPSRC (research, Doctoral awards, IAA), the University of Manchester and IHTSDO (SNOMED Intl).
Structural reasoning is reasoning that is governed exclusively by structural rules. In this context a proof system can be said to be structural if all of its proper inference rules are structural. A logic is considered to be structuralizable if it can be equipped with a sound and complete structural proof system. In my talk I will provide a general formulation of the problem of structuralizability, giving specific consideration to a family of logics that are based on the Dunn-Belnap four-valued semantics. It is shown how sound and complete structural proof systems can be constructed for a spectrum of logics within different logical frameworks.
When we talk about intensional proof-theoretic semantics, we normally mean an approach, which takes into account not only whether something has been proved, but also how it has been proved and whether several ways of proving it should be considered different or not. Therefore its central concept is that of identity of proofs. However, there are other topics where intensionality comes in, which I would like to discuss. Using as an example the rule of contraction, I argue that we must distinguish between different ways the same formula is given to us (to use Frege's terminology). In the context of the paradoxes there are contexts where two occurrences of a formulas cannot plausibly contracted into one, as they are given to us in different ways. As another example I consider inductive definitions and argue that an alteration of their form, which does not alter their deductive strength, nevertheless affects which information can be extracted from them.
The idea of Non-Fregean Logics emerged in the sixties together with Roman Suszko's willingness to formalize Wittgenstein's Tractatus [5,4]. Non-Fregean Logics (NFL, for short) owe their name to the rejection of the so called Fregean Axiom which says that the identity of referents of two given sentences boils down to the identity of their logical values [3]. In NFLs semantic correlates of sentences are not their logical values, but situations. The language of an NFL is equipped with the connective of identity, ≡, which is intended to reflect the fact that two sentences describe the same situation.
The weakest considered by Suszko NFL is the Sentential Calculus with Identity, SCI for short, build upon Classical Sentential Calculus by the addition of the following axioms characterizing ≡:
(≡1) φ ≡ φ
(≡2) (φ ≡ ψ) → (¬ φ ≡ ¬ ψ)
(≡3) (φ ≡ ψ) → (φ ↔ ψ)
(≡4) ((φ ≡ ψ) ∧ (α ≡ β)) → ((φ ⊗ α) ≡ (ψ ⊗ β))
Suszko's identity refers to a congruence relation, and it is clearly stronger than equivalence. The only valid formulas having ≡ as the main connective are of the form `φ ≡ φ', in this sense Non-Fregean identity formalized within SCI is a very strong connective. This interpretation is usually relaxed by extending the axiomatic basis of SCI.
The tutorial Proof-Theoretical Analysis of Non-Fregean Logics, proposed for the conference Non-Classical Logics. Theory and Applications 2022, focuses on proof-theory for NFLs. We pay special attention to structural proof theory taking into consideration sequent systems and natural deduction systems. The issues of cut-elimination and normalization are discussed. We also outline ideas for possible extensions of SCI and its intuitionistic counterpart called ISCI.
We begin with a synthetic tableau system (ST-system, for short) for SCI. In this method there is exactly one binary branching rule and a collection of linear rules called synthetic or synthesizing, since they build complex formulas from their subformulas and/or from their negations. The identity connective is characterized by a collection of such synthesizing rules encoding the basic properties of congruence.
We then move to the structural proof theory for SCI; we present sequent calculus which encapsulates the main properties of identity expressed within axioms (≡1)–(≡4), [1]. This particular calculus is then modified to fit the intuitionistic setting of ISCI, [2], for which we also discuss the decidability issue. We also cover WB – a Boolean extension of SCI obtained through the addition of six axioms extending the properties of the identity connective. Now, in contrast to SCI, we can consider a larger set of valid formulas built with the identity connective. We discuss a sequent calculus formalization of WB and its algebraic semantics. We also shortly discuss other extensions considered by Suszko: WT and WH.
In the remaining parts of the tutorial we focus on natural deduction systems for ISCI. Intuitionistic setting requires a constructive interpretation of identity, different ideas of which are discussed. This discussion, in turn, leads us to the identity's relation to isomorphism, to which we come back in the last part of the workshop. We also address cut elimination in sequent calculus and normalization in natural deduction systems.
Most studied extensions of SCI are WB, WT and WH. We believe it is beneficial to consider analogous extensions of ISCI as well. Two usual ways of introducing extensions of an NFL are through the addition of axioms extending the properties of identity connective or by the addition of inference rules. Here we discuss less straightforward strategy. In WB a formula of the form p 𠪪 p is valid. Naturally, the addition of this formula to ISCI makes it an extension of SCI at once: the law of excluded middle becomes derivable and the starting-point logic is no longer intuitionistic. Thus we need to consider only those extensions that do not affect the constructive character of the logic. We consider two such extensions; one of them is related to the notion of propositional isomorphism, the other introduces a special case of the law of excluded middle.
References:
[1] Szymon Chlebowski (2018): Sequent Calculi for SCI. Studia Logica 106(3), pp. 541–563, doi:10.1007/s11225-017-9754-8.
[2] Szymon Chlebowski & Dorota Leszczyńska-Jasion (2019): An investigation into intuitionistic logic with identity. Bulletin of the Section of Logic 48(4), pp. 259–283, doi:10.18778/0138-0680.48.4.02.
[3] Gottlob Frege (1948): Sense and reference. The Philosophical Review 57(3), pp. 209–230. Available at
10.2307/2181485.
[4] Mieczysław Omyła (1986): Zarys Logiki Niefregowskiej (Introduction to Non-Fregean Logic). Państwowe Wydawnictwo Naukowe, Warszawa.
[5] Roman Suszko (1975): Abolition of the Fregean axiom. In Rohit Parikh, editor: Logic Colloquium, Lecture
Notes in Mathematics 453, Springer, Berlin, Heidelberg, pp. 169–239, doi:10.1007/BFb0064874.
Acknowledgement
The work of Dorota Leszczyńska-Jasion, Szymon Chlebowski and Agata Tomczyk was supported financially by National Science Centre, grant no 2017/26/E/HS1/00127.
Relating Logic is a logic of relating connectives (just as Modal Logic is a logic of modal operators). The basic idea behind a relating connectives is that the logical value of a given complex proposition is the result of two things:
(i) the logical values of the main components of this complex proposition, supplemented with
(ii) a valuation of the relation between these components.
The latter element is a formal representation of an intensional relation that emerges from the connection of several simpler propositions into one more complex proposition. More formally, let A1, ..., An be propositions with some fixed logical values and let c be an n-ary relating connective. Then the logical value of complex sentence c(A1, ..., An) depends not only on the logical values of A1, ..., An, but additionally on the value of the connection between A1, ..., An. It therefore depends on an additional valuation of pairs (n-tuples) that is the part of the overall process of evaluation of the logical values of complex propositions built with relating connectives. This way we can form logical systems to deal with reasoning about non-logical relationships. They are determined with relating semantics.
Although the simplest model in relating semantics is a pair ⟨ v, R ⟩, the situation may get more complicated. We can use multi-relating models to represent more types of non-logical relations between sentences. In addition, the valuation of relationships between sentences may not be binary but may be many-valued or more subtly graded. Furthermore, we can mix relating semantics with possible world semantics, equipping all worlds with additional valuations of complex sentences. Last, but not least, any semantics may be treated as relating one, when we assume that in case of complex sentences a relationship is represented by a universal relation.
The solution that relating semantics offers seems to be quite natural, since when two (or more) propositions in natural language are connected by a connective, some sort of emergence occurs. In fact, the key feature of intensionality is that adding a new connective results in the emergence of a new quality, which itself does not belong to the components of a given complex proposition built by means of the same connective. An additional valuation function determines precisely this quality.
Talking about emergence is justified here, because the quality that arises as a result of the connections between the constituent propositions is not reducible to the properties of those propositions. Consequently, if the phenomenon of emergence is to be properly captured, we need additional valuations in a model. The key feature of relating semantics is that it enables us to treat non-logical relations between sentences seriously.
References:
[1] Tomasz Jarmużek & Francesco Paoli (2021): Relating Logic and Relating Semantics. History, Philosophical
Applications and Some of Technical Problems. Logic and Logical Philosophy 30(4), p. 563–577,
doi:10.12775/LLP.2021.025.
[2] Mateusz Klonowski (2021): History of Relating Logic. The Origin and Research Directions. Logic and Logical
Philosophy 30(4), p. 579–629, doi:10.12775/LLP.2021.021.
Ontology is a theoretical subdiscipline of philosophy. `What exists, what is?' is the fundamental question of ontological research. There exist and there are people, animals, plants, clouds, mountains. There are also qualities, relations, actions, sensations, time and space. Sometimes ontology (metaphysics) is defined as the philosophical science of being as being. Formal ontology is a certain variant of ontology as such, a certain way of investigating and inquiring into ontological questions, problems, concepts, and theorems. What kind of way is it? Here, in formal ontology, we use formal tools and methods. To these tools and methods belong, among others, methods which we can find in logic, algebra, systems theory, and, recently, also general topology. For example, in 1905 Russell's theory of descrptions (On Denoting) was presented, which explained, with the help of logical tools, how to use names correctly. In the 1920s, Polish logician J. Łukasiewicz used 3-valued logic to study the determinism/indeterminism problem. In the 1980s and 1990s, the philosophy of action (cf. Belnap and Perloff) was developed; this philosophy and logic (cf. STIT Logic) can be seen as a formal analysis of one of Aristotle's categories, namely the category of action; in the 20th century between 1920 and 1930, Russell and Wittgenstein proposed the ontology of logical atomism that relied heavily on the achievements of sentence logic and first-order logic of the time. In turn, in the 1980s, Wittgenstein's proposal presented in Tractatus Logico-Philosophicus was interpreted by Polish philosopher and ontologist B. Wolniewicz. His interpretation makes use of lattice theory. In the last few decades general topology and mathematical theory of categories (mathematical because we want to distinguish it from the theory of categories already proposed by Aristotle) have also been used to study and interpret philosophical and ontological problems (e.g., R. Thom, K. Kelly, O. Schulte and C. Juhl, T. Mormann). During the workshop selected propositions, topics, problems in the field of formal ontology, including topological ontology, will be presented. The following philosophers ontologists, and logicians will take part in it:
Axiomatic Formal Ontology is a fairly comprehensive systematic treatise on general metaphysics. The axiomatic method is applied throughout the book. Its main theme is the construction of a general non-set-theoretical theory of intensional entities. Other important matters discussed are the metaphysics of modality, the nature of actual existence, mereology and the taxonomy of entities.
Graded logic (GL) is the logic of human reasoning with graded percepts. Human percepts of
truth, importance, satisfaction, suitability, preference, and many others, are all graded, i.e., they are a
matter of degree. Without the loss of generality, all degrees are normalized in the interval [0,1], where 0
denotes the lowest degree and 1 denotes the highest degree of the intensity of given percept. Truth is the
most important of graded percepts because all graded percepts can be described as the degree of truth of
the statement asserting that a given percept has the highest intensity. For example, the degree of truth of
the statement "the object A completely satisfies all our requirements" is equivalent to the degree of
satisfaction (or suitability) of A. Obviously, truth is not an anonymous real number, but the percept that
has both the intensity and a clear semantic identity: the role, meaning, and importance for specific
stakeholder. The stakeholder is defined as an individual, or an organization, who specify an assertion, and
then need to know its degree of truth. Usually, that is done in the process of decision-making, where the
stakeholder evaluates alternatives and selects the most suitable alternative. GL is fully humancentric: its
main goal is to be consistent with observable properties of human reasoning. That is the motivation for
development of GL.
GL is based on the concept that both simultaneity (conjunction) and substitutability (disjunction) are a
matter of degree. This concept was introduced in 1973. The conjunction degree (later renamed andness) is
an indicator of similarity between a logic aggregator and the full (or pure) conjunction (the minimum
function). The disjunction degree (later renamed orness) is an indicator of similarity between a logic
aggregator and the full (pure) disjunction (the maximum function). Andness and orness are adjustable and
complementary indicators, observable and measurable in human reasoning. The highest orness (1)
corresponds to the lowest andness (0), and the highest andness (1) corresponds to the lowest orness (0).
The andness 1 denotes the full conjunction, and the orness 1 denotes the full disjunction. Between these
extreme points we have the graded (or partial) conjunction and the graded (or partial) disjunction
functions with adjustable degrees of andness/orness. Therefore, the models of graded
conjunction/disjunction must provide continuous transition from the full conjunction to the full
disjunction.
Both the inputs and the output of the graded conjunction function and the graded disjunction function
are the degrees of truth. The graded conjunction is a basic GL function where andness is greater than
orness. It is the model of simultaneity, and consequently the output of this function is primarily affected
by the low values of arguments. The graded disjunction is a basic GL function where orness is greater
than andness. It is the model of substitutability, and consequently the output of this function is primarily
affected by the high values of arguments. Human logic reasoning provably combines simultaneity and
substitutability.
All means are functions that return values between the minimum and the maximum of their
arguments. Thus, means can be interpreted as logic functions. The centroid of all means, where andness
equals orness (i.e., both have the value 1/2), is the traditional arithmetic mean. The arithmetic mean is
interpreted as the logic neutrality because it simultaneously has 50% of conjunctive properties and 50%
of disjunctive properties. Parameterized means, such as power means and exponential means, provide the
desirable continuous transition from the full conjunction to the full disjunction. In addition, when used as
logic functions, means provide another necessary property: means can be weighted and support the
degree of importance of input arguments. In human reasoning, each truth has a meaning and consequently
also a specific degree of importance for its stakeholder. It is not exaggeration to claim that all logic
models where truth does not come with its importance degree, must be rejected as realistic models of
human reasoning. Many means, such as weighted power means, use weights to express the impact/importance of individual arguments. So, such means are the primary candidates for serving as graded logic aggregators.
The use of annihilators is a fundamental observable property of the graded conjunction and the graded disjunction used in human reasoning. If the graded conjunction supports the annihilator 0, then
such operator is called hard; if the annihilator 0 is not supported, then the operator is called soft. If the
graded disjunction supports the annihilator 1, then such operator is called hard; if the annihilator 1 is not
supported, then the operator is called soft. The hard graded conjunction can be verbalized as the "must
have" condition, where each input is considered mandatory, so that its nonsatisfaction unconditionally
yields the output 0. The hard graded disjunction can be verbalized as the "enough to have" condition,
where each input is considered sufficient to fully satisfy stakeholder's requirements: its value 1
unconditionally yields the output 1. Both the graded conjunction and the graded disjunction can be soft,
verbalized as the "nice to have" condition where the properties of aggregation are conjunctive or
disjunctive, but the annihilators are not supported. Both hard and soft logic aggregators are provably
present and used in human reasoning.
The basic Graded Conjunction/Disjunction (basic GCD) is the fundamental GL aggregation function
that includes and integrates 7 types of logic aggregation: full conjunction, hard graded conjunction, soft
graded conjunction, neutrality (the weighted arithmetic mean), soft graded disjunction, hard graded
disjunction, and the full disjunction. The basic GCD and negation are sufficient to create all other
compound GL functions, such as graded absorption, implication, abjunction, equivalence, exclusive
disjunction, and others. In vertices of the unit hypercube GL and the classical Boolean logic are identical.
Therefore, GL is a seamless generalization of the classical Boolean logic inside the whole unit
hypercube.
Conjunctive properties can be stronger than the full conjunction, and disjunctive properties can be
stronger than the full disjunction. The extreme conjunctive function is called the drastic conjunction (the
function where the output is 1 only if all inputs are 1, and in all other cases the output is 0). The extreme
disjunctive function is called the drastic disjunction (the function where the output is 0 only if all inputs
are 0, and in all other cases the output is 1). To cover the full spectrum of logic aggregators in GL we also
use the extended GCD, which is a function that provides the andness-directed continuous transition from
the drastic conjunction to the drastic disjunction. The extended GCD includes the basic GCD in the
segment between the pure conjunction and the pure disjunction. All forms of GCD satisfy De Morgan
duality.
The extended GCD in the range of high andness between the full conjunction and the drastic
conjunction is called hyperconjunction. Similarly, hyperdisjunction is the extended GCD in the high
orness range between the full disjunction and the drastic disjunction. Hyperconjunction covers the area of
t-norms and the hyperdisjunction covers the area of t-conorms. That includes the product t-norms/t-
conorms, which model the probability of independent events. In this way GL includes models of
probabilistic reasoning, building useful bridges between logic and the probability theory.
One of central results of GL is the graded logic conjecture which claims that 10 basic types of logic
functions, consisting of 7 function types of basic GCD, hyperconjunction, hyperdisjunction, and negation
are necessary and sufficient to adequately model mental processes when human beings aggregate
subjective categories, i.e., degrees of truth corresponding to various graded percepts. After selecting an
appropriate type of GCD aggregator, humans regularly perform fine tuning of aggregator by additionally
adjusting both the desired importance and the desired andness/orness. The extended GCD and negation
are necessary and sufficient components for building a graded propositional calculus that can process
graded/partial truth in a way consistent with observable properties of human reasoning.
GL is highly applicable and supported by collection of professional software tools. In industrial
setting, GL is the fundamental component of decision engineering, an area of professional decision-
making based on complex logic criteria, that can use hundreds of inputs. Published applications of GL are
in the areas of ecology, evaluation, optimization, and selection of computer hardware and software,
medicine (evaluation of disease severity and patient disability), public health (evaluation of priority for vaccination), geography and space management (suitability maps), agriculture, urban planning, evaluation
of data management systems, web browsers, search engines, windowed environments, websites, e-commerce sites, homes (in online real estate), groundwater contamination, cybersecurity, and others. A
detailed presentation of GL, a presentation of decision engineering applications of GL (evaluation,
optimization, comparison, and selection of complex alternatives), as well as a survey of the GL-based LSP decision method and
corresponding literature can be found in the monograph by J. Dujmović.
References:
[1] Jozo Dujmović (2018): Soft Computing Evaluation Logic: The LSP Decision Method and Its Applications.
John Wiley & Sons, Inc., doi:10.1002/9781119256489.