Skip to main content
Humanities LibreTexts

1.5: “And”

  • Page ID
    16858
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \( \newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\)

    ( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\id}{\mathrm{id}}\)

    \( \newcommand{\Span}{\mathrm{span}}\)

    \( \newcommand{\kernel}{\mathrm{null}\,}\)

    \( \newcommand{\range}{\mathrm{range}\,}\)

    \( \newcommand{\RealPart}{\mathrm{Re}}\)

    \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\)

    \( \newcommand{\Argument}{\mathrm{Arg}}\)

    \( \newcommand{\norm}[1]{\| #1 \|}\)

    \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\)

    \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    \( \newcommand{\vectorA}[1]{\vec{#1}}      % arrow\)

    \( \newcommand{\vectorAt}[1]{\vec{\text{#1}}}      % arrow\)

    \( \newcommand{\vectorB}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vectorC}[1]{\textbf{#1}} \)

    \( \newcommand{\vectorD}[1]{\overrightarrow{#1}} \)

    \( \newcommand{\vectorDt}[1]{\overrightarrow{\text{#1}}} \)

    \( \newcommand{\vectE}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash{\mathbf {#1}}}} \)

    \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \)

    \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)

    \(\newcommand{\avec}{\mathbf a}\) \(\newcommand{\bvec}{\mathbf b}\) \(\newcommand{\cvec}{\mathbf c}\) \(\newcommand{\dvec}{\mathbf d}\) \(\newcommand{\dtil}{\widetilde{\mathbf d}}\) \(\newcommand{\evec}{\mathbf e}\) \(\newcommand{\fvec}{\mathbf f}\) \(\newcommand{\nvec}{\mathbf n}\) \(\newcommand{\pvec}{\mathbf p}\) \(\newcommand{\qvec}{\mathbf q}\) \(\newcommand{\svec}{\mathbf s}\) \(\newcommand{\tvec}{\mathbf t}\) \(\newcommand{\uvec}{\mathbf u}\) \(\newcommand{\vvec}{\mathbf v}\) \(\newcommand{\wvec}{\mathbf w}\) \(\newcommand{\xvec}{\mathbf x}\) \(\newcommand{\yvec}{\mathbf y}\) \(\newcommand{\zvec}{\mathbf z}\) \(\newcommand{\rvec}{\mathbf r}\) \(\newcommand{\mvec}{\mathbf m}\) \(\newcommand{\zerovec}{\mathbf 0}\) \(\newcommand{\onevec}{\mathbf 1}\) \(\newcommand{\real}{\mathbb R}\) \(\newcommand{\twovec}[2]{\left[\begin{array}{r}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\ctwovec}[2]{\left[\begin{array}{c}#1 \\ #2 \end{array}\right]}\) \(\newcommand{\threevec}[3]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\cthreevec}[3]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \end{array}\right]}\) \(\newcommand{\fourvec}[4]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\cfourvec}[4]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \end{array}\right]}\) \(\newcommand{\fivevec}[5]{\left[\begin{array}{r}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\cfivevec}[5]{\left[\begin{array}{c}#1 \\ #2 \\ #3 \\ #4 \\ #5 \\ \end{array}\right]}\) \(\newcommand{\mattwo}[4]{\left[\begin{array}{rr}#1 \amp #2 \\ #3 \amp #4 \\ \end{array}\right]}\) \(\newcommand{\laspan}[1]{\text{Span}\{#1\}}\) \(\newcommand{\bcal}{\cal B}\) \(\newcommand{\ccal}{\cal C}\) \(\newcommand{\scal}{\cal S}\) \(\newcommand{\wcal}{\cal W}\) \(\newcommand{\ecal}{\cal E}\) \(\newcommand{\coords}[2]{\left\{#1\right\}_{#2}}\) \(\newcommand{\gray}[1]{\color{gray}{#1}}\) \(\newcommand{\lgray}[1]{\color{lightgray}{#1}}\) \(\newcommand{\rank}{\operatorname{rank}}\) \(\newcommand{\row}{\text{Row}}\) \(\newcommand{\col}{\text{Col}}\) \(\renewcommand{\row}{\text{Row}}\) \(\newcommand{\nul}{\text{Nul}}\) \(\newcommand{\var}{\text{Var}}\) \(\newcommand{\corr}{\text{corr}}\) \(\newcommand{\len}[1]{\left|#1\right|}\) \(\newcommand{\bbar}{\overline{\bvec}}\) \(\newcommand{\bhat}{\widehat{\bvec}}\) \(\newcommand{\bperp}{\bvec^\perp}\) \(\newcommand{\xhat}{\widehat{\xvec}}\) \(\newcommand{\vhat}{\widehat{\vvec}}\) \(\newcommand{\uhat}{\widehat{\uvec}}\) \(\newcommand{\what}{\widehat{\wvec}}\) \(\newcommand{\Sighat}{\widehat{\Sigma}}\) \(\newcommand{\lt}{<}\) \(\newcommand{\gt}{>}\) \(\newcommand{\amp}{&}\) \(\definecolor{fillinmathshade}{gray}{0.9}\)

    5. “And”

    5.1 The conjunction

    To make our logical language more easy and intuitive to use, we can now add to it elements that make it able to express the equivalents of other sentences from a natural language like English. Our translations will not be exact, but they will be close enough that: first, we will have a way to more quickly understand the language we are constructing; and, second, we will have a way to speak English more precisely when that is required of us.

    Consider the following expressions. How would we translate them into our logical language?

    Tom will go to Berlin and Paris.

    The number a is evenly divisible by 2 and 3.

    Steve is from Texas but not from Dallas.

    We could translate each of these using an atomic sentence. But then we would have lost—or rather we would have hidden—information that is clearly there in the English sentences. We can capture this information by introducing a new connective; one that corresponds to our “and”.

    To see this, consider whether you will agree that these sentences above are equivalent to the following sentences.

    Tom will go to Berlin and Tom will go to Paris.

    The number a is evenly divisible by 2 and the number a is evenly divisible by 3.

    Steve is from Texas and it is not the case that Steve is from Dallas.

    Once we grant that these sentences are equivalent to those above, we see that we can treat the “and” in each sentence as a truth functional connective.

    Suppose we assume the following key.

    P: Tom will go to Berlin.

    Q: Tom will go to Paris.

    R: a is evenly divisible by 2.

    S: a is evenly divisible by 3.

    T: Steve is from Texas

    U: Steve is from Dallas.

    A partial translation of these sentences would then be:

    P and Q

    R and S

    T and ¬U

    Our third sentence above might generate some controversy. How should we understand “but”? Consider that in terms of the truth value of the connected sentences, “but” is the same as “and”. That is, if you say “P but Q” you are asserting that both P and Q are true. However, in English there is extra meaning; the English “but” seems to indicate that the additional sentence is unexpected or counter-intuitive. “P but Q” seems to say, “P is true, and you will find it surprising or unexpected that Q is true also.” That extra meaning is lost in our logic. We will not be representing surprise or expectations. So, we can treat “but” as being the same as “and”. This captures the truth value of the sentence formed using “but”, which is all that we require of our logic.

    Following our method up until now, we want a symbol to stand for “and”. In recent years the most commonly used symbol has been “^”.

    The syntax for “^” is simple. If Φ and Ψ are sentences, then

    (Φ^Ψ)

    is a sentence. Our translations of our three example sentences should thus look like this:

    (P^Q)

    (R^S)

    (T^¬U)

    Each of these is called a “conjunction”. The two parts of a conjunction are called “conjuncts”.

    The semantics of the conjunction are given by its truth table. Most people find the conjunction’s semantics obvious. If I claim that both Φ and Ψ are true, normal usage requires that if Φ is false or Ψ is false, or both are false, then I spoke falsely also.

    Consider an example. Suppose your employer says, “After one year of employment you will get a raise and two weeks vacation”. A year passes. Suppose now that this employer gives you a raise but no vacation, or a vacation but no raise, or neither a raise nor a vacation. In each case, the employer has broken his promise. The sentence forming the promise turned out to be false.

    Thus, the semantics for the conjunction are given with the following truth table. For any sentences Φ and Ψ:

    Φ Ψ (Φ^Ψ)
    T T T
    T F F
    F T F
    F F F

    5.2 Alternative phrasings, and a different “and”

    We have noted that in English, “but” is an alternative to “and”, and can be translated the same way in our propositional logic. There are other phrases that have a similar meaning: they are best translated by conjunctions, but they convey (in English) a sense of surprise or failure of expectations. For example, consider the following sentence.

    Even though they lost the battle, they won the war.

    Here “even though” seems to do the same work as “but”. The implication is that it is surprising—that one might expect that if they lost the battle then they lost the war. But, as we already noted, we will not capture expectations with our logic. So, we would take this sentence to be sufficiently equivalent to:

    They lost the battle and they won the war.

    With the exception of “but”, it seems in English there is no other single word that is an alternative to “and” that means the same thing. However, there are many ways that one can imply a conjunction. To see this, consider the following sentences.

    Tom, who won the race, also won the championship.

    The star Phosphorous, that we see in the morning, is the Evening Star.

    The Evening Star, which is called “Hesperus”, is also the Morning Star.

    While Steve is tall, Tom is not.

    Dogs are vertebrate terrestrial mammals.

    Depending on what elements we take as basic in our language, these sentences all include implied conjunctions. They are equivalent to the following sentences, for example:

    Tom won the race and Tom won the championship.

    Phosphorous is the star that we see in the morning and Phosphorous is the Evening Star.

    The Evening Star is called “Hesperus” and the Evening Star is the Morning Star.

    Steve is tall and it is not the case that Tom is tall.

    Dogs are vertebrates and dogs are terrestrial and dogs are mammals.

    Thus, we need to be sensitive to complex sentences that are conjunctions but that do not use “and” or “but” or phrases like “even though”.

    Unfortunately, in English there are some uses of “and” that are not conjunctions. The same is true for equivalent terms in some other natural languages. Here is an example.

    Rochester is between Buffalo and Albany.

    The “and” in this sentence is not a conjunction. To see this, note that this sentence is not equivalent to the following:

    Rochester is between Buffalo and Rochester is between Albany.

    That sentence is not even semantically correct. What is happening in the original sentence?

    The issue here is that “is between” is what we call a “predicate”. We will learn about predicates in chapter 11, but what we can say here is that some predicates take several names in order to form a sentence. In English, if a predicate takes more than two names, then we typically use the “and” to combine names that are being described by that predicate. In contrast, the conjunction in our propositional logic only combines sentences. So, we must say that there are some uses of the English “and” that are not equivalent to our conjunction.

    This could be confusing because sometimes in English we put “and” between names and there is an implied conjunction. Consider:

    Steve is older than Joe and Karen.

    Superficially, this looks to have the same structure as “Rochester is between Buffalo and Albany”. But this sentence really is equivalent to:

    Steve is older than Joe and Steve is older than Karen.

    The difference, however, is that there must be three things in order for one to be between the other two. There need only be two things for one to be older than the other. So, in the sentence “Rochester is between Buffalo and Albany”, we need all three names (“Rochester”, “Buffalo”, and “Albany) to make a single proper atomic sentence with “between”. This tells us that the “and” is just being used to combine these names, and not to combine implied sentences (since there can be no implied sentence about what is “between”, using just two or just one of these names).

    That sounds complex. Do not despair, however. The use of “and” to identify names being used by predicates is less common than “and” being used for a conjunction. Also, after we discuss predicates in chapter 11, and after you have practiced translating different kinds of sentences, the distinction between these uses of “and” will become easy to identify in almost all cases. In the meantime, we shall pick examples that do not invite this confusion.

    5.3 Inference rules for conjunctions

    Looking at the truth table for the conjunction should tell us two things very clearly. First, if a conjunction is true, what else must be true? The obvious answer is that both of the parts, the conjuncts, must be true. We can introduce a rule to capture this insight. In fact, we can introduce two rules and call them by the same name, since the order of conjuncts does not affect their truth value. These rules are often called “simplification”.

    (Φ^Ψ)

    _____

    Φ

    And:

    (Φ^Ψ)

    _____

    Ψ

    In other words, if (Φ^Ψ) is true, then Φ must be true; and if (Φ^Ψ) is true, then Ψ must be true.

    We can also introduce a rule to show a conjunction, based on what we see from the truth table. That is, it is clear that there is only one kind of condition in which (Φ^Ψ) is true, and that is when Φ is true and when Ψ is true. This suggests the following rule:

    Φ

    Ψ

    _____

    (Φ^Ψ)

    We might call this rule “conjunction”, but to avoid confusion with the name of the sentences, we will call this rule “adjunction”.

    5.4 Reasoning with conjunctions

    It would be helpful to consider some examples of reasoning with conjunctions. Let’s begin with an argument in a natural language.

    Tom and Steve will go to London. If Steve goes to London, then he will ride the Eye. Tom will ride the Eye too, provided that he goes to London. So, both Steve and Tom will ride the Eye.

    We need a translation key.

    T: Tom will go to London.

    S: Steve will go to London.

    U: Tom will ride the Eye.

    V: Steve will ride the Eye.

    Thus our argument is:

    (T^S)

    (S→U)

    (T→V)

    _____

    (V^U)

    Our direct proof will look like this.

    \[ \fitchprf{\pline[1.] {(T \land S)} [premise]\\ \pline[2.]{(S \lif U)} [premise]\\ \pline[3.]{(T \lif V)} [premise]\\ } { \pline[4.]{T}[simplification, 1]\\ \pline[5.]{V}[modus ponens, 3, 4]\\ \pline[6.]{S}[simplification, 1]\\ \pline[7.]{U}[modus ponens, 2, 6]\\ \pline[8.]{(V \land U)}[adjunction, 5, 7] } \]

    Now an example using just our logical language. Consider the following argument.

    (Q→ ¬S)

    (P →(Q^R))

    (T→ ¬ R)

    P

    _____

    ( ¬ S^ ¬ T)

    Here is one possible proof.

    \[ \fitchprf{\pline[1.] {(Q \lif \lnot S)} [premise]\\ \pline[2.]{(P \lif (Q \land R))} [premise]\\ \pline[3.]{(T \lif \lnot R)} [premise]\\ \pline[4.]{P}[premise] } { \pline[5.]{(Q \land R)}[modus ponens, 2, 4]\\ \pline[6.]{Q}[simplification, 5]\\ \pline[7.]{\lnot S}[modus ponens, 1, 6]\\ \pline[8.]{R}[simplification, 5]\\ \pline[9.]{\lnot \lnot R}[double negation, 8]\\ \pline[10.]{\lnot T}[modus tollens, 3, 9]\\ \pline[11.]{(\lnot S \land \lnot T)}[adjunction, 7, 10] } \]

    5.5 Alternative symbolizations for the conjunction

    Alternative notations for the conjunction include the symbols “&” and the symbol “∙”. Thus, the expression (P^Q) would be written in these different styles, as:

    (P&Q)

    (P∙Q)

    5.6 Complex sentences

    Now that we have three different connectives, this is a convenient time to consider complex sentences. The example that we just considered required us to symbolize complex sentences, which use several different kinds of connectives. We want to avoid confusion by being clear about the nature of these sentences. We also want to be able to understand when such sentences are true and when they are false. These two goals are closely related.

    Consider the following sentences.

    ¬ (P →Q)

    ( ¬ P →Q)

    ( ¬ P ¬ Q)

    We want to understand what kinds of sentences these are, and also when they are true and when they are false. (Sometimes people wrongly assume that there is some simple distribution law for negation and conditionals, so there is some additional value to reviewing these particular examples.) The first task is to determine what kinds of sentences these are. If the first symbol of your expression is a negation, then you know the sentence is a negation. The first sentence above is a negation. If the first symbol of your expression is a parenthesis, then for our logical language we know that we are dealing with a connective that combines two sentences.

    The way to proceed is to match parentheses. Generally people are able to do this by eye, but if you are not, you can use the following rule. Moving left to right, the last “(” that you encounter always matches the first “)” that you encounter. These form a sentence that must have two parts combined with a connective. You can identify the two parts because each will be an atomic sentence, a negation sentence, or a more complex sentence bound with parentheses on each side of the connective.

    In our propositional logic, each set of paired parentheses forms a sentence of its own. So, when we encounter a sentence that begins with a parenthesis, we find that if we match the other parentheses, we will ultimately end up with two sentences as constituents, one on each side of a single connective. The connective that combines these two parts is called the “main connective”, and it tells us what kind of sentence this is. Thus, above we have examples of a negation, a conditional, and a conditional.

    How should we understand the meaning of these sentences? Here we can use truth tables in a new, third way (along with defining a connective and checking arguments). Our method will be this.

    First, write out the sentence on the right, leaving plenty of room. Identify what kind of sentence this is. If it is a negation sentence, you should add just to the left a column for the non-negated sentence. This is because the truth table defining negation tells us what a negated sentence means in relation to the non-negated sentence that forms the sentence. If the sentence is a conditional, make two columns to the left, one for the antecedent and one for the consequent. If the sentence is a conjunction, make two columns to the left, one for each conjunct. Here again, we do this because the semantic definitions of these connectives tell us what the truth value of the sentence is, as a function of the truth value of its two parts. Continue this process until the parts would be atomic sentences. Then, we stipulate all possible truth values for the atomic sentences. Once we have done this, we can fill out the truth table, working left to right.

    Let’s try it for ¬(P→Q). We write it to the right.

    ¬(P→Q)

    This is a negation sentence, so we write to the left the sentence being negated.

    (P→Q) ¬(P→Q)
       
       
       
       

    This sentence is a conditional. Its two parts are atomic sentences. We put these to the left of the dividing line, and we stipulate all possible combinations of truth values for these atomic sentences.

    P Q (P→Q) ¬(P→Q)
    T T    
    T F    
    F T    
    F F    

    Now, we can fill out each column, moving left to right. We have stipulated the values for P and Q, so we can identify the possible truth values of (P→Q). The semantic definition for “” tells us how to do that, given that we know for each row the truth value of its parts.

    P Q (P→Q) ¬(P→Q)
    T T T  
    T F F  
    F T T  
    F F T  

    This column now allows us to fill in the last column. The sentence in the last column is a negation of (P→Q), so the definition of “¬” tell us that ¬(P→Q) is true when (P→Q) is false, and ¬(P→Q) is false when (P→Q) is true.

    P Q (P→Q) ¬(P→Q)
    T T T F
    T F F T
    F T T F
    F F T F

    This truth table tells us what ¬(P→Q) means in our propositional logic. Namely, if we assert ¬(P→Q) we are asserting that P is true and Q is false.

    We can make similar truth tables for the other sentences.

    P Q ¬P (¬P→Q)
    T T F T
    T F F T
    F T T T
    F F T F

    How did we make this table? The sentence (¬P→Q) is a conditional with two parts, ¬P and Q. Because Q is atomic, it will be on the left side. We make a row for ¬P. The sentence ¬P is a negation of P, which is atomic, so we put P also on the left. We fill in the columns, going left to right, using our definitions of the connectives.

    And:

    P Q ¬P ¬Q (¬P→¬Q)
    T T F F T
    T F F T T
    F T T F F
    F F T T T

    Such a truth table is very helpful in determining when sentences are, and are not, equivalent. We have used the concept of equivalence repeatedly, but have not yet defined it. We can offer a semantic, and a syntactic, explanation of equivalence. The semantic notion is relevant here: we say two sentences Φ and Ψ are “equivalent” or “logically equivalent” when they must have the same truth value. (For the syntactic concept of equivalence, see section 9.2). These truth tables show that these three sentences are not equivalent, because it is not the case that they must have the same truth value. For example, if P and Q are both true, then ¬(P→Q) is false but (¬P→Q) is true and (¬P→¬Q) is true. If P is false and Q is true, then (¬P→Q) is true but (¬P→¬Q) is false. Thus, each of these sentences is true in some situation where one of the others is false. No two of them are equivalent.

    We should consider an example that uses conjunction, and which can help in some translations. How should we translate “Not both Steve and Tom will go to Berlin”? This sentence tells us that it is not the case that both Steve will go to Berlin and Tom will go to Berlin. The sentence does allow, however, that one of them will go to Berlin. Thus, let U mean Steve will go to Berlin and V mean Tom will go to Berlin. Then we should translate this sentence, ¬(U^V). We should not translate the sentence (¬U^¬V). To see why, consider their truth tables.

    U V (U^V) ¬(U^V) ¬U ¬V (¬U^¬V)
    T T T F F F F
    T F F T F T F
    F T F T T F F
    F F F T T T T

    We can see that ¬(U^V) and (¬U^¬V) are not equivalent. Also, note the following. Both ¬(U^V) and (¬U^¬V) are true if Steve does not go to Berlin and Tom does not go to Berlin. This is captured in the last row of this truth table, and this is consistent with the meaning of the English sentence. But, now note: it is true that not both Steve and Tom will go to Berlin, if Steve goes and Tom does not. This is captured in the second row of this truth table. It is true that not both Steve and Tom will go to Berlin, if Steve does not go but Tom does. This is captured in the third row of this truth table. In both kinds of cases (in both rows of the truth table), ¬(U^V) is true but (¬U^¬V) is false. Thus, we can see that ¬(U^V) is the correct translation of “Not both Steve and Tom will go to Berlin”.

    Let’s consider a more complex sentence that uses all of our connectives so far: ((P^¬Q)→¬(P→Q)). This sentence is a conditional. The antecedent is a conjunction. The consequent is a negation. Here is the truth table, completed.

    P Q ¬Q (P→Q) (P^¬Q) ¬(P→Q) ((P^¬Q)→¬(P→Q))
    T T F T F F T
    T F T F T T T
    F T F T F F T
    F F T T F F T

    This sentence has an interesting property: it cannot be false. That is not surprising, once we think about what it says. In English, the sentence says: If P is true and Q is false, then it is not the case that P implies Q. That must be true: if it were the case that P implies Q, then if P is true then Q is true. But the antecedent says P is true and Q is false.

    Sentences of the propositional logic that must be true are called “tautologies”. We will discuss them at length in later chapters.

    Finally, note that we can combine this method for finding the truth conditions for a complex sentence with our method for determining whether an argument is valid using a truth table. We will need to do this if any of our premises or the conclusion are complex. Here is an example. We’ll start with an argument in English:

    If whales are mammals, then they have vestigial limbs. If whales are mammals, then they have a quadrupedal ancestor. Therefore, if whales are mammals then they have a quadrupedal ancestor and they have vestigial limbs.

    We need a translation key.

    P: Whales are mammals.

    Q: Whales have have vestigial limbs.

    R: Whales have a quadrupedal ancestor.

    The argument will then be symbolized as:

    (P→Q)

    (P→R)

    ____

    (P→(R^Q))

    Here is a semantic check of the argument.

          premise premise   conclusion
    P Q R (P→Q) (P→R) (R^Q) (P→(R^Q))
    T T T T T T T
    T T F T F F F
    T F T F T F F
    T F F F F F F
    F T T T T T T
    F T F T T F T
    F F T T T F T
    F F F T T F T

    We have highlighted the rows where the premises are all true. Note that for these, the conclusion is true. Thus, in any kind of situation in which all the premises are true, the conclusion is true. This is equivalent, we have noted, to our definition of valid: necessarily, if all the premises are true, the conclusion is true. So this is a valid argument. The third column of the analyzed sentences (the column for (R^Q)) is there so that we can identify when the conclusion is true. The conclusion is a conditional, and we needed to know, for each kind of situation, if its antecedent P, and if its consequent (R^Q), are true. The third column tells us the situations in which the consequent is true. The stipulations on the left tell us in what kind of situation the antecedent P is true.

    5.6 Problems

    1. Translate the following sentences into our logical language. You will need to create your own key to do so.
      1. Ulysses, who is crafty, is from Ithaca.
      2. If Ulysses outsmarts both Circes and the Cyclops, then he can go home.
      3. Ulysses can go home only if he isn’t from Troy.
      4. Ulysses is from Ithaca but not from Troy.
      5. Ulysses is not both crafty and from Ithaca.
    2. Prove the following arguments are valid, using a direct derivation.
      1. Premise: ((PQ) ^ ¬Q). Conclusion: ¬P.
      2. Premises: ((PQ) ^ (RS)), (¬Q ^ ¬S). Conclusion: (¬P ^ ¬R).
      3. Premises: ((R ^ S) T), (Q ^ ¬T). Conclusion: ¬(R ^ S).
      4. Premises: (P (R S)), (R ^ P). Conclusion: S.
      5. Premises: (P (R S)), (¬S ^ P). Conclusion: ¬R.
    3. Make truth tables for the following complex sentences. Identify which are tautologies.
      1. (((P Q)^ ¬Q) ¬P)
      2. ¬(P ^ Q)
      3. ¬(¬P ¬Q)
      4. (P ^ ¬P)
      5. ¬(P ^ ¬P)
    4. Make truth tables to show when the following sentences are true and when they are false. State which of these sentences are equivalent.
      1. ¬(P^Q)
      2. (¬P^¬Q)
      3. ¬(P Q)
      4. (P^¬Q)
      5. (¬P^Q)
      6. ¬(¬P ¬Q)
    5. Write a valid argument in normal colloquial English with at least two premises, one of which is a conjunction or includes a conjunction. Your argument should just be a paragraph (not an ordered list of sentences or anything else that looks like formal logic). Translate the argument into propositional logic. Prove it is valid.
    6. Write a valid argument in normal colloquial English with at least three premises, one of which is a conjunction or includes a conjunction and one of which is a conditional or includes a conditional. Translate the argument into propositional logic. Prove it is valid.
    7. Make your own key to translate the following argument into our propositional logic. Translate only the parts in bold. Prove the argument is valid.

    “I suspect Dr. Kronecker of the crime of stealing Cantor’s book,” Inspector Tarski said. His assistant, Mr. Carroll, waited patiently for his reasoning. “For,” Tarski said, “The thief left cigarette ashes on the table. The thief also did not wear shoes, but slipped silently into the room. Thus, If Dr. Kronecker smokes and is in his stocking feet, then he most likely stole Cantor’s book.” At this point, Tarski pointed at Kronecker’s feet. “Dr. Kronecker is in his stocking feet.” Tarski reached forward and pulled from Kronecker’s pocket a gold cigarette case. “And Kronecker smokes.” Mr. Carroll nodded sagely, “Your conclusion is obvious: Dr. Kronecker most likely stole Cantor’s book.”


    This page titled 1.5: “And” is shared under a CC BY-NC-SA 4.0 license and was authored, remixed, and/or curated by Craig DeLancey (OpenSUNY) via source content that was edited to the style and standards of the LibreTexts platform.

    • Was this article helpful?