Boolean algebra (logic)


 *  For other uses, see: Boolean algebra (structure) for semantic aspects, namely the algebraic structures satisfying those laws; binary arithmetic for discussions the use of binary numbers in computer systems; Boolean satisfiability problem for the NP-complete problem of deciding satisfiability of Boolean formulas.

Boolean algebra (or Boolean logic) is a logical calculus of truth values, developed by George Boole. It resembles the algebra of real numbers as taught in high school, but with the numeric operations of multiplication xy, addition x + y, and negation &minus;x replaced by the logical operations of conjunction x&and;y, disjunction x&or;y, and complement &not;x. The Boolean operations are these and all other operations obtainable from them by composition; equivalently, the finitary operations on the set {0,1}. The laws of Boolean algebra can be defined axiomatically as the equations derivable from a sufficient finite subset of those laws, such as the equations axiomatizing a complemented distributive lattice or a Boolean ring, or semantically as those equations identically true or valid over {0,1}. The axiomatic approach is sound and complete in the sense that it proves respectively neither more nor fewer laws than the validity-based semantic approach.

Conventions
Boolean algebra is the algebra of two values. These are usually taken to be 0 and 1, as we shall do here, although &perp; and T, false and true, etc. are also in common use. For the purpose of understanding Boolean algebra any Boolean domain of two values will do.

Regardless of nomenclature, the values are customarily thought of as essentially logical in character and are therefore referred to as truth values, in contrast to the natural numbers or the reals which are considered numerical values. On the other hand the algebra of the integers modulo 2, while ostensibly just as numeric as the integers themselves, was shown to constitute exactly Boolean algebra, originally by I.I. Zhegalkin in 1927 and rediscovered independently in the west by Marshall Stone in 1936. So in fact there is some ambiguity in the true nature of Boolean algebra: it can be viewed as either logical or numeric in character.

More generally Boolean algebra is the algebra of values from any Boolean algebra as a model of the laws of Boolean algebra. For example the bit vectors of a given length, as with say 32-bit computer words, can be combined with Boolean operations in the same way as individual bits, thereby forming a 232-element Boolean algebra under those operations. Any such combination applies the same Boolean operation to all bits simultaneously. This passage from the Boolean algebra of 0 and 1 to these more general Boolean algebras is the Boolean counterpart of the passage from the algebra of the ring of integers to the algebra of commutative rings in general. The two-element Boolean algebra is the prototypical Boolean algebra in the same sense as the ring of integers is the prototypical commutative ring. Boolean logic as the subject matter of this article is independent of the choice of Boolean algebra (the same equations hold of every nontrivial Boolean algebra) whence there is no need here to consider any Boolean algebra other than the two-element one. The article on Boolean algebra (structure) treats Boolean algebras themselves.

Applications
Boolean algebra as the calculus of two values is fundamental to digital logic, computer programming, and mathematical logic, and is also used in other areas of mathematics such as set theory and statistics.

Digital logic codes its symbols in various ways: as voltages on wires in high-speed circuits and capacitive storage devices, as orientations of a magnetic domain in ferromagnetic storage devices, as holes in punched cards or paper tape, and so on. Now it is possible to code more than two symbols in any given medium. For example one might use respectively 0, 1, 2, and 3 volts to code a four-symbol alphabet on a wire, or holes of different sizes in a punched card. In practice however the tight constraints of high speed, small size, and low power combine to make noise a major factor. This makes it hard to distinguish between symbols when there are many of them at a single site. Rather than attempting to distinguish between four voltages on one wire, digital designers have settled on two voltages per wire, high and low. To obtain four symbols one uses two wires, and so on.

Programmers programming in machine code, assembly language, and other programming languages that expose the low-level digital structure of the data registers operate on whatever symbols were chosen for the hardware, invariably bit vectors in modern computers for the above reasons. Such languages support both the numeric operations of addition, multiplication, etc. performed on words interpreted as integers, as well as the logical operations of disjunction, conjunction, etc. performed bit-wise on words interpreted as bit vectors. Programmers therefore have the option of working in and applying the laws of either numeric algebra or Boolean algebra as needed. A core differentiating feature is carry propagation with the former but not the latter.

Other areas where two values is a good choice are the law and mathematics. In everyday relaxed conversation, nuanced or complex answers such as "maybe" or "only on the weekend" are acceptable. In more focused situations such as a court of law or theorem-based mathematics however it is deemed advantageous to frame questions so as to admit a simple yes-or-no answer---is the defendant guilty or not guilty, is the proposition true or false---and to disallow any other answer. However much of a straitjacket this might prove in practice for the respondent, the principle of the simple yes-no question has become a central feature of both judicial and mathematical logic, making two-valued logic deserving of organization and study in its own right.

A central concept of set theory is membership. Now an organization may permit multiple degrees of membership, such as novice, associate, and full. With sets however an element is either in or out. The candidates for membership in a set work just like the wires in a digital computer: each candidate is either a member or a nonmember, just as each wire is either high or low.

Algebra being a fundamental tool in any area amenable to mathematical treatment, these considerations combine to make the algebra of two values of fundamental importance to computer hardware, mathematical logic, and set theory. It has not featured so prominently in law however, perhaps because mathematical methods in general have not been applied as vigorously there as in these other application areas.

Basic operations
After values, the next ingredient of any algebraic system is its operations. Whereas elementary algebra is based on numeric operations multiplication xy, addition x + y, and negation &minus;x, Boolean algebra is customarily based on logical counterparts to those operations, namely conjunction x&and;y (AND), disjunction x&or;y (OR), and complement or negation &not;x (NOT).

Conjunction is the closest of these three to its numerical counterpart, in fact on 0 and 1 it is multiplication. As a logical operation the conjunction of two propositions is true when both propositions are true, and otherwise is false. The first column of Figure 1 below tabulates the values of x&and;y for the four possible valuations for x and y; such a tabulation is traditionally called a truth table.

Disjunction, in the second column of the figures, works almost like addition, with one exception: the disjunction of 1 and 1 is neither 2 nor 0 but 1. Thus the disjunction of two propositions is false when both propositions are false, and otherwise is true. This is just the definition of conjunction with true and false interchanged everywhere; because of this we say that disjunction is the dual of conjunction.

Logical negation however does not work like numerical negation at all. Instead it corresponds to incrementation: &not;x = x+1 mod 2. Yet it shares in common with numerical negation the property that applying it twice returns the original value: &not;&not;x = x, just as &minus;(&minus;x) = x. An operation with this property is called an involution. The set {0,1} has two permutations, both involutary, namely the identity, no movement, corresponding to numerical negation mod 2 (since +1 = &minus;1 mod 2), and SWAP, corresponding to logical negation. Using negation we can formalize the notion that conjunction is dual to disjunction via De Morgan's laws, &not;(x&and;y) = &not;x &or; &not;y and &not;(x&or;y) = &not;x &and; &not;y. These can also be construed as definitions of conjunction in terms of disjunction and vice versa: x&and;y = &not;(&not;x &or; &not;y) and x&or;y = &not;(&not;x &and; &not;y).



Figure 2 shows the symbols used in digital electronics for conjunction and disjunction; the input ports are on the left and the signals flow through to the output port on the right. Inverters negating the input signals on the way in, or the output signals on the way out, are represented as circles on the port to be inverted.

Derived operations
Other Boolean operations are derivable from these by composition. For example implication x&rarr;y (IMP), in the third column of the figures, is a binary operation which is false when x is true and y is false, and true otherwise. It can be expressed as x&rarr;y = &not;x&or;y (the OR-gate of Figure 2 with the x input inverted), or equivalently &not;(x&and;&not;y) (its De Morgan equivalent in Figure 3). In logic this operation is called material implication, to distinguish it from related but non-Boolean logical concepts such as entailment and relevant implication. The idea is that an implication x&rarr;y is by default true (the weaker truth value in the sense that false implies true but not vice versa) unless its premise or antecedent x holds, in which case the truth of the implication is that of its conclusion or consequent y.

Although disjunction is not the exact counterpart of numerical addition, Boolean algebra nonetheless does have an exact counterpart, called exclusive-or (XOR) or parity, x&oplus;y. As shown in the fourth column of the figures, the exclusive-or of two propositions is true just when exactly one of the propositions is true; equivalently when an odd number of the propositions is true, whence the name "parity". Exclusive-or is the operation of addition mod 2. The exclusive-or of any value with itself vanishes, x&oplus;x = 0, since the arguments have an even number of whatever value x has. Its digital electronics symbol is shown in Figure 2, being a hybrid of the disjunction symbol and the equality symbol. The latter reflects the fact that the negation (which is also the dual) of XOR, &not;(x&oplus;y), is logical equivalence, EQV, being true just when x and y are equal, either both true or both false. XOR and EQV are the only binary Boolean operations that are commutative and whose truth tables have equally many 0s and 1s. Exclusive-or together with conjunction constitute yet another complete basis for Boolean algebra, with the Boolean operations reformulated as the Zhegalkin polynomials.

Another example is Sheffer stroke, x|y, the NAND gate in digital electronics, which is false when both arguments are true, and true otherwise. NAND is definable by composition of negation with conjunction as x |y = &not;(x&and;y). It does not have its own schematic symbol as it is easily represented as an AND gate with an inverted output. Unlike conjunction and disjunction, NAND is a binary operation that can be used to obtain negation, via the definition &not;x = x|x. With negation in hand one can then in turn define conjunction in terms of NAND via x&and;y = &not;(x|y), from which all other Boolean operations of nonzero arity can then be obtained. NOR, &not;(x&or;y), as the evident dual of NAND serves this purpose equally well. This universal character of NAND and NOR makes them a popular choice for gate arrays, integrated circuits with multiple general-purpose gates.

The above-mentioned duality of conjunction and disjunction is exposed further by De Morgan's laws, &not;(x&and;y) = &not;x&or;&not;y and &not;(x&or;y) = &not;x&and;&not;y. Figure 3 illustrates De Morgan's laws by giving for each gate its De Morgan dual, converted back to the original operation with inverters on both inputs and the outputs. In the case of implication, taking the form of an OR-gate with one inverter on disjunction, that inverter is cancelled by the second inverter that would have gone there. The De Morgan dual of XOR is just XOR with an inverter on the output (there is no separate symbol); as with implication, putting inverters on all three ports cancels the dual's output inverter. More generally, changing an odd number of inverters on an XOR gate produces the dual gate, an even number leaves the gate's functionality unchanged.

As with all the other laws in this section, De Morgan's laws may be verified case by case for each of the 2n possible valuations of the n variables occurring in the law, here two variables and hence 22 = 4 valuations. De Morgan's laws play a role in putting Boolean terms in certain normal forms, one of which we will encounter later in the section on soundness and completeness.

Figure 4 illustrates the corresponding Venn diagrams for each of the four operations presented in Figures 1-3. The interior (respectively exterior) of each circle represents the value true (respectively false) for the corresponding input, x or y. The convention followed here is to represent the true or 1 outputs as dark regions and false as light, but the reverse convention is also sometimes used.

All Boolean operations
There are infinitely many expressions that can be built from two variables using the above operations, suggesting great expressiveness. Yet a straightforward counting argument shows that only 16 distinct binary operations on two values are possible. The two arguments have four possible values between them, and there are 24 = 16 ways of choosing an output value for each of these four input values. The choice of one of these 16 ways completely determines the implemented operation, whence there can be only 16 binary operations. So the apparent infinite expressiveness proves to be a complete mirage.

The 16 binary Boolean operations can be organized as follows. There are two constant operations, 0 and 1, whose truth-tables are featureless. There are four operations depending on only one variable, namely x, &not;x, y, and &not;y, whose truth tables amount to two juxtaposed rectangles. There are two operations with a "checkerboard" truth table, namely XOR and EQV. Four operations are obtained from disjunction with some subset of its inputs negated, namely x&or;y, x&rarr;y, y&rarr;x, and x|y; their truth tables contain a single 0. The last four come from the same treatment applied to conjunction, having a single 1 in their truth tables. So 10 of the 16 operations depend on both variables; all are representable schematically as an AND-gate, an OR-gate, or an XOR-gate, with one port optionally inverted. For the AND and OR gates the location of each inverter matters, for the XOR gate it does not, only whether there is an even or odd number of inverters.

Operations of other arities are possible. For example the ternary counterpart of disjunction can be obtained as (x&or;y)&or;z. In general an n-ary operation, one having n inputs, has 2n possible valuations of those inputs. An operation has two possibilities for each of these, whence there exist 22 n n-ary Boolean operations. There are therefore 232 = 4,294,967,296 operations with 5 inputs. Not infinite, certainly, but neither is it quite the mirage of the binary case.

Although Boolean algebra confines attention to operations that return a single bit, the concept generalizes to operations that take n bits in and return m bits instead of one bit. Digital circuit designers draw such operations as suitably shaped boxes with n wires entering on the left and m wires exiting on the right. Such multi-output operations can be understood simply as m n-ary operations. The operation count must then be raised to the m-th power, or, in the case of n inputs, (22 n )m = 2m2 n  operations. The number of Boolean operations of this generalized kind with say 5 inputs and 5 outputs is 1.46 &times; 1048. A computer module mapping 32 bits to 32 bits could be any of 5.47 &times; 1041,373,247,567 operations, more than is obtained by squaring a googol 28 times. Mere mortals are ill-equipped to distinguish such numbers from infinity in practical applications of Boolean algebra.

Applications of Boolean operations
The original application for Boolean operations was mathematical logic, where it combines the truth values, true or false, of individual formulas.

Natural languages such as English have words for several Boolean operations, in particular conjunction (and), disjunction (or), negation (not), and implication (implies). But not is synonymous with and not. When used to combine situational assertions such as "the block is on the table" and "cats drink milk," which naively are either true or false, the meanings of these logical connectives often have the meaning of their logical counterparts. However with descriptions of behavior such as "Jim walked through the door", one starts to notice differences such as failure of commutativity, for example the conjunction of "Jim opened the door" with "Jim walked through the door" in that order is not equivalent to their conjunction in the other order, since and usually means and then in such cases. Questions can be similar: the order "Is the sky blue, and why is the sky blue?" makes more sense than the reverse order. Conjunctive commands about behavior are like behavioral assertions, as in get dressed and go to school. Disjunctive commands such love me or leave me or fish or cut bait tend to be asymmetric via the implication that one alternative is less preferable. Conjoined nouns such as tea and milk generally describe aggregation as with set union while tea or milk is a choice. However context can reverse these senses, as in your choices are coffee and tea which usually means the same as your choices are coffee or tea (alternatives). Double negation as in "I don't not like milk" rarely mean literally "I do like milk" but rather convey some sort of hedging, as though to imply that there is a third possibility. "Not not P" can be loosely interpreted as "surely P", and although P necessarily implies "not not P" the converse is suspect in English, much as with intuitionistic logic. In view of the highly idiosyncratic usage of conjunctions in natural languages, Boolean algebra cannot be considered a reliable framework for interpreting them.

Boolean operations are used in digital logic to combine the bits carried on individual wires, thereby interpreting them over {0,1}. When n gates are used to combine bit vectors of n bits, the individual bit operations if all the same can be understood collectively as a single operation on values from a Boolean algebra with 2n elements.

Naive set theory interprets Boolean operations as acting on subsets of a given set X. The set theoretic counterparts of conjunction, intersection, and negation are respectively intersection, union, and complement relative to X. The subsets of X form a Boolean algebra under these operations.

Computer displays based on raster graphics use bit blit to manipulate whole regions comprised of pixels, relying on Boolean operations to specify how the source region should be combined with the destination, typically with the help of a third region called the mask. Modern video cards offer all 22 3 = 256 ternary operations for this purpose, with the choice of operation being a one-byte (8-bit) parameter. Defining constants SRC = 0xaa or 10101010, DST = 0xcc or 11001100, and MSK = 0xf0 or 11110000, the operation for XOR-ing the source into the destination within the region defined by the 1 bits of the mask is, in C (programming language), the constant (SRC^DST)&MSK or 0x60. SRC^DST describes the process of XOR-ing the source and the destination, which &MSK then "masks out". The 256 operations made available to the programmer in this way are applied to three regions of memory specified by the programmer. The regions are combined directly in hardware on the video card and requires surprisingly little electronics to decode and execute all 256 possible constants so representable. The resulting region is then sent by the hardware to the region specified by the programmer. As a particularly simple example the NOP or do-nothing operation is written simply as DST. This operation makes a copy of the data in the destination, in effect does nothing to it (though the hardware works exactly as hard as with more complex operations), and writes it back out over the top of the destination, which therefore is left unchanged in the end.

Solid modeling systems for computer aided design offer a variety of methods for building objects from other objects, combination by Boolean operations being one of them. In this method the space in which objects exist is understood as a set S of voxels (the three-dimensional analogue of pixels in two-dimensional graphics) and shapes are defined as subsets of S, allowing objects to be combined as sets via union, intersection, etc. One obvious use is in building a complex shape from simple shapes simply as the union of the latter. Another use is in sculpting understood as removal of material: any grinding, milling, routing, or drilling operation that can be performed with physical machinery on physical materials can be simulated on the computer with the Boolean operation x&and;&not;y or x-y, which in set theory is set difference, remove the elements of y from those of x. Thus given two shapes one to be machined and the other the material to be removed, the result of machining the former to remove the latter is described simply as their set difference.

Search engine queries also employ Boolean logic. For this application, each web page on the Internet may be considered to be an "element" of a "set". The following examples use a syntax supported by Google.


 * Doublequotes are used to combine whitespace-separated words into a single search term.
 * Whitespace is used to specify logical AND, as it is the default operator for joining search terms:

"Search term 1" "Search term 2"


 * The OR keyword is used for logical OR:

"Search term 1" OR "Search term 2"


 * The minus sign is used for logical NOT (AND NOT):

"Search term 1" -"Search term 2"

Axioms
With values and operations in hand, the next aspect of Boolean algebra is that of laws or properties. As with many kinds of algebra, the principal laws take the form of equations between terms built up from variables using the operations of the algebra. Such an equation is deemed a law or identity just when both sides have the same value for all values of the variables, equivalently when the two terms denote the same operation.

Now numeric algebra has such laws as commutativity of addition and multiplication, x+y = y+x and xy = yx. Likewise Boolean algebra has x&or;y = y&or;x expressing the commutativity of disjunction and x&and;y = y&and;x for conjunction. Not all binary operations are commutative however, for example subtraction in the numeric case and implication in the Boolean case.

Another equally fundamental law is associativity, which in the case of numeric multiplication is expressed as x(yz) = (xy)z, justifying abbreviating both sides to xyz and thinking of multiplication as a single ternary operation. All four of numeric addition and multiplication and logical disjunction and conjunction are associative, giving for the latter two the Boolean laws x&or;(y&or;z) = (x&or;y)&or;z and x&and;(y&and;z) = (x&and;y)&and;z.

Again numeric subtraction and logical implication serve as examples, this time of binary operations that are not associative. On the other hand exclusive-or, being just addition mod 2, is both commutative and associative.

Boolean algebra does not completely mirror numeric algebra however, as both conjunction and disjunction satisfy idempotence, expressed respectively as x&and;x = x and x&or;x = x. These laws are easily verified by considering the two valuations 0 and 1 for x. But since 2+2 = 2&times;2 = 4 in numeric algebra, clearly numeric addition and multiplication are not idempotent. With arithmetic mod 2 on the other hand, multiplication is idempotent, though not addition since 1+1 = 0 mod 2, reflected logically in the idempotence of conjunction but not of exclusive-or.

A more subtle difference between number and logic is with x(x+y) and x+xy, neither of which equal x numerically. In Boolean algebra however, both x&and;(x&or;y) and x&or;(x&and;y) are equal to x, as can be verified for each of the four possible valuations for x and y. These two Boolean laws are called the laws of absorption. These laws (both are needed) together with the associativity, commutativity, and idempotence of conjunction and disjunction constitute the defining laws or axioms of lattice theory.

Another law common to numbers and truth values is distributivity of multiplication over addition, when paired with distributivity of conjunction over disjunction. Numerically we have x(y+z) = xy+xz, whose Boolean algebra counterpart is x&and;(y&or;z) = (x&and;y)&or;(x&and;z). On the other hand Boolean algebra also has distributivity of disjunction over conjunction, x&or;(y&and;z) = x&or;y&and;x&or;z, for which there is no numeric counterpart, consider 1 + 2&times;3 = 7 whereas (1+2)&times;(1+3) = 12. Like associativity, distributivity has three variables and so requires checking 23 = 8 cases.

Either distributivity law for Boolean algebra entails the other. Adding either to the axioms for lattices axiomatizes the theory of distributive lattices. That theory does not need the idempotence axioms because they follow from the six absorption, distributivity, and associativity laws.

Two Boolean laws having no numeric counterpart are the laws characterizing logical negation, namely x&and;&not;x = 0 and x&or;&not;x = 1. These are the only laws thus far that have required constants. It then follows that x&and;0 = x&and;(x&and;&not;x) = (x&and;x)&and;&not;x = x&and;&not;x = 0, showing that 0 works with conjunction in logic just as it does with multiplication of numbers. Also x&or;0 = x&or;(x&and;&not;x) = x by absorption. Dualizing this reasoning, we obtain x&or;1 = 1 and x&and;1 = x. Alternatively we can justify these laws more directly simply by checking them for each of the two valuations of x.

The laws of lattice theory along with these first two laws for negation axiomatize the theory of complemented lattices. Including either distributivity law then axiomatizes the theory of complemented distributive lattices. The next two sections show that this theory is sufficient to axiomatize all the valid laws or identities of two-valued logic, that is, Boolean algebra. It follows that Boolean algebra as commonly defined in terms of these axioms coincides with the intuitive semantic notion of the valid identities of two-valued logic.

Derivations
While the Boolean laws enumerated in the previous section are certainly highlights of Boolean algebra, they by no means exhaust the laws, of which there are infinitely many, nor do they even exhaust the highlights. As it is out of the question to proceed in the ad hoc way of the preceding section for ever, the question arises as to how best to present the remaining laws.

One way of establishing an equation as being a law is to verify its truth for all valuations of its variables, sometimes called the method of truth tables. This is the method we depended on in the previous section to justify each law as we introduced it, constituting the semantic approach to establishing laws. From a practical standpoint the method lends itself to computer implementation for 20-30 variables because the enumeration of valuations is straightforward to program and boring to carry out, making it ideal work for a computer. Because there are 2n valuations to check the method starts to become impractical as 40 variables is approached. Beyond that the approach becomes of value mainly as the in-principle semantic definition of what constitutes an identically true or valid equation.

In contrast the syntactic approach is to derive new laws by symbolic manipulation from already established laws such as those listed in the previous section. (This is not to imply that derivations of a law shorter than the length of a semantic verification of that law need exist, although some thousand-variable laws impossible to verify by enumeration of valuations can have quite short derivations.) Here is an example showing the derivation of (w&or;x)&or;(y&or;z) = (w&or;y)&or;(x&or;z) from just the commutativity and associativity of disjunction.

(w&or;x)&or;(y&or;z) = ((w&or;x)&or;y)&or;z = (w&or;(x&or;y))&or;z = (w&or;(y&or;x))&or;z = ((w&or;y)&or;x)&or;z = (w&or;y)&or;(x&or;z)

The first two and last two steps appealed to associativity while the middle step used commutativity.

The rules of derivation for forming new laws from old can be assumed to be those permissible in high school algebra. For definiteness however it is worthwhile formulating a well-defined set of rules showing exactly what is needed. These are the domain-independent rules of equational logic, as sound for logic as they are for numerical domains or any other kind.

Reflexivity. t = t. That is, any equation whose two sides are the same term t is a law. (While arguably an axiom rather than a rule since it has no premises, we classify it as a rule because like the other three rules it is domain-independent, making no mention of specific logical, numeric, or other operations.)

Symmetry. From s = t infer t = s. That is, the two sides of a law may be interchanged. Intuitively one attaches no importance to which side of an equation a term comes from.

Transitivity. A chain s = t = u of two laws yields the law s = u. (This law of "cutting out the middleman" is applied four times in the above example to eliminate the intermediate terms.)

Substitution. Given two laws and a variable, each occurrence of that variable in the first law may be replaced by one or the other side of the second law. (Distinct occurrences can be replaced by distinct sides, but every occurrence must be replaced by one or the other side.)

While the first equation in the above example might seem simply a straightforward application of the associativity law, when analyzed more carefully according to the above rules it can be seen to require something more. We can justify it in terms of the reflexivity and substitution rules. Beginning with the laws x&or;(y&or;z) = (x&or;y)&or;z and w&or;x = w&or;x, we use substitution to replace both occurrences of x by w&or;x to arrive at the first equation. All five equations in the chain are accounted for along similar lines, with commutativity in place of associativity in the middle equation.

Soundness and completeness
It can be shown that the two approaches, semantic and syntactic, to constructing all the laws of Boolean algebra lead to the same set of laws. We say that the syntactic approach is sound when it yields a subset of the semantically obtained laws, and complete when it yields a superset thereof. We can then restate this coinciding of the semantic and syntactic approaches as the soundness and completeness of the syntactic approach with respect to (or as calibrated by) the semantic approach.

Soundness follows firstly from the fact that the initial laws or axioms we started from were all identities, that is, semantically true laws. Secondly it depends on the easily verified fact that the rules preserve identities.

Completeness can be proved by first deriving a few additional useful laws and then showing how to use the axioms and rules to prove that a term with n variables, ordered alphabetically say, is equal to its n-ary normal form, namely a unique term associated with the n-ary Boolean operation realized by that term with the variables in that order. It then follows that if two terms denote the same operation (the same thing as being semantically equal), they are both provably equal to the normal form term denoting that operation, and hence by transitivity provably equal to each other.

There is more than one suitable choice of normal form, but complete disjunctive normal form will do. A literal is either a variable or a negated variable. A disjunctive normal form (DNF) term is a disjunction of conjunctions of literals. (Associativity allows a term such as x&or;(y&or;z) to be viewed as the ternary disjunction x&or;y&or;z, likewise for longer disjunctions, and similarly for conjunction.) A DNF term is complete when every disjunct (conjunction) contains exactly one occurrence of each variable, independently of whether or not the variable is negated. Such a conjunction uniquely represents the operation it denotes by virtue of serving as a coding of those valuations at which the operation returns 1. Each conjunction codes the valuation setting the positively occurring variables to 1 and the negated ones to 0; the value of the conjunction at that valuation is 1, and hence so is the whole term. At valuations corresponding to omitted conjunctions, all conjunctions present in the term evaluate to 0 and hence so does the whole term.

In outline the general technique for converting any term to its normal form, or normalizing it, is to use De Morgan's laws to push the negations down to the variables. This yields monotone normal form, a term built from literals with conjunctions and disjunctions. For example &not;(x &or; (&not;y&and;z)) becomes &not;x &and; &not;(&not;y&and;z) and then &not;x &and; (&not;&not;y&or;&not;z). Applying &not;&not;x = x then yields &not;x &and; (y&or;&not;z).

Next use distributivity of conjunction over disjunction to push all conjunctions down below all disjunctions, yielding a DNF term. This makes the above example (&not;x&and;y) &or; (&not;x&and;&not;z).

Then for each variable y, replace each conjunction x not containing y with the disjunction of two copies of x, with y conjoined to one copy of x and &not;y conjoined to the other, in the end yielding a complete DNF term. (This is one place where an auxiliary law helps, in this case x = x&and;1 = x&and;(y&or;&not;y) = (x&and;y) &or; (x&and;&not;y).) In the above example the first conjunction lacks z while the second lacks y; expanding appropriately yields the complete DNF term (&not;x&and;y&and;z) &or; (&not;x&and;y&and;&not;z) &or; (&not;x&and;&not;z&and;y) &or; (&not;x&and;&not;z&and;&not;y).

Next use commutativity to put the literals in each conjunction in alphabetical order. The example becomes (&not;x&and;y&and;z) &or; (&not;x&and;y&and;&not;z) &or; (&not;x&and;y&and;&not;z) &or; (&not;x&and;&not;y&and;&not;z). This brings any repeated copies of literals next to each other; delete the redundant copies using idempotence of conjunction, not needed in our example.

Lastly order the disjuncts according to a suitable uniformly applied criterion. The criterion we use here is to read the positive and negative literals of a conjunction as respectively 1 and 0 bits, and to read the bits in a conjunction as a binary number. In our example the bits are 011, 010, 010, 000, or in decimal 3, 2, 2, 0. Ordering them numerically as 0, 2, 2, 3 yields (&not;x&and;&not;y&and;&not;z) &or; (&not;x&and;y&and;&not;z) &or; (&not;x&and;y&and;&not;z) &or; (&not;x&and;y&and;z). Note that these bits are exactly those valuations for x, y, and z that satisfy our original term &not;(x&or;(&not;y&and;z)). Complete DNF amounts to a canonical way of representing the truth table for the original term as another term.

Repeated conjunctions can then be deleted using idempotence of disjunction, which simplifies our example to (&not;x&and;&not;y&and;&not;z) &or; (&not;x&and;y&and;&not;z) &or; (&not;x&and;y&and;z).

In this way we have proved that the term we started with is equal to the normal form term for the operation it denotes. Hence all terms denoting that operation are provably equal to the same normal form term and hence by transitivity to each other.