The Wisdom of Bitwise Operators

CS 301 Lecture, Dr. Lawlor

So most of the usual work you do in C/C++/Java/C# manipulates integers or strings.  For example, you'll write a simple line like:
    x = y + 4;
which adds 4 to the value of y.

But the "bitwise" operators manipulate the underlying bits of the integer.  It's like the computer counts on its (32 or 64) fingers, does the operation on those bits, and then converts back to an integer (Except, of course, deep down the computer only knows about bits, not integers!).  For example, you might write:
    x = y & 4;
which masks out all but bit 2 of y.  So if y is 6, x will be 4.  But if y is 8, x will be zero.  Try it! (executable NetRun Link)

Bitwise Operations in C

There are actually six bitwise operators in C/C++/Java/C#:
Here's the "truth table" for the 3 main binary bitwise operations: AND, OR, and XOR.  Again, these are accessible directly from all C-like languages using '&' (ampersand or just "and"), '|' (pipe or just "or"), and '^' (caret or just "ex-oar").
A
B
A&B (AND)
A|B (OR)
A^B (XOR)
0
0
0
0
0
0
1
0
1
1
1
0
0
1
1
1
1
1
1
0
Interpretations:

Both-1 Either-1 Different


Spreads-0
Spreads-1
Flip A if B is set

Say you're Google.  For each possible word, you store a big list of documents containing one 32-bit integer per document.  Each bit of that integer corresponds to a section of the document where the word might occur (e.g., bit 0 is set if the word occurs in the first 3% of the document, bit 1 represents the next 3%, and so on).  If you've got two search terms, for a given document you can look up two integers, A and B, that represent where those terms appear in the document.  If both terms appear near the same place in the document, that's good--and we can find a list of all those places (where both bits are set to 1) in just one clock cycle using just "A&B"!  The same idea shows up in the "region codes" of Cohen-Sutherland clipping, used in computer graphics.

But what if I want to compare two documents for similarity?  Note that C-like languages don't provide a "bitwise-compare" operator, that sets equal bits to 1 and unequal bits to 0 (because == compares whole integers, not individual bits).  So how can you compute this?

Well, all the bitwise operations do interesting and useful things when inverted:
A
B
~(A&B) (NAND)
~(A|B) (NOR)
~(A^B) (XNOR)
0
0
1
1
1
0
1
1
0
0
1
0
1
0
0
1
1
0
0
1
Interpretations:

Either-0
Both-0 Equality

Note that "~(A^B)" is 1 if both input bits are identical, and 0 if they're different--so it's perfect for computing document similarity!

The final major bitwise operation is bit shifting, represented in C-like languages using ">>" and "<<".  Left shift ("<<", pointing left), shifts bits into higher positions, filling the now-empty lower positions with zeros.  For unsigned numbers, right shifting (">>", pointing right) shifts bits into lower positions and fills the empties with zeros.  If you wants to search for both-term matches in B that are one bit off of those in A, you can use "A&(B<<1)" or "A&(B>>1)", depending on the direction you want to check.  To check both directions, use "(A&(B<<1))|(A&(B>>1))".

One warning: bitwise operations have strange precedence.  Specifically, "A&B==C" is interpreted by default like "A&(B==C)", not "(A&B)==C"; and "A>>B+C" is interpreted as "A>>(B+C)".  This can cause horrible, hard-to-understand errors, so I just stick in parenthesis around everything when I'm using bitwise operators.

Applications of Bitwise Operations--Thinking in SIMD

The simplest bitwise operation is the NOT operation: NOT 0 is 1; NOT 1 is 0.  Bitwise NOT is written as "~" in C/C++/Java/C#.   The cool thing about the NOT (and other bitwise) operations is that they operate on *all* the bits of an integer at *once*.  That means a 32-bit integer NOT does thirty-two separate things at the same time!  This "single-instruction, multiple data" (SIMD) approach is a really common way to speed up computations; it's the basis of the MMX, SSE, and AltiVec instruction set extensions. But you can do lots interesting SIMD work without using any special instructions--plain old C will do.  This use of bitwise operations is often called "SIMD within a register (SWAR)" or "word-SIMD"; see Sean Anderson's "Bit Twiddling Hacks" for a variety of amazing examples.

The genome uses just 4 digits (ACGT, for the acids Adenine, Cytosine, Guanine, and Thymine), and so can be represented in as few as 2 bits per nucleotide (genetic digit).  Former UAF CS undergraduate James Long, now up the hill at ARSC, developed a very fast sofware implementation of the Smith-Waterman string (or gene) matching algorithm called CBL that uses this SIMD-within-a-register approach.