Biological Computing

CS 301 Lecture, Dr. Lawlor
This material will not appear on the test.  Have a nice Thanksgiving break!

Computing With Biology

Leonard Adleman demonstrated how to use DNA to quickly perform an exhaustive search in parallel.  The idea is to dump all the input molecules into a well-mixed test tube.  The inputs stick together in various ways, some of which represent the outputs you're looking for.  You then have to use chemistry to winnow out the (huge number of) incorrect outputs until you're left with the correct outputs, which you can read off.  Specifically, Adleman used the travelling salesman problem.  He represented each city with a short sequence of DNA, call them A through  Z.  The routes between each each city were represented by a piece of DNA that will stick to ("bind") to one city and the next, where the length of the strand corresponds to the distance between the cities.  He then mixed all the cities and routes together, where they stuck together randomly.  Now, a solution to the salesman problem is encoded as a strand of cities and routes.  A solution has two features:
Other folks have recently built Turing machines with DNA, and "DNA nanofabrication" is a huge research field nowdays.

The advantage of biological computing is density--since each input letter can be represented by a single molecule, and the computation proceeds in parallel and in 3D, you can search enormous search spaces quickly. 

You can get companies online that will synthesize custom genes for you.  For example, $100 will buy you 10 nano-mols of flourescent probe molecules to tag a particular protein or sequence you're interested in.  1 mol is 6.022 x 1023 molecules.  So your 10 nano-mols is actually 6.022 x 1015 molecules.  That's 60 trillion molecules per dollar!  (For comparison, a $100 billion-transistor chip is just 10 million transistors per dollar.)

Biologically Inspired Computing

As components get smaller and faster, ordinary computers are getting more and more sensitive to small disturbances.  Already, cosmic rays can impact server uptimes. Anecdotally, there's a curious fact about space travel.  Due to the higher radiation:
Human brain cells can be killed in large absolute numbers by chemicals such as alcohol and a variety of viruses and bacteria (e.g., meningitis).  Yet the brain remains operational (at degraded efficiency) despite such abuse, and rarely suffers significant permanent damage from these injuries.  That is, the human brain is a suprisingly fault-tolerant computer.  Unfortunately, we have no idea how the human brain operates (see Gerstner's online neuron reverse-engineering attempts).  But we can infer that it's at least as smart as the reactions of a single cell.

Ecosystem Design

A single cell uses a number of interesting design principles.  Many of these same principles are shared by healthy ecosystems, economic markets, functioning democracies, and piles of gravel.  I've come to call these principles "ecosystem design":
As an example, I claim an automobile, CPU, dictatorship, and orderly stack of bricks use "machine design", not ecosystem design: these systems have crucial decisionmaker parts (e.g., the braking system, control unit, dictator, or middle brick) whose failure can dramatically change the entire system.   Machine design requires somebody to go in and fix these crucial parts now and then, which is painful.

All modern programming languages encourage machine-style design (for example, what happens to a for loop if the variable "i" freaks out?).  Nobody knows what an ecosystem-design language would look like.