Biological Computing
CS 301 Lecture, Dr. Lawlor
This material will not appear on the test. Have a nice Thanksgiving break!
Computing With Biology
Leonard Adleman demonstrated how to use DNA to quickly perform an exhaustive search in parallel.
The idea is to dump all the input molecules into a well-mixed test
tube. The inputs stick together in various ways, some of which
represent the outputs you're looking for. You then have to use
chemistry to winnow out the (huge number of) incorrect outputs until
you're left with the correct outputs, which you can read off.
Specifically, Adleman used the travelling salesman problem. He
represented each city with a short sequence of DNA, call them A
through Z. The routes between each each city were
represented by a piece of DNA that will stick to ("bind") to one city
and the next, where the length of the strand corresponds to the
distance between the cities. He then mixed all the cities and
routes together, where they stuck together randomly. Now, a
solution to the salesman problem is encoded as a strand of cities and
routes. A solution has two features:
- It visits all the cities. You can wash away strands that
don't match city A by building a gel that binds to A, and washing away
all the strands. Strands that don't have an A won't stick, and
hence will be washed away. You can then dissolve the gel.
Repeat to wash away strands that are missing any other city.
- It's shorter than any other path. You can determine this in a centrifuge! Shorter stuff is lighter, and rises to the top!
Other folks have recently built Turing machines with DNA, and "DNA nanofabrication" is a huge research field nowdays.
The
advantage of biological computing is density--since each input letter can be represented by a
single molecule, and the computation proceeds in parallel and in 3D,
you can search enormous search spaces quickly.
You can get companies online that will synthesize custom genes for you. For example, $100 will buy you 10 nano-mols of flourescent probe molecules to tag a particular protein or sequence you're interested in. 1 mol is 6.022 x 1023 molecules. So your 10 nano-mols is actually 6.022 x 1015
molecules. That's 60 trillion molecules per dollar! (For
comparison, a $100 billion-transistor chip is just 10 million
transistors per dollar.)
Biologically Inspired Computing
As components get smaller and faster, ordinary computers are getting
more and more sensitive to small disturbances. Already, cosmic rays can impact server uptimes. Anecdotally, there's a curious fact about space travel. Due to the higher radiation:
- Typical computers lock up about once a day (including flight computers, laptops, digital cameras, etc.) and must be rebooted.
- Human beings have never been observed to lock up, and have never
needed to be rebooted. This is a good thing, because a spaceship
with an insane or catatonic crew might have trouble returning to Earth.
Human brain cells can be killed in large absolute numbers by chemicals
such as alcohol and a variety of viruses and bacteria (e.g.,
meningitis). Yet the brain remains operational (at degraded
efficiency) despite such abuse, and rarely suffers significant
permanent damage from these injuries. That is, the human brain is
a suprisingly fault-tolerant computer. Unfortunately, we have no
idea how the human brain operates (see Gerstner's online neuron reverse-engineering attempts). But we can infer that it's at least as smart as the reactions of a single cell.
Ecosystem Design
A single cell uses a number of interesting design principles.
Many of these same principles are shared by healthy ecosystems, economic markets,
functioning democracies, and piles of gravel. I've come to call
these principles "ecosystem design":
- Many independent decision-makers. Examples: A cell is a
complicated collection of separate but interacting proteins. A
market is a collection of interacting buyers and sellers. A
gravel pile is a collection of interacting pebbles. Results: no
central decision-maker means no single point of failure, so the loss of
any few small pieces doesn't affect the overall result.
- Dynamic equilibrium--lots of small-scale stuff is changing all
the time, but the overall averages remain quite constant.
Examples: the metabolic rate of a cell is the result of all its pieces,
but it's quite predictable overall. A market's overall prices
tend to remain fairly stable. A gravel pile's average slope can
be predicted to within a few degrees.
As an example, I claim an automobile, CPU, dictatorship, and orderly stack of
bricks use "machine design", not ecosystem design: these systems have
crucial decisionmaker parts (e.g., the braking system, control unit,
dictator, or middle brick) whose failure can dramatically change the
entire system. Machine design requires somebody to go in
and fix these crucial parts now and then, which is painful.
All modern programming languages encourage machine-style design (for
example, what happens to a for loop if the variable "i" freaks
out?). Nobody knows what an ecosystem-design language would look
like.