Exploring "equivalence" of Biology and Boolean Logic

What can we learn by attempting to implement biologies systems using symbolic logic, and what improvements can be made in our implementations of symbolic logic through the study of biology? That is the purpose of this exploration.


I've been considering the dichotomy of "symbolic" systems, such as pure math (which exists in the theoretical domain, not subject to the effects of physics), and "physical" systems (which are subject to hidden variables and uncertainty, e.g. Heisenberg uncertainty). I've attempted to document these raw thoughts in the preface of the following essay.

This comparison (which can be oversimplified as physics v. theory) and the role of "computers" as a medium for translation between these two domains is ancillary to the essay's main objective: to explore the implementations and properties of Biological computation (how symbolic computation emerges "naturally" from biological, physical systems) and Boolean Logic (Von Neumann architecture and how present day computers implement symbolic logic over physical matter).

Ideally, this comparison will evolve over time to examine the most basic biological components and the most basic electronic implementations of boolean logic gates from both directions (creating boolean gates using these axiomatic biological components and created the axiomatic biological components using electronics). The outcome would be the ability to manipulate physical systems (using techniques and components like schmitt trigger triggers in electronics to enable circuits to be robust against the uncertainty of physics) to construct playgrounds where we can make promises about deduction about the equivalence of biology and computer programs (like the Curry-Howard correspondence, but including isomorphism to biological systems)


The reason pure math "works" (i.e. has the property of being proveable and internally consistent) is because it is abstract and symbolic. That is, it exists in a theoretical domain which is not subject to the physics of the universe and its complex variables and their uncertainties over time. Within the static, unchanging context of the symbolic domain, we can freely test axioms and their consequences, control the construction and scope of a problem, examine properties of expressions using different approaches, and prove the consistency, correctness, and properties of theorems.

While these properties of the symbolic domain convenience us to make deep promises about the integrity, inferences, and comparison of purely symbolic mathematical systems, it doesn't eliminate the reality that many of the systems we care about are subject to the chaotic laws of our physical universe. In such physical domains, we have less luxury of controlling our axioms. We may eliminate some variables by controlling our environment, even manipulate the powers of physics within reason of our technical capabilities (e.g. Bose-Einstein Conensate or the world's biggest vacuum chamber), but we cannot change that the universe is made of quarks -- a confounding constraint pure mathematics can ignore.

The need for symbolic systems is necessitated by the complexity of our physical world. It seems there is always something more we do not know, in which every other computation depends:

"For centuries [people] lived in the belief that truth was slim and elusive and that once [it was] found [...], the troubles of [our]kind would be over. [One] of knowledge in our time is bowed down under a burden [one] never imagined [one] would ever have: The overproduction of truth that cannot be consumed."
-- The Denial of Death (page xviii, paragraph 1) by Ernest Becker
To fully understand the physical world in its entirety requires that it be exhaustively, deterministically computable, and fully observable. At the quantum level, this "unfolding of the layers of the onion" requires creativity and finese as objects become so small that even observing them changes their behaviour. Instead of hedging all of our bets on speculatation on whether we will someday be able to compute what happened before the big bang, or the exact location of an electron with full confidence without relying on heisenberg uncertainty principle, for instance, there's reason to create incrementally better (more accurate) tools and systems for approximating these answers (or function).

Fortunately, our universe is sufficiently complex as to support physical systems (made of matter and subject to the laws of physics) capable of faithfully emulating isolatable symbolic domains at high (albeit finite) accuracy. A computer is one example of a "sub-universe" we can create whose axioms are those of symbolic logic, a subset of symbolic mathematics. In later sections, it will be explored how the ingenuity of electronic design and engineering has enabled computers to uphold such a promise; to achieve resiliance and robustness to the seemingly untamable chaos and stochasticism of physics. The importance of the computer is that it provides a medium for integrating symbolic systems into our physical lives; as a frictionless translation layer between the physical and the symbolic.

Why is important to be able to engineer faithfully symbolic systems from within our physical reality? First, it affords us interoperability; the ability for observations to be taken directly from the physical word and then analyzed within a controllable, symbolic environment. Second, we benefit from one the standard features of symbolic systems, which is that they are fully observable; they allow us to simulate and test complex interactions while maintaining full control over the environment's variables and axioms, giving us insights into the workings of the physical world. Most importantly, as "implementations" of symbolic logic, they provide us with a extensible, evolvable framework for performing repeatable computation (executing algorithms).

How much can we trust a computer proof? See: Curry-Howard Correspondence. Is the computer the only viable approach of implementing symbolic systems?

The dichotomy of pure symbolic mathematics and the physical universe means that we can make progress in either domain/direction. Towards a perfect understanding of the physical universe, or in advancing discrete tools and systems for solving certain problems in isolation. But so too can we discuss the dichotomy of different systems. Boolean logic is not the only implementable system for computation. We also have biology, an evolutionary system which evolved independently and from different axioms than boolean logic. Because biology is a physical system and boolean logic a symbolic one, perhaps the same dichotomy exists as does at the higher level comparison of the physical universe and symbolic math (in their totality).

What can we learn by attempting to implement biology's systems using symbolic logic, and what improvements can be made in our implementations of symbolic logic through the study of biology? How and where will the brain and the transistor "meet in the middle"? And can it be demonstrated that computationally they are equivalent? That is the purpose of this exploration.

A mess of unintelligable notes:

** Electrical engineering, avoiding inaccuracies in electrical current
capacitors (integrate input of noisy buttons, for instance -- debouncer) or
Schmitt trigger as detector

*** Schmitt triggers

"When the necessity rises to determine which of the two signals is
stronger or to determine which of the two signals reaches a specified
value a comparitor is used."


* Inspiration
schrodener - "what is life" (recommended by Drew Winget)
Acting on 1st principles, determining boundaries between the supervening fields

deductions based on 1st principles
- size constraints
- electromagnetic bonds / type

* 2 directions:
  1. boolean logic using/from biological components

  2. biological components using boolean logic (electronics)

* What are the axiomatic components of biology (the simplest units made only of the elements)

* What are the axiomatic components of boolean logic and electronics?
NAND gates, etc.

* Do computers make "errors"?

** Error Prevention, Safeguards, Guardbands

MPE (Manchester Encoding)

** Types of errors
1. Floating point errors; deterministic inaccuracies in calculation due to hardware limitation

2. Design mistakes (e.g. CPU); formal methods can be used to verify correctness

3. Environmental Corruption and Hardware error: radiation from cosmic rays, electrical noise or spikes (e.g. electrostatic discharge)
- https://en.wikipedia.org/wiki/Engineering_tolerance#Electrical_component_tolerance
- https://en.wikipedia.org/wiki/Allowance_(engineering)
- https://en.wikipedia.org/wiki/Allowance_(engineering)#Confounding_of_the_engineering_concepts_of_allowance_and_tolerance

** Error correction
https://www.youtube.com/watch?v=5sskbSvha9M error correction

* How do the brain's mistakes differ from those of Von Neumann computers?
The difficulties of executing simple algorithms: why brains make mistakes computers don't.

* Heisenberg's uncertainty principle & Schrödinger equation

* Synthetic Biology (applying engineering principles to nature)
Designing *Input* module, *Response* module, and *Output* module 
Creating Life - The Ultimate Engineering Challenge. (Synthetic Biology documentary)
https://www.youtube.com/watch?v=VhuiMRIn6GM Labster - Synthetic Biology Virtual Lab Simulation

** Designing an Apoptotic Biological Circuit
*** electroporation
Electropermeabilization, is a molecular biology technique in which an electrical field is applied to cells in order to increase the permeability of the cell membrane, allowing chemicals, drugs, or DNA to be introduced into the cell.

*** pasmid isolation (http://vlab.amrita.edu/?sub=3&brch=77&sim=314&cnt=1)

*** gel electrophoresis

! systems biology

Previous works

http://www.irisa.fr/dyliss/public/asiegel/Articles/SchaubSiegelVidela.pdf https://books.google.com/books?id=qGREBAAAQBAJ&pg=PT60&lpg=PT60&dq=equivalence+of+boolean+logic+and+biology&source=bl&ots=A3erNXBCxB&sig=MICDGNmyE-Tcgp631qOYBODYhYk&hl=en&sa=X&ei=sn6MVfPzI4r6sAWliJmYDQ&ved=0CB4Q6AEwAA#v=onepage&q=equivalence%20of%20boolean%20logic%20and%20biology&f=false

Tangent (Essay)

Conceptually, "Labster" (MIT virtual science laboratory) is an amazing technology and resource for anyone interested in practicing Synthetic Biology

Provenance Trail

The past few days I've gone down a rabbit hole involving biology and computational theory. I became particularly interested in the relationship between electric circuits (physically engineered implementations of boolean logic) and biological systems. What might be, or has been, learned about their comparative computational abilities or properties, and how might understanding one inform advances within the other? While my journey has just begun, I've recorded breadcrumbs of my findings[1] to organize myself and others interested in the topic.

This whole spiral started in a very "Mark P Xu Neyerian" kind of way, by thinking about Curry-Howard Correspondence (Isomorphism), which demonstrates computational equivalence between mathematics and computer programs (i.e. boolean logic). I wondered if a similar correspondence might exist between biological systems and boolean logic. I admit this question is more than slightly naive (it is in-concrete what "equivalence" means in this context), given biology is built of the chemical elements and on top physics, both of which are subject to measurement uncertainty. If a loose correspondence can be demonstrated and we are able to determine both (a) mappings between biological systems and boolean logic, as well as (b) determine the chemical and thus mathematical thresholds under which these biological systems predictably operate equivalently (in the same way a Schmitt trigger is used in electronics to make electrical circuits robust against uncertainty and variability in electrical flow), this may expand the way we can use computers to cheaply and accurately test, evolve, or even prove biological and medical results.

I also must admit, if this question of Curry-Howard-Biology correspondence is new, it's only because it hasn't been phrased "exactly" as such and because it's implications are mostly philosophical and don't contribute to the underlying sciences and experiments -- both which are anything but new and have been around for many years. Systems Biology, for instance, attempts to understand and emulate biological components, and vice versa through computational models[2]. The field of Biocomputing attempts to achieve some of the same learnings from the opposite direction, by creating physical boolean logic gates and other computational systems from biological material[3]. There is yet another field, Synthetic Biology, which constitutes the interaction of engineering practices and biology to design and synthesize biological components for specific applications.

In order to connect the dots and make headway in any of these directions towards my goal, I needed a better understanding of what the simplest units of biology are. I had to consider, what is below the "cell"? What are the simplest biological constructs (molecules, systems) made only of the elements (which rise only from chemistry)? This question too has two directions. One, we can deduce the structure and components of the "cell" (from the cell down to its lowest level constituent parts) or two, we can inductively determine the arrangement of chemicals and circumstances required for biological, organic matter and simple biological systems to emerge from pure chemistry (abiogenesis/biopoiesis -- see Miller-Urey experiment)[4].

Synthetic biology may be the answer to "Schmitt trigger" analogy I previously entertained, as a medium for controlling the uncertainty effects of biological variables, determine attribution/consequences of variables, and in retroactively verifying results. Which made me wonder, how does synthetic biology work? I first watched this video[5] which followed a team of new biology researchers in a project to create bacterial capable of changing color within parasite infested water. The video is very practical and also follows environment consequences and implications, design considerations, and the reality of experimentation (regression testing), and team/lab dynamics. While the video was quite understandable, what the video didn't outline was any sort of terminology for contextualizing or reproducing the different experimental steps. I found myself wanting to know what processes the scientists were running and how. That's when I saw Labster[6] and was sufficiently impressed by the idea. Even in the video, the steps of Synthetic Biological are demystified. The best part is, the program seems like an amazingly safe, fast, scalable and accessible alternative to laboratories and hazardous material.

In the interest in time-boxing this tangent, my next step is to continue exploring the simplest systems within biology, as well as how they may be functionally emulated through boolean logic. Updates soon!


green chemistry artificial evolution
Stanford karl deisseroth

[1] Exploring "equivalence" of Biology and Boolean Logic
[2] An Introduction to Systems Biology: Design Principles of Biological Circuits
[3] Synthesizing Biomolecule-based Boolean Logic Gates http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3603578/ 
[4] Miller–Urey Experiment https://en.wikipedia.org/wiki/Miller%E2%80%93Urey_experiment
[5] Creating Life - The Ultimate Engineering Challenge. (Synthetic Biology documentary)
[6] Labster - Synthetic Biology Virtual Lab Simulation https://www.youtube.com/watch?v=VhuiMRIn6GM
[7] http://www.nature.com/nbt/journal/v32/n6/full/nbt.2891.html
[8] https://www.quantamagazine.org/20160128-ecorithm-computers-and-life/ "ecorithms"