Differential Analytic Turing Automata
Author: Jon Awbrey
The task ahead is to chart a course from general ideas about transformational equivalence classes of graphs to a notion of differential analytic turing automata (DATA). It may be a while before we get within sight of that goal, but it will provide a better measure of motivation to name the thread after the envisioned end rather than the more homely starting place.
The basic idea is as follows. One has a set of graphs and a set of transformation rules, and each rule has the effect of transforming graphs into graphs, In the cases that we shall be studying, this set of transformation rules partitions the set of graphs into transformational equivalence classes (TECs).
There are many interesting excursions to be had here, but I will focus mainly on logical applications, and and so the TECs I talk about will almost always have the character of logical equivalence classes (LECs).
An example that will figure heavily in the sequel is given by rooted trees as the species of graphs and a pair of equational transformation rules that derive from the graphical calculi of C.S. Peirce, as revived and extended by George Spencer Brown.
Here are the fundamental transformation rules, also referred to as the arithmetic axioms, more precisely, the arithmetic initials.
(1)  
(2) 
That should be enough to get started.
Contents
 1 Cactus Language
 2 Example 1
 3 Example 2
 4 Example 3
 5 Example 4
 6 Differential Analysis
 7 Symbolic Method
 8 Differential : Locally Linear Approximation
 9 Notions of Approximation
 10 Analytic Series
 11 Visualization
 12 Turing Machine Example
 13 Document History
Cactus Language
I will be making use of the cactus language extension of Peirce's Alpha Graphs, so called because it uses a species of graphs that are usually called "cacti" in graph theory. The last exposition of the cactus syntax that I've written can be found here:
The representational and computational efficiency of the cactus language for the tasks that are usually associated with boolean algebra and propositional calculus makes it possible to entertain a further extension, to what we may call differential logic, because it develops this basic level of logic in the same way that differential calculus augments analytic geometry to handle change and diversity. There are several different introductions to differential logic that I have written and distributed across the Internet. You might start with the following couple of treatments:
I will draw on those previously advertised resources of notation and theory as needed, but right now I sense the need for some concrete examples.
Example 1
Let's say we have a system that is known by the name of its state space and we have a boolean state variable where
We observe for a while, relative to a discrete time frame, and we write down the following sequence of values for

“Aha!” we say, and think we see the way of things, writing down the rule where is the next state after and is the negation of in boolean logic.
Another way to detect patterns is to write out a table of finite differences. For this example, we would get:

And of course, all the higher order differences are zero.
This leads to thinking of as having an extended state and this additional language gives us the facility of describing state transitions in terms of the various orders of differences. For example, the rule can now be expressed by the rule
There is a more detailed account of differential logic in the following paper:
For future reference, here are a couple of handy rosetta stones for translating back and forth between different notations for the boolean functions where
Example 2
For a slightly more interesting example, let's suppose that we have a dynamic system that is known by its state space and we have a boolean state variable In addition, we are given an initial condition and a law
The initial condition has two cases:

Here is a table of the two trajectories or orbits that we get by starting from each of the two permissible initial states and staying within the constraints of the dynamic law


Note that the state that is, is a stable attractor for both orbits.
Further discussion of this example, complete with charts and graphs, can be found at this location:
Example 3
One more example may serve to suggest just how much dynamic complexity can be built on a universe of discourse that has but a single logical feature at its base. But first, there are a few more elements of general notation that that we'll need to describe finite dimensional universes of discourse and the qualitative dynamics that we envision occurring in them.
Let be the alphabet of logical features or variables that are used to describe the dimensional universe of discourse One may picture a venn diagram whose overlapping “circles” are labeled with the feature names in Staying with this picture, one visualizes the universe of discourse as having two layers:
 The set of points or cells — the latter used in another sense of the word than when we speak of cellular automata.
 The set of propositions, booleanvalued functions, or maps from to
Thus we picture the universe of discourse as an ordered pair having points in the underlying space and propositions in the function space
A more complete discussion of these notations can be found here:
Now, to the Example.
Once again, let us begin with a 1feature alphabet In the discussion that follows I will consider a class of trajectories that are ruled by the constraint that for all greater than some fixed and I will indulge in the use of some picturesque language to describe salient classes of such curves. Given the finite order condition, there is a highest order nonzero difference that is exhibited at each point in the course of any determinate trajectory. Relative to any point of the corresponding orbit or curve, let us call this highest order differential feature the drive at that point. Curves of constant drive are then referred to as gear curves.
One additional piece of notation will be needed here. Starting from the base alphabet we define and notate as the order extended alphabet over
Let us now consider the family of 4^{th} gear curves through the extended space These are the trajectories that are generated subject to the law where it is understood in making such a statement that all higher order differences are equal to
Since and all higher order are fixed, the entire dynamics can be plotted in the extended space Thus, there is just enough room in a planar venn diagram to plot both orbits and to show how they partition the points of As it turns out, there are exactly two possible orbits, of eight points each, as illustrated in Figures 16a and 16b. See here:
Here are the 4^{th} gear curves over the 1feature universe arranged in the form of tabular arrays, listing the extended state vectors as they occur in one cyclic period of each orbit.


In this arrangement, the temporal ordering of states can be reckoned by a kind of parallel roundup rule. Specifically, if is any pair of adjacent digits in a state vector then the value of in the next state is the addition being taken mod 2, of course.
A more complete discussion of this arrangement is given here:
Example 4
I am going to tiptoe in silence/consilience past many questions of a philosophical nature/nurture that might be asked at this juncture, no doubt to revisit them at some future opportunity/importunity, however the cases happen to align in the course of their inevitable fall.
Instead, let's follow the adage to “keep it concrete and simple”, taking up the consideration of an incrementally more complex example, but having a slightly more general character than the orders of sequential transformations that we've been discussing up to this point.
The types of logical transformations that I have in mind can be thought of as transformations of discourse because they map a universe of discourse into a universe of discourse by way of logical equations between the qualitative features or logical variables in the source and target universes.
The sequential transformations or state transitions that we have been considering so far are actually special cases of these more general logical transformations, specifically, they are the ones that have a single universe of discourse, as it happens to exist at different moments in time, in the role of both the source and the target universes of the transformation in question.
Onward and upward to Flatland, the differential analysis of transformations between 2dimensional universes of discourse.
Consider the transformation from the universe to the universe that is defined by this system of equations:

The parenthetical expressions on the right are the cactus forms for the boolean functions that correspond to inclusive disjunction and logical equivalence, respectively. Table 1 summarizes the basic elements of the cactus notation for propositional logic.



 

 

 

 

 


The component notation allows us to give a name and a type to this transformation, and permits us to define it by means of the compact description that follows:

The information that defines the logical transformation can be represented in the form of a truth table, as shown below.






A more complete framework of discussion and a fuller development of this example can be found in the neighborhood of the following site:
Consider the transformation of textual elements (TOTE) in progress:

Taken as a transformation from the universe to the universe this is a particular type of formal object, and it can be studied at that level of abstraction until the chickens come home to roost, as they say, but when the time comes to count those chickens, if you will, the terms of artifice that we use to talk about abstract objects, almost as if we actually knew what we were talking about, need to be fully fledged or fleshed out with extra bits of interpretive data (BOIDs).
And so, to decompress the story, the TOTE that we use to convey the FOMA has to be interpreted before it can be applied to anything that actually puts supper on the table, so to speak.
What are some of the ways that an abstract logical transformation like gets interpreted in the setting of a concrete application?
Mathematical parlance comes part way to the rescue here and tosses us the line that a transformation of syntactic signs can be interpreted in either one of two ways, as an alias or as an alibi.
When we consider a transformation in the alias interpretation, we are merely changing the terms that we use to describe what may very well be, to some approximation, the very same things.
For example, in some applications the discursive universes and are best understood as diverse frames, instruments, reticules, scopes, or templates, that we adopt for the sake of viewing from variant perspectives what we conceive to be roughly the same underlying objects.
When we consider a transformation in the alibi interpretation, we are thinking of the objective things as objectively moving around in space or changing their qualitative characteristics. There are times when we think of this alibi transformation as taking place in a dimension of time, and then there are times when time is not an object.
For example, in some applications the discursive universes and are actually the same universe, and what we have is a frame where is the next state of and is the next state of notated as and This permits us to rewrite the transformation as follows:

All in all, then, we have three different ways in general of applying or interpreting a transformation of discourse, that we might sum up as one brand of alias and two brands of alibi, all together, the Elseword, the Elsewhere, and the Elsewhen.
No more angels on pinheads, the brass tacks next time.
Differential Analysis
It is time to formulate the differential analysis of a logical transformation, or a mapping of discourse. It is wise to begin with the first order differentials.
We are considering an abstract logical transformation that can be interpreted in a number of different ways. Let's fix on a couple of major variants that might be indicated as follows:

is just one example among — well, now that I think of it — how many other logical transformations from the same source to the same target universe? In the light of that question, maybe it would be advisable to contemplate the character of within the fold of its most closely akin transformations.
Given the alphabets and along with the corresponding universes of discourse and how many logical transformations of the general form are there?
Since and can be any propositions of the type there are choices for each of the maps and and thus there are different mappings altogether of the form
The set of all functions of a given type is customarily denoted by placing its type indicator in parentheses, in the present instance writing and so the cardinality of this function space can most conveniently be summed up by writing:
Given any transformation of this type, the (first order) differential analysis of is based on the definition of a couple of further transformations, derived by way of operators on that ply between the (first order) extended universes, and of own source and target universes.
First, the enlargement map (or the secant transformation) is defined by the following pair of component equations:

Second, the difference map (or the chordal transformation) is defined in a componentwise fashion as the boolean sum of the initial proposition and the enlarged or shifted proposition for in accord with following pair of equations:

Maintaining a strict analogy with ordinary difference calculus would perhaps have us write but the sum and difference operations are the same thing in boolean arithmetic. It is more often natural in the logical context to consider an initial proposition then to compute the enlargement and finally to determine the difference so we let the variant order of terms reflect this sequence of considerations.
Given these general considerations about the operators and let's return to particular cases, and carry out the first order analysis of the transformation
By way of getting our feet back on solid ground, let's crank up our current case of a transformation of discourse, with concrete type or abstract type and let it spin through a sufficient number of turns to see how it goes, as viewed under the scope of what is probably its most straightforward view, as an elsewhen map





In the upshot there are two basins of attraction, the state and the state with the orbit making up an isolated basin and the orbit leading to the basin
On first examination of our present example we made a likely guess at a form of rule that would account for the finite protocol of states that we observed the system passing through, as spied in the light of its boolean state variable and that rule is wellformulated in any of these styles of notation:

In the current example, we already know in advance the program that generates the state transitions, and it is a rule of the following equivalent and easily derivable forms:

Well, the last one is not such a fall off a log, but that is exactly the purpose for which we have been developing all of the foregoing machinations.
Here is what I got when I just went ahead and calculated the finite differences willynilly:


To be honest, I have never thought of trying to hack the problem in such a bruteforce way until just now, and so I know enough to expect a not inappreciable probability of error about all that I've taken the risk to write out here, but let me forge ahead and see what I can see.
What we are looking for is — one rule to rule them all, a rule that applies to every state and works every time.
What we see at first sight in the tables above are patterns of differential features that attach to the states in each orbit of the dynamics. Looked at locally to these orbits, the isolated fixed point at is no problem, as the rule describes it pithily enough. When it comes to the other orbit, the first thing that comes to mind is to write out the law
Symbolic Method
It ought to be clear at this point that we need a more systematic symbolic method for computing the differentials of logical transformations, using the term differential in a loose way at present for all sorts of finite differences and derivatives, leaving it to another discussion to sharpen up its more exact technical senses.
For convenience of reference, let's recast our current example in the following form:

In their application to this logical transformation the operators and respectively produce the enlarged map and the difference map whose components can be given as follows.

But these initial formulas are purely definitional, and help us little to understand either the purpose of the operators or the significance of the results. Working symbolically, let's apply a more systematic method to the separate components of the mapping
A sketch of this work is presented in the following series of Figures, where each logical proposition is expanded over the basic cells of the 2dimensional universe of discourse
Computation Summary for Logical Disjunction
The venn diagram in Figure 1.1 shows how the proposition can be expanded over the universe of discourse to produce a logically equivalent exclusive disjunction, namely,
oo    o   /%\   /%%%\   /%%%%%\   o%%%%%%%o   /%\%%%%%/%\   /%%%\%%%/%%%\   /%%%%%\%/%%%%%\   o%%%%%%%o%%%%%%%o   /%\%%%%%/%\%%%%%/%\   /%%%\%%%/%%%\%%%/%%%\   /%%%%%\%/%%%%%\%/%%%%%\   o%%%%%%%o%%%%%%%o%%%%%%%o   /%\%%%%%/%\%%%%%/%\%%%%%/%\   /%%%\%%%/%%%\%%%/%%%\%%%/%%%\   /%%%%%\%/%%%%%\%/%%%%%\%/%%%%%\   o%%%%%%%o%%%%%%%o%%%%%%%o%%%%%%%o   \%%%%%/%\%%%%%/ \%%%%%/%\%%%%%/    \%%%/%%%\%%%/ \%%%/%%%\%%%/     \%/%%%%%\%/ \%/%%%%%\%/     o%%%%%%%o o%%%%%%%o     \%%%%%/ \ / \%%%%%/      \%%%/ \ / \%%%/      u  \%/ \ / \%/  v    o+o o o+o    \ / \ /     \ / \ /     du \ / \ / dv    oo oo   \ /   \ /   \ /   o    oo Figure 1.1. f = ((u)(v)) 
Figure 1.2 expands over to give:

oo    o   /%\   /%%%\   /%%%%%\   o%%%%%%%o   /%\%%%%%/%\   /%%%\%%%/%%%\   /%%%%%\%/%%%%%\   o%%%%%%%o%%%%%%%o   /%\%%%%%/ \%%%%%/%\   /%%%\%%%/ \%%%/%%%\   /%%%%%\%/ \%/%%%%%\   o%%%%%%%o o%%%%%%%o   /%\%%%%%/%\ /%\%%%%%/%\   /%%%\%%%/%%%\ /%%%\%%%/%%%\   /%%%%%\%/%%%%%\ /%%%%%\%/%%%%%\   o%%%%%%%o%%%%%%%o%%%%%%%o%%%%%%%o   \%%%%%/ \%%%%%/%\%%%%%/ \%%%%%/    \%%%/ \%%%/%%%\%%%/ \%%%/     \%/ \%/%%%%%\%/ \%/     o o%%%%%%%o o     \ /%\%%%%%/%\ /      \ /%%%\%%%/%%%\ /      u  \ /%%%%%\%/%%%%%\ /  v    o+o%%%%%%%o%%%%%%%o+o    \%%%%%/ \%%%%%/     \%%%/ \%%%/     du \%/ \%/ dv    oo oo   \ /   \ /   \ /   o    oo Figure 1.2. Ef = ((u + du)(v + dv)) 
Figure 1.3 expands over to produce:

oo    o   / \   / \   / \   o o   / \ / \   / \ / \   / \ / \   o o o   / \ /%\ / \   / \ /%%%\ / \   / \ /%%%%%\ / \   o o%%%%%%%o o   / \ / \%%%%%/ \ / \   / \ / \%%%/ \ / \   / \ / \%/ \ / \   o o o o o   \ /%\ /%\ /%\ /    \ /%%%\ /%%%\ /%%%\ /     \ /%%%%%\ /%%%%%\ /%%%%%\ /     o%%%%%%%o%%%%%%%o%%%%%%%o     \%%%%%/%\%%%%%/%\%%%%%/      \%%%/%%%\%%%/%%%\%%%/      u  \%/%%%%%\%/%%%%%\%/  v    o+o%%%%%%%o%%%%%%%o+o    \%%%%%/ \%%%%%/     \%%%/ \%%%/     du \%/ \%/ dv    oo oo   \ /   \ /   \ /   o    oo Figure 1.3. Df = f + Ef 
I'll break this here in case anyone wants to try and do the work for on their own.
Computation Summary for Logical Equality
The venn diagram in Figure 2.1 shows how the proposition can be expanded over the universe of discourse to produce a logically equivalent exclusive disjunction, namely,
oo    o   /%\   /%%%\   /%%%%%\   o%%%%%%%o   /%\%%%%%/%\   /%%%\%%%/%%%\   /%%%%%\%/%%%%%\   o%%%%%%%o%%%%%%%o   / \%%%%%/%\%%%%%/ \   / \%%%/%%%\%%%/ \   / \%/%%%%%\%/ \   o o%%%%%%%o o   / \ / \%%%%%/ \ / \   / \ / \%%%/ \ / \   / \ / \%/ \ / \   o o o o o   \ / \ /%\ / \ /    \ / \ /%%%\ / \ /     \ / \ /%%%%%\ / \ /     o o%%%%%%%o o     \ /%\%%%%%/%\ /      \ /%%%\%%%/%%%\ /      u  \ /%%%%%\%/%%%%%\ /  v    o+o%%%%%%%o%%%%%%%o+o    \%%%%%/%\%%%%%/     \%%%/%%%\%%%/     du \%/%%%%%\%/ dv    oo%%%%%%%oo   \%%%%%/   \%%%/   \%/   o    oo Figure 2.1. g = ((u, v)) 
Figure 2.2 expands over to give:

oo    o   /%\   /%%%\   /%%%%%\   o%%%%%%%o   / \%%%%%/ \   / \%%%/ \   / \%/ \   o o o   /%\ /%\ /%\   /%%%\ /%%%\ /%%%\   /%%%%%\ /%%%%%\ /%%%%%\   o%%%%%%%o%%%%%%%o%%%%%%%o   / \%%%%%/ \%%%%%/ \%%%%%/ \   / \%%%/ \%%%/ \%%%/ \   / \%/ \%/ \%/ \   o o o o o   \ /%\ /%\ /%\ /    \ /%%%\ /%%%\ /%%%\ /     \ /%%%%%\ /%%%%%\ /%%%%%\ /     o%%%%%%%o%%%%%%%o%%%%%%%o     \%%%%%/ \%%%%%/ \%%%%%/      \%%%/ \%%%/ \%%%/      u  \%/ \%/ \%/  v    o+o o o+o    \ /%\ /     \ /%%%\ /     du \ /%%%%%\ / dv    oo%%%%%%%oo   \%%%%%/   \%%%/   \%/   o    oo Figure 2.2. Eg = ((u + du, v + dv)) 
Figure 2.3 expands over to yield the form:

oo    o   / \   / \   / \   o o   /%\ /%\   /%%%\ /%%%\   /%%%%%\ /%%%%%\   o%%%%%%%o%%%%%%%o   /%\%%%%%/ \%%%%%/%\   /%%%\%%%/ \%%%/%%%\   /%%%%%\%/ \%/%%%%%\   o%%%%%%%o o%%%%%%%o   / \%%%%%/ \ / \%%%%%/ \   / \%%%/ \ / \%%%/ \   / \%/ \ / \%/ \   o o o o o   \ /%\ / \ /%\ /    \ /%%%\ / \ /%%%\ /     \ /%%%%%\ / \ /%%%%%\ /     o%%%%%%%o o%%%%%%%o     \%%%%%/%\ /%\%%%%%/      \%%%/%%%\ /%%%\%%%/      u  \%/%%%%%\ /%%%%%\%/  v    o+o%%%%%%%o%%%%%%%o+o    \%%%%%/ \%%%%%/     \%%%/ \%%%/     du \%/ \%/ dv    oo oo   \ /   \ /   \ /   o    oo Figure 2.3. Dg = g + Eg 
Differential : Locally Linear Approximation
'Tis a derivative from me to mine,  
— Winter's Tale, 3.2.43–44 
We've talked about differentials long enough that I think it's way past time we met with some.
When the term is being used with its more exact sense, a differential is a locally linear approximation to a function, in the context of this logical discussion, then, a locally linear approximation to a proposition.
Recall the form of the current example:

To speed things along, I will skip a mass of motivating discussion and just exhibit the simplest form of a differential for the current example of a logical transformation after which the majority of the easiest questions will have been answered in visually intuitive terms.
For we have and so we can proceed componentwise, patching the pieces back together at the end.
We have prepared the ground already by computing these terms:

As a matter of fact, computing the symmetric differences and has already taken care of the localizing part of the task by subtracting out the forms of and from the forms of and respectively. Thus all we have left to do is to decide what linear propositions best approximate the difference maps and respectively.
This raises the question: What is a linear proposition?
The answer that makes the most sense in this context is this: A proposition is just a booleanvalued function, so a linear proposition is a linear function into the boolean space
In particular, the linear functions that we want will be linear functions in the differential variables and
As it turns out, there are just four linear propositions in the associated differential universe These are the propositions that are commonly denoted: in other words:
Notions of Approximation
for equalities are so weighed  
— King Lear, Sc.1.5–7 (Quarto)  
for qualities are so weighed  
— King Lear, 1.1.5–6 (Folio) 
Justifying a notion of approximation is a little more involved in general, and especially in these discrete logical spaces, than it would be expedient for people in a hurry to tangle with right now. I will just say that there are naive or obvious notions and there are sophisticated or subtle notions that we might choose among. The later would engage us in trying to construct proper logical analogues of Lie derivatives, and so let's save that for when we have become subtle or sophisticated or both. Against or toward that day, as you wish, let's begin with an option in plain view.
Figure 1.4 illustrates one way of ranging over the cells of the underlying universe and selecting at each cell the linear proposition in that best approximates the patch of the difference map that is located there, yielding the following formula for the differential

oo    o   / \   / \   / \   o o   / \ / \   / \ / \   / \ / \   o o o   / \ / \ / \   / \ / \ / \   / \ / \ / \   o o o o   / \ /%\ /%\ / \   / \ /%%%\ /%%%\ / \   / \ /%%%%%\ /%%%%%\ / \   o o%%%%%%%o%%%%%%%o o   \ /%\%%%%%/ \%%%%%/%\ /    \ /%%%\%%%/ \%%%/%%%\ /     \ /%%%%%\%/ \%/%%%%%\ /     o%%%%%%%o o%%%%%%%o     \%%%%%/%\ /%\%%%%%/      \%%%/%%%\ /%%%\%%%/      u  \%/%%%%%\ /%%%%%\%/  v    o+o%%%%%%%o%%%%%%%o+o    \%%%%%/ \%%%%%/     \%%%/ \%%%/     du \%/ \%/ dv    oo oo   \ /   \ /   \ /   o    oo Figure 1.4. df = linear approx to Df 
Figure 2.4 illustrates one way of ranging over the cells of the underlying universe and selecting at each cell the linear proposition in that best approximates the patch of the difference map that is located there, yielding the following formula for the differential

oo    o   / \   / \   / \   o o   /%\ /%\   /%%%\ /%%%\   /%%%%%\ /%%%%%\   o%%%%%%%o%%%%%%%o   /%\%%%%%/ \%%%%%/%\   /%%%\%%%/ \%%%/%%%\   /%%%%%\%/ \%/%%%%%\   o%%%%%%%o o%%%%%%%o   / \%%%%%/ \ / \%%%%%/ \   / \%%%/ \ / \%%%/ \   / \%/ \ / \%/ \   o o o o o   \ /%\ / \ /%\ /    \ /%%%\ / \ /%%%\ /     \ /%%%%%\ / \ /%%%%%\ /     o%%%%%%%o o%%%%%%%o     \%%%%%/%\ /%\%%%%%/      \%%%/%%%\ /%%%\%%%/      u  \%/%%%%%\ /%%%%%\%/  v    o+o%%%%%%%o%%%%%%%o+o    \%%%%%/ \%%%%%/     \%%%/ \%%%/     du \%/ \%/ dv    oo oo   \ /   \ /   \ /   o    oo Figure 2.4. dg = linear approx to Dg 
Well, that was easy, seeing as how is already linear at each locus,
Analytic Series
We have been conducting the differential analysis of the logical transformation defined as and this means starting with the extended transformation and breaking it into an analytic series, and so on until there is nothing left to analyze any further.
As a general rule, one proceeds by way of the following stages:

In our analysis of the transformation we carried out Step 1 in the more familiar form and we have just reached Step 2 in the form where is the residual term that remains for us to examine next.
Note. I'm am trying to give quick overview here, and this forces me to omit many picky details. The picky reader may wish to consult the more detailed presentation of this material at the following locations:
Let's push on with the analysis of the transformation:

For ease of comparison and computation, I will collect the Figures that we need for the remainder of the work together on one page.
Computation Summary for Logical Disjunction
Figure 1.1 shows the expansion of over to produce the expression:

Figure 1.2 shows the expansion of over to produce the expression:

In general, tells you what you would have to do, from wherever you are in the universe if you want to end up in a place where is true. In this case, where the prevailing proposition is the indication of tells you this: If and are both true where you are, then just don't change both and at once, and you will end up in a place where is true.
Figure 1.3 shows the expansion of over to produce the expression:

In general, tells you what you would have to do, from wherever you are in the universe if you want to bring about a change in the value of that is, if you want to get to a place where the value of is different from what it is where you are. In the present case, where the reigning proposition is the term of tells you this: If and are both true where you are, then you would have to change both and in order to reach a place where the value of is different from what it is where you are.
Figure 1.4 approximates by the linear form that expands over as follows:

Figure 1.5 shows what remains of the difference map when the first order linear contribution is removed, namely:

oo    o   /%\   /%%%\   /%%%%%\   o%%%%%%%o   /%\%%%%%/%\   /%%%\%%%/%%%\   /%%%%%\%/%%%%%\   o%%%%%%%o%%%%%%%o   /%\%%%%%/%\%%%%%/%\   /%%%\%%%/%%%\%%%/%%%\   /%%%%%\%/%%%%%\%/%%%%%\   o%%%%%%%o%%%%%%%o%%%%%%%o   /%\%%%%%/%\%%%%%/%\%%%%%/%\   /%%%\%%%/%%%\%%%/%%%\%%%/%%%\   /%%%%%\%/%%%%%\%/%%%%%\%/%%%%%\   o%%%%%%%o%%%%%%%o%%%%%%%o%%%%%%%o   \%%%%%/%\%%%%%/ \%%%%%/%\%%%%%/    \%%%/%%%\%%%/ \%%%/%%%\%%%/     \%/%%%%%\%/ \%/%%%%%\%/     o%%%%%%%o o%%%%%%%o     \%%%%%/ \ / \%%%%%/      \%%%/ \ / \%%%/      u  \%/ \ / \%/  v    o+o o o+o    \ / \ /     \ / \ /     du \ / \ / dv    oo oo   \ /   \ /   \ /   o    oo Figure 1.1. f = ((u)(v)) 
oo    o   /%\   /%%%\   /%%%%%\   o%%%%%%%o   /%\%%%%%/%\   /%%%\%%%/%%%\   /%%%%%\%/%%%%%\   o%%%%%%%o%%%%%%%o   /%\%%%%%/ \%%%%%/%\   /%%%\%%%/ \%%%/%%%\   /%%%%%\%/ \%/%%%%%\   o%%%%%%%o o%%%%%%%o   /%\%%%%%/%\ /%\%%%%%/%\   /%%%\%%%/%%%\ /%%%\%%%/%%%\   /%%%%%\%/%%%%%\ /%%%%%\%/%%%%%\   o%%%%%%%o%%%%%%%o%%%%%%%o%%%%%%%o   \%%%%%/ \%%%%%/%\%%%%%/ \%%%%%/    \%%%/ \%%%/%%%\%%%/ \%%%/     \%/ \%/%%%%%\%/ \%/     o o%%%%%%%o o     \ /%\%%%%%/%\ /      \ /%%%\%%%/%%%\ /      u  \ /%%%%%\%/%%%%%\ /  v    o+o%%%%%%%o%%%%%%%o+o    \%%%%%/ \%%%%%/     \%%%/ \%%%/     du \%/ \%/ dv    oo oo   \ /   \ /   \ /   o    oo Figure 1.2. Ef = ((u + du)(v + dv)) 
oo    o   / \   / \   / \   o o   / \ / \   / \ / \   / \ / \   o o o   / \ /%\ / \   / \ /%%%\ / \   / \ /%%%%%\ / \   o o%%%%%%%o o   / \ / \%%%%%/ \ / \   / \ / \%%%/ \ / \   / \ / \%/ \ / \   o o o o o   \ /%\ /%\ /%\ /    \ /%%%\ /%%%\ /%%%\ /     \ /%%%%%\ /%%%%%\ /%%%%%\ /     o%%%%%%%o%%%%%%%o%%%%%%%o     \%%%%%/%\%%%%%/%\%%%%%/      \%%%/%%%\%%%/%%%\%%%/      u  \%/%%%%%\%/%%%%%\%/  v    o+o%%%%%%%o%%%%%%%o+o    \%%%%%/ \%%%%%/     \%%%/ \%%%/     du \%/ \%/ dv    oo oo   \ /   \ /   \ /   o    oo Figure 1.3. Difference Map Df = f + Ef 
oo    o   / \   / \   / \   o o   / \ / \   / \ / \   / \ / \   o o o   / \ / \ / \   / \ / \ / \   / \ / \ / \   o o o o   / \ /%\ /%\ / \   / \ /%%%\ /%%%\ / \   / \ /%%%%%\ /%%%%%\ / \   o o%%%%%%%o%%%%%%%o o   \ /%\%%%%%/ \%%%%%/%\ /    \ /%%%\%%%/ \%%%/%%%\ /     \ /%%%%%\%/ \%/%%%%%\ /     o%%%%%%%o o%%%%%%%o     \%%%%%/%\ /%\%%%%%/      \%%%/%%%\ /%%%\%%%/      u  \%/%%%%%\ /%%%%%\%/  v    o+o%%%%%%%o%%%%%%%o+o    \%%%%%/ \%%%%%/     \%%%/ \%%%/     du \%/ \%/ dv    oo oo   \ /   \ /   \ /   o    oo Figure 1.4. Linear Proxy df for Df 
oo    o   / \   / \   / \   o o   / \ / \   / \ / \   / \ / \   o o o   / \ /%\ / \   / \ /%%%\ / \   / \ /%%%%%\ / \   o o%%%%%%%o o   / \ /%\%%%%%/%\ / \   / \ /%%%\%%%/%%%\ / \   / \ /%%%%%\%/%%%%%\ / \   o o%%%%%%%o%%%%%%%o o   \ / \%%%%%/%\%%%%%/ \ /    \ / \%%%/%%%\%%%/ \ /     \ / \%/%%%%%\%/ \ /     o o%%%%%%%o o     \ / \%%%%%/ \ /      \ / \%%%/ \ /      u  \ / \%/ \ /  v    o+o o o+o    \ / \ /     \ / \ /     du \ / \ / dv    oo oo   \ /   \ /   \ /   o    oo Figure 1.5. Remainder rf = Df + df 
Computation Summary for Logical Equality
Figure 2.1 shows the expansion of over to produce the expression:

Figure 2.2 shows the expansion of over to produce the expression:

In general, tells you what you would have to do, from wherever you are in the universe if you want to end up in a place where is true. In this case, where the prevailing proposition is the component of tells you this: If and are both true where you are, then change either both or neither of and at the same time, and you will attain a place where is true.
Figure 2.3 shows the expansion of over to produce the expression:

In general, tells you what you would have to do, from wherever you are in the universe if you want to bring about a change in the value of that is, if you want to get to a place where the value of is different from what it is where you are. In the present case, where the ruling proposition is the term of tells you this: If and are both true where you are, then you would have to change one or the other but not both and in order to reach a place where the value of is different from what it is where you are.
Figure 2.4 approximates by the linear form that expands over as follows:

Figure 2.5 shows what remains of the difference map when the first order linear contribution is removed, namely:

oo    o   /%\   /%%%\   /%%%%%\   o%%%%%%%o   /%\%%%%%/%\   /%%%\%%%/%%%\   /%%%%%\%/%%%%%\   o%%%%%%%o%%%%%%%o   / \%%%%%/%\%%%%%/ \   / \%%%/%%%\%%%/ \   / \%/%%%%%\%/ \   o o%%%%%%%o o   / \ / \%%%%%/ \ / \   / \ / \%%%/ \ / \   / \ / \%/ \ / \   o o o o o   \ / \ /%\ / \ /    \ / \ /%%%\ / \ /     \ / \ /%%%%%\ / \ /     o o%%%%%%%o o     \ /%\%%%%%/%\ /      \ /%%%\%%%/%%%\ /      u  \ /%%%%%\%/%%%%%\ /  v    o+o%%%%%%%o%%%%%%%o+o    \%%%%%/%\%%%%%/     \%%%/%%%\%%%/     du \%/%%%%%\%/ dv    oo%%%%%%%oo   \%%%%%/   \%%%/   \%/   o    oo Figure 2.1. g = ((u, v)) 
oo    o   /%\   /%%%\   /%%%%%\   o%%%%%%%o   / \%%%%%/ \   / \%%%/ \   / \%/ \   o o o   /%\ /%\ /%\   /%%%\ /%%%\ /%%%\   /%%%%%\ /%%%%%\ /%%%%%\   o%%%%%%%o%%%%%%%o%%%%%%%o   / \%%%%%/ \%%%%%/ \%%%%%/ \   / \%%%/ \%%%/ \%%%/ \   / \%/ \%/ \%/ \   o o o o o   \ /%\ /%\ /%\ /    \ /%%%\ /%%%\ /%%%\ /     \ /%%%%%\ /%%%%%\ /%%%%%\ /     o%%%%%%%o%%%%%%%o%%%%%%%o     \%%%%%/ \%%%%%/ \%%%%%/      \%%%/ \%%%/ \%%%/      u  \%/ \%/ \%/  v    o+o o o+o    \ /%\ /     \ /%%%\ /     du \ /%%%%%\ / dv    oo%%%%%%%oo   \%%%%%/   \%%%/   \%/   o    oo Figure 2.2. Eg = ((u + du, v + dv)) 
oo    o   / \   / \   / \   o o   /%\ /%\   /%%%\ /%%%\   /%%%%%\ /%%%%%\   o%%%%%%%o%%%%%%%o   /%\%%%%%/ \%%%%%/%\   /%%%\%%%/ \%%%/%%%\   /%%%%%\%/ \%/%%%%%\   o%%%%%%%o o%%%%%%%o   / \%%%%%/ \ / \%%%%%/ \   / \%%%/ \ / \%%%/ \   / \%/ \ / \%/ \   o o o o o   \ /%\ / \ /%\ /    \ /%%%\ / \ /%%%\ /     \ /%%%%%\ / \ /%%%%%\ /     o%%%%%%%o o%%%%%%%o     \%%%%%/%\ /%\%%%%%/      \%%%/%%%\ /%%%\%%%/      u  \%/%%%%%\ /%%%%%\%/  v    o+o%%%%%%%o%%%%%%%o+o    \%%%%%/ \%%%%%/     \%%%/ \%%%/     du \%/ \%/ dv    oo oo   \ /   \ /   \ /   o    oo Figure 2.3. Difference Map Dg = g + Eg 
oo    o   / \   / \   / \   o o   /%\ /%\   /%%%\ /%%%\   /%%%%%\ /%%%%%\   o%%%%%%%o%%%%%%%o   /%\%%%%%/ \%%%%%/%\   /%%%\%%%/ \%%%/%%%\   /%%%%%\%/ \%/%%%%%\   o%%%%%%%o o%%%%%%%o   / \%%%%%/ \ / \%%%%%/ \   / \%%%/ \ / \%%%/ \   / \%/ \ / \%/ \   o o o o o   \ /%\ / \ /%\ /    \ /%%%\ / \ /%%%\ /     \ /%%%%%\ / \ /%%%%%\ /     o%%%%%%%o o%%%%%%%o     \%%%%%/%\ /%\%%%%%/      \%%%/%%%\ /%%%\%%%/      u  \%/%%%%%\ /%%%%%\%/  v    o+o%%%%%%%o%%%%%%%o+o    \%%%%%/ \%%%%%/     \%%%/ \%%%/     du \%/ \%/ dv    oo oo   \ /   \ /   \ /   o    oo Figure 2.4. Linear Proxy dg for Dg 
oo    o   / \   / \   / \   o o   / \ / \   / \ / \   / \ / \   o o o   / \ / \ / \   / \ / \ / \   / \ / \ / \   o o o o   / \ / \ / \ / \   / \ / \ / \ / \   / \ / \ / \ / \   o o o o o   \ / \ / \ / \ /    \ / \ / \ / \ /     \ / \ / \ / \ /     o o o o     \ / \ / \ /      \ / \ / \ /      u  \ / \ / \ /  v    o+o o o+o    \ / \ /     \ / \ /     du \ / \ / dv    oo oo   \ /   \ /   \ /   o    oo Figure 2.5. Remainder rg = Dg + dg 
 Have I carved enough, my lord   Child, you are a bone.   Leonard Cohen, "Teachers" (1967)
Visualization
In my work on Differential Logic and Dynamic Systems, I found it useful to develop several different ways of visualizing logical transformations, indeed, I devised four distinct styles of picture for the job. Thus far in our work on the mapping we've been making use of what I call the areal view of the extended universe of discourse, but as the number of dimensions climbs beyond four, it's time to bid this genre adieu and look for a style that can scale a little better. At any rate, before we proceed any further, let's first assemble the information that we have gathered about from several different angles, and see if it can be fitted into a coherent picture of the transformation
In our first crack at the transformation we simply plotted the state transitions and applied the utterly stock technique of calculating the finite differences.

A quick inspection of the first Table suggests a rule to cover the case when namely, To put it another way, the Table characterizes Orbit 1 by means of the data: Another way to convey the same information is by means of the extended proposition:

A more fine combing of the second Table brings to mind a rule that partly covers the remaining cases, that is, This much information about Orbit 2 is also encapsulated by the extended proposition which says that and are not both true at the same time, while is equal in value to and is opposite in value to
Turing Machine Example
☞ See Theme One Program for documentation of the cactus graph syntax and the propositional modeling program used below.
By way of providing a simple illustration of Cook's Theorem, namely, that “Propositional Satisfiability is NPComplete”, I will describe one way to translate finite approximations of turing machines into propositional expressions, using the cactus language syntax for propositional calculus that I will describe in more detail as we proceed.
 Space and time limited turing machine,
 with units of space and units of time.
 Space and time limited turing machine,
 for computing the parity of a bit string, with number of tape cells of input equal to
I will follow the pattern of discussion in Herbert Wilf (1986), Algorithms and Complexity, pp. 188–201, but translate his logical formalism into cactus language, which is more efficient in regard to the number of propositional clauses that are required.
A turing machine for computing the parity of a bit string is described by means of the following Figure and Table.
Table 4. Parity Machine oooooo  State  Symbol  Next Symbol  Ratchet  Next State   Q  S  S'  dR  Q'  oooooo  0  0  0  +1  0   0  1  1  +1  1   0  #  #  1  #   1  0  0  +1  1   1  1  1  +1  0   1  #  #  1  *  oooooo 
The TM has a finite automaton (FA) as one component. Let us refer to this particular FA by the name of
The tape head (that is, the read unit) will be called The registers are also called tape cells or tape squares.
Finite Approximations
To see how each finite approximation to a given turing machine can be given a purely propositional description, one fixes the parameter and limits the rest of the discussion to describing which is not really a fullfledged TM anymore but just a finite automaton in disguise.
In this example, for the sake of a minimal illustration, we choose and discuss Since the zeroth tape cell and the last tape cell are both occupied by the character that is used for both the beginning of file and the end of file markers, this allows for only one digit of significant computation.
To translate into propositional form we use the following collection of basic propositions, boolean variables, or logical features, depending on what one prefers to call them:
The basic propositions for describing the present state function are these:

The proposition of the form says:
At the pointintime the finite state machine is in the state 
The basic propositions for describing the present register function are these:

The proposition of the form says:
At the pointintime the tape head is on the tape cell 
The basic propositions for describing the present symbol function are these:

The proposition of the form says:
At the pointintime the tape cell bears the mark 
Initial Conditions
Given but a single free square on the tape, there are just two different sets of initial conditions for the finite approximation to the parity turing machine that we are presently considering.
Initial Conditions for Tape Input "0"
The following conjunction of 5 basic propositions describes the initial conditions when is started with an input of "0" in its free square:

This conjunction of basic propositions may be read as follows:
At time machine is in the state At time scanner is reading cell At time cell contains the symbol At time cell contains the symbol At time cell contains the symbol 
Initial Conditions for Tape Input "1"
The following conjunction of 5 basic propositions describes the initial conditions when is started with an input of "1" in its free square:

This conjunction of basic propositions may be read as follows:
At time machine is in the state At time scanner is reading cell At time cell contains the symbol At time cell contains the symbol At time cell contains the symbol 
Propositional Program
A complete description of in propositional form is obtained by conjoining one of the above choices for initial conditions with all of the following sets of propositions, that serve in effect as a simple type of declarative program, telling us all that we need to know about the anatomy and behavior of the truncated TM in question.
Mediate Conditions

Terminal Conditions

State Partition

Register Partition

Symbol Partition

Interaction Conditions

Transition Relations

Interpretation of the Propositional Program
Let us now run through the propositional specification of our truncated TM, and paraphrase what it says in ordinary language.
Mediate Conditions

In the interpretation of the cactus language for propositional logic that we are using here, an expression of the form expresses a conditional, an implication, or an ifthen proposition, commonly read in one of the following ways:

A text string expression of the form corresponds to a graphtheoretic datastructure of the following form:
oo    p q   oo      @    oo  ( p ( q ))  oo 
Taken together, the Mediate Conditions state the following:
If at is in state then at is in state and If at is in state then at is in state and If at is in state then at is in state and If at is in state then at is in state 
Terminal Conditions

In cactus syntax, an expression of the form expresses the disjunction The corresponding cactus graph, here just a tree, has the following shape:
oo    p q   o o   \ /   o      @    oo  ((p) (q))  oo 
In effect, the Terminal Conditions state the following:
At time machine is in state or At time machine is in state 
State Partition

In cactus syntax, an expression of the form expresses a statement to the effect that exactly one of the expressions is true, for Expressions of this form are called universal partition expressions, and the corresponding painted and rooted cactus (PARC) has the following shape:
oo    e_1 e_2 ... e_k   o o o        oo ... o   \ /   \ /   \ /   \ /   \ /   \ /   \ /   \ /   @    oo  ((e_1),(e_2),(...),(e_k))  oo 
The State Partition segment of the propositional program consists of three universal partition expressions, taken in conjunction expressing the condition that has to be in one and only one of its states at each point in time under consideration. In short, we have the constraint:
At each of the points in time for in the set can be in exactly one state for in the set 
Register Partition

The Register Partition segment of the propositional program consists of three universal partition expressions, taken in conjunction saying that the read head must be reading one and only one of the registers or tape cells available to it at each of the points in time under consideration. In sum:
At each of the points in time for is reading exactly one cell for 
Symbol Partition

The Symbol Partition segment of the propositional program for consists of nine universal partition expressions, taken in conjunction stipulating that there has to be one and only one symbol in each of the registers at each point in time under consideration. In short, we have:
<p>At each of the points in time for in
in each of the tape registers for in there can be exactly one sign for in 
Interaction Conditions

In briefest terms, the Interaction Conditions simply express the circumstance that the mark on a tape cell cannot change between two points in time unless the tape head is over the cell in question at the initial one of those points in time. All that we have to do is to see how they manage to say this.
Consider a cactus expression of the following form:

This expression has the corresponding cactus graph:
oo    p<i>_r<j> p<i+1>_r<j>_s<k>   o o   \ /   p<i>_r<j>_s<k> o      @    oo 
A propositional expression of this form can be read as follows:
At the time the tape cell bears the mark 
it is not the case that: 
At the time the tape head is on the tape cell 
At the time the tape cell bears the mark 
The eighteen clauses of the Interaction Conditions simply impose one such constraint on symbol changes for each combination of the times registers and symbols
Transition Relations

The Transition Relation segment of the propositional program for consists of sixteen implication statements with complex antecedents and consequents. Taken together, these give propositional expression to the TM Figure and Table that were given at the outset.
Just by way of a single example, consider the clause:
This complex implication statement can be read to say:
At the time the machine is in the state and 
At the time the scanner is reading cell and 
At the time the tape cell contains a 
At the time the machine is in the state and 
At the time the scanner is reading cell and 
At the time the tape cell contains a 
Computation
The propositional program for uses the following set of basic propositions or boolean variables:



This means that the propositional program itself is nothing but a single proposition or boolean function of the form
An assignment of boolean values to the above set of boolean variables is called an interpretation of the proposition and any interpretation of that makes the proposition evaluate to is referred to as a satisfying interpretation of the proposition Another way to specify interpretations, instead of giving them as bit vectors in and trying to remember some arbitrary ordering of variables, is to give them in the form of singular propositions, that is, a conjunction of the form where each is either or that is, either the assertion or the negation of the boolean variable as runs from 1 to 57. Even more briefly, the same information can be communicated simply by giving the conjunction of the asserted variables, with the understanding that each of the others is negated.
A satisfying interpretation of the proposition supplies us with all the information of a complete execution history for the corresponding program, and so all we have to do in order to get the output of the program is to read off the proper part of the data from the expression of this interpretation.
Output
One component of the program that I wrote some years ago finds all the satisfying interpretations of propositions expressed in cactus syntax. It's not a polynomial time algorithm, as you may guess, but it was just barely efficient enough to do this example in the 500 K of spare memory that I had on an old 286 PC in about 1989, so I will give you the actual outputs from those trials.
Output Conditions for Tape Input "0"
Let be the proposition that we get by conjoining the proposition that describes the initial conditions for tape input "0" with the proposition that describes the truncated turing machine As it turns out, has a single satisfying interpretation. This interpretation is expressible in the form of a singular proposition, which can in turn be indicated by its positive logical features, as shown in the following display:
oo    p0_q0   p0_r1   p0_r0_s#   p0_r1_s0   p0_r2_s#   p1_q0   p1_r2   p1_r2_s#   p1_r0_s#   p1_r1_s0   p2_q#   p2_r1   p2_r0_s#   p2_r1_s0   p2_r2_s#    oo 
The Output Conditions for Tape Input "0" can be read as follows:
At the time machine is in the state and At the time scanner is reading cell and At the time cell contains the symbol and At the time cell contains the symbol and At the time cell contains the symbol and 
At the time machine is in the state and At the time scanner is reading cell and At the time cell contains the symbol and At the time cell contains the symbol and At the time cell contains the symbol and 
At the time machine is in the state and At the time scanner is reading cell and At the time cell contains the symbol and At the time cell contains the symbol and At the time cell contains the symbol 
The output of being the symbol that rests under the tape head if and when the machine reaches one of its resting states, we get the result that
Output Conditions for Tape Input "1"
Let be the proposition that we get by conjoining the proposition that describes the initial conditions for tape input "1" with the proposition that describes the truncated turing machine As it turns out, has a single satisfying interpretation. This interpretation is expressible in the form of a singular proposition, which can in turn be indicated by its positive logical features, as shown in the following display:
oo    p0_q0   p0_r1   p0_r0_s#   p0_r1_s1   p0_r2_s#   p1_q1   p1_r2   p1_r2_s#   p1_r0_s#   p1_r1_s1   p2_q*   p2_r1   p2_r0_s#   p2_r1_s1   p2_r2_s#    oo 
The Output Conditions for Tape Input "1" can be read as follows:
At the time machine is in the state and At the time scanner is reading cell and At the time cell contains the symbol and At the time cell contains the symbol and At the time cell contains the symbol and 
At the time machine is in the state and At the time scanner is reading cell and At the time cell contains the symbol and At the time cell contains the symbol and At the time cell contains the symbol and 
At the time machine is in the state and At the time scanner is reading cell and At the time cell contains the symbol and At the time cell contains the symbol and At the time cell contains the symbol 
The output of being the symbol that rests under the tape head when and if the machine reaches one of its resting states, we get the result that
Document History
Ontology List : Feb–Mar 2004
 http://web.archive.org/web/20090402071440/http://suo.ieee.org/ontology/msg05457.html
 http://web.archive.org/web/20090331180117/http://suo.ieee.org/ontology/msg05458.html
 http://web.archive.org/web/20090331184858/http://suo.ieee.org/ontology/msg05459.html
 http://web.archive.org/web/20090331214624/http://suo.ieee.org/ontology/msg05460.html
 http://web.archive.org/web/20090331203711/http://suo.ieee.org/ontology/msg05461.html
 http://web.archive.org/web/20090331214723/http://suo.ieee.org/ontology/msg05462.html
 http://web.archive.org/web/20090331184056/http://suo.ieee.org/ontology/msg05463.html
 http://web.archive.org/web/20090330053213/http://suo.ieee.org/ontology/msg05464.html
 http://web.archive.org/web/20090401001227/http://suo.ieee.org/ontology/msg05465.html
 http://web.archive.org/web/20090331184904/http://suo.ieee.org/ontology/msg05466.html
 http://web.archive.org/web/20090330053236/http://suo.ieee.org/ontology/msg05467.html
 http://web.archive.org/web/20090331215933/http://suo.ieee.org/ontology/msg05469.html
 http://web.archive.org/web/20090331203716/http://suo.ieee.org/ontology/msg05470.html
 http://web.archive.org/web/20090331184908/http://suo.ieee.org/ontology/msg05471.html
 http://web.archive.org/web/20090331164444/http://suo.ieee.org/ontology/msg05472.html
 http://web.archive.org/web/20090331192012/http://suo.ieee.org/ontology/msg05473.html
 http://web.archive.org/web/20090331190123/http://suo.ieee.org/ontology/msg05474.html
 http://web.archive.org/web/20090331184101/http://suo.ieee.org/ontology/msg05475.html
 http://web.archive.org/web/20090331203730/http://suo.ieee.org/ontology/msg05476.html
 http://web.archive.org/web/20070705085032/http://suo.ieee.org/ontology/msg05477.html
 http://web.archive.org/web/20090331164450/http://suo.ieee.org/ontology/msg05479.html
NKS Forum : Feb–Jun 2004
 http://forum.wolframscience.com/archive/topic/2281.html
 http://forum.wolframscience.com/showthread.php?threadid=228
 http://forum.wolframscience.com/printthread.php?threadid=228&perpage=50
 http://forum.wolframscience.com/showthread.php?postid=664#post664
 http://forum.wolframscience.com/showthread.php?postid=666#post666
 http://forum.wolframscience.com/showthread.php?postid=677#post677
 http://forum.wolframscience.com/showthread.php?postid=684#post684
 http://forum.wolframscience.com/showthread.php?postid=689#post689
 http://forum.wolframscience.com/showthread.php?postid=697#post697
 http://forum.wolframscience.com/showthread.php?postid=708#post708
 http://forum.wolframscience.com/showthread.php?postid=721#post721
 http://forum.wolframscience.com/showthread.php?postid=722#post722
 http://forum.wolframscience.com/showthread.php?postid=725#post725
 http://forum.wolframscience.com/showthread.php?postid=733#post733
 http://forum.wolframscience.com/showthread.php?postid=756#post756
 http://forum.wolframscience.com/showthread.php?postid=759#post759
 http://forum.wolframscience.com/showthread.php?postid=764#post764
 http://forum.wolframscience.com/showthread.php?postid=766#post766
 http://forum.wolframscience.com/showthread.php?postid=767#post767
 http://forum.wolframscience.com/showthread.php?postid=773#post773
 http://forum.wolframscience.com/showthread.php?postid=775#post775
 http://forum.wolframscience.com/showthread.php?postid=777#post777
 http://forum.wolframscience.com/showthread.php?postid=791#post791
 http://forum.wolframscience.com/showthread.php?postid=1458#post1458
 http://forum.wolframscience.com/showthread.php?postid=1461#post1461
 http://forum.wolframscience.com/showthread.php?postid=1463#post1463
 http://forum.wolframscience.com/showthread.php?postid=1464#post1464
 http://forum.wolframscience.com/showthread.php?postid=1467#post1467
 http://forum.wolframscience.com/showthread.php?postid=1469#post1469
 http://forum.wolframscience.com/showthread.php?postid=1470#post1470
 http://forum.wolframscience.com/showthread.php?postid=1471#post1471
 http://forum.wolframscience.com/showthread.php?postid=1473#post1473
 http://forum.wolframscience.com/showthread.php?postid=1475#post1475
 http://forum.wolframscience.com/showthread.php?postid=1479#post1479
 http://forum.wolframscience.com/showthread.php?postid=1489#post1489
 http://forum.wolframscience.com/showthread.php?postid=1490#post1490
Inquiry List : Feb–Jun 2004
 http://stderr.org/pipermail/inquiry/2004February/thread.html#1228
 http://stderr.org/pipermail/inquiry/2004March/thread.html#1235
 http://stderr.org/pipermail/inquiry/2004March/thread.html#1240
 http://stderr.org/pipermail/inquiry/2004June/thread.html#1630
 http://stderr.org/pipermail/inquiry/2004February/001228.html
 http://stderr.org/pipermail/inquiry/2004February/001230.html
 http://stderr.org/pipermail/inquiry/2004February/001231.html
 http://stderr.org/pipermail/inquiry/2004February/001232.html
 http://stderr.org/pipermail/inquiry/2004February/001233.html
 http://stderr.org/pipermail/inquiry/2004February/001234.html
 http://stderr.org/pipermail/inquiry/2004March/001235.html
 http://stderr.org/pipermail/inquiry/2004March/001236.html
 http://stderr.org/pipermail/inquiry/2004March/001237.html
 http://stderr.org/pipermail/inquiry/2004March/001238.html
 http://stderr.org/pipermail/inquiry/2004March/001240.html
 http://stderr.org/pipermail/inquiry/2004March/001242.html
 http://stderr.org/pipermail/inquiry/2004March/001243.html
 http://stderr.org/pipermail/inquiry/2004March/001244.html
 http://stderr.org/pipermail/inquiry/2004March/001245.html
 http://stderr.org/pipermail/inquiry/2004March/001246.html
 http://stderr.org/pipermail/inquiry/2004March/001247.html
 http://stderr.org/pipermail/inquiry/2004March/001248.html
 http://stderr.org/pipermail/inquiry/2004March/001249.html
 http://stderr.org/pipermail/inquiry/2004March/001255.html
 http://stderr.org/pipermail/inquiry/2004June/001630.html
 http://stderr.org/pipermail/inquiry/2004June/001631.html
 http://stderr.org/pipermail/inquiry/2004June/001632.html
 http://stderr.org/pipermail/inquiry/2004June/001633.html
 http://stderr.org/pipermail/inquiry/2004June/001634.html
 http://stderr.org/pipermail/inquiry/2004June/001635.html
 http://stderr.org/pipermail/inquiry/2004June/001636.html
 http://stderr.org/pipermail/inquiry/2004June/001637.html
 http://stderr.org/pipermail/inquiry/2004June/001638.html
 http://stderr.org/pipermail/inquiry/2004June/001639.html
 http://stderr.org/pipermail/inquiry/2004June/001640.html
 http://stderr.org/pipermail/inquiry/2004June/001641.html
 http://stderr.org/pipermail/inquiry/2004June/001642.html