3. Ontology

Ontology is generally considered the study of what exists. It’s a philosophical area of study because there is no clear cut answer as to what exists. Most agree that trees exist. Does the number “three” exist?

The point of this section is not to argue that the current theory will provide the answer to what exists and what doesn’t. The point of this section is to lay out how this theory will use the terms “exists” and “real” in the context of the framework, and to explain how certain fundamental concepts like “causality” and “information” relate to these terms.

Existence

As stated in the axioms, this theory will confine the term “exists” to physical things, i.e., those things identified by physics, like planets, tables, electrons, etc. To put it into terms of the framework, something exists if it is a mechanism for at least one task.

Input (x1, x2, … xn) –>[mechanism]–> Output (y1, y2, …ym)

Another way to say this is something exists if it interacts with the environment. If it doesn’t interact with the environment we cannot ever know about it and would have no reason to care about it.

Patterns

Our axioms say patterns are real. The best explanation of what this axiom is trying to say is probably given by Daniel Dennett’s paper “Real Patterns“. The bottom line is that patterns are real abstract things. This concept is useful to us because our framework relies heavily on patterns. The specific set of inputs to a task constitutes a pattern. The combination of input, mechanism, and output for a given task constitutes a pattern. The set of all input/output combinations for a given mechanism constitutes a pattern. In fact, for anything we identify as existing, we do so by recognizing a set of input/output combinations for that thing, i.e., the patterns associated with that thing, although sometimes recognizing the outputs alone are enough, if those outputs are unique to that thing.

Causality

Causality brings in the topic of our third axiom: things change. Causality is specifically about how things change over time. More specifically, causality is about patterns in how things change.

To my knowledge, causality was most famously addressed by Aristotle and his Four Causes. Over time, causal analysis got whittled down to just cause and effect. I think this whittling was unfortunate because Aristotle’s analysis was accurate, more descriptive, and therefor more useful. Aristotle recognized four causes: the material, efficient, formal, and final. His paradigm case was the creation of a sculpture. The material cause was the material from which the sculpture was made, say, clay. The efficient cause was the sculptor. The formal cause was the result, say, a statue of a man. The final cause was the reason the sculpture was made, say, as a contract from the city. Instead of calling these “causes”, I think they would be better described as aspects of causation.

It should be quickly obvious how Aristotle’s framework applies to the current framework. The material cause is the input, the efficient cause is the mechanism, and the formal cause is the output. The final cause will be addressed in the upcoming section on purpose/function.

The current framework describes causality as follows: a given mechanism causes an output when presented with the input. I should point out that causality as described here is relative. You could just as easily reverse the role of the input and the mechanism, saying a given input, when presented with the mechanism, causes the output. That’s why, I presume, in the standard discussion of “cause” and “effect”, the input and mechanism are lumped together as “the cause”. But breaking them out let’s you focus on one (think of the many things a given artist could make) or the other (think of the many things different artists could make from a given lump of clay).

Information

There’s another way of looking at causality which is important. For most people, causality is about cause and effect, and, more specifically, some given physical thing or event having “causal power”, the exercise of which leads to some other physical thing or event, the effect. A different way of looking at things is to take the “block universe” view, in which time is simply a dimension along with the three dimensions of space. We can look at the pattern of stuff at any given time and place, and compare that to the pattern at the (aprox.) same place but at a different time. For example, we could look at a billiards table at t1, where the cue ball is headed directly toward the 8 ball with some velocity. We could then look at time t2 where the cue ball is stationary (adjacent to where the 8 ball was) and the 8 ball is headed away from the cue ball with the same velocity that the cue ball had. In this situation we note a correlation between the velocity of the cue ball at t1 and the velocity of the 8 ball at t2. We know that looking at one will tell us the other, and this goes in both directions.

There’s a concept in Information Theory called mutual information which describes this situation. We say that one system has mutual information relative to another system when knowing something about the first tells you something about the second, and vice versa, with some probability greater than chance. To use the cue ball example, we say the cue ball at time t1 shares mutual information with the 8 ball at time t2. Again, as just stated, this mutual information works in both directions. Knowing the former affords a prediction of the latter, and knowing the latter affords knowing about the past.

The value of mutual information is statistically derived from the physics involved in the evolution of the system from t1 to t2. Thus, information in this sense is a physical property. More specifically, mutual information is a physically derived relation between physical systems, and therefore is mind independent.

Computation

Much heat (and not so much light) occurs in discussions of the relation between consciousness and computers. Here I simply want to establish the ontology of computation with regard to the current framework.

Information theory has established that all computation can be reduced to some combination of four basic operations: COPY, AND, OR, and NOT. There are other operations, such as EXCLUSIVE OR (XOR), which means something like “x OR y, but not both”, but these are just combinations of the first four. Thus, “a XOR b” is the same as “(a OR b) AND NOT (a AND b)”.

[Actually, seems to me that we could have 3 basic operations, as AND can be expressed using only OR, and vice versa. For example, “a AND b” is equivalent to “NOT ( (NOT a) OR (NOT b) )”. Likewise, “a OR b” is equivalent to “NOT ( (NOT a) AND (NOT b) )”. I just came up with this, so I need to verify with some Info. Theory type person.]

So what is the significance of computation in the current framework? Computation is sometimes referred to as information processing. So what does that mean? It means that computation processes mutual information. Assume x is a physical system that has some mutual information with respect to y. After the operation x COPY, the output of that operation, say z, now has (approx.) the same mutual information with respect to y. In fact, we can define COPY as any operation for which the output retains the same mutual information as the input. Likewise, the NOT operation produces output that has an inverse correlation with respect to the mutual information of the input. The AND and OR operations similarly produce outputs with mutual information relative to the inputs, but the mutual information of separate inputs is combined into the output.

Bottom line: information and computation can be physically described without reference to meaning, intentions, or minds. But then, where do these things (meaning, intentions, minds) come from? Mutual information is an affordance for these things. If x has mutual information with respect to y, and some system “cares” about y such that it “wants” to do a given action when y is the case, then that system can achieve that “goal” by attaching an action to x.