Statistical time

Time is number, measure, information

This piece deals with the physical meaning of time as I understand it.

Time is a measure of how many events happen in a given situation, it means noticing, at least, counting (Aristotle), them, it is a measure of the knowledge an observer has about the surroundings, or a measure of the interaction between regions of space and their surroundings. Time, therefore, is not universal, it is proper to the observer(s), to the observed, and, in general, to any and each arbitrarily delimited physical systems.

We can build a more formal relationship between time and knowledge or information: we'll use two ideal cases to formulate a plausible hypothesis, and then plunge into a more realistic world with it.

An observer immersed in a space with no events, would wait an infinity for an event to happen, in other words, the unit of time proper to the observer would be infinity, the observer can't notice anything happening, cannot use a definite unit of time for itself. There's nothing else the observer can define its unit of time relative to, so one can't use an infinity of units, therefore, in this extreme case, it's more reasonable to consider its unit of time as being infinite.
To put it differently: if the probability p of an event is equal to zero, the corresponding unit of time u of the event counter (the observer) is infinite: p=0u=.

An observer who is absolutely sure an event will happen (has total knowledge about it) has a unit of time equivalent with zero, null. Because total knowledge about an event means exactly noticing that event, actually happening (being the event itself). To put it differently: if the probability p of an event is equal to one, the corresponding unit of time u of the event counter (the observer) is equal to zero: p=1u=0.

Briefly: p=0u=p=1u=0 .

A simple monotonic function which maps the probability to the unit of time is the logarithm, so, we can write: u=-klnp (k is a dimensional constant).

Now for the real world, where multiple events happen stochastically: an observer noticing a number n of events, each of them happening with a probability pi, where i labels each of those events, can calculate an average unit of time, using the logarithmic relation above: <u>=-kΣinpilnpi: any thing's proper unit of time is proportional to the informational entropy of its interacting context.

In the mechanical world, where classical time is used, anything is bound to happen with absolute certainty, so the average time unit u, calculated above, is zero. However, given the history of our concept of time understanding, we used a constant unit of time as a gauge for all the other durations we'd like to measure. This mechanical gauge has been chosen because the physical process used in defining it is extremely regular (doesn't bring any new knowledge to the observer: the observer knows already what will happen with this regular system when it notices other, independent, events). So we add it to the relation above, to accomodate our present, conventional, use of the mechanical, constant, unit of time; it means the conventionally transferred knowledge from a purely mechanical system.
So time and knowledge/interaction are directly related: <u>=-kΣinpilnpi+cst.
Finally: any thing's proper unit of time is linearly dependent on the informational entropy of its interacting context (that part of the environment which interacts with it). Let's call it statistical time.

So, unless you know a way to unknow or uninteract with things, you can't turn back in time. This asymmetry should pervade any self-respecting formalism about the relevant parts of nature.


  1. The above is an idea which started to define itself in 1986, took the present form in 1996 and first published on the blog in 2005.
  2. I learned about Aristotle's view of (discrete) time from a reading of B. Russell's: A history of western philosophy around 2014: I then discovered I'm in good company.
  3. The notion of observer used above does not imply conscience, it only implies interaction (a rock hitting another, light beaming on water, a cat feeling the movements of a human or a vacuum cleaner, an apple percolated by some neutrinos, the egg fried by some heat and percolated by some gamma ray).
  4. If I were working as a physicist researcher now, I would attempt rescaling any equation involving time, by using the entropy of the physical object described as the (new) unit of time: there might be new physics lurking there.
  5. Of course, there may be independent collections of events, with a probability drawn from unrelated distributions, and this might provide further refinements to the unit of statistical time.
  6. Maybe it can be used as the unit of time for state-collapsing in (a realist) quantum mechanics, e.g. when a threshold of information/number of events has been reached for a particle-wave, its state collapses.