R A N D O M

Have you ever heard of the term, “random number“? What exactly is a “random number”? Can 6 be a “random number”? Is there some sort of an “order” to these “random numbers”? By “order”, I’m referring to the relation by which 4 is < (less than) 5. Don’t confuse that with magnitude, which is the relation that “ignores” the difference between -4 and 4. Is it possible to conduct an exchange between these properties just so a “random number” can spring from out of nowhere? How exactly would such an idea be used, in respect to order and magnitude, to generate “random numbers”? Can a “random number” be generated unpredictably in the same respect that a cosmic ray pass through a particular square millimeter within the next millisecond or so? When people bring up “random number”, could they possibly be referring to algorithmic randomness? Any number we can put a “name” to isn’t “random”, right?

Think of randomness in different orders of entropy. As I’ve stated in the previous blog post from July, entropy is the best way of quantifying randomness. [Maximum] entropy, on all orders, means there’s no advantage that you can have for any past information since a maximum value on all orders (whether they’re first-order conditionals, second-order conditionals, independent, etc.) would mean that the analysis of all past values with respect to each other would not render an advantage. Instead, it would be best to have all of these distributions uniformed since this distribution is the one that will maximize entropy; if this is done for all conditional probabilities, then you will get a distribution that is purely random.

S=κln|Ω|

S = entropy

K = Boltzmann’s constant

ln = natural logarithm

Ω = number of states

Natural scientists have a goal of finding order and this directly relates to entropy. The minimization of entropy gives us an indication of order while the maximization of entropy indicates disorder. You do not just want to take into consideration entropy characteristics of the original distributions but also the transformations of the data and hence the possible transformations of distributions that are sensible (i.e., the derived values from the implied values of -1 and 9 for the diacritic dots over “i” and “j” from the Watson “i/j” Theory). Think of entanglement; instead of having two objects that would be classified as “purely random“, what you’ll see is a reduction of entropy from that case since there is a direct effect on the other, resulting in a form of order that would otherwise not be seen in a system that is “purely random“. This property of minimum entropy is a variety of circumstances that has permitted formulas to be obtained. The ability to quantify this order accurately enough allows us to comprehend a little about reality.

In a classical way of analysis, this is not completely foreign, or even unknown, in terms of our intuitive understanding and experience, it’s much more complicated dealing with it mathematically. With calculus, the way we order things is always in terms of locality with respect to some form of a variable that is usually temporal or spatial. In terms of temporal, slowly from observation, it has grown some, from a macroscopic-to-a-microscopic level but the relation remains the same: relate local changes in space and time to a process and use calculus to model some from of global behavior of the physical world. So what about a high-state space or a high-state general process? Truth is, there are all kinds of orders and typically one order will hide a lot of information about the system, in general, and in such a way that though the mathematical conclusions are correct, however the interpretation of those conclusions may indeed turn out to be limited and can detract from a higher level of comprehension. When it comes to finding orders, it’s mostly a non-issue; but other orders can be derived from the raw system itself or from a transformed variant of the raw system that gives an insight that would otherwise not be seen from any existing order, either “modeled” or subsequently discovered.

Some will find that interactions are constrained between specific parts of the system in the same kind of manner you’d see in local interactions [spatially] in terms of classical physics; thing is that the constraints are not spatially local nor are they even temporally local and, in that context, you would need a different way of analyzing the system. I’m talking about conditional entropies, and mathematically, the way you would describe conditional entropies would be to define the collection of all possible conditional probability distributions and then you would define an entropy for that particular distribution.

The one underlying quality about thermodynamics is that we assume that entropy will either remain the same or increment slightly for a particular kind of  system and, by default, we tend to associate energy as having adopted this same property. Study how heat maneuvers; you’ll begin to see how this particular framework of physics came about [from studying heat]. There are several ways in which we generate energy–and that’s by generating heat. We have a lot of order [in terms of a few known approximations in physics and other scientific systems]; just look around and see the order that exists on this planet (Earth) as far as life-forms behaving with one another and in terms of us [humans], the fact that everyday seems the same. We’re more ordered than we are disordered. With this view, one can question how the second law of thermodynamics gets applied in manners of representing the universe and not seem somewhat faulty. Anyone can see the entire universe is not in utter and complete chaos. With respect to entropies, it ain’t hard to tell that within a complex system, there are going to be many different kinds of orders and when you go about comparing and contrasting the different entropies of other states, the same argument can be applied.

If all measures of entropy were increasing, then it is expected for systems to become even more chaotic and not less chaotic. Of course, that’s not to say that this principle will always violate the second law of thermodynamics. Now, in this case, how would we go about defining “pattern” [in this context]? Unlike what I stated about the definition of pattern in the second part of the Watson “i/j” Theory, the conditional for pattern here would be that patterns are simply a way of taking some sort of transformation of the state-space, and in which the entropy is minimized. To show you how you would go about using entropy measures to deduce an order [or, pattern], let’s utilize what’s known as the “staircase” algorithm:

If you have a process corresponding to an infinite periodic sequence where one period consists of the order {0, 1, 2, 3, 4, 5} and it repeats itself, you would calculate P(X = a), for a = {0, 1, 2, 3, 4, 5} and you will always get 1/6, which implicates a maximal entropy. However, [and this is the same steps that I showed you all in the second part of the Watson “i/j” Theory] for this process we know that

P(X_n + 1 = a | X_n = a - 1) \text {MOD 6)}\ = 1

Due to periodicity, the entropy of this joint distribution is zero–and that would imply absolute determinism. Now, do note that if the entropy is always unique, as a function, in a way that it is maximal, in terms of the information, you will have to first construct a distribution that is given an internal state-space inside of the whole system and from this you will be able to deduce the realities of the distribution.

One thing that you must keep in mind about [popular] scientific descriptions (i.e., “second law of thermodynamics”) is that they tend to leave out the “messy details”. Usually, those “messy details” are drowned in six or more pages of math–math that’s not all too complicated, but nevertheless, there’s plenty of math for you to wrap your head around in.

Anyways, you can never have perfect nor near-perfect initial conditions. This is the emergence of chaos or randomness. Unlike things arbitrary, random implies that a variable could mean that it [the variable] follows a probability distribution. Mathematical sequences; random decay of particles; and irrational numbers, plus a statistical treatment of systems all reveal how complexity can lead towards a bias for determinism. However, with the Watson “i/j” Theory, you can see a slight difference. I, for one, do not believe that patterns are random; they are (or, at least can be) modeled. If I set forth the principle of using either -1 or 9 as the implied values for some mark (in case of the theory itself, the diacritic dots that reside over a lowercase “i” or a lowercase “j”), I’m pretty much setting up order to the chaos beforehand since no other value aside from -1 or 9 will be implied. The implied values are the order in which will determine the derived values. That’s an example of setting order to the chaos within the system. If we’re talking determinism, then we’re basically talking about math, playa. We can describe our perceptions of the order to the chaos because essentially math is going to have to come into the picture one way or the other.

If, by chance, we were to talk of chaos, on the quantum level, it would be non-deterministic chaos. It wouldn’t matter if you were given the same conditions or not, non-linear dynamic systems do not repeat themselves. With that said, chaotic systems can be in synchrony without accompaniment from periodicity. Spontaneous order can arise out of “coupled disorder”; your brain, human ecology, life, your body’s immune response, etc., are all functions of spontaneous order. Chaos theory states that if you could determine the initial conditions (an example of this would be using -1 and 9 as the implied values for the diacritic dots over “i” and “j“, e.g., “Watson ‘i/j’ Theory“), then you could have exact deterministic predictability; yet, quantum mechanics functions on the principle that you can’t do this with the actual world and its functionality. Understand, that some things are indeed more predictable than what some would think. Take for instance, predicting how a die’s roll. How a die rolls and “jumps” across the table and how it interacts with the aerodynamic drag acting upon the die as it flies through the air is defined by classical mechanics. In this case, random is not defined in the context of being indeterminate but chaotic. For it being chaotic, the behavior of the die is not-so-easily calculable and any small deviation between the interaction of the die and the table it’s flying across can cause several huge differences that determine the number on which the die will land (in this event,  that’s what “chaotic” means). The actions of the die seems as if it’s locked in an indeterminate mechanism, though it’s not. When any object impacts another object it’s defined by the principles of classical physics (reducible to individual events), i.e., the bounces that the die takes in conjunction with air resistance, these events are governed by the laws of the conservation of energy [and conversation of momentum]. Those two laws, in addition to the laws of aerodynamic effects determining what a die does as it bounces across the table. The actions upon the die is reducible to what happens during its movement across the table; if not, then we must claim that “other” indeterminate processes operating on the die while the die is in action.

Non-deterministic systems evolve in such a way that no amount of [prior] information will tell you what the outcome will be with near-100% certainty. Contrary to popular [scientific] belief, quantum mechanics yields indeterminacy. Unpredictable systems evolve in a way that if you knew the exact initial conditions (and rules), you can predict the outcome with 100% certainty but the slightest error in your initial conditions (or, rules) will yield outcomes different from what you’d expected. So, play around with different implied values for the diacritic dots over “i” and/or “j” and see what your derived values are if you were to use another numerical value for the name “Susie” other than -1 and see if the derived value is 13.

It won’t be.

In closing, you can think of entropy as anti-information (in the Greek, the prefix anti– means “instead of”, not “against”, like it does in American English). Say I have a 400-mb cd-rom with old pictures in it. If I were to take a hammer to it, I’ll destroy those pictures and, as a result, I would have increased the entropy of the world by 400-mb (you can measure entropy and thermodynamics using megabytes). If I were to leave that cd alone, what will happen is that the cd will spontaneously decay away. Also, the pictures (information) will decay as well.. However, the direct opposite does not happen; if I leave the cd alone, a photo album does not spontaneously appear. Also, thermodynamics of information is an active area of research [in physics]. An increase in entropy corresponds with an increase in heat. The reason why your desktop/laptop “gets hot” is because your computer’s simultaneously doing tons of calculations while you’re entering in “information” and when that “information” (data) gets erased from the CPU, this, in turn, increase entropy. The best way to “erase” information is to burn it. Things that are not-so-obvious turn out to be related–somewhat. How would you define “temperature”? In a bottom-up/top-down relationship, temperature would be what you’d measure with a thermometer. Well, you can look at entropy in a similar light since entropy is the function of energy placed into a system versus how much the temperature [within that system] changes.

In my hand, I have a cup of ice, right? There is an entropy associated with that cup of ice:

\frac{41 J}{mol K}

Now, watch that cup of ice melt into water and the entropy is now:

\frac{70 J}{mol K}

Do know that some [scientists] will model that and try to convince you by telling you that due to some “weird quantum entanglement effect”, that the entropy of the universe is constant. In other words, you’re witnessing water (read: the universe) change from one form into another, yet in order for you to comprehend what’s happening before your very eyes you’ll need implication of quantum entanglement to justify exactly what it is that you’re seeing. Does that make any sense to you? Answer: I would hope not. I’m watching ice change entropy from one form (as ice) into another entropy (melting into liquid water (keep in mind what I stated earlier about heat increasing corresponding with an increase in entropy)).

-Desmond (DTO™)