Conversations On-Line
Intuition Network: Physics   open forum

15

To:             Subject: correct neg-entropy principle       Date: 01/11/97

'Life .. it's flexible behaviors, not number crunching'


Matti Pitkanen wrote on 1/11/97   (Subject: CORRECTION TO NEGENTROPY MAXIMIZATION PRINCIPLE ):

===================
[begin block quote:

The corrected form of principle is either of the following forms:

1)First form

The RELATIVE NEGENTROPY GAIN R= - Delta N/N = (-N_final +N_initial)/N_initial is maximum. The maximum relative negentropy gain goes like

R = 1- N0/N, 1-N_0/log(n)

where N0 is the contribution to negentropy of subsystem final state from quantum statistical entanglement, which cannot be eliminated. The principle is NONTRIVIAL SOLELY DUE TO QUANTUM STATISTICS (Bose Einstein,Fermi)!

This form is so 'cruel' as the original version. Maximum gain increases very slowly with the number of quantum states of subsystem so that complicated systems with ability to generate a lot of quantum entanglement via deterministic quantum interactions with external world are survivors. Since for very large systems maximum quantum entanglement is not probable systems of finite size are expected to be the survivors.

2)Second form

One can consider also the gain for some function of negentropy. The most natural is the function F= exp(-S)=exp(N) familiar from thermodynamics. The maximum gain defined as - Delta F/F0 = (1-exp(N)/exp(N0)) behaves like

1-1/(n* exp(N0))

now and favours the quantum jumps for large quantum subsystems.This principle is nontrivial also in case that statistics effects
are not present: in this case one would have maximum gain 1- 1/n.
------------------------------------------------------------ -----------

Both forms imply 'survival of the fittest'. The form 1) is favourable also for quantum jumps in all length scales. The form 2)
disfavours simple systems and favours large systems. The general conclusions of previous posting remain essentially unchanged.

Matti Pitkanen

                                                                                                                                                        :end block quote]
=====================================================

Matti,

Your enthusiasm to form-fit relationship implications into your TDG theory (and specifically the Negentropy Maximization Principle) is commendable. The problems you are dealing with though have already been worked out to a great degree.[1]

I discerned 25 years ago that the main issue is to build a mathematical model which in the first instance is totally unbounded and open. Bounded regions are then observable/definable within that openness. To use my imagery from biology: we must be able to designate "econiches" within a dimensionally open environment; determine and specify how closed-sets and open-sets  interact with one another. After that, we examine what factors encourage or diminish the continuation of a bounded set to exist in the total environment. These would be ongoing dynamic factors where even the boundary-definition itself is variable and adjustable - smoothing reactively to any local/distant changes elsewhere in the environment.

Also, there are multitudes of co-extant sets and set configurations possible...each enacting their own self-relevant dynamics ... even if they overlap in content. A simple example to relate to is a "human". Alone, factors like food, warmth, shelter, function space are important. Place a single human in a community, and the "health" of the community might require that any one
individual receive only a portion of the food supply, or that several people sleep in shifts around the clock in order to be alert for animal predators, contrary to the natural (individual) preference to follow circadian day/night sleep cycles.

This is a "set" issue. The set called "individual" and the set called "community" share the exact same factors, but for the best interest of the different set, must employ or enact those factors quite differently or adjustively blend them to accommodate the continuation of both assemblies.

This is a current weakness/failing of your proposed model. You are insistent on choosing one set-framework as "base". In contrast, the Integrity model accounts for both/all sets simultaneously; allows that any function or operation can be either entropic *or* negentropic - conditional on the environmental set group considered.

The cytochrome transport system uses the stability of higher quantum plateaus to trap energy negentropically in ATP molecules ... the process partially driven by solar energy transduced through other metabolic molecules.

We have here a "single" event - the creation of ATP from ADP - describable by several different set-groups...and "opposite" ent/negent gradients.

1. As the additional phosphate atom is bound to adenosine-di-phosphate, the valence *electron* has an enlarged space to occupy. Quantum mechanically it has jumped to a higher energy state. Thermodynamically, increased its distribution-space ... increased its *spatial* entropy (even as its innate structure has negentropically *absorbed* a photon and localized it from the
sun).

2. The free phosphate atom is now *negentropically* bound as part of an adenosine-tri-phosphate molecule ... ready to be released and enact muscular activity.

3. The binding photon - once part of the sun - is distributed entropically far from the original sun-source system.

4,5,6... there are other assemblies (set-groups) by which we can evaluate the entropy/negentropy dynamics involved. I won't pursue them here.

Each of them have pertinent and extended eigenstates (actually available and elsewise, "potential"). Any modification or delta of one set co-affects the mutual or adjacent eigenstates of the other. There are several issues concomitant with all this. The dynamic survival of a given assembly is also dependent upon the rate-capacity of a system to process energy/information, besides the availability or lack of potential eigenstates. Rates of entropy/negentropy alteration.

We are therefore looking at "eigenstate options" as the primary factor in the survival of a system/assembly. THIS is the factor, which when "maximized" determines whether a system is able to cope with extraneous energy encountered form the open environment it exists in or not. It is THIS activity-potential which becomes "survival of the fittest" in large organic systems. "Fittest" becomes not the strength to over-power a competitor or opponent, but potential functional "degrees-of-freedom" to deal with future unknown situations.

We are looking then at all dynamic systems and evaluating mechanisms for survival. The extremes are : A) reduce all potential interactions to a minimum (secure an econiche; don't bump into another particle; etc) and B) have the structural capacity to deal with as many energy forms and strengths as possible - shunting, processing, absorbing or repelling the energy of such events. Any system maximizes its *potential* eigenstates when those states are neither filled nor empty. When it is poised with the capacity to lose energy or gain energy and not be disrupted by such process. This indicates that the Integrity of any system depends upon its ability to interact with the rest of the universe. It must maintain itself WITHIN A *RANGE* OF INTERACTION POTENTIAL.  Stability is not a node or plateau (Prigogine). It is a behavioral-range within which competent "behaviors" can be sustained/maintained.

NMP, as outlined by Matti, is incorrect because it focuses priority on a mechanism, not the dynamics of the whole-system. It has the right components, they are just not-well organized. So his approach is on the right track in general, that's what I like about it. The trick is to apply Relativity notions to Set Theory. One of the offshoots of this is that we can surmount Gödel's
Undecidability Theorems this way. Specific sets may be restricted to his information-limitation relationships, but since we are now able to free-float among a Cantorian transfinite set of set-groups, no one Gödel Set takes priority, and the information of one is the environment of all others. Priority is no longer placed on what is or isn't the information content of any given set, but the compatibility of information/energy to be shared among all-possible sets.

Since the quality of "compatibility" surmounts the quality of "content", we can say (contra-Gödel) that we *know with absolute certainty* some *information*  about that which we have yet to encounter in the universe. We *know something about the unknown* (!).

The topology of the universe is self-consistently compatible. Transduction of information/energy through various orders of assembly.

Ceptualist
INTEGRITY PARADIGM
(c)(1973),1992,1995,1996.[1] Understanding the Integral Universe.