Computus

a primer to the basic algebraic notion of computation

[this document requires HTML-superscript-ing]

[under construction - preliminary draft needs typography]

We take the infinitesmally closed interval in deference to finitesmality, and to a possibly finer ruler.

First Notion

computus: quantum, measurement, indistinguishability, equality, inclusivity, inequality

Let a, a+1, ... be numbers adjacent by smallest measurable quanta=1, one finitesmal; and let x: a+1>x>a be a measurable value on the same domain. If (and we're looking for the contradiction to if) x leans distinguishably more toward a than toward a+1, which we denote, a+1 m> x, meaning x measures less than a+1, then in fact we could have resolved further the distinguishable adjacency of a, a+1,..., and we may presume then we already did so [in eliminating the contradiction] - to wit: if x : a+1 m> xo > y > a (validly: a+1 > xo) then xo, 2a+1-xo subdividee the interval (a,a+1) into three distinguishable pieces: lesser quanta, around and between a,a+1, that is, around a,a+.5,a+1, where "a+.5" is short notation for, between a,a+1. Obviously a contradiction: x measured cannot be more accurate/precise than the ruler, else we'd have merely defined a way of obtaining a+.5: after finishing that definition at a,a+1,.... This may change our skill at taking measurements, by observing around-ness and between-ness; but it cannot improve on the best possible measure of x, and therefor neither a+1>x nor x>a - x is not distinguishably toward, around, nor between a,a+1.

Whence x is indistinguishable on [x-1,x+1], while it is distinguishable outside. Thus the a's themselves are vestigially indistinguishable, as adjacent, while distinguishable further away than immediate. Just possibly a m= a+1.

Inside this interval [x-1,x+1] x is most indistinguishable near the center, and least, most distinguishable, near the endpoints, without presuming any tendency within the interval: the distinguishability being essentially monotonic on either side of center: being akin to the rightness of a quick look. [Thus the a's themselves...] The monotonicity is not necessarily linear (to a triangular interval) because our usual method of quantum measure relies on a,a+1 being differentiated (as starkly as black and white lines and spacings), and x is not only nearer one, it is vexingly more like one, than the other. However, certain sensor-metry may have very linear statistic averages.

Any x then, must measure indistinguishably as either of the nearest two a's, or in the extreme case of x=a, as among {a-1,a,a+1}. [Possibly xm=x-1,x+1]

[x : ... , is read, "x for/ [be] such-that/where [it[is]] ..." . . . . . .

Second Notion

The traditional course of probability and statistics discovers a law of large [sums of] numbers wherein sums of uniformly distributed (and other qualified distributions of) numeric samples exhibit a normal distribution asymptotically, e^-x^2/2s^/s*sqrt(2pi) : s=standard (of) deviation. But our second notion of distinguishability tells us that for a number to be distinguishable (from adjacent numbers: all of them) it must be selected as such, for at least 50% of the samples of the same number - at least statistically. The classic normal distribution peak exceeds 0.5 only for sqrt(2/pi) > s or 0.7979 ~ 1/1.2533; but as we notioned first, an excess of distinguishability can be (further) subdivided or revised until our uniform finite covering is barly distinguishable at the minimum quanta (maximum) resolution. Thus we arbitraily take our covering mesh to be, s=sqrt(2/pi), or the semi-natural normal distribution, e^-pi(x/2)^2//2 - the natural normal distribution being, e^-pi*x^2.

However, the precise nature of the semi-natural normal distributed uniform finite covering is such that, although the representative selection of an exact sample as itself, is just 50%, the misselection by some near-representation (of an exact sample) is minutely greater at 0.500006975 - a curiosity that infinite sums of equally spaced consecutive normal distributions converge rapidly to uniform on the whole, for smaller meshes, but not identically: the half-way between numbers are representatively selected slightly less than 100% (for all possible) and thereby compensates mostly. To characterize this more reasonably, the normal distribution is not perfectly ideal. *

* [sampling on some uniform detectors yields a triangular distribution]

Thus by notion our ruler line is graduated with near-triangular half-height distributions overlapping on half-widths (still each a full unit-for-unit likelihood, but spread over two), whereín each exactly half-way number is representatively selected as (equally) either nearest neighbor, 82% total occasionally or 41% each nearest neighbor (better than strictly triangular 75%, or 37.5% each). . . . .

[next page]

"The tack of the true mathematician is to go quickly to the best right answer integrating left and right."

Grand-Admiral Petry
'Majestic Service in a Solar System'
Nuclear Emergency Management


The theory of measurement propounded in this work is not to be cited (as) considering contraband or corpses; Nor are the intellectual appurtenances herein to be used for or in the commission of crimes against persons, peoples, properties, or powers (states). May your tabernacle measure true.

COPYRIGHT: BASIC LIBRARY RULES: NONTRANSFERABLE: READ QUIETLY

© 2000 GrandAdmiralPetry@Lanthus.net
Mr. Raymond Kenneth Petry