SECRET

Lord over Numbers

[Discussion herein uses the language of, but is not intended for use in nor upon, populations]

In the numerical calculation processing of numbers the precision-spread of results depends on the correlations of the input values, oft empirically known only at the event of data entry, oft later - and oft revising as data may be subsequently reused.

A simple calculatory estimate of the resulting precision-spread might involve re-calculating the entire process for each input datum slightly varied, and computing the ratio of differences for each output given the difference for each input - a discreet differential or quantized derivative - if the calculation is reliable at small "delta" values (the amount of variation) the resulting difference is a numeric estimate of the process sensivity to the "delta-varied" datum tested - a full sensitivity analysis requires repeating the test for every input value, but, large processes may have racks of precise variables and hopper-loads of variably precise input values for each: an enormous calculation for its generality, and its purpose may yet be to prevent or avert a process-crash by bounds-checking. Even then, the self-correlation of variables is inadequate to guarantee, estimate, even predict the instantaneous correlation of uncorrelated variables: when under actual instantial case the inputs may creep toward the near-worser-possible case.

Thus like many full-domain transforms, the sensitivity analysis may be of no intrinsic utility in application(s).

Whence next we consider the more practicable performance of an efficient subset of combined sensitivity tests cases: the minimum number is (log n) +2 : n being the count of inputs to be tested: the reasoning is there exist (log n) +2 "Walsh" functions [harmonic squarewaves] spanning a set of tested values, so that among the minimum set there exist some tests where each input value is combined with each other in both positive and negative correlations: for example on 4 values, ++++,++--,+-+-,---- are the 4 combined tests covering all possible combinations of any two values, though not all combinations of all values, nor even a hull spanning the domain of all values. This is quite as reasonable as standard sensitivity analysis inasmuch as the input values may instantially correlate: we've chosen a sufficiently varied representative population of samples.

[Note that the order may be both arbitary or a randomizing cover-weight applicable: in the example, +--+,+-+-,++--,-++- would have served equally as the likelihood is small that all input values were positive-swayed for the test, though that means nothing, save it may invent a bias-trouble in a particular calculation implementation. This gives the programmer an additional tool of permutations generating many similarly satisfactory tests, called individuals.]

This combined test is a vector of signed-units weight-factors - or simply we may call it a chromosome: . . .

What makes this TOP SECRET is, . . .

[under construction]

A premise discovery under the title,

Grand-Admiral Petry
'Majestic Service in a Solar System'
Nuclear Emergency Management

© 2000 GrandAdmiralPetry@Lanthus.net

SECRET