# prime checking without division

 The basic test for a prime number, is division by all primes lesser than or equal its square root. But division is slightly more work than necessary.

FASTER LONG DIVISON:

Before the advanced mathematical technique of checking n(Q-1) = 1 mod Q, for Q prime, the manual test uses standard long division attempts by lesser primes P : Q/P ≠ 0, making successive trial estimates of each digit of result, trial subtractions, single digit multiplication, and full subtraction, from the high order left, down the digits to zero-remaining if factored; nonzero if not factored; and finally prime if not factored by all considerable prime divisors.

A more advanced division technique allows estimating the digit high-side, with a negative remainder after full subtraction of the single digit multiplication, and subsequent digit negative, eg. 13579... = 13439...

(This positive-negative technique has further advantage in reading meter displays tracking variables: where upper digits stabilize to readable while lower digits adjust at blur speeds: Eg. a display transitioning between 3.4968 and 3.5023 is unreadable below the unit digit, as 3.####; but the equivalent transitioning between 3.5032 and 3.5023, is very readable, as 3.50##: and shows where the changing is.)

PRIME CHECKING:

But numbers can be prime-checked in a division-like process without, the step of guessing: by trialing right to left: All numbers ending in units-place, 0 or 5, except one are non primes divisible by 5. Other potential primes end in 1,3,7,9, which have unique co-factors per trial divisor least significant digit, eg. ...1/...7 = ...3 if factored; thus removing the guessing step in the trial-division process. The resulting quotient is not usable, but it finishes and indicates, as factored, or not. And it says, there are useful arithmetic processes that are not likely, ever, implemented on computers: whence computer metatheory says, all computers are a simulation of arithmetic computing with some directness; computers are always a subset of computing.

The second-main point of this is that, Although addition, subtraction, multiplication, division are well understood -as- fundamental arithmetic operations, they are not -the most- fundamental: there are smaller arithmetic processes, as exampled.

A premise discovery under the title,