Every Implementable Algorithm is Extensionally a Lookup Table

Setup and Assumptions

Definition (Deterministic Implementable Algorithm)

A deterministic implementable algorithm is a total function

f : I O ,

where I and O are finite sets of inputs and outputs, respectively.

Definition (Stochastic Implementable Algorithm)

A stochastic implementable algorithm is either

  1. a conditional distribution P(·i) over O for each iI, or equivalently
  2. a deterministic function h : I × S O together with a probability distribution μ over the finite seed set S.

Main Results

Theorem (Deterministic Case)

Every deterministic implementable algorithm is extensionally equivalent to a lookup table.

Proof. Let f:IO with I finite. The function f is completely specified by the finite set of pairs

Tf = { ( i , f ( i ) ) : i I } .

This set Tf is a lookup table representation of f. Hence f and Tf are extensionally equivalent.   ∎

Theorem (Stochastic Case)

Every stochastic implementable algorithm is extensionally equivalent to a finite lookup table of probability distributions (or equivalently, to a deterministic lookup table on extended input space).

Proof. By definition, a stochastic algorithm specifies for each iI a distribution P(·i) over O. Thus it is represented by the finite table

TP = { ( i , P ( · i ) ) : i I } .

Equivalently, if the algorithm is implemented as a deterministic map h:I×SO with random seed s~μ, then the induced conditional distribution is

P ( o i ) = s S 1 { h ( i , s ) = o } μ ( s ) .

Since I×S is finite, h is representable by the finite table

Th = { ( ( i , s ) , h ( i , s ) ) : i I , s S } .

Thus both formulations reduce to finite tables.   ∎

Theorem (Closure Under Composition and Masking)

If f:IO and g:OO' are implementable algorithms, then gˆf:IO' is also extensionally a lookup table. The same holds if f,g are stochastic.

Proof. For the deterministic case,

( g ˆ f ) ( i ) = g ( f ( i ) ) i I .

Thus gˆf is specified by the finite table

Tgˆf = { ( i , g ( f ( i ) ) ) : i I } .

For stochastic maps, let f(i) induce a distribution P(·i) on O, and let g(o) induce Q(·o) on O'. Then the composition induces

R ( o ' i ) = o O Q ( o ' o ) P ( o i ) ,

which is again a finite table of distributions. Hence closure holds.   ∎

Corollary

Corollary

Any finite tower of algorithms (base functions, stochastic samplers, masks, controllers, etc.) is extensionally equivalent to a single finite lookup table.

Remarks

Remark

This result depends crucially on finiteness. If one allows unbounded memory or infinite-precision real inputs, then I may be infinite, in which case the lookup table analogy does not hold. However, physically implementable systems are finite.

Remark

Intensional differences (e.g., whether an algorithm is realized via neural networks, circuits, or symbolic rules) affect efficiency, compressibility, and generalization properties, but not extensional equivalence. Extensionally, all such realizations reduce to finite tables.