What's the probability distribution of a deterministic signal or how to marginalize dynamical systems? (functional integrals in probability theory)

On $[0,1]^{[0,1]}$ there is a prior distribution (even a "proper" one) that corresponds to the idea of "totally unknown": the product uniform measure. But you cannot do any meaningful Bayesian analysis with such a prior. You know $f$ maps $z_1$ to $z_2$ etc and the posterior distribution of $f(z)$ is still uniform on $[0,1]\cap\{z_2,\ldots,z_n\}^c$ if $z\notin\{z_1,\ldots,z_{n-1}\}$. As a result, the distribution of the infinite sequence of outputs $(z_i)_{i\in\mathbb N}$ is the product $\prod_{i\in\mathbb N}dz_i$ : they are i.i.d. uniform random variables.

To go Bayesian meaningfully, you need a (possibly improper) prior distribution on a set of maps $\Gamma\to\Gamma$ such that your unknown $f$ certainly belongs to it. It may be counting measure on a countable set of possible maps (example $f_n(x)=$ the fractional part of $nx$, on $\Gamma=[0,1]$) or a measure $d\lambda/\lambda$ for a set $\{f_\lambda,\lambda>0\}$. But with no restriction on the dynamical system, nothing reasonable can be done.

What can be done is to treat this as a problem of interpolation: construct $f_n$ satisfying $f_n(z_i)=z_{i+1}$, $i=1,\ldots,n-1$ (piecewise linear,spline,...), in such a way that $f_n$ converges to $f$ if the infinite sequence $(z_n)$ is dense. Moreover, this approach is Bayesian (Kimeldorf & Wahba, 1970), but with a prior such as Brownian (piecewise linear) or its primitive (cubic spline), not the "non-informative prior".