Functions that can be computed faster simultaneously than expected

A specific example that has a negative answer comes from Karatsuba multiplication in which one multiplies two polynomials $$(a_0 + a_1x)(b_0 + b_1x) =: c_0 + c_1x + c_2x^2$$ $$= (a_0b_0) + (a_0b_1 + a_1b_0)x + (a_1b_1)x^2$$ $$= (a_0b_0) + (a_0b_0 + (a_0+a_1)(b_0 + b_1) + a_1b_1)x + (a_1b_1)x^2$$ If we take $\mathbb{F}_2$-coefficients (or $2$-coefficients, in your notation), this can be viewed as three Boolean functions $2^4 \to 2^1$, or their concatenation a single function $2^4 \to 2^3$.

Karatsuba's trick allows one to turn the four multiplications and one addition into three multiplications and four additions. If you're counting all operations, and not just multiplications, it looks like you have increased from five to seven operations. But if you use this trick recursively, or take the $a_i,b_j$ to be polynomials themselves, then the reduction in multiplication wins out, and you can save overall operations.

I don't have a proof that two of the coefficients cannot be optimized without the third, but the reason I believe it is the case is because the Karatsuba trick exploits the fact that our coefficients of interest live in a rank-3 subspace of a 4-dimensional space.


Suppose there is a 2-variable function $\phi: 2^n\times 2^n \to 2^m$. Take two constant vectors $c_1,c_2\in 2^n$ and define $f(x)=\phi(x,c_1)$, $g(x)=\phi(x,c_2)$ and $h(x)=\phi(x,c_1+c_2)$. For computing any two of these, the second arguments are arbitrary constant vectors, but to compute all three one might be able to use the fact that the three second arguments have a simple relationship.

For example (does this work?), take $m=1$ and define $\phi(x,y)$ to be the inner product of $x$ and $y$ (mod 2). Then $h(x)$ can be computed very quickly as $f(x)+g(x)$ but I don't see how to do any two of $f,g,h$ faster than one at at time.