Superassociative operation

Claim: if an operation's iterates are all associative, they must eventually repeat.

Denote $\circ_1$ by $\circ$ and $\circ_2$ by $\circ'$. Denote the $n$th power of $x$ with respect to $\circ$ by $x^{\circ n}$, so that $x \circ' n = x^{\circ n}$. Then if $\circ'$ is associative,

$$a^{\circ b^{\circ c}} = (a^{\circ b})^{\circ c} = a^{\circ bc}$$

where the latter equality follows from the associativity of $\circ$.

Now suppose that for any $N \in \mathbb N_{\ge 1}$ we can find $x$ such that $x = x^{\circ 1}, ..., x^{\circ N}$ are all distinct. Then, for all $b$ and $c$, we can find an $x$ such that $x^{\circ b^{\circ c}} = x^{\circ bc}$ implies that $b \circ' c = b^{\circ c}$ and $bc$ are equal, so in fact $\circ' = \times$. And in this case, $a \circ b = 1^{\circ a} \circ 1^{\circ b} = 1^{\circ a + b} = a + b$.

Now let's look at the other case. In any semigroup, the semigroup generated by a single element $a$ is determined uniquely up to isomorphism by the first power that repeats, $m_a$, and the minimal period $p_a$. Also let $n_a = m_a + p_a$. Call this semigroup $C_{m, p}$.

The cyclic semigroups in $\circ$ completely determine $\circ'$, since it is just repeated $\circ$.

Now notice that, by induction,

$$a \circ'' 1 = a = a^{\circ a^0}$$

$$a \circ'' (n + 1) = (a \circ'' n) \circ' a = a^{\circ a^{n-1}} \circ' a = (a^{\circ a^{n-1}})^{\circ a} = a^{\circ a^n}$$

So basically, where we consider $C$ as a quotient of the non-negative integers regarded as exponents, $C'$ is just the cyclic multiplicative monoid generated by $a$ in $C$!

This immediately implies that the operation must eventually repeat. To see why, notice that all $C$ have size bounded by $M$. Therefore, so do all $C'$ (and $|C'| \le |C|$). Since there can only be finitely many such monoids, the size of all the cyclic monoids stabilizes after a certain point. After this point, we have the same elements in $C$ and $C'$, but possibly with a different operation. But again, there are only finitely many such operations, so they must repeat in bounded time. Then, simply iterate for the least common multiple of these periods (worst case, say, $2^M + 2^M!$) and you will get a repeated operation.

In most cases $C'$ will be smaller than $C$. Notice, $|C| = n_a - 1$.

If $a \ge m_a$, then we only get $1$ and elements in $[m_a, n_a)$, so, if $m_a > 2$, $2 \notin C'$.

If $a < m_a$, then $|C| = |C'|$ only when $a = 2$ and $m_a = 3$ (because otherwise we must skip $a^{\circ 2}$ or $a^{\circ 3}$ - or $a = 1$ so $C'$ is trivial).

Moreover, $(j + dp_a)(k + ep_a) = jk +Qp_a$ so in reality the subsemigroup $[m_a, n_a)$ is isomorphic to $\mathbb{Z}/ p_a\mathbb{Z}$ both in terms of addition and multiplication. We must eventually get some power of $a$ inside of this monoid. And the only time a multiplicative submonoid of $\mathbb{Z}/ p_a\mathbb{Z}$ is equal to the whole thing is when $p_a = 1$.

Thus, the only stable cyclic semigroups are $\mathbf{1} = \{a,a^{\circ 2}=a, ...\}$, $A = \{a,a^{\circ 2},a^{\circ 3} = a^{\circ 2}, ...\}$, $T = \{a,a^{\circ 2},a^{\circ 3}, a^{\circ 4} = a^{\circ 3}, ...\}$, and $T$ only when $a = 2$.

So we have at most three classes of elements: idempotents ($a \circ a = a$), square-idempotents ($a \circ a = f(a)$ an idempotent), and $2$ may possibly be neither (in which case its square must be of the second class).

Since each of these semigroups actually produces the same operation, the minimal period of an superassociative operator must moreover be $1$, and its "$m$" must be less than $M$.

All the examples mentioned above fit into this pattern.

  • $a \circ_L b = a$ has $\circ_L' = \circ_L$ and all elements are idempotent.

  • $a \circ_R b = b$ has $\circ_R' = \circ_L$, and again everything is idempotent. In fact, any operation for which everything is idempotent produces $\circ_L$. E.g., $\max$ and $\min$.

  • $a \circ_k b = k$: (sorry, overloading OP's notation here) $\circ_k'$ actually is only associative for $k \ne 1$: $a \circ_k' b = k$ if $b > 1$, otherwise $a$. This is almost stable, except for when $a = 1$, but we actually do get a chain of three distinct operators here.

And in fact, using the above cyclic monoids we can construct all operations $\circ$ such that $\circ' = \circ$. Suppose $f : \mathbb{N}_{\ge 1} \to \mathbb{N}_{\ge 1}$ is idempotent: $f(f(x)) = f(x)$, with $f(x) = 1$ iff $x = 1$. Also fix $g \in \mathbb{N}_{\ge 1}$, and require that $f(g) = f(2)$ and if $g = 2$, then $f(2) = 2$. ($f$ then gives the top element in each cyclic monoid, and $g = 2 \circ 2$.) Then set

$$a \circ b = \begin{cases} a & \mbox{if } b=1 \\ g & \mbox{if } a = b = 2 \\ f(a) & \mbox{else} \end{cases}$$

Note that these are not all associative. It should be possible, with some work, to construct (arbitrarily long?) chains of distinct operations by working backwards from the stable monoids.

Question:

If $\circ$ and $\circ'$ are associative (but not $+$ and $×$) does $\circ''$ have to be as well?

**Follow-up (3 April 2017):**

Say that an operation $\circ$ is iterative if it is in the image of the $I: \star \mapsto (x, y \mapsto x^{\star y})$ operator, and $n$-iterative if it is in the image of $I^n$.

Claim: The maximum associativity degree (in the sense of the length of a chain of distinct operators) is either 2 or 3.

First, note that if $x \cdot x = x$ and $\cdot' = \circ$ then $$x \circ (n+1) = x^{\cdot n+1} = x^{\cdot n} \cdot x = (x \circ n) \cdot x = x \cdot x = x$$ by induction. So $x \circ n = x$ for all $n$. In particular, if $\cdot$ is iterative, then $1 \cdot 1 = 1$, so for every 2-iterative operation, 1 is a left-absorbing element (as well as a right identity).

Corollary: No 2-iterative operation on $\mathbb{N}_{\ge 1}$ can be commutative, since then $1 = 1 \cdot n = n \cdot 1 = n$.

So we also have

$$(x \circ 1) \circ y = x \circ y$$ and $$x \circ (1 \circ y) = x \circ 1 = x$$

So if $\circ$ is associative, in fact $x \circ y = x$ for all $x, y$, meaning $\circ = \circ_L$. This immediately implies the claim, since if there is a chain of $n$ associative operators then all but the first two are at the end of a chain of three and therefore must all be $\circ_L$. This partially replaces the proof above.

Now, let's try to trace back some of the known stable operations.

$\circ_R$, $\min$, and $\circ_k$ can't be iterative because they don't have $1$ as a right identity.

$\cdot'$ can't equal $\max$ for $\cdot$ associative (or even power-associative):

$\max(1, n) = n$, so $1^{\cdot n} = n$, so $m \cdot n = 1^{\cdot n} \cdot 1^{\cdot m} = 1^{\cdot m + n} = m + n$.

So in general, where $\circ$ A-iterative means that $\circ$ is associative and $\circ = \cdot'$ for associative $\cdot$, we have:

Proposition: An A-iterative operation with a 2-sided identity (in particular a commutative A-iterative operation) must be $\times$.

There is also $\circ_k'$ and $\circ_k''$ but neither of these are associative themselves anyways (proof: $(2, 1, 2)$).

Finally, recall that $\cdot' = \circ_L$ iff $\cdot$ satisfies $x \cdot x = x$ for all $x$. (It's easy to construct non-associative examples like $x \cdot_n y = \max((n + 1)x - ny, 1)$, but this one obviously can't have a right identity either unless $n = 1$ so it isn't iterative.) If such an operation is also associative it is called a band. Associative examples include $\max, \min$, but both of these are commutative.

A rectangular band is one constructed as $A \times B$ with the operation $$(h, i) \circ (j, k) = (h, k)$$

So, if we fix two a countable set $X$ and put $X \times \mathbb(N)$ are in bijection with $\mathbb{N}_{\ge 1}$, we will get a band operation and therefore a predecessor of $\circ_L$. In particular we can set the first component equal to some set of decimal (or base $b$) places and the second one equal to the rest (this operation can give zero but we can conjugate it with shifting to get an operation on $\mathbb{N}_{\ge 1}$). This gives continuum-many operations with associativity degree 2 (with iterate $\circ_L$). However, only the one-element rectangular band has a right identity, so none of these operations can be iterative. Also, the only rectangular, left-normal band is $\circ_L$.

If $\cdot$ is A-iterative, then $\cdot$ must also satisfy the law we proved above: $x \cdot (y \cdot z) = x \cdot (yz)$. This implies that $x \cdot (y \cdot z) = x \cdot (z \cdot y)$. A band which satisfies this identity is a left-normal band. So to show that there is an operation with associativity degree 3 we need to find a countably infinite noncommutative left-normal band with a right identity.

**Follow-up (29 January 2018):**

Suppose $\cdot' = \circ_L$ and $\&' = \cdot$, with all being associative. Then $2 \cdot 2 = 2$ so

$$x\&x\&x\&x = x \cdot 4 = x \cdot (2 \times 2) = x \cdot (2 \cdot 2) = x \cdot 2 = x \& x$$

By associativity of $\&$, we get that $x \cdot (2k) = x \cdot 2$ and $x \cdot (2k + 1) = x \cdot 3$ for all $k > 0$. Recall that $x \cdot x = x$, so in particular when $x$ is even, $x\cdot 2 = x$, so also $x\cdot 3 = x\& (x \cdot 2) = x$ so $x\cdot y = x$ for all $y$. The only freedom we get in $\cdot$ is to choose what $x\cdot 2 = x\&x$ is when $x$ is odd and greater than 1--and it must be a number that is left-absorbing for $\&$ since $(x\cdot 2) \cdot 2 = x \cdot 2$.

If $\cdot$ is of this form then it will in fact be an associative predecessor of $\circ_L$: let $f$ be an idempotent function $\mathbb{N} \to \mathbb{N}$ that fixes all even numbers. Then we define $x \cdot y := f(x)$ if $y$ is even, otherwise $x$. The iteration of this operation is clearly $\circ_L$, and it is associative: when $x$ is even, $x \cdot (y \cdot z) = (x \cdot y) \cdot z = x$. If x is odd and y is even: $(x \cdot y) \cdot z = f(x) \cdot z = f(x)$, $x \cdot (y \cdot z) = x \cdot y = f(x)$. If x is odd and y is odd: $(x \cdot y) \cdot z = x \cdot z = x$, $x \cdot (y \cdot z) = x \cdot y = x$. All that remains to show is that these operations do have associative predecessors.

We now construct some such predecessors $\&$, showing that the maximum associativity degree is in fact 3:

Let $s$ be a permutation of $\mathbb{N}$ composed of 2-cycles (no fixpoints) that all consist of an even number and an odd number. Now define $$x \& y := s^x(y)$$ Note that because of how $s$ is constructed, $x \& y = x + y (\mod 2)$. Therefore $(x \& y) \& z = (x + y) \& z$. And $x \& (y \& z) = s^x(s^y(z)) = s^{x+y}(z)$ so $\&$ is in fact associative. $x\&x = x$ if $x$ is even and $x\&x = s(x)$ when $x$ is odd. Therefore $x\&x\&x = x$ if $x$ is even and $x\&x\&x = s(x)\&x = x$ if $x$ is odd because $s(x)$ is even.

These $\&$ all have the property that $x \& x$ is distinct for all $x$, so they don't cover all possible $\cdot$ above. I see no reason why they should be the only ones, but it seems difficult to come up with all associative operations satisfying the mild constraints given.


Here is a partial result ...

Suppose $A: \Bbb{N}\times\Bbb{N}\to\Bbb{N}$ is a given operation that (1) is associative, (2) has a right identity element $r_A$, and (3) is such that $m\ A\ n \ge m+n$ for all sufficiently large $m,n \in \Bbb{N}$.

Define $B: \Bbb{N}\times\Bbb{N}\to\Bbb{N}$ in terms of $A$ as follows (using iterated function notation): $$m\ B\ n := (m\ A)^n\ r_A$$ which is the same as $$m\ B\ n \ \ := \ \ \begin{cases} r_A & \text{if }n=0 \\ \underset{n \ \ m\text{'s}}{\underbrace{m\ A\ m\ A\cdots m\ A\ m}} & \text{if }n\ge 1. \end{cases}$$ Now, for all $m\ge0, p \ge 1$, and $n$ sufficiently large (ensuring $n\ B\ p \ge np$ by property (3)),
$$\begin{align} \color{blue}{m\ B\ (n\ B\ p) } & = (m\ B\ n)\ A\ (m\ B\ (n\ B\ p\ -n)) \\ & = (m\ B\ n)\ A\ (m\ B\ n)\ A\ (m\ B\ (n\ B\ p\ -n -n)) \\ &.\\ &.\\ & = \color{blue}{((m\ B\ n)\ B\ p)}\ A\ (m\ B\ (n\ B\ p\ -np)) \\ \end{align}$$

Therefore, $B$ is associative only if $(m\ B\ (n\ B\ p\ -np)) = r_A$; however, if $A$ is not addition then $n\ B\ p\ \gt np$ for sufficiently large $n$, in which case $(m\ B\ (n\ B\ p\ -np))$ is not a constant (hence $\neq r_A$). Consequently,

$$A\text{ is not addition }\implies B\text{ is not associative}.$$

On the other hand, if $A$ is addition, then $n\ B\ p\ = np$ and $(m\ B\ (n\ B\ p\ -np)) = m\ B\ 0 = r_A$, and virtually the same derivation as above shows that $B$ is then associative. That is,

$$B\text{ is associative }\iff A\text{ is addition} \iff B\text{ is multiplication}.$$

NB: This considers only those operations $A$ that (1) are associative, (2) have a right identity element, and (3) are such that $m\ A\ n \ge m+n$ for all sufficiently large $m,n \in \Bbb{N}$.