Apple - What is the technology used in iOS 11 settings migration (moving blue dots / particles pattern / animated QR code?)

The technology can be thought of very much like an animated QR-code. The main principle of data encoding is very much like QR-codes, but obviously with the difference that as time goes forward, the displayed data changes as an animation.

In addition there seems to be the use of a neat trick with two different colors shown in quick alternation, which makes it possible for Apple to "hide" or make less obvious the clues that are used for the receiver of the data to synchronize with the signal (i.e. figure out where the boundaries of the encoded data is).

With a traditional QR code these sync markers are much more evident as they are big black square boxes with white borders that are set different places in the code to make it easy for the receiver to figure out what's encoded data, and what's not part of the code.

Much more details are available in these two patent applications by Apple:

US Patent 9,022,291 US Patent 9,022,292


I skimmed the patents linked above and here’s my naive interpretation of the technology. It’s essentially a finely detailed QR code, or similar, but rather than using coloured squares, the information is encoded by rapidly alternating complimentary colours such that the human eye doesn’t notice. This also allows higher information density by allowing the subtleties of the particular alternating colours chosen to encode additional degrees of freedom. Neither the rounded density distribution nor the wandering, orbiting movement of the particles is mentioned in the patent, it might be just for aesthetics which would be an Apple thing to do. What is mentioned is that this code can be embedded into a background image (and presumably an animation), without the image appearing different to the human eye. I wonder how much of the spherical orbiting “galaxy” shape is just smoke and mirrors to give you something to point your camera at while the real magic happens invisibly? We need someone to film the code up close, with a high speed camera - the patent mentions a flicker rate of 60hz so any footage above 120fps should do the trick.