Access to the iOS' video decoder?

2019

There are two solutions

  1. Do it "by hand" which means using AVFoundation and in particular VideoToolbox.

To get going with that you basically start with https://developer.apple.com/videos/play/wwdc2014/513/ Enjoy!

I have to say, that is really the "correct and better" solution.

  1. If you can get ffmpeg-api working inside your iOS app, you can use ffmpeg, FFmpeg will do hardware decoding after some fiddling.

There are a number of ways to get started with that. (One absolutely amazing new thing is the SWIFT ffmpeg made by sunlubo: https://github.com/sunlubo/SwiftFFmpeg )

Be aware with the "ffmpeg" approach that there are, in short, a number of legal/license issues with ffmpeg / iOS. One can search and read about those problems.

However on the technical side, these days indeed it is possible to compile ffmpeg right in to iOS, and use it raw in your iOS code. (Using a C library may be easiest.)

We just did an enormous project doing just this, as well as other approaches. (I never want to see FFmpeg again!)

You can in fact achieve actual hardware decoding, in iOS, using FFmpeg.

We found it to be incredibly fiddly. And a couple of bugs need to be patched in FFmpeg. (I hope I never see videotoolbox.c again :/ )

So once again your two options for hardware decoding in iOS are

  1. Do it "by hand" AVFoundation/VideoToolbox.

  2. Use FFmpeg.

Item 2 is incredibly fiddly and uses a lot of time. Item 1 uses a huge amount of time. Tough choice :/


With iOS 8, you can use video toolbox (https://developer.apple.com/reference/videotoolbox) to decode H264 to raw frames. VT APIs are hardware accelerated and will provide you much better performance when compared with libavcodec. If you want to play the frames or generate a preview, you can use eagl based renderer to play. I have written a sample app to encode the frames from raw to h.264 (https://github.com/manishganvir/iOS-h264Hw-Toolbox). h.264 to raw shouldn't be that difficult !


After raising the issue with Apple DTS it turns out that there currently is no way to decode video data from custom stream sources.

I will file an enhancement request for this.


If you continue to have problems with it, I suggest you take a look at libavcodec for decoding the data (available on the ffmpeg project).

There are great ffmpeg tutorials at dranger that show how to properly decode (through libavcodec) and display video data (using libsdl), among other things.