Video GPU hardware decoding on Android
Issue generated from Tuleap's migration script. Originally submitted by: Ciro Santilli (cirosantilli)
This is a followup to https://tuleap.ring.cx/plugins/tracker/?aid=293 specifically to implement GPU decoding.
Current status: patches uses AMediaCodec on https://gerrit-ring.savoirfairelinux.com/#/c/3799/ and https://gerrit-ring.savoirfairelinux.com/#/c/3780/
Video works most of the time, but fails some, see the commit messages.
## MediaCodec
We are not sure if AMediaCodec is the best way of doing it, but it looks like a good choice because it is the JNI version of a Java API, thus "well documented": http://developer.android.com/reference/android/media/MediaCodec.html
There is an official example of using AMediaCodec at https://github.com/googlesamples/android-ndk/tree/3cd41e1f5280443665ca98463c7a76e80bf0b96c/native-codec The problem is that example uses MediaExtractor to read a video file instead of decoding streaming bytes directly.
The pure Java example at: https://github.com/googlesamples/android-BasicMediaDecoder/tree/39d62cafb747176b96a385404952f1b8439f7b2e may be useful since the native API is basically identical. But once again that example uses a video file instead of a stream.
https://github.com/fyhertz/libstreaming appears to be doing H.264 streaming with Java MediaCodec, so it might be an useful example to look at. It implements RTSP, concrete examples at: https://github.com/fyhertz/libstreaming-examples
## Other possibilities
### Openmax
Openmax would likely work as well, as there is an example at https://github.com/googlesamples/android-ndk/tree/3cd41e1f5280443665ca98463c7a76e80bf0b96c/native-media Like the MediaCodec example, the problem is that the example decodes a file, not a stream.
Openmax is used inside of MediaCodec, as can be seen by the fact that libopenmax must be linked to as well to use AMediaCodec.
The major problem of Openmax is that it feels like an unmaintained Khronos API from 2011, examples are extremely scarce, and it is even difficult to get a quick GNU Linux implementation running. MediaCodec on the other hand seems to get more and more support with time.
### libstagefright
The name comes up often, likely because FFmpeg had support for it, but it was dropped because it is an internal API. See:
So this is likely not a good idea.
### FFmpeg
The perfect solution would be of course to find an FFmpeg build option that magically works on Android.
We haven't been able to find one however.
There is some info at https://trac.ffmpeg.org/wiki/HWAccelIntro which mentions -hwaccel
, but it does not mention Android.
Most FFmpeg Android hardware threads are about libstagefright, which as mentioned above is not an option.
So we have opted to do a custom quick-and-dirty solution first. But ideally, our solution should be merged back to FFmpeg one day.
## Measuring the improvement
We expect two improvements with this change:
- higher FPS
- lower energy consumption
To test FPS, use the patch: https://gerrit-ring.savoirfairelinux.com/#/c/3728/ and then compile with:
EXTRA\_CFLAGS=-DDEBUG\_FPS ./compile.sh
We haven't investigated energy consumption yet. The Android guide says that constant networking is the major power consumption in Android devices, so maybe it is not significant: http://developer.android.com/training/monitoring-device-state/index.html To be confirmed.
## How to generate a minimal example
In order to really understand what is going on, it might be necessary to generate a minimal FFmpeg encoding to MediaCodec decoding example.
The best way to do that might be to:
-
make a minimal Android socket-based server app that reads packets and feeds them to AMediaCodec.
Maybe this can be used as a starting point: https://github.com/cirosantilli/android-cheat/tree/92de020d0b708549a444ebd9f881de7b240b3fbc/socket for the
ServerSocket
part. -
encode camera data from a GNU Linux computer and send it with FFmpeg over the LAN.
It would be even more minimal if we could have the video source inside Android as well, but installing FFmpeg on Android is not so simple (Ring does it of course, but maybe https://github.com/WritingMinds/ffmpeg-android would be simpler).
## If you really want this to work at any cost
Email: https://www.linkedin.com/in/andymcfadden http://stackoverflow.com/users/294248/fadden and ask if he does consulting. He's answered tons of android video questions on SO. If it's doable, I bet he could do it in a week, and the right way.
He replied to me:
I'm not available for consulting at this time.
If you want this to work on Android API 18+, you should use MediaCodec, in either its Java or native form. The output should go directly to the display Surface. The Java API is just a thin wrapper around the native API, so there's no performance reason to use the native MediaCodec API. Using a stable API will make your life much easier than attempting to use libstagefright or OMX directly, as those can change without warning and break everything on future devices.
Breaking a stream into NAL units will likely be faster in native code, but I don't know how much of a difference it would make in practice. Either way you have to identify the start and end, and copy the data into an input buffer, because that's how OMX rolls.
Or maybe https://github.com/fyhertz He's from Montreal as well.
## Misc links
This is a huge semi-organized dump of a great number of links that we have looked into to different degrees. Very few examples everywhere. Just to save some googling time.
- https://github.com/fyhertz/libstreaming
- https://github.com/OnlyInAmerica/FFmpegTest
- openmax android https://www.khronos.org/openmax/ VLC has the option. By Khronos. Previously called iomx?
- last API update 2011, almost no desktop implementation... old forgotten stuff
-
<ndk>/samples/native-codec
- TODO: how is frame-rate controlled there? no usleep. Just do it as fast as possible?
- http://mobilepearls.com/labs/native-android-api/ndk/docs/openmaxal/
- http://stackoverflow.com/questions/6990020/using-openmax-il-for-audio-video-decoding-on-android
- http://stackoverflow.com/questions/14528487/can-openmax-for-android-ndk-be-used-for-streaming-live-video-audio-to-a-server
- http://android-developers.blogspot.fr/2011/11/updated-ndk-for-android-40.html
- http://osdir.com/ml/android-ndk/2012-11/msg00079.html
- https://android.googlesource.com/platform/ndk/+/android-4.4.4_r2.0.1/docs/openmaxal/index.html
- mediacodec
- stream
- http://stackoverflow.com/questions/13307086/decoding-raw-h264-stream-in-android
- http://stackoverflow.com/questions/32739047/android-decode-raw-h264-stream-with-mediacodec
- http://stackoverflow.com/questions/26678717/how-to-play-raw-h264-produced-by-mediacodec-encoder
- http://stackoverflow.com/questions/31367225/android-mediacodec-decoding-of-raw-h-264
- http://stackoverflow.com/questions/13397863/use-mediacodec-for-h264-streaming
- http://stackoverflow.com/questions/17358918/can-android-mediacodec-decode-video-h264-stream
- http://stackoverflow.com/questions/15756735/decoding-h264-streaming-using-android-low-level-api
- http://stackoverflow.com/questions/21232206/raw-h-264-stream-output-by-mediacodec-not-playble
- http://stackoverflow.com/questions/19742047/how-to-use-mediacodec-without-mediaextractor-for-h264
- http://stackoverflow.com/questions/15305241/how-do-i-feed-h-264-nal-units-to-android-mediacodec-for-decoding fadden says: can't feed fixed size buffers, need to parse H.264 and send some variable size unit
- http://stackoverflow.com/questions/25738680/how-to-parse-access-unit-in-h-264
- http://stackoverflow.com/questions/21182246/android-mediacodec-decode-h264-raw-frame
- https://developer.android.com/reference/android/media/MediaCodec.html official API?
- With FFmpeg
<sdk>/samples/android-22/media/BasicMediaDecoder
- https://github.com/vecio/MediaCodecDemo
- https://github.com/taehwandev/MediaCodecExample could not get working
- VLC example https://github.com/videolan/vlc/blob/master/modules/codec/omxil/mediacodec\_ndk.c
- stream
-
http://stackoverflow.com/questions/25791722/using-hardware-acceleration-with-libavcodec
- http://stackoverflow.com/questions/32371246/android-hardware-accelerated-video-decoder-for-h264-stream
- http://stackoverflow.com/questions/7869907/hardware-accelerated-ffmpeg-on-android
- http://stackoverflow.com/questions/8670807/android-ffmpeg-and-hardware-acceleration
- http://stackoverflow.com/questions/14890140/to-use-hw-decoder-in-android-through-libstagefright-what-to-set-for-kkeyavcc-in
- http://stackoverflow.com/questions/9702503/h264-hw-decoding-on-android-using-ffmpeg-10
- http://stackoverflow.com/questions/29358915/how-can-ffmpeg-be-made-as-efficient-as-androids-built-in-video-viewer/29362353#29362353
- http://stackoverflow.com/questions/28775931/muxing-android-mediacodec-encoded-h264-packets-into-rtmp
- https://developer.android.com/training/basics/firstapp/creating-project.html
- https://gerrit-ring.savoirfairelinux.com/#/q/status:open,25
- libstagefright
- FFmpeg dropped it because this API is not public, says that MediaCodec should be used instead: https://github.com/FFmpeg/FFmpeg/commit/72673ad7eae2d4f685f3c0a895558502bfe07c8e
- part of
git clone https://android.googlesource.com/platform/frameworks/av
at pathmedia/libstagefright/
- there was a serioius vuln in it: https://nakedsecurity.sophos.com/2015/07/28/the-stagefright-hole-in-android-what-you-need-to-know/
- https://source.android.com/devices/media/
- http://stackoverflow.com/questions/25818668/ffmpeg-support-for-libstagefright-hardware-decoding
- https://quandarypeak.com/2013/08/androids-stagefright-media-player-architecture/
- http://stackoverflow.com/questions/9832503/android-include-native-stagefright-features-in-my-own-project?lq=1
- http://www.canofcode.co.uk/software/android/hardware-video-decoding-android/
- ndk/samples/native-codec