I am getting irregular newBufferInfo.presentationTimeUs because of which if I put Thread.sleep to slow down the playback, a lot of frames are dropped. Actually, with Surface the frames timestamps are synchronized automatically with system timestamp without sleep, however it does not work with giving output to OpenGLES. https://developer.android.com/reference/android/media/MediaCodec#releaseOutputBuffer(int,%20long) I thought mExtractor.getSampleTime() is the problem but even after removing it, the
Tag: android-mediacodec
Android Mediacodec decodes h264 video stream single frames with large green padding
I want to decode single frame of a H.264 video stream which is sent by server but when I do, the result picture has large padding. Code & Result: Code: now the result has large adding like this: What am I doing wrong? Rescaling the YUV image did not help and causes the picture to have 0 dimensions. (I put
Rendering same video to 2 Surfaces from a MediaCodec
MediaCodec has 2 ways of operating: you either pass a Surface for it to render to, or you read the output buffer and paint it to the screen yourself. In the first case, where I pass a surface: is it possible to paint the same MediaCodec decoded video to 2 surfaces? The decoding loop looks something like this: Answer The