Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add a demo showing how to process video frames using GL before presenting them #6920

Closed
msilva-verimatrix opened this issue Jan 28, 2020 · 6 comments
Assignees

Comments

@msilva-verimatrix
Copy link

Good afternoon,
What would be the recommended way to extract video frames from ExoPlayer during the playback of a network url, edit the frames (to apply YUV filtering, effects or watermarking) and provide the updated frames to the player? Is it intercepting MediaCodec.queueInputBuffer() the way to go? The related ByteBuffer seems to aggregate raw video frames, but how to iterate through these frames?
Can you provide a code snippet or point to an example?
Warm regards,

Moacir Silva

@iamradhakrishna
Copy link

As per my basic analysis,this is possible by listening public void onFrameAvailable(SurfaceTexture surfaceTexture) method while rendering the video to surface but i am also looking for best solution.
Please suggest on this.

Best regards,
RK

@msilva-verimatrix
Copy link
Author

OK, thank you.
By configuring the video stream MediaCodec with a null surface in MediaCodecVideoRenderer.configureCodec(), it was possible to redirect the video frames to MediaCodecRenderer.drainOutputBuffer(), convert them to jpeg using YuvImage and then to Bitmaps using BitmapFactory. But I will also try the approach that you have suggested.
Are there plans to add support to more image formats in YuvImage in the near future? At the moment it seems to only support ImageFormat.NV21 and ImageFormat.YUY2...
Warm regards,

Moacir Silva

@andrewlewis
Copy link
Collaborator

Is the idea to do some postprocessing on video frames output by the decoder before showing them to the user?

If so, this can be done by adding a GLSurfaceView to your layout, passing the player a Surface created from a SurfaceTexture based on a GL texture, and doing the filtering/compositing in a GL shader in your GLSurfaceView.Renderer, using the texture as input. I will mark this as an enhancement to add a demo of this functionality as there is quite a bit of non-obvious code required to make it work!

@andrewlewis andrewlewis changed the title Extract video frames from ExoPlayer during the playback of a network url Add a demo showing how to process video frames using GL before presenting them Jan 31, 2020
@msilva-verimatrix
Copy link
Author

Indeed, that would be the idea. And it would be great to have the demo!
Many thanks,

Moacir

kim-vde pushed a commit that referenced this issue Feb 11, 2020
Demonstrates rendering to a GLSurfaceView while applying a GL shader.

Issue: #6920
PiperOrigin-RevId: 293551724
ojw28 pushed a commit that referenced this issue Feb 13, 2020
Demonstrates rendering to a GLSurfaceView while applying a GL shader.

Issue: #6920
PiperOrigin-RevId: 293551724
@ojw28
Copy link
Contributor

ojw28 commented Feb 13, 2020

There is now a demo app you can take a look at. Clone the project, open in Android Studio, and look at the demo-gl demo module.

@ojw28 ojw28 closed this as completed Feb 13, 2020
@msilva-verimatrix
Copy link
Author

That is great!
Many thanks,

Moacir

@google google locked and limited conversation to collaborators Apr 14, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Projects
None yet
Development

No branches or pull requests

4 participants