While I'm pretty decent with the ffmpeg command line tool, I've been meaning to learn to use the ffmpeg's C libraries for some time. It'd aware of ffmpeg's powerful capabilities when handling multimedia, and I wish to combine it to my graphics programming projects.

I started following ffmpeg's tutorial to get myself familiar with the library. So far things seem to be a little overcomplicated, so, frustrated by the overwhelming amount of coding I need to do just to read frames from a media file, I decided to take a step back and try to solve my problem from a different end.

My motivation for using ffmpeg comes from my idea to be able to animate textures into 3D objects. In order to successfully do that, I need to figure out a way to decode each frame from the video stream from a video file and to upload it to opengl. While I got kind of stuck on the first end, I decided to do some work on the latter.

For that I wanted to have a bunch of frames I could use and focus only on the code logic regarding passing the frames to opengl. I used the ffmpeg's command line tool to convert a video stream to a ppm format, which for me is the simplest format that I could possibly work with. The ppm format is basically a series of concatenated, bitmapped images, each representing a full shot of each frame a video.

# This is the actual command I used. Note that I'm resizing the video as
# well as converting it to ppm.
ffmpeg -i aesthetics.mkv -filter:v "scale=-1:480" -f image2pipe -codec:v ppm -> out.ppm

I created a single opengl texture which I updated for every new frame I loaded into memory. It may not be the best way but all intents and purposes of my project it was good enough. In the end I just looped the video over and over again.


In my demo, I decode frames from a ppm video stream into a texture and render it in real time into a cube in opengl.

The music is from Vektroid and the video is from Blank Banshee.

I'm not done with my objective just yet. Now that I gained some experience on the opengl end of things regarding rendering animated textures, I'm going back to ffmpeg, to try decoding different video formats to finally merge it with the code from this demo.

Download the source code.

In order to compile, you'll need the following libraries; freeglut3-dev and libglew-dev. As well as modern graphics driver. Go to the project directory and run:

make run

You'll also need to provide a .ppm file. Because of the huge file size and possibly legal issues, I'm not distributing the video file along with the project. You can run the ffmpeg command I mentioned above to create a ppm video file from an existing video on disk, then open src/main.cpp and modify the filepath variable to point to the newly created ppm file.