[Libav-user] Filtergraph memory leak

classic Classic list List threaded Threaded
1 message Options
Reply | Threaded
Open this post in threaded view

[Libav-user] Filtergraph memory leak

Leif Andersen
I have an FFmpeg program that:

1. Demuxes and decodes a video file.
2. Passes it through a filtergraph
3. encodes and muxes the new video.

The filtergraph itself is rather complex, and can be run directly from
the command line as such:

    ffmpeg -i demo.mp4 -filter_complex \
      [0:v]fifo[video5]" \
    -map "[fv]" -map "[fa]" out.mp4

I realize this is a massive filtergraph with a lot of no-op filters,
it was autogenerated rather than being hand written. [Here is a more
cleaner version of the graph.][1] (Its a graphviz file, you can run it
in the command line or [here][2].)

Anyway, when I run the program that uses this filtergraph my memory
usage spikes. I end up using about 7 GB of RAM for a 30 second clip.
However, when I run the program using the ffmpeg command above, it
peaks out at about 600 MB of RAM. This causes me to believe that the
problem is not the ungodly size of the filtergraph, but a problem with
how my program is using it.

The program sets up the filtergraph (using `av_filter_parse_ptr`,
giving the filtergraph string shown above), encoder, muxer, decoder,
and demuxer, then spawns two threads, one that sends frames into the
filtergraph, and one that receives them. The frame that sends them
looks something like:

    void decode () {
        while(... more_frames ...) {
            AVFrame *frame = av_frame_alloc();
            ... fill next frame of stream ...
            av_buffersrc_write_frame(ctx, frame);

(I have elided the `av_send_packet/av_receive_frame` functions as they
don't seem to be leaking memory. I have also elided the process of
flushing the buffersrc as that won't happen until the end, and the
memory spikes long before that.)

And the encoder thread looks similar:

    void encode() {
        while(... nodes_in_graph ...) {
            AVFrame *frame = av_frame_alloc();
            av_buffersink_get_frame(ctx, frame);
            ... ensure frame actually was filled ...
            ... send frame to encoder ...

As with the decoder, I have elided the `send_frame/receive_packet`
combo as they don't seem to be leaking memory. Additionally I have
elided the details of ensuring that the frame actually was filled. The
code loops until the frame eventually does get filled.

Every frame I allocate I fairly quickly deallocate. I additionally
handled all of the error cases that the ffmpeg can give (Elided in the

I have also tried having only one frame for the encoder and one for
the decoder (and calling `av_frame_unref` in each iteration of the

Am I forgetting to free something, or am I just using the calls to
libavfilter incorrectly such that it has to buffer all of the data? I
don't think the leak is caused by the memory graph because running it
from the command line doesn't seem to cause the same memory explosion.

FWIW, the actual code is [here][3], although its written in Racket. I
should also note that I originally posted this question to Stack
Overflow[4], but was directed here instead. If anyone has any
suggestions I would love to hear them.


~Leif Andersen

[1]: https://gist.github.com/LeifAndersen/68c87563c8f2f7af3ea7913b65c651b6
[2]: http://webgraphviz.com/
[3]: https://github.com/videolang/video/blob/master/video/private/ffmpeg-pipeline.rkt
[4]: https://stackoverflow.com/questions/44913445/ffmpeg-filtergraph-memory-leak?noredirect=1#comment76803986_44913445
Libav-user mailing list
[hidden email]