Problem using avcodec_decode_video2 solely? How to set qscale?
I am involved in the project, using libav to decode video.
It uses its own code to parse video streams and avcodec_decode_video2 for decoding.
Currently, it would require too much effort to switch to libav parsing, so I'd like to leave it as it is, just do some tuning of the AVCodecContext.
My problem is the fact that the MPEG-4 part 2 decoder doesn't decode all video frames starting from the key one, while ffplay seems to be doing that.
I am using windows build of the git d049257 commit (2011-10-19).
When my application receives I-frame, having 00 in the vop_coding_type field, call to avcodec_decode_video2() returns -1 and causes the following debug messages:
[mpeg4 @ 0091C120]hmm, seems the headers are not complete, trying to guess time_increment_bits
[mpeg4 @ 0091C120]my guess is 7 bits ;)
[mpeg4 @ 0091C120]looks like this file was encoded with (divx4/(old)xvid/opendivx) -> forcing low_delay flag
[mpeg4 @ 0091C120]qp:17 fc:1,1 I size:436952 pro:1 alt:0 top:0 hpel part:0 resync:0 w:0 a:0 rnd:1 vot:0 dc:99 ce:0/0/0
[IMGUTILS @ 055CFC54]Picture size 0x0 is invalid
[mpeg4 @ 0091C120]get_buffer() failed (-1 0 0 00000000)
Then, after my player receives some amount of data, I see that qp value becomes 5 instead of 17 and it starts decoding and showing video frames.
I've observed other values of qp: 8 and 1.
ffplay always shows qp = 5
So, could anyone give me a hint on how to make avcodec_decode_video2 consider qp = 5?
I understand that this value of 5 could be correct only for that particular video stream
Unfortunately, my knowledge of mpeg-4 is very vague.
How can I calculate the correct qscale (QP) value and feed it in the decoder?
I've studied sources and have seen that this value is stored in the internal structure, which is inaccessible from outside.