Opened 9 years ago
Closed 9 years ago
#5176 closed enhancement (wontfix)
avcodec_decode_video2 requiring input buffer to be larger than actual read bytes
Reported by: | Henrik Boström | Owned by: | |
---|---|---|---|
Priority: | normal | Component: | avcodec |
Version: | unspecified | Keywords: | |
Cc: | Blocked By: | ||
Blocking: | Reproduced by developer: | no | |
Analyzed by developer: | no |
Description
As is already known and documented by avcodec_decode_video2, the input buffer needs to be AV_INPUT_BUFFER_PADDING_SIZE larger than the actual read bytes because some bistream readers read 32 or 64 bits at once and can over-read.
I view this as a flaw and suggest this is fixed so that you don't have to allocate more bytes than the encoded data. It shouldn't be a performance issue because you can still read in 64 (or whatever) bit chunks up 'til the very end.
(As someone who is using multiple libraries, FFmpeg is now a special case when allocating buffers, and I feel like it shouldn't have to be.)
Change History (3)
comment:1 by , 9 years ago
comment:2 by , 9 years ago
OK. Fair enough. (I thought you could avoid that by telling the optimized byte reader to read X bytes, where X is all bytes up until the end, and then read the remaining Y<64 bits normally, guess not.)
comment:3 by , 9 years ago
Keywords: | avcodec_decode_video2 AV_INPUT_BUFFER_PADDING_SIZE input buffer bitstream reader 32 64 bits overread removed |
---|---|
Resolution: | → wontfix |
Status: | new → closed |
Of course this is a performance issue, if we wouldn't have padding we would have to check every time if we're at the end of the buffer, which is exactly what the required padding is designed to avoid.
Don't expect any changes, this is a clear design decision.