Hi all, I am trying to use libavcodec(55.39.101) to decode h264 video as fast as possible, I found that the library can only make use of around 160% of CPU ( i3-2100, 2 core, 4 threads) on some of the h264 videos (eg. Video A, see link below), but able to use over 300% CPU on other h264 videos (eg. Video B, see link below)
>From mediainfo, Video A is interlaced while B is progressive. Does this really affect the parallelism of decoding? I've tried using x264 to encode videos to h264 with interlace/progressive options, either option videos can let libavcodec use over 300% CPU to do decoding. so it seems that interlace/progressive may not be the root cause which affects CPU usage. My question is what factors are affecting the CPU usage of decoding h264 videos? Is there a way to maximize CPU usage so that decoding can become faster? I use ffmpeg benchmark command to illustrate the difference of CPU usage: ffmpeg -benchmark -threads auto -i ~/Videos/fileA.h264 -f null - (-threads=4, 8 or auto, results are similar) Video A - http://downloads.mainconcept.com/MainConceptLogo_Blu-ray_AVC_1920x1080_LPCM.zip Video B - http://download.blender.org/durian/trailer/Sintel_Trailer1.1080p.DivX_Plus_HD.mkv I appreciate any help from you, thanks! Winter Fa
_______________________________________________ Libav-user mailing list [email protected] http://ffmpeg.org/mailman/listinfo/libav-user
