![]() ![]() However, you will need to be careful about the path you pass to model_path. Note about the model path on Windowsĭue to Windows not having a good default for where to pull the VMAF model from, you will always need to specify model_path when calling libvmaf through ffmpeg. See the FFmpeg's guide to libvmaf, the FFmpeg Filtering Guide for more examples of complex filters, and the Scaling Guide for information about scaling and using different scaling algorithms. ![]() It uses the model_path at location /usr/local/share/model/vmaf_float_v0.6.1.json (which is the default and can be omitted). The log_path is set to standard output /dev/stdout. It is important to set the frame rate and the PTS right, since FFmpeg filters synchronize based on timestamps instead of frames. r 24 sets the frame rate (note that it needs to be before -i), and PTS-STARTPTS synchronizes the PTS (presentation timestamp) of the two videos (this is crucial if one of your videos does not start at PTS 0, for example, if you cut your video out of a long video stream). First, download the reference video src01_hrc00_576x324.yuv and the distorted video src01_hrc01_576x324.yuv. Note that you may need to download the test videos from vmaf_resource.īelow is an example on how you can run FFmpeg+libvmaf on a pair of YUV files. We provide a few examples how you can construct the FFmpeg command line and use VMAF as a filter. For the best practices of computing VMAF at the right resolution, refer to our tech blog. Using FFmpeg+libvmaf is very powerful, as you can create complex filters to calculate VMAF directly on videos of different encoding formats and resolutions.
0 Comments
Leave a Reply. |