**C++ stuff**

In order to test the C++ version:

  1. The code can be downloaded from 
[here](https://github.com/IFeelBloated/test_c_filters).
  2. I compiled it like:
    
        $ g++ -Wall -O3 -shared -fPIC -I. -o libfilter.so GaussBlur.cxx




3\. Create a VapourSynth python filter like test_filter.vpy:
    
    
    import vapoursynth as vs
    core = vs.get_core()
    core.std.LoadPlugin(path='./libfilter.so')
    core.std.SetMaxCPU('none')
    clip = core.std.BlankClip(format=vs.GRAYS, length=100000, fpsnum=24000, 
fpsden=1001, keep=True)
    clip = core.testc.GaussBlur(clip)
    clip.set_output()
    
    Run

4\. Testing:
    
    
    $ vspipe test_filter.vpy /dev/null
    Output 100000 frames in 29.53 seconds (3386.27 fps)

So I am just getting 3386fps with the pure C++ version while I am getting 
1200fps with the Nim version.

I am not comparing apples to apples here:

  1. The Nim version uses int32 while the C++ version uses float.
  2. The Nim version is not using getFrameAsync, so it is not taking advantage 
of multithreading as discussed 
[here](https://github.com/vapoursynth/vapoursynth/issues/551#issuecomment-610473633).


Reply via email to