Hello,

I am upgrading to DMD 2.076.1 from DMD 2.069.2 (similar results on 2.075.1), and seeing a huge increase in unittest compilation time when the -deps parameter is also passed to dmd. This is on both OSX and linux. What can be the cause of this?

Sample program:

    import std.stdio: writeln;

    unittest { writeln("TestUT"); }

    version (unittest) {}
    else
    {
        void main()
        {
            writeln("TestMain");
        }
    }

Observations:

Command: time dmd -deps=test.dep -c -o- test.d -de -w -m64 -color -g -debug -gs -unittest -main
Linux runtime: user 0m28.192s    << Note the increase
OSX runtime: user 0m48.508s     <<
Linux 2.069.1 runtime: user 0m0.009s

Command: time dmd -c -o- test.d -de -w -m64 -color -g -debug -gs -unittest -main
Linux runtime: user 0m0.064s
OSX runtime: user 0m0.090s
Linux 2.069.1 runtime: user 0m0.005s

Command: time dmd -deps=test.dep -c -o- test.d -de -w -m64 -color -g -debug -gs
Linux runtime: user 0m0.584s
OSX runtime: user 0m0.882s
Linux 2.069.1 runtime: user 0m0.007s

Command: time dmd -c -o- test.d -de -w -m64 -color -g -debug -gs
Linux runtime: user 0m0.048s
OSX runtime: user 0m0.074s
Linux 2.069.1 runtime: user 0m0.010s

Environment:
OSX: El Capitan 10.11.6
Linux with DMD 2.076.1: Gentoo 4.9.34
Linux with DMD 2.069.1: Centos 7

Reply via email to