On Sunday, 13 August 2017 at 08:13:56 UTC, amfvcg wrote:
On Sunday, 13 August 2017 at 08:00:53 UTC, Daniel Kozak wrote:
my second version on ldc takes 380ms and c++ version on same compiler (clang), takes 350ms, so it seems to be almost same


Ok, on ideone (ldc 1.1.0) it timeouts, on dpaste (ldc 0.12.0) it gets killed.
What version are you using?

Either way, if that'd be the case - that's slick. (and ldc would be the compiler of choice for real use cases).

Here are my results:

$ uname -sri
Linux 4.10.0-28-generic x86_64

$ lscpu | grep 'Model name'
Model name:            Intel(R) Core(TM) i7-3770K CPU @ 3.50GHz

$ ldc2 --version | head -n5
LDC - the LLVM D compiler (1.3.0):
  based on DMD v2.073.2 and LLVM 4.0.0
  built with LDC - the LLVM D compiler (1.3.0)
  Default target: x86_64-unknown-linux-gnu
  Host CPU: ivybridge

$ g++ --version | head -n1
g++ (Ubuntu 6.3.0-12ubuntu2) 6.3.0 20170406

$ ldc2 -O3 --release sum_subranges.d
$ ./sum_subranges
378 ms, 556 μs, and 9 hnsecs
50000000

$ g++ -O5 sum_subranges.cpp -o sum_subranges
$ ./sum_subranges
237135
50000000

Reply via email to