Hello, Today I found that top level defines have a significant performance impact on Guile (2.2.3). The following program takes about 108 seconds to complete on my ThinkPad (an i5-5200U with Arch Linux):
(define node cons) (define node-left car) (define node-right cdr) (define (make d) (if (= d 0) (node #f #f) (let ([d2 (1- d)]) (node (make d2) (make d2))))) (define (check t) (if (node-left t) (+ 1 (check (node-left t)) (check (node-right t))) 1)) (define (main n) (define min-depth 4) (define max-depth (max (+ min-depth 2) n)) (define stretch-depth (1+ max-depth)) (format #t "stretch tree of depth ~a\t check: ~a\n" stretch-depth (check (make stretch-depth))) (let ([long-lived-tree (make max-depth)]) (do ([d 4 (+ d 2)]) ([not (< d (1+ max-depth))]) (let ([iterations (ash 1 (+ (- max-depth d) min-depth))]) (format #t "~a\t trees of depth ~a\t check: ~a\n" iterations d (let sum ([i iterations] [n 0]) (if (zero? i) n (sum (1- i) (+ n (check (make d))))))))) (format #t "long lived tree of depth ~a\t check: ~a\n" max-depth (check long-lived-tree)))) (main 21) By simply wrapping that code in a lambda the program finished in about 47 seconds. Using lets instead of defines is equally effective. I was quite surprised because I initially thought some optimization would just substitute those useless nodes symbols away, but it seems like that's not the case... Cheers!