I have a medium size daemon application that uses several threads, libasync, and daemonize. On windows it runs correctly with GC enabled, but on linux the GC causes a deadlock while allocating memory.
Adding core.memory.GC.disable; to main causes the application to work correctly (and quickly till it runs out of memory :D ) I am running in a vagrant VM for test, but this should not be the issue. I am developing this for a future product so I wont have control over the environment that it runs in. I am not sure how to debug or work around this issue. My boss is a little concerned that I am using D but does think its cool. I really want to make this stable and get D in to one of our products, if not I will have to switch to C++ or even worse JAVA! Any help would be appreciated. -Byron System: DMD 2.066.1 Linux vagrant-ubuntu-precise-64 3.2.0-76-virtual #111-Ubuntu SMP Tue Jan 13 22:3 3:42 UTC 2015 x86_64 x86_64 x86_64 GNU/Linux DISTRIB_ID=Ubuntu DISTRIB_RELEASE=12.04 DISTRIB_CODENAME=precise DISTRIB_DESCRIPTION="Ubuntu 12.04.5 LTS" NAME="Ubuntu" VERSION="12.04.5 LTS, Precise Pangolin" ID=ubuntu ID_LIKE=debian PRETTY_NAME="Ubuntu precise (12.04.5 LTS)" VERSION_ID="12.04"