The "unlimited" big address space is the only point to move to 64-bit. However the the 64 bit data takes two times more space in the computer memory. I found 6GB memory sufficient, but sometimes I dream also big "infinte" RAM to really test some "unlimited memory algorithms".
I have used mapped files to store and calculate big arrays (150000 * 2000) tables. On a dual core computer you can put two J-processes manipulate the same mapped file (different positions of course) and get 100% out of your computer. The OS is also important. I use XP64 and sometimes it takes a long time to allocate the memory. Sometimes if have wondered how Windows Serever 64 had done the work or 64bit UNIX. There must be differences, but there is quite little information out there. The 64bit J computation has been very succesfull for me. However the use of the same computer as personal workstation has caused some problems because all 32-bit things do not work on 64bit paltform. Anssi At 09:06 30.9.2009, Alex Rufon wrote: >Hi guys, > >Where doing the CAPEX for 2010 and I was hoping to get a 64bit >machine for my J Application Server. > >Right now, I've only came up with the following justification: > >1. Allows J to allocate objects greater than 1GB. > >2. Allows J to process files greater than 2GB. > >I'm hoping somebody out there has more. ;) > >Thanks. > >r/Alex > >---------------------------------------------------------------------- >For information about J forums see http://www.jsoftware.com/forums.htm ---------------------------------------------------------------------- For information about J forums see http://www.jsoftware.com/forums.htm
