My spreadsheet for the netlogon.log files was huge. Iirc, 2.2 million entries. Excel 2010 x64 sorted it faster than Don Ely can chug a beer. Than it was a simple mouse click to remove all duplicates. That left me, or rather the customer, with just over 9000 unique IP addresses. They had some work to do!
Total time to import in 5 log files, sort and get the unique values was less than 1 minute. How long it took them to fix Sites and Services and Reverse DNS i have no idea. Sent from my Verizon Wireless Phone Ben Scott wrote: On Mon, Jan 23, 2012 at 2:28 PM, Michael B. Smith <[email protected]> wrote: >> I can honestly say that I never once in my life before now thought >> that 2 gigabytes would ever be a practical limit in the world of >> spreadsheets. :-) > > Oh my - you'd be so wrong. :-P Apparently. :) Although someone else's example of millions of lines of log file was revealing. Me, I'd tend to view that as a database problem, not a spreadsheet problem. I generally see spreadsheets as a math/formulas solution. But (ab)using spreadsheets instead of databases is a tradition at least as old as the spreadsheet, so I should have seen that coming. > That's kinda like saying "640 KB is enough for anybody" ... My reaction was specifically to a spreadsheet that big, not the plain data size. There's lots of other cases where I would find that much memory entirely justified. Databases, for example. ;-) -- Ben ~ Finally, powerful endpoint security that ISN'T a resource hog! ~ ~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/> ~ --- To manage subscriptions click here: http://lyris.sunbelt-software.com/read/my_forums/ or send an email to [email protected] with the body: unsubscribe ntsysadmin ~ Finally, powerful endpoint security that ISN'T a resource hog! ~ ~ <http://www.sunbeltsoftware.com/Business/VIPRE-Enterprise/> ~ --- To manage subscriptions click here: http://lyris.sunbelt-software.com/read/my_forums/ or send an email to [email protected] with the body: unsubscribe ntsysadmin
