On Sat, 2 Apr 2016, Mark Wedel wrote:

I would hope that there are not any such assumptions (the relevant sections of code could just look at the maps and see how big they are), but that of

It was several months ago that I had a look at a lot of that code. I thought I saw potential issues in a few places but I could be wrong. In any case I decided to stick to 50x50 and avoid any potential issues.

One of the main issues is that to open such files, a popen (instead of fopen) was necessary, so this littered the code with checks based on if the file was compressed, it had to record if the file was compressed (so when it saved it, it saved it as compressed), and also dealt with the potential of of many different compression methods (compress, gzip, bzip, now xzip, etc). It was removed to make the code cleaner, which is a good thing, and at the time, the given size of maps (and other data) wasn't large enough to be a concern.

Also, at the time, it allowed all files to be compressed (archetypes, player files, etc). Certainly allowing it only on map files would be limit the number of places that code would be needed. Other assumptions could be made, like if compress is set in configure, then assume all map files to be opened will be compressed, and all map files saved will be compressed (thus, do not need to record at time of reading if the file was compressed and what it was compressed with).

What I was thinking about was quite a bit simpler. Try to open the uncompressed map file as normal. If that fails try to open the same file in the same directory with extension .xz compressed with the xz algorithm (or subsitute another similar compression algorithm to taste) while keeping all temp files uncompressed.

The editor would need to know about that too of course.

Fair point, but you are sort of an edge case on this, which is to say, a feature really only one person would probably use.

If 1000x1000 maps became standard in the game (or at least a supported ad-on) it could be common.

I wonder if it is possible to do it with a plugin using Mapload or Mapenter:

http://wiki.cross-fire.org/dokuwiki/doku.php/server_plugin?s[]=events#hooking_to_global_events

If so the uncompressed map could be cleaned up by Mapunload or Mapreset.

ZFS on Solaris. I've not looked at the state of other filesystems on linux - I would have thought that there would be other transparent compression methods for linux, but could be wrong.

Linux is trailing FreeBSD and Solaris in that regard. BTRFS is the filesystem of the future (and always will be ;) ).

Back to the mega map, one thing that was thought of back in the past was to unify the scale.

When I was considering what to do with my new world a few months ago I went through the archives and found a discussion on this topic. I felt there were some good argument against having buildings at the same scale as the bigworld map. In particular there was a concern about the ability of characters to see the buildings properly. This seemed like a strong argument to me.

I like the flexibility that the current 2 scale approach allows.

Cheers,

Rob

--
Email: rob...@timetraveller.org         Linux counter ID #16440
IRC: Solver (OFTC, Freenode and Snoonet)
Web: http://www.pracops.com
I tried to change the world but they had a no-return policy
_______________________________________________
crossfire mailing list
crossfire@metalforge.org
http://mailman.metalforge.org/mailman/listinfo/crossfire

Reply via email to