On Thu, 20 Aug 2020 at 16:53, Albert Astals Cid <[email protected]> wrote:
>
> What's your reason for needing such huge images?

Main point is they are not *that* huge? The [largest digital photo in
1999] was bigger.

Essentially, it's not unreasonable to have a working set larger than
2GiB. With some sample inputs my final output then is ~10MiB as png.
It's fine for the allocator to say you can't have N bytes because the
machine can't provide it, rather than because it doesn't fit in int32?

On Thu, 20 Aug 2020 at 22:28, Albert Astals Cid <[email protected]> wrote:
>
> You will always hit number limits at some point, my suggestion is to change 
> pdftoppm to just render tiles and then stitch it together if they don't fit 
> the current size limits.

In practice I am also tiling above the level of pdftoppm, but in
development wanted the flexibility of picking the tiling parameters to
find the best combination. It's likely to be smaller, say 8192x8192,
in deployment.

Martin


[largest digital photo in 1999]: 1.7 gigapixels, or a bit under 5GiB
of pixel data
https://en.wikipedia.org/wiki/List_of_largest_photographs#Portrait_of_a_Coral_Reef_(1999)
_______________________________________________
poppler mailing list
[email protected]
https://lists.freedesktop.org/mailman/listinfo/poppler

Reply via email to