The image output looks correct to me.
-bborie
On 10/25/2012 10:43 PM, Mahavir Trivedi wrote:
hi i split the image (tiff) into 100 X 100 tile .(RASTER IMAGE 500 MB
input )
but problem occurred when i export it then output size increase.
(image input size = 1591 X 1446)
(image output
...@postgis.refractions.net] On Behalf Of Bborie Park
Sent: Friday, October 26, 2012 8:26 AM
To: PostGIS Users Discussion
Subject: Re: [postgis-users] out of memory
The output size makes sense since the loader split the input raster
into 100x100 tiles.
1591 / 100 = 15.91 ... 16 x 100
1446 / 100 = 14.46
The output size makes sense since the loader split the input raster
into 100x100 tiles.
1591 / 100 = 15.91 ... 16 x 100
1446 / 100 = 14.46 ... 15 x 100
So, when unioning the tiles back together, the unioned raster should
be 1600 x 1500 (don't know where you got 1450 though).
-bborie
On Thu,
To: PostGIS Users Discussion
Subject: Re: [postgis-users] out of memory
The output size makes sense since the loader split the input raster
into 100x100 tiles.
1591 / 100 = 15.91 ... 16 x 100
1446 / 100 = 14.46 ... 15 x 100
So, when unioning the tiles back together, the unioned raster should
You may want to increase your shared_buffer to 25% of your system's
available memory.
You really should tile your raster.
-bborie
On 10/25/2012 02:19 AM, Mahavir Trivedi wrote:
i have 4 GB RAM WINDOWS XP 64-bit
shared_buffer:512 MB checkpoint_segment 32
hi i split the image (tiff) into 100 X 100 tile .(RASTER IMAGE 500 MB
input )
but problem occurred when i export it then output size increase.
(image input size = 1591 X 1446)
(image output size = 1600 X 1450)
can i change blocksize of server ? if yes then how ?
my system XP Windows
This should be fixed when 1.3.4 comes out. An rc2 will be out tomorrow
you can try.
P.
On Tue, Oct 21, 2008 at 3:28 PM, John Zhang [EMAIL PROTECTED] wrote:
Hello list,
I am writing to seek your input on how to handle such an issue:
I have a large table containing over 3 million polygons and
Hi John,
I am not sure that this is the exact same problem I was having with
ST_Within(), but I suspect it could be. Hopefully this will be fixed
by some changes that are in the next release. Anyway, for a possible
workaround, see this thread:
2008 18.31 dátummal Kevin Neufeld [EMAIL PROTECTED] ezt írta:
Actually, 21000 is a very small table. There are spatial tables out
there with 1/2 billion rows.
You shouldn't be getting an out of memory error on such a small
table.
What happens if you try to restructure your query, like: