Re: [postgis-users] out of memory

2012-10-29 Thread Bborie Park
The image output looks correct to me.

-bborie

On 10/25/2012 10:43 PM, Mahavir Trivedi wrote:
   hi i  split the  image (tiff) into 100 X 100 tile .(RASTER IMAGE 500 MB
 input )
  but problem occurred when i export  it then output size increase.
 
 (image input size = 1591 X 1446)
 (image output size = 1600 X 1450)
 
 
 can i change blocksize of server ? if yes then how ?
 my system XP Windows 64-bit 4 GB RAM
 
 thanks
 mahavir
 
 
 
 ___
 postgis-users mailing list
 postgis-users@postgis.refractions.net
 http://postgis.refractions.net/mailman/listinfo/postgis-users
 

-- 
Bborie Park
Programmer
Center for Vectorborne Diseases
UC Davis
530-752-8380
bkp...@ucdavis.edu
___
postgis-users mailing list
postgis-users@postgis.refractions.net
http://postgis.refractions.net/mailman/listinfo/postgis-users


Re: [postgis-users] out of memory

2012-10-29 Thread Bborie Park
Yes it can... someday.

-bborie

On 10/26/2012 06:55 AM, Pierre Racine wrote:
 This could be solved by ticket #826
 
 http://trac.osgeo.org/postgis/ticket/826
 
 Pierre
 
 -Original Message-
 From: postgis-users-boun...@postgis.refractions.net [mailto:postgis-users-
 boun...@postgis.refractions.net] On Behalf Of Bborie Park
 Sent: Friday, October 26, 2012 8:26 AM
 To: PostGIS Users Discussion
 Subject: Re: [postgis-users] out of memory

 The output size makes sense since the loader split the input raster
 into 100x100 tiles.

 1591 / 100 = 15.91 ... 16 x 100

 1446 / 100 = 14.46 ... 15 x 100

 So, when unioning the tiles back together, the unioned raster should
 be 1600 x 1500 (don't know where you got 1450 though).

 -bborie

 On Thu, Oct 25, 2012 at 10:43 PM, Mahavir Trivedi
 mahavir.triv...@gmail.com wrote:
   hi i  split the  image (tiff) into 100 X 100 tile .(RASTER IMAGE 500 MB
 input )
  but problem occurred when i export  it then output size increase.

 (image input size = 1591 X 1446)
 (image output size = 1600 X 1450)


 can i change blocksize of server ? if yes then how ?
 my system XP Windows 64-bit 4 GB RAM

 thanks
 mahavir

 ___
 postgis-users mailing list
 postgis-users@postgis.refractions.net
 http://postgis.refractions.net/mailman/listinfo/postgis-users




 --
 Bborie Park
 Programmer
 Center for Vectorborne Diseases
 UC Davis
 530-752-8380
 bkp...@ucdavis.edu
 ___
 postgis-users mailing list
 postgis-users@postgis.refractions.net
 http://postgis.refractions.net/mailman/listinfo/postgis-users
 ___
 postgis-users mailing list
 postgis-users@postgis.refractions.net
 http://postgis.refractions.net/mailman/listinfo/postgis-users
 

-- 
Bborie Park
Programmer
Center for Vectorborne Diseases
UC Davis
530-752-8380
bkp...@ucdavis.edu
___
postgis-users mailing list
postgis-users@postgis.refractions.net
http://postgis.refractions.net/mailman/listinfo/postgis-users


Re: [postgis-users] out of memory

2012-10-26 Thread Bborie Park
The output size makes sense since the loader split the input raster
into 100x100 tiles.

1591 / 100 = 15.91 ... 16 x 100

1446 / 100 = 14.46 ... 15 x 100

So, when unioning the tiles back together, the unioned raster should
be 1600 x 1500 (don't know where you got 1450 though).

-bborie

On Thu, Oct 25, 2012 at 10:43 PM, Mahavir Trivedi
mahavir.triv...@gmail.com wrote:
   hi i  split the  image (tiff) into 100 X 100 tile .(RASTER IMAGE 500 MB
 input )
  but problem occurred when i export  it then output size increase.

 (image input size = 1591 X 1446)
 (image output size = 1600 X 1450)


 can i change blocksize of server ? if yes then how ?
 my system XP Windows 64-bit 4 GB RAM

 thanks
 mahavir

 ___
 postgis-users mailing list
 postgis-users@postgis.refractions.net
 http://postgis.refractions.net/mailman/listinfo/postgis-users




-- 
Bborie Park
Programmer
Center for Vectorborne Diseases
UC Davis
530-752-8380
bkp...@ucdavis.edu
___
postgis-users mailing list
postgis-users@postgis.refractions.net
http://postgis.refractions.net/mailman/listinfo/postgis-users


Re: [postgis-users] out of memory

2012-10-26 Thread Pierre Racine
This could be solved by ticket #826

http://trac.osgeo.org/postgis/ticket/826

Pierre

 -Original Message-
 From: postgis-users-boun...@postgis.refractions.net [mailto:postgis-users-
 boun...@postgis.refractions.net] On Behalf Of Bborie Park
 Sent: Friday, October 26, 2012 8:26 AM
 To: PostGIS Users Discussion
 Subject: Re: [postgis-users] out of memory
 
 The output size makes sense since the loader split the input raster
 into 100x100 tiles.
 
 1591 / 100 = 15.91 ... 16 x 100
 
 1446 / 100 = 14.46 ... 15 x 100
 
 So, when unioning the tiles back together, the unioned raster should
 be 1600 x 1500 (don't know where you got 1450 though).
 
 -bborie
 
 On Thu, Oct 25, 2012 at 10:43 PM, Mahavir Trivedi
 mahavir.triv...@gmail.com wrote:
hi i  split the  image (tiff) into 100 X 100 tile .(RASTER IMAGE 500 MB
  input )
   but problem occurred when i export  it then output size increase.
 
  (image input size = 1591 X 1446)
  (image output size = 1600 X 1450)
 
 
  can i change blocksize of server ? if yes then how ?
  my system XP Windows 64-bit 4 GB RAM
 
  thanks
  mahavir
 
  ___
  postgis-users mailing list
  postgis-users@postgis.refractions.net
  http://postgis.refractions.net/mailman/listinfo/postgis-users
 
 
 
 
 --
 Bborie Park
 Programmer
 Center for Vectorborne Diseases
 UC Davis
 530-752-8380
 bkp...@ucdavis.edu
 ___
 postgis-users mailing list
 postgis-users@postgis.refractions.net
 http://postgis.refractions.net/mailman/listinfo/postgis-users
___
postgis-users mailing list
postgis-users@postgis.refractions.net
http://postgis.refractions.net/mailman/listinfo/postgis-users


Re: [postgis-users] out of memory

2012-10-25 Thread Bborie Park
You may want to increase your shared_buffer to 25% of your system's
available memory.

You really should tile your raster.

-bborie

On 10/25/2012 02:19 AM, Mahavir Trivedi wrote:
  i have 4 GB RAM  WINDOWS XP 64-bit
 
  shared_buffer:512 MB   checkpoint_segment 32
 
 work_mem;5MB
 
  i wish to import 540 MB tif image  into database postgis 2.0 . without tile .
 
 i got success test.sql file. but error occured when psql -d dbname -f test.sql
 
 error : String is 978678801 byte too long for encoding conversion
 
 pls give any suitable solution
 
 with thanks
 mahavir
 ___
 postgis-users mailing list
 postgis-users@postgis.refractions.net
 http://postgis.refractions.net/mailman/listinfo/postgis-users
 

-- 
Bborie Park
Programmer
Center for Vectorborne Diseases
UC Davis
530-752-8380
bkp...@ucdavis.edu
___
postgis-users mailing list
postgis-users@postgis.refractions.net
http://postgis.refractions.net/mailman/listinfo/postgis-users


Re: [postgis-users] out of memory

2012-10-25 Thread Mahavir Trivedi
  hi i  split the  image (tiff) into 100 X 100 tile .(RASTER IMAGE 500 MB
input )
 but problem occurred when i export  it then output size increase.

(image input size = 1591 X 1446)
(image output size = 1600 X 1450)


can i change blocksize of server ? if yes then how ?
my system XP Windows 64-bit 4 GB RAM

thanks
mahavir
___
postgis-users mailing list
postgis-users@postgis.refractions.net
http://postgis.refractions.net/mailman/listinfo/postgis-users


Re: [postgis-users] Out of Memory problem for large table by ST_Contains(..)

2008-10-22 Thread Paul Ramsey
This should be fixed when 1.3.4 comes out. An rc2 will be out tomorrow
you can try.

P.

On Tue, Oct 21, 2008 at 3:28 PM, John Zhang [EMAIL PROTECTED] wrote:
 Hello list,

 I am writing to seek your input on how to handle such an issue:

 I have a large table containing over 3 million polygons and a small table
 containing 53k points. My function is to identify whether a point in the
 table is Contained by a polygon in the polygon table. ST_Contains function
 is effectively used for this purpose (it takes about 2 seconds for  given
 known polygon: polyG  ptG AND ST_Contains(polyG, ptG) is used where
 polyG an ptG are the geometries of the point an dpolygon). However, with a
 given pt, it crashes to run through all the polygons with the reason given
 Out of memory for query in about 300 seconds. I then tested with select
 count(*) of the polygon table, it takes 100 seconds as well. It seems there
 is something wrong in the database configuration. I could not figure out
 what is wrong there. Could anyone help on the issue? Any input would be much
 appreciated.

 Thanks in advance.
 John


 ___
 postgis-users mailing list
 postgis-users@postgis.refractions.net
 http://postgis.refractions.net/mailman/listinfo/postgis-users


___
postgis-users mailing list
postgis-users@postgis.refractions.net
http://postgis.refractions.net/mailman/listinfo/postgis-users


Re: [postgis-users] Out of Memory problem for large table by ST_Contains(..)

2008-10-21 Thread Shane Butler
Hi John,

I am not sure that this is the exact same problem I was having with
ST_Within(), but I suspect it could be. Hopefully this will be fixed
by some changes that are in the next release. Anyway, for a possible
workaround, see this thread:
http://postgis.refractions.net/pipermail/postgis-users/2008-October/021598.html

Shane


On Wed, Oct 22, 2008 at 9:28 AM, John Zhang [EMAIL PROTECTED] wrote:
 Hello list,

 I am writing to seek your input on how to handle such an issue:

 I have a large table containing over 3 million polygons and a small table
 containing 53k points. My function is to identify whether a point in the
 table is Contained by a polygon in the polygon table. ST_Contains function
 is effectively used for this purpose (it takes about 2 seconds for  given
 known polygon: polyG  ptG AND ST_Contains(polyG, ptG) is used where
 polyG an ptG are the geometries of the point an dpolygon). However, with a
 given pt, it crashes to run through all the polygons with the reason given
 Out of memory for query in about 300 seconds. I then tested with select
 count(*) of the polygon table, it takes 100 seconds as well. It seems there
 is something wrong in the database configuration. I could not figure out
 what is wrong there. Could anyone help on the issue? Any input would be much
 appreciated.

 Thanks in advance.
 John


 ___
 postgis-users mailing list
 postgis-users@postgis.refractions.net
 http://postgis.refractions.net/mailman/listinfo/postgis-users


___
postgis-users mailing list
postgis-users@postgis.refractions.net
http://postgis.refractions.net/mailman/listinfo/postgis-users


Re: [postgis-users] out of memory @ UPDATE a big table

2008-06-18 Thread Kis János Tamás
2008 18.31 dátummal Kevin Neufeld [EMAIL PROTECTED] ezt írta:

 Actually, 21000 is a very small table. There are spatial tables out
 there with 1/2 billion rows.
 
 You shouldn't be getting an out of memory error on such a small
 table.

 What happens if you try to restructure your query, like:

 UPDATE kecskemet_k.foldreszletek AS t
 SET hrsz=f.szoveg
 FROM kecskemet_k.feliratok f
 WHERE t.geometria  f.geometria
 AND intersects( t.geometria, f.geometria )
 AND f.reteg IN ( '11' );

 If you still get an out of memory error, let us know of your
 machine's specs and Postgres settings, like shared_memory,
 work_mem, etc.


Maybe I found what was the problem:

In the table is 21094 POLYGONs:
sum: ~28 points,
avg: ~13 point/POLYGON, 
sum area: ~32000 ha
avg area: ~1.5 ha
and I have a -I think-  too big POLYGON (at PostGIS || at my 
computer), wich has ~3000 points and ~3200 ha.

If it exists, then I get the out of memory error irrespectively of 
postgresql.conf settings. But when I delete it from table, all right.

What can I do, if I don't want delete it?

Thanks,
kjt


McAfee SCM 4.1 által ellenőrizve!
___
postgis-users mailing list
postgis-users@postgis.refractions.net
http://postgis.refractions.net/mailman/listinfo/postgis-users