Please let me one more guess ^^
Third guess : you are using topology (nodes are indexed by node_id).
- If this is the case, you could use postgis topology.
- The gain is that with this topology model, you store *shared linestring*,
and not shared points.
More seriously from what you say
-general-ow...@postgresql.org] On Behalf Of Nathan Clayton
Sent: January-15-15 19:19
To: pgsql-general@postgresql.org
Subject: Re: [GENERAL] Indexing large table of coordinates with GiST
On 1/15/2015 12:36 PM, Daniel Begin wrote:
Thank, there is a lot of potential ways to resolve this problem
-general-ow...@postgresql.org] On Behalf Of Nathan Clayton
Sent: January-15-15 19:19
To: pgsql-general@postgresql.org
Subject: Re: [GENERAL] Indexing large table of coordinates with GiST
On 1/15/2015 12:36 PM, Daniel Begin wrote:
Thank, there is a lot of potential ways to resolve this problem
As Remi notes, going with a pointcloud approach might be wiser, particularly if
you aren’t storing much more about the points that coordinates and other lidar
return information. Since you’re only working with points, depending on your
spatial distribution (over poles? dateline?) you might just
On 1/15/2015 6:44 AM, Daniel Begin wrote:
Hi, I'm trying to create an index on coordinates (geography type) over a
large table (4.5 billion records) using GiST...
CREATE INDEX nodes_geom_idx ON nodes USING gist (geom);
The command ran for 5 days until my computer stops because a power outage!
On 01/15/2015 05:44 AM, Daniel Begin wrote:
Hi, I'm trying to create an index on coordinates (geography type) over a
large table (4.5 billion records) using GiST...
CREATE INDEX nodes_geom_idx ON nodes USING gist (geom);
The command ran for 5 days until my computer stops because a power
Hey,
You may want to post this on postGIS list.
I take that so many rows mean either raster or point cloud.
If it is point cloud simply consider using pg_pointcloud.
A 6 billion point cloud is about 600 k lines for one of my data set.
If it is raster, you may consider using postgis raster type.
Thank, there is a lot of potential ways to resolve this problem!
For Rob, here is a bit of context concerning my IT environment…
Windows 7 64b Desktop, running with an Intel i7 core and 16GB ram. The
PostgreSQL 9.3 database is stored on a 3TB external drive (USB-3 connection
with write
On January 15, 2015 at 12:36:29 PM, Daniel Begin
(jfd...@hotmail.com(mailto:jfd...@hotmail.com)) wrote:
Paul, the nodes distribution is all over the world but mainly over inhabited
areas. However, if I had to define a limit of some sort, I would use the
dateline. Concerning spatial
On 1/15/2015 12:36 PM, Daniel Begin wrote:
Thank, there is a lot of potential ways to resolve this problem!
For Rob, here is a bit of context concerning my IT environment…
Windows 7 64b Desktop, running with an Intel i7 core and 16GB ram. The
PostgreSQL 9.3 database is stored on a 3TB
Hi, I'm trying to create an index on coordinates (geography type) over a
large table (4.5 billion records) using GiST...
CREATE INDEX nodes_geom_idx ON nodes USING gist (geom);
The command ran for 5 days until my computer stops because a power outage!
Before restarting the index creation, I am
I'd restructure the table to be split into perhaps 100 or so inherited
tables (or more). That many rows in a table are usually not efficient with
postgres in my experience. My target is to keep the tables under about 100
million rows. I slice them up based on the common query patterns, usually
by
12 matches
Mail list logo