Tom,

> Apparently you've not found pg_read_file() ?
Thanks a lot. Did'nt find this. This helped!

Still, get_url() would be handy too... :->

Questions: Don't see, why this would be a security issue: How could such a
function do any harm? large files?

Finally: Got some tricky followup questions regarding index usage in
tsearch2 and regex. Should I place these here (or else where?)?

Regards, S.




2009/5/19 Tom Lane <t...@sss.pgh.pa.us>

> Robert Haas <robertmh...@gmail.com> writes:
> > On Mon, May 18, 2009 at 4:03 PM, Stefan Keller <sfkel...@gmail.com>
> wrote:
> >> I'd expect functions like get_text() or get_url() in order to do the
> >> following:
> >> INSERT INTO collection(id, path, content) VALUES(1, '/tmp/mytext,
> >> get_text('/tmp/mytext));
>
> Apparently you've not found pg_read_file() ?
>
> >> AFAIK there was a get_url in libcurl but I neither find it any more. But
> >> anyway: This should be part of the core... :->
>
> > Putting this into core would have security implications.  The file or
> > URL would be downloaded by the PostgreSQL server process, not the
> > client process - therefore I think it would have to be super-user
> > only, which would make it much less useful.
>
> Yes.  I very strongly doubt that we'd accept a url-fetching function at
> all.  Aside from the security issues, it would necessarily pull in a
> boatload of dependencies that we'd prefer not to have.
>
> Of course, you can write such a thing trivially in plperlu or several
> other untrusted PLs, and include any security restrictions you see fit
> while you're at it.  I'm not seeing how a built-in function that would
> have to impose one-size-fits-all security requirements would be an
> improvement.
>
>                        regards, tom lane
>

Reply via email to