Hi,

While downloading urls, context santizes the filename but does not check the length of the url. So, one can end up with a situation where the filename is too long for the operating system to handle. For example, the following fails on 32bit linux.

\enabletrackers[resolvers.schemes]
\startluacode
  local report_webfilter = logs.new("thirddata.webfilter")

local url = "http://www.bing.com/search?q=AreallyreallylongstringjusttoseehowthingsworkordontworkAreallyreallylongstringjusttoseehowthingsworkordontworkAreallyreallylongstringjusttoseehowthingsworkordontworkAreallyreallylongstringjusttoseehowthingsworkordontworAreallyreallylongstringjusttoseehowthingsworkordontworkkAreallyreallylongstringjusttoseehowthingsworkordontwork";

  local specification = resolvers.splitmethod(url)

  local file       = resolvers.finders['http'](specification) or ""

  if file and file ~= "" then
    report_webfilter("saving file %s", file)
  else
    report_webfilter("download failed")
  end
\stopluacode

\normalend

Is there a robust way to avoid this problem? One possibility is that in data-sch.lua instead of

    local cleanname = gsub(original,"[^%a%d%.]+","-")

use

    local cleanname = md5sum(original)

What do you think?

Aditya

___________________________________________________________________________________
If your question is of interest to others as well, please add an entry to the 
Wiki!

maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context
webpage  : http://www.pragma-ade.nl / http://tex.aanhet.net
archive  : http://foundry.supelec.fr/projects/contextrev/
wiki     : http://contextgarden.net
___________________________________________________________________________________

Reply via email to