Jan Böcker jan.boec...@jboecker.de writes:
On 12.02.2010 23:23, dmg wrote:
For evince, I think I have found a problem in the parsing of the link.
Evince already encodes
the URL, but it does not encode the '/', hence you will get a link like this:
emacsclient
On 12.02.2010 23:23, dmg wrote:
For evince, I think I have found a problem in the parsing of the link.
Evince already encodes
the URL, but it does not encode the '/', hence you will get a link like this:
emacsclient
Basically, it is OK to url-encode each character who's binary
representation start with 1 (i.e., the value of the character is higher
than 127). The text to be url-encoded should be UTF-8 ideally.
If you use glib::ustring, it's easy to transform any iso-8859 string to
utf-8. Each character,
Jan Böcker jan.boec...@jboecker.de writes:
On 06.02.2010 14:50, Jan Böcker wrote:
AFAIK, your current approach is correct.
I was wrong. The attached patch fixes a bug in the encode_uri function.
That fixes the non-ASCII characters problem in xournal for me.
The gchar type is just typedef'd
I have been looking around and I am not sure how to solve this
problem. Withing Evince and Xournal I am encoding any non alphanum (as
defined by the C macro) each byte that is contained in the filename
individually.
Does anybody know which are the characters above 0 (zero) that need to
be
On 06.02.2010 14:50, Jan Böcker wrote:
AFAIK, your current approach is correct.
I was wrong. The attached patch fixes a bug in the encode_uri function.
That fixes the non-ASCII characters problem in xournal for me.
The gchar type is just typedef'd to char, which means it is signed. To
get the