Nope.. QS always converts the document to MacRoman for me when I enter
non ascii characters and use the Append to... command. I tried to make
a shell script to add a string to my txt file and then the encoding of
the file ends up UTF-8 but the input text is all f-ed up due to the
encoding of the QS output.

Here is an example of my workflow:

QS -> ./myscript "some text with non ascii chars ÄÖ" -> Run Command in
Shell

The contents of myscript:

#!/bin/sh
echo $1 >> test.txt

The test.txt file is encoded in UTF-8 but the text in it looks like
this:
some text with non ascii chars A¨O¨

I have tried piping it through iconv...

#!/bin/sh
echo $1 | iconv -f MacRoman -t UTF-8 >> test.txt

...in every way I can think of but I can't get it right. Any
suggestions?


On Dec 31 2008, 5:23 pm, "Jon Stovell (a.k.a. Sesquipedalian)"
<[email protected]> wrote:
> In my tests, QS converts the file to UTF-16 when appending Unicode
> characters to a text file using the Append To... action. Try
> converting from UTF-16 instead of from MacRoman when using iconv.
>
> (For those who are interested, QS appeared to leave the original
> encoding alone when appending standard ASCII characters. It apparently
> changes the encoding to UTF-16 only when dealing with non-ASCII
> characters.)
>
> On Dec 30, 8:43 am, msson <[email protected]> wrote:
>
> > I'm having some trouble making a shell script work properly due to the
> > fact that QS only wants to output text in MacRoman instead of UTF-8. I
> > want to add a string to the end of a txt file and the file needs to be
> > encoded with UTF-8. So every time I add some text (with non ascii
> > chars) via QS it either changes the encoding of the txt file or enters
> > a bunch of mumbo-jumbo.
>
> > I've tried using the iconv command to convert the encoding but nothing
> > seems to work. Is there some way to change the output encoding of QS
> > or a better way converting the encoding than iconv?
>
> > Best regards, msson.

Reply via email to