I played some more with it. I was using Greek and Hebrew characters in
my tests, which cannot be rendered in MacRoman, and so QS used UTF-16.
But if I stick to accented Latin characters, then you are correct in
saying that QS will convert a document to MacRoman in order to append
the characters to it.

It appears that iconv is able to translate properly from MacRoman or
UTF-16 into UTF-8, so long as the output is to stdout. But if I try
piping, redirecting, etc., then conversions to UTF-8 don't work any
more. You can test this in Terminal by comparing the results of iconv -
f MacRoman -t UTF-8 non-ascii-chars-file.txt with the results of iconv
-f MacRoman -t UTF-8 non-ascii-chars-file.txt > test.txt. The first
outputs properly to stdout, and all is rendered as it should be. But
the second doesn't work, as you are already well aware.

So, this all brings us back to your original question, "Is there some
way to change the output encoding of QS
or a better way converting the encoding than iconv?" I don't know of
any way to change the output encoding of QS. Instead of iconv, you
might try building and installing recode (http://directory.fsf.org/
project/recode/). I haven't done that, so good luck with it if you do.

However, may I ask why the files must be in UTF-8?

On Jan 2, 2:57 pm, msson <[email protected]> wrote:
> Nope.. QS always converts the document to MacRoman for me when I enter
> non ascii characters and use the Append to... command. I tried to make
> a shell script to add a string to my txt file and then the encoding of
> the file ends up UTF-8 but the input text is all f-ed up due to the
> encoding of the QS output.
>
> Here is an example of my workflow:
>
> QS -> ./myscript "some text with non ascii chars ÄÖ" -> Run Command in
> Shell
>
> The contents of myscript:
>
> #!/bin/sh
> echo $1 >> test.txt
>
> The test.txt file is encoded in UTF-8 but the text in it looks like
> this:
> some text with non ascii chars A¨O¨
>
> I have tried piping it through iconv...
>
> #!/bin/sh
> echo $1 | iconv -f MacRoman -t UTF-8 >> test.txt
>
> ...in every way I can think of but I can't get it right. Any
> suggestions?
>
> On Dec 31 2008, 5:23 pm, "Jon Stovell (a.k.a. Sesquipedalian)"
>
> <[email protected]> wrote:
> > In my tests, QS converts the file to UTF-16 when appending Unicode
> > characters to a text file using the Append To... action. Try
> > converting from UTF-16 instead of from MacRoman when using iconv.
>
> > (For those who are interested, QS appeared to leave the original
> > encoding alone when appending standard ASCII characters. It apparently
> > changes the encoding to UTF-16 only when dealing with non-ASCII
> > characters.)
>
> > On Dec 30, 8:43 am, msson <[email protected]> wrote:
>
> > > I'm having some trouble making a shell script work properly due to the
> > > fact that QS only wants to output text in MacRoman instead of UTF-8. I
> > > want to add a string to the end of a txt file and the file needs to be
> > > encoded with UTF-8. So every time I add some text (with non ascii
> > > chars) via QS it either changes the encoding of the txt file or enters
> > > a bunch of mumbo-jumbo.
>
> > > I've tried using the iconv command to convert the encoding but nothing
> > > seems to work. Is there some way to change the output encoding of QS
> > > or a better way converting the encoding than iconv?
>
> > > Best regards, msson.

Reply via email to