I'm havinga large bit of encoding confusion. I'm programming an app in
Java that is using SQLite, and have a text field that will contain a
variable number of entries. I of course wanted to separate those
entries with delimiters, but I thought I could avoid the hassle of
making sure that user inputted characters weren't delimiters by using
characters that "don't exist" as delimiters. In Java, the character
values below the decimal value 33 represent nothing, so I just took an
arbitrary number (2, in this case), stored it as a character, and
tried to put it in the database.
I got an unrecognized token error. The thing I don't understand here
is that the value 2 is a perfectly valid character value, it just
doesn't map to a character. Why couldn't I store it in the database?
Is it something involving the interface, or is SQLite storing
characters in such a way that this sort of thing isn's supported?
I know Java uses unicode, and my database is using UTF-8 (which I can,
I think, fairly safely assume Java is using when they say "unicode"),
so the character should be stored in the same format in SQLite as in
Java.

Am I completely wrong here (I half expect it), or is there a way to do
something along these lines?

-----------------------------------------------------------------------------
To unsubscribe, send email to [EMAIL PROTECTED]
-----------------------------------------------------------------------------

Reply via email to