Bugs item #2177734, was opened at 2008-10-18 21:58
Message generated for change (Comment added) made by mr-meltdown
You can respond by visiting: 
https://sourceforge.net/tracker/?func=detail&atid=482468&aid=2177734&group_id=56967

Please note that this message will contain a full copy of the comment thread,
including the initial issue submission, for this request,
not just the latest update.
Category: Core
Group: Clients CVS Head
Status: Open
Resolution: None
Priority: 5
Private: No
Submitted By: Stefan de Konink (skinkie)
Assigned to: Nobody/Anonymous (nobody)
Summary: Output breaks on foreign characters

Initial Comment:
I presume NLS must be systemwide enabled in order get working 'foreign' output 
in mclient. What I now see is that probably the pipe to stdout breaks on a 
foreign characters:

| 22771762 | power | line | 22771762 |  22 | 290062167 | 290062167 | 
50.91285649999 | 6.130952899999 | Team_Alpha      | 2008-08-22      |
:          :       :      :          :     :           :           :           
9997 :           9996 :                 : 06:04:08.000000 :
:          :       :      :          :     :           :           :            
    :                :                 : +00:00          :
| 22772346 | power | line | 22772346 |   0 | 244473999 | 244473999 | 
50.79486810000 | 6.973957699999 |12315 tuples
sql>select * from way_tags, way_nds, nodes_legacy where k='power' and v='line' 
and way_tags.way = way_nds.way and way_nds.to_node = nodes_legacy.id;
12315 tuples

----------------------------------------------------------------------

>Comment By: Fabian (mr-meltdown)
Date: 2008-10-21 09:01

Message:
sorry, I meant
@skinkie: what does `locale` say, and what does `perl -e ''` say?

----------------------------------------------------------------------

Comment By: Stefan Manegold (stmane)
Date: 2008-10-21 08:56

Message:
$ locale
LANG=en_US.UTF-8
LC_CTYPE="en_US.UTF-8"
LC_NUMERIC="en_US.UTF-8"
LC_TIME="en_US.UTF-8"
LC_COLLATE="en_US.UTF-8"
LC_MONETARY="en_US.UTF-8"
LC_MESSAGES="en_US.UTF-8"
LC_PAPER=nl_NL.UTF-8
LC_NAME="en_US.UTF-8"
LC_ADDRESS="en_US.UTF-8"
LC_TELEPHONE="en_US.UTF-8"
LC_MEASUREMENT="en_US.UTF-8"
LC_IDENTIFICATION="en_US.UTF-8"
LC_ALL=


----------------------------------------------------------------------

Comment By: Fabian (mr-meltdown)
Date: 2008-10-21 08:44

Message:
what does `locale` say on your system?

----------------------------------------------------------------------

Comment By: Stefan de Konink (skinkie)
Date: 2008-10-20 23:26

Message:
I presume you have a Linux distro that has NLS enabled, like I did before I
thought I could strip it to save more space for my VM/LiveCD.

SQLrow (len=0x8713698, numeric=0x87136c8, rest=0x87136b0, fields=5,
trim=1) at ../../../src/mapiclient/MapiClient.mx:4
408                     for (i = 0; i < fields; i++) {
(gdb) 
409                             if ((t = rest[i]) != NULL && utf8strlen(t)
> (size_t) len[i]) {
(gdb) 
utf8strlen (s=0x8713610 "FSürth") at
../../../src/mapiclient/MapiClient.mx:370

Considering that; I presume something in stream_printf goes wrong where
toConsole is changed. If you want to debug it, of course a quest account is
possible within my vm.

----------------------------------------------------------------------

Comment By: Stefan Manegold (stmane)
Date: 2008-10-20 20:20

Message:
Works for me:

sql>create table MyTab (MyAtt string);
0 tuples
sql>insert into MyTab values ('FSürth');
Rows affected 1
sql>select * from MyTab;
+---------+
| myatt   |
+=========+
| FSürth  |
+---------+
1 tuple
sql>


----------------------------------------------------------------------

Comment By: Stefan de Konink (skinkie)
Date: 2008-10-20 12:38

Message:
(I'm using the new and improved bugtracker maybe that helps)

The issue is not the white space, the issue is the sudden loss of output
to stdout. If you take a peak on the 'line' with 22771762, you notice a
string Team_Alpha. Now look forward to '22772346' you notice 12315 tuples.
For the reason, that I can positively mark as 'in some way related to my
recompiling to -nls' (in gentoo terms),
http://openstreetmap.org/api/0.5/node/244473999 where the user-value is set
to 'FSürth'.

Now how can I know that I didn't screw up Mserver5, or Mapi in that
perspective? Because my alternative lookup mechanism still works, and is
working on this same dataset.
http://thuis.konink.de/api/0.5/node/244473999

----------------------------------------------------------------------

Comment By: Fabian (mr-meltdown)
Date: 2008-10-20 12:28

Message:
Unfortunately SF eats whitespace, so I'm not able to see the problem
proper, but can it be that you're bitten by mclient breaking up long values
in an attempt not to get wider than your terminal size?

----------------------------------------------------------------------

Comment By: Stefan de Konink (skinkie)
Date: 2008-10-20 12:12

Message:
I take this as offensive; there is enough information to reproduce the bug
on something like UTF8 characters. If even the authors of mclient are not
surprised what the output of the second query is <<12315 tuples>> without
setting any '>'-option, better to not improve my bug reporting skills.

----------------------------------------------------------------------

Comment By: Sjoerd Mullender (sjoerd)
Date: 2008-10-20 10:33

Message:
Please provide details.

Read and internalize
<http://www.chiark.greenend.org.uk/~sgtatham/bugs.html>.

----------------------------------------------------------------------

You can respond by visiting: 
https://sourceforge.net/tracker/?func=detail&atid=482468&aid=2177734&group_id=56967

-------------------------------------------------------------------------
This SF.Net email is sponsored by the Moblin Your Move Developer's challenge
Build the coolest Linux based applications with Moblin SDK & win great prizes
Grand prize is a trip for two to an Open Source event anywhere in the world
http://moblin-contest.org/redirect.php?banner_id=100&url=/
_______________________________________________
Monetdb-bugs mailing list
[email protected]
https://lists.sourceforge.net/lists/listinfo/monetdb-bugs

Reply via email to