On Sat, 31 Aug 2019, Anthony Walter via lazarus wrote:
Michael,
Real world examples of never used JSON fields:
Calling most web REST methods which return JSON as a web result where the
caller is only interested in success or failure with a message.
Acting as a RESTful service where many
Michael,
Internally JsonTools just stores the document as JSON fragments or text, in
the case of object, array, or null, the text is empty, but in the case of
number or bool, it converts the text to a number or bool when you ask
AsNumber or AsBoolean. Now normally this wont be a problem because
On Sat, 31 Aug 2019, Luca Olivetti via lazarus wrote:
El 31/8/19 a les 16:22, Michael Van Canneyt via lazarus ha escrit:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse
Also frequently encountered is omitting "" around property names. JSON is
El 31/8/19 a les 16:22, Michael Van Canneyt via lazarus ha escrit:
https://developer.mozilla.org/en-US/docs/Web/JavaScript/Reference/Global_Objects/JSON/parse
Also frequently encountered is omitting "" around property names. JSON is a
subset of Javascript:
D.Parse('{ d: 12345678.3 }');
On Sat, 31 Aug 2019, Anthony Walter wrote:
Could you include https://github.com/BeRo1985/pasjson in the comparison?
Sure. I also have a few other people have requested. I will also list the
license of each in the first table.
[snip]
For example if wanted to store object state using
> Could you include https://github.com/BeRo1985/pasjson in the comparison?
Sure. I also have a few other people have requested. I will also list the
license of each in the first table.
Regarding a huge gigabytes of JSON in a file, I know a small portion of
programmers of people might be inclined
On Fri, Aug 30, 2019 at 11:02 PM Michael Van Canneyt via lazarus
wrote:
> Can you try setting defaultsystemcodepage to UTF8 ?
Feeling a little bit embarrassed now (I'm used to Lazarus which
defaults to that).
With DefaultSystemCodePage := CP_UTF8 it works:
Handles unicode chars correctly: >{
Could you include https://github.com/BeRo1985/pasjson in the comparison?
On Fri, Aug 30, 2019 at 4:22 PM Anthony Walter via lazarus <
lazarus@lists.lazarus-ide.org> wrote:
> Alan, oh that's a good idea. I will do that as well as add a few more
> parser libraries as requested by a few people in
On Sat, 31 Aug 2019, Sven Barth via lazarus wrote:
Am 31.08.2019 um 09:45 schrieb Michael Van Canneyt via lazarus:
Codepages & strings require careful setup. Contrary to popular belief, it
does not 'just work'.
All this is documented:
Am 31.08.2019 um 11:08 schrieb Sven Barth:
Am 31.08.2019 um 09:45 schrieb Michael Van Canneyt via lazarus:
Codepages & strings require careful setup. Contrary to popular
belief, it does not 'just work'.
All this is documented:
Am 31.08.2019 um 09:45 schrieb Michael Van Canneyt via lazarus:
Codepages & strings require careful setup. Contrary to popular belief,
it does not 'just work'.
All this is documented:
https://www.freepascal.org/docs-html/current/ref/refsu9.html#x32-390003.2.4
Many people tend to ignore
On Sat, 31 Aug 2019, Michael Van Canneyt via lazarus wrote:
On Sat, 31 Aug 2019, Anthony Walter via lazarus wrote:
Okay, going back and looking through the messages I see you did post a test
with:
{$codepage UTF8} and uses cwstring
Here are the results with that added:
On Linux using
On Sat, 31 Aug 2019, Anthony Walter via lazarus wrote:
Okay, going back and looking through the messages I see you did post a test
with:
{$codepage UTF8} and uses cwstring
Here are the results with that added:
On Linux using {$codepage UTF8} by itself causes both tests to fail. Adding
Okay, going back and looking through the messages I see you did post a test
with:
{$codepage UTF8} and uses cwstring
Here are the results with that added:
On Linux using {$codepage UTF8} by itself causes both tests to fail. Adding
cwstring causes both tests to work. On Windows trying to use
On Sat, 31 Aug 2019, Anthony Walter via lazarus wrote:
Michael, regarding this unicode problem, all the code has already been
posted in this thread.
program Test;
uses
FPJson, JsonParser, JsonTools;
There you are. You're missing the cwstring unit and the codepage directive.
Change
Michael, I hadn't tried your example code yet as I thought the discussion
was on the topic of the unicode failure, and your example was about parsing
speed. I'll be happy to take a look at speed improvements, but like you I
am interested to find our what's failing with VerifyUnicodeCharsFPJson.
--
If there is any chance the char codes are being altered through whatever
browser / mail client you are using, here is a direct link to the program
source:
https://cache.getlazarus.org/projects/test.lpr
--
___
lazarus mailing list
Michael, regarding this unicode problem, all the code has already been
posted in this thread.
program Test;
uses
FPJson, JsonParser, JsonTools;
const
UnicodeChars = '{ "name": "Joe®Schmoe", "occupation": "bank teller \u00Ae
" }';
function VerifyUnicodeCharsFPJson: Boolean;
var
N:
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
Okay, so I turned on my Windows VM with a different version of FPC and ran
VerifyUnicodeChars with both FPJson and JsonTools. The resutls are the
same. JsonTools sees the unicode correctly, and something is wrong when
using FPJson. I
Okay, so I turned on my Windows VM with a different version of FPC and ran
VerifyUnicodeChars with both FPJson and JsonTools. The resutls are the
same. JsonTools sees the unicode correctly, and something is wrong when
using FPJson. I don't know what the problem is, but other people are
noticing
On Fri, 30 Aug 2019, Bart via lazarus wrote:
On Fri, Aug 30, 2019 at 9:09 PM Bart wrote:
On Windows it prints FALSE, both with 3.0.4 and trunk r42348
It fails on both comparisons (hexadecimal representation of the
returned unicodestrings):
Name: 004A 006F 0065 003F 0053 0063 0068
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
I am not sure how under any situation parsing a JSON from a stream source
would be any faster than parsing a string.
If you would check the fpjson code, you'd see why.
You'd also see why there is plenty of room for improvement.
Also
For those tracking the unicode issue, could you please verify the problem
does not present in my JsonTools library on compiler revisions and
platforms? I always get true (passed) with my library, but not with any
other library. Here is the relevant test:
function VerifyUnicodeChars: Boolean;
On Fri, Aug 30, 2019 at 9:09 PM Bart wrote:
> On Windows it prints FALSE, both with 3.0.4 and trunk r42348
It fails on both comparisons (hexadecimal representation of the
returned unicodestrings):
Name: 004A 006F 0065 003F 0053 0063 0068 006D 006F 0065
Expected: 004A 006F 0065 00AE 0053
On Fri, Aug 30, 2019 at 4:04 PM Michael Van Canneyt via lazarus
wrote:
> No idea. I tested with both 3.0.4 and trunk. Both give the same result.
>
> Here are the sources I used:
...
> I test on linux, but could try windows.
On Windows it prints FALSE, both with 3.0.4 and trunk r42348
--
Bart
I am not sure how under any situation parsing a JSON from a stream source
would be any faster than parsing a string. Also with regards to timing I am
not sure how accurate Now is. For this purpose I've written:
{ Return a time based on system performance counters }
function TimeQuery: Double;
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
Alan, oh that's a good idea. I will do that as well as add a few more
parser libraries as requested by a few people in other non mailing lists
threads. I will also try to find out what's going on the unicode strings as
it might be a
Alan, oh that's a good idea. I will do that as well as add a few more
parser libraries as requested by a few people in other non mailing lists
threads. I will also try to find out what's going on the unicode strings as
it might be a problem with the compiler.
Michael,
I am on Linux as well, but
It is maybe bug which was fixed in FPC trunk, there was some Unicode issue in
3.0.4.
>
> On my system with FPJson the test is failing it failing on "bank teller
> \u00Ae ", but on when using approximately the same code with JSONTools it
> passes on both "name" and "occupation" always. What
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
On my system with FPJson the test is failing it failing on "bank teller
\u00Ae ", but on when using approximately the same code with JSONTools it
passes on both "name" and "occupation" always. What do you think is going
on?
No idea. I
On my system with FPJson the test is failing it failing on "bank teller
\u00Ae ", but on when using approximately the same code with JSONTools it
passes on both "name" and "occupation" always. What do you think is going
on?
--
___
lazarus mailing list
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
Michael,
Can you tell me why the second half (N.Items[1].AsUnicodeString) this test
fails? This is the part that decodes "bank teller \u00Ae ".
The test fails on "Joe®Schmoe", not on "bank teller \u00Ae ".
If you WriteLn the
Michael,
Can you tell me why the second half (N.Items[1].AsUnicodeString) this test
fails? This is the part that decodes "bank teller \u00Ae ".
function VerifyUnicodeChars: Boolean;
const
UnicodeChars = '{ "name": "Joe®Schmoe", "occupation": "bank teller \u00Ae
" }';
var
N: TJSONData;
begin
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
Michael,
I have a hurricane headed my way, but when I'm done evacuating I'll send
you a copy of my test. If you want to make improvements to the test program
to be sure the manner in which I am using the FPJson functions and classes
is
On Fri, 30 Aug 2019, Anthony Walter via lazarus wrote:
With regards to duplicate key names, some libraries allow for the same key
to be parsed resulting in multiple child nodes of the same name. Others
throw an exception when parsing an object with a duplicate key name.
The correct way to
Michael,
I have a hurricane headed my way, but when I'm done evacuating I'll send
you a copy of my test. If you want to make improvements to the test program
to be sure the manner in which I am using the FPJson functions and classes
is correct and send me a revised test program, then that would
With regards to duplicate key names, some libraries allow for the same key
to be parsed resulting in multiple child nodes of the same name. Others
throw an exception when parsing an object with a duplicate key name.
The correct way to handle duplicate keys is to overwrite the existing key
when a
Yes, JsonTools needs a method SaveToFile if it has not. It must save
formatted json with indent, set by a property or global variable
(default is 2 usually).
SaveToFile must handle Unicode strings, ie output them with \u or
like it. Use Unicode escape for all codes >=128, because utf8
On Fri, 30 Aug 2019, Anthony Walter wrote:
I've posted a new page that tests the speed and correctness of several
pascal based JSON parsers.
https://www.getlazarus.org/json/tests/
In full disclosure I am the author of the new open source JsonTools
library, and even though my parser seems to
39 matches
Mail list logo