Ok I have tried everything I can with this problem. I used different libs from curl to Pocco etc and I got the same result. I tried to pass the byte array I receive into all the possible containers and functions I could but it was still the same issue.
We then implemented a RUST version of this using the protobuf libraries they provided and that worked flawlessly. I am really not sure why this happens I only know it happens when allot of data is being processes. On Monday, September 7, 2020 at 4:24:13 PM UTC+2 Test Last wrote: > Could it possibly be Curl that somehow corrupts data if its to much? > > Is there some curl flags I need to look out for perhaps? > > Thanks > Laster > > On Thursday, September 3, 2020 at 7:00:21 PM UTC+2 Test Last wrote: > >> Hi >> >> I couldn't upload a file ? >> No matter what type it just comes back with error occurred so I have >> uploaded the files to a google drive. >> This is the link -> >> https://drive.google.com/file/d/10cwBazAPlolMC-5mPd-sq5dr2bnUEXiD/view?usp=sharing >> In there is a my protobufs i am using. The functions used to process the >> protobuf data and also the data itself. >> In the files Limit61.txt and limit60.txt are binary data as I receive it >> from the server. If you compile the Protobuf files you can run those BIN >> files through them and get "Hopefully" my results. >> There are two separate version because the limit61.txt one breaks when >> the object it's trying to process is smaller than what its suppose to be >> and its value also read NULL when has_scalar_value() is called. >> After that the next object just doesn't exist even though object.size() >> shows there is suppose to be more. >> The limit60.txt version however does work. Now this is not the data set >> because it seems like if I give it enough of any data set it will >> eventually crash. What is the limit to the size of a protobuf object? >> >> This happens on both Protobuf Versions 3.12.3(Version in Nuget Package >> manager for VSCode) and 3.13.0(Compiled by me for Linux Ubuntu 18 and 20) >> Although the Ubuntu version of things seem to handle way less data. I >> have no idea of the inner workings of the protobuf so I am totally shooting >> in the dark as to what the cause might be. >> >> Please help I really appreciate the assistance. >> Thanks >> Laster >> >> On Wednesday, September 2, 2020 at 6:43:44 PM UTC+2 [email protected] >> wrote: >> >>> It looks like you have an enum value named NULL, which conflicts with >>> the NULL macro in C++. I haven't looked into this but it is possible that >>> we added some workaround for this somewhere between 3.6.1 and 3.13.0. There >>> is no reason for your Java and C++ to have the same protobuf version, >>> though, so I would recommend that if your C++ project was already on 3.13.0 >>> and was building successfully, then keep it on 3.13.0 instead of >>> downgrading to 3.6.1. If you could provide more details about the >>> protobuf.has_scalar_value() problem then I could try to see what is going >>> wrong. >>> >>> On Wed, Sep 2, 2020 at 6:33 AM Test Last <[email protected]> wrote: >>> >>>> Hi All >>>> >>>> I am getting some errors where at a certain amount of data the >>>> protobuf.has_scalar_value() will show false. >>>> But I am sure that there is indeed more data because I can see it in >>>> the response I received from the server. >>>> However the server is a Java implementation of Apache Calcite running >>>> Protobuf V3.6.1. >>>> At first my C++ program was running Protobuf V3.13.0 and I kept on >>>> getting errors when the data exceeds a certain amount of data. I AM NOT >>>> however sure that, that is indeed the case but the current error is very >>>> obscure at this stage. >>>> So then to try and remedy the situation I pulled TAG Protobuf V3.6.1 >>>> and I compiled it. >>>> Ran protoc on them and I got the following error WHICH I didn't receive >>>> in 3.13.0 AND ALSO not on the Java server side running same version. ALSO >>>> This is done on a newly installed Ubuntu server 20 OS. >>>> >>>> protoc -I=. --cpp_out=. ./common.proto >>>> protoc -I=. --cpp_out=. ./request.proto >>>> protoc -I=. --cpp_out=. ./response.proto >>>> g++ -g -fPIC -c common.pb.cc -L/usr/local/lib `pkg-config --cflags >>>> --libs protobuf` -Wl,--no-as-needed -lgrpc++_reflection -Wl,--as-needed >>>> -ldl -o common.o -std=c++14 >>>> In file included from >>>> /usr/include/x86_64-linux-gnu/bits/types/stack_t.h:23, >>>> from /usr/include/signal.h:303, >>>> from /usr/include/x86_64-linux-gnu/sys/param.h:28, >>>> from /usr/local/include/google/protobuf/stubs/port.h:64, >>>> >>>> from >>>> /usr/local/include/google/protobuf/stubs/common.h:46, >>>> from common.pb.h:9, >>>> from common.pb.cc:4: >>>> common.pb.h:222:3: error: expected identifier before ‘__null’ >>>> 222 | NULL = 24, >>>> | ^~~~ >>>> common.pb.h:222:3: error: expected ‘}’ before ‘__null’ >>>> >>>> So my question is. How is it possible that Protobufs work on the Java >>>> server but not on my C++ implementation with the same Proto files and same >>>> version compiled for this OS? >>>> Is it a bug in the V3.6.1 ? >>>> >>>> >>>> -- >>>> You received this message because you are subscribed to the Google >>>> Groups "Protocol Buffers" group. >>>> To unsubscribe from this group and stop receiving emails from it, send >>>> an email to [email protected]. >>>> To view this discussion on the web visit >>>> https://groups.google.com/d/msgid/protobuf/4408fbbb-786c-4e03-93ef-7dfcd045a772n%40googlegroups.com >>>> >>>> <https://groups.google.com/d/msgid/protobuf/4408fbbb-786c-4e03-93ef-7dfcd045a772n%40googlegroups.com?utm_medium=email&utm_source=footer> >>>> . >>>> >>> -- You received this message because you are subscribed to the Google Groups "Protocol Buffers" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. To view this discussion on the web visit https://groups.google.com/d/msgid/protobuf/a54d36c5-06b1-41be-b6b5-c7b1644600aen%40googlegroups.com.
