[protobuf] Re: How to Generate Header Files for Protobuf C++?

2024-09-17 Thread 'Aldrin' via Protocol Buffers
runtime_version.h is available from protobuf itself and not generated.

Assume I installed protobuf to INSTALL_PREFIX:
   >> find ${INSTALL_PREFIX}/include -name 'runtime_version.h'
   ${INSTALL_PREFIX}/include/google/protobuf/runtime_version.h

You can find the source directory in the protobuf repo [1].

[1]: 
https://github.com/protocolbuffers/protobuf/blob/v28.1/src/google/protobuf/runtime_version.h
On Sunday, September 15, 2024 at 9:17:15 AM UTC-7 Dương Lê wrote:

> Hello everyone,
> I'm currently working with Protobuf in C++ and I'm trying to generate the 
> necessary header files, such as runtime_version.h. I've followed the 
> instructions in the CMake README, but unfortunately, I haven't had any 
> success.
> Could someone provide guidance on how to properly create these header 
> files? Any tips or examples would be greatly appreciated!
> Thank you in advance for your help!
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/f5f48718-610d-497f-be40-369ddb26ed1bn%40googlegroups.com.


[protobuf] Re: [C++] Linking against generated protobuf files from multiple projects

2024-09-17 Thread 'Aldrin' via Protocol Buffers
Thank you, this is solved now.

On Thursday, September 12, 2024 at 1:08:13 PM UTC-7 Aldrin wrote:

> Hello!
>
> I am trying to figure out how to make this work and hitting issues with 
> every approach I have tried.
>
> The core of my scenario can be seen here Gist - protobuf dependency graph 
> . I have 
> my own project mohair (we can call "A") and I am trying to link against 
> duckdb (we can call "B"), and both use substrait (we can call "ProtoLib") 
> and arrow (we can call "C").
>
> I am hoping to find some reasonable solution to the above that either uses 
> only cmake (used by duckdb), or is build system diagnostic (I prefer to use 
> meson). Below, I describe what I have tried and what errors I get.
>
> When building duckdb and arrow together (B + C + ProtoLib), all is well. 
> When building mohair and arrow together (A + C + ProtoLib), all is well. 
> When building mohair and duckdb and arrow together (A + B + C + ProtoLib), 
> I get one of a variety of errors:
> * with duckdb's vendored protobuf sources, I get version mismatch on protoc
> * when updating duckdb's vendored protobuf sources, I get duplicate 
> descriptor errors (sharing a pool without sharing descriptors?)
> * when migrating substrait to its own library, which I link against from 
> duckdb and mohair, I get either: (1) undefined symbol errors (trying to 
> change which library links protobuf) or (2) duplicate descriptor errors
> * when trying to build everything together (in hopes of explicitly reusing 
> the same descriptors), I have various build issues
>
> As far as building everything together, I am customizing duckdb extensions 
> which seem to be required to be out-of-source. However, building it as a 
> meson subproject seemingly requires the extensions to be in-source. The 
> last thing I am going to try is to build everything together using cmake 
> only, but this will not work beyond this prototype and I need a portable 
> solution.
>
> Thank you for any help!
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/958b8f46-0c7f-41ff-91a5-2f1ca2cc601dn%40googlegroups.com.


[protobuf] Re: Guidance on Installing and Running Protocol Buffers C++ on VS Code

2024-09-13 Thread 'Aldrin' via Protocol Buffers
Some of your questions can be answered in the "how do I start" section [1]. 
To fast forward your exploration through some of the links:

1. Software Requirements: There aren't any additional tools to *use* 
protobuf (Protocol Buffers). If you're building it, then there are some 
dependencies described in (2) below.
2. Installation Guide: Here is how to install protoc on windows [2]
3. Necessary Commands: "Using protobuf" consists of the following general 
workflow:
A. Use `protoc` to generate source code [4] from protobuf 
definitions[3].
B. Include your generated source code in your project [5]. This is 
essentially putting the ".h" and ".cc" files in your project and treating 
them like normal header and source files.
C. Compile your project. How you do this depends on how you build your 
C++ code. An example using a Makefile is available in the protobuf repo [6].

As you can maybe guess, generating source code has to happen when you 
change your protobuf definitions (protocol definition), but isn't necessary 
otherwise. Everytime you compile your project, though, you need to link 
against the protobuf library. The example I reference above uses pkg-config 
which looks like this:

> >> pkg-config --cflags --libs protobuf
> -DPROTOBUF_USE_DLLS -Wno-float-conversion -Wno-implicit-float-conversion 
-Wno-implicit-int-float-conversion -Wno-unknown-warning-option -DNOMINMAX 
-I... -L... ... -labsl_log_severity

note that I omitted a whole lot of link flags and details and a whole lot 
of other repeated flags (unfortunate effect of how pkg-config for absl is 
implemented)


[1]: https://protobuf.dev/#how-do-i-start
[2]: 
https://github.com/protocolbuffers/protobuf/tree/main/src#c-protobuf---windows
[3]: https://protobuf.dev/getting-started/cpptutorial/#protocol-format
[4]: 
https://protobuf.dev/getting-started/cpptutorial/#compiling-protocol-buffers
[5]: https://protobuf.dev/getting-started/cpptutorial/#writing-a-message
[6]: 
https://github.com/protocolbuffers/protobuf/blob/main/examples/Makefile#L47-L49

On Friday, September 13, 2024 at 8:56:32 AM UTC-7 Dương Lê wrote:

> Hello everyone,
> I’m a beginner with Protocol Buffers and I’m trying to set up my 
> environment for C++ in Visual Studio Code. I have a few questions and would 
> greatly appreciate your help:
> 1. Software Requirements: What software do I need to install to use 
> Protocol Buffers with C++? Are there any additional tools I should be aware 
> of?
> 2. Installation Guide: Could anyone share a detailed guide on how to 
> install Protocol Buffers on Windows
> 3. Necessary Commands: After installation, what commands do I need to run 
> to compile and execute a program using Protocol Buffers in C++?
> I really appreciate any assistance you can provide. Thank you so much!
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/6132d624-da6a-4b67-a167-26f5a7d6e781n%40googlegroups.com.


[protobuf] Re: Suppressing invalid UTF-8 data warnings?

2024-09-13 Thread Florian Suri-Payer
Currently I was using `syntax = "proto2";`
I had gone ahead with the re-factor, so I think its fine. 

Thanks again Em,
Florian
On Tuesday, September 10, 2024 at 5:11:39 PM UTC-4 Em Rauch wrote:

> I *think* if you use a proto2 syntax message it actually will not perform 
> this check as of today (only proto3 syntax file).
>
> If that's not right, I unfortunately suspect the only way around it would 
> be vendor the protobuf runtime into your codebase and comment out the check 
> / log if its bothering you.
> On Friday, September 6, 2024 at 11:43:28 AM UTC-4 fs...@cornell.edu wrote:
>
>> Thank you for the detailed answer Em, I really appreciate it!
>>
>> Good to know the warning can probably be ignored for now. I've opted to 
>> do the repeated option for now to avoid my logs being drowned in the 
>> warnings... I take it there is no way to suppress warnings?
>>
>> Best,
>> Florian
>>
>> On Thursday, September 5, 2024 at 5:19:00 PM UTC-4 Em Rauch wrote:
>>
>>> Using non-UTF8 data in a string field should be understood as incorrect, 
>>> but realistically will work today as long as your messages are only used 
>>> exactly by C++ Protobuf on the current release of protobuf and only ever 
>>> with the binary wire format (not textproto or JSON encoding, etc).
>>>
>>> Today the malformed utf8 enforcement exists to different degrees in the 
>>> different languages (and even depending on the syntax of the .proto file), 
>>> but its not semantically intended that a `string` field should be used for 
>>> non-utf8 data in any language. It should be assumed that a serialized 
>>> message with a map where the keys are non-utf8 may start to 
>>> parse-fail in some future release of Protobuf.
>>>
>>> Unfortunately bytes as a map key isn't allowed due to obscure technical 
>>> concerns related to some non-C++ languages and the JSON representation, and 
>>> we don't have an immediate plan to relax that.
>>>
>>> Realistically your options are:
>>> - Keep doing what you're doing, only ever keep these messages in C++ and 
>>> binary wire encoding, ignore the warnings, know that it might stop working 
>>> if a future release of protobuf
>>> - Make your key data be valid utf8 strings instead (eg, use a base64 
>>> encoding of the digest instead of the raw digest bytes)
>>> - Use repeated of a message with a key and value field instead of a map, 
>>> and use your own struct as the in-memory representation when processing 
>>> (move the data into/out of a STL map at the parse/serialization boundaries 
>>> instead).
>>>
>>> Sorry there's not a more trivial fix available for this usecase!
>>>
>>> On Thursday, September 5, 2024 at 5:03:03 PM UTC-4 fs...@cornell.edu 
>>> wrote:
>>>
 Hi,

 I've been using protobuf 3.5.1 in c++ and am using a message type with 
 the following map type: `map txns = 1`

 It is my understanding that `string` and `bytes` are the same in proto 
 c++; for maps however one can only use `string` as keys. I'm using the key 
 field to send around transaction digests which are byte strings consisting 
 of cryptographic hashes. As far as I can tell, it makes no difference 
 whether I use strings/bytes (the decoding works), yet I keep getting the 
 error:
  
  `String field 'pequinstore.proto.MergedSnapshot.MergedTxnsEntry.key' 
 contains invalid UTF-8 data when serializing a protocol buffer. Use the 
 'bytes' type if you intend to send raw bytes.`

 I understand the error is complaining about my digests possibly not 
 being UTF-8, but I'm unsure if I actually need to be concerned about it; I 
 have not noticed any problems with parsing. Is there a way to suppress 
 this 
 error?

 Or, if this is a serious error that could lead to non-deterministic 
 behavior, do you have a suggested workaround? There is a lot of existing 
 code that uses the map structure akin to an STL map, so I'd like to avoid 
 re-factoring the protobuf into a repeated field if possible. 

 Thanks,
 Florian

>>>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/8f73448e-c5b1-4646-981c-c48a4133ec22n%40googlegroups.com.


[protobuf] Re: Suppressing invalid UTF-8 data warnings?

2024-09-10 Thread 'Em Rauch' via Protocol Buffers
I *think* if you use a proto2 syntax message it actually will not perform 
this check as of today (only proto3 syntax file).

If that's not right, I unfortunately suspect the only way around it would 
be vendor the protobuf runtime into your codebase and comment out the check 
/ log if its bothering you.
On Friday, September 6, 2024 at 11:43:28 AM UTC-4 fs...@cornell.edu wrote:

> Thank you for the detailed answer Em, I really appreciate it!
>
> Good to know the warning can probably be ignored for now. I've opted to do 
> the repeated option for now to avoid my logs being drowned in the 
> warnings... I take it there is no way to suppress warnings?
>
> Best,
> Florian
>
> On Thursday, September 5, 2024 at 5:19:00 PM UTC-4 Em Rauch wrote:
>
>> Using non-UTF8 data in a string field should be understood as incorrect, 
>> but realistically will work today as long as your messages are only used 
>> exactly by C++ Protobuf on the current release of protobuf and only ever 
>> with the binary wire format (not textproto or JSON encoding, etc).
>>
>> Today the malformed utf8 enforcement exists to different degrees in the 
>> different languages (and even depending on the syntax of the .proto file), 
>> but its not semantically intended that a `string` field should be used for 
>> non-utf8 data in any language. It should be assumed that a serialized 
>> message with a map where the keys are non-utf8 may start to 
>> parse-fail in some future release of Protobuf.
>>
>> Unfortunately bytes as a map key isn't allowed due to obscure technical 
>> concerns related to some non-C++ languages and the JSON representation, and 
>> we don't have an immediate plan to relax that.
>>
>> Realistically your options are:
>> - Keep doing what you're doing, only ever keep these messages in C++ and 
>> binary wire encoding, ignore the warnings, know that it might stop working 
>> if a future release of protobuf
>> - Make your key data be valid utf8 strings instead (eg, use a base64 
>> encoding of the digest instead of the raw digest bytes)
>> - Use repeated of a message with a key and value field instead of a map, 
>> and use your own struct as the in-memory representation when processing 
>> (move the data into/out of a STL map at the parse/serialization boundaries 
>> instead).
>>
>> Sorry there's not a more trivial fix available for this usecase!
>>
>> On Thursday, September 5, 2024 at 5:03:03 PM UTC-4 fs...@cornell.edu 
>> wrote:
>>
>>> Hi,
>>>
>>> I've been using protobuf 3.5.1 in c++ and am using a message type with 
>>> the following map type: `map txns = 1`
>>>
>>> It is my understanding that `string` and `bytes` are the same in proto 
>>> c++; for maps however one can only use `string` as keys. I'm using the key 
>>> field to send around transaction digests which are byte strings consisting 
>>> of cryptographic hashes. As far as I can tell, it makes no difference 
>>> whether I use strings/bytes (the decoding works), yet I keep getting the 
>>> error:
>>>  
>>>  `String field 'pequinstore.proto.MergedSnapshot.MergedTxnsEntry.key' 
>>> contains invalid UTF-8 data when serializing a protocol buffer. Use the 
>>> 'bytes' type if you intend to send raw bytes.`
>>>
>>> I understand the error is complaining about my digests possibly not 
>>> being UTF-8, but I'm unsure if I actually need to be concerned about it; I 
>>> have not noticed any problems with parsing. Is there a way to suppress this 
>>> error?
>>>
>>> Or, if this is a serious error that could lead to non-deterministic 
>>> behavior, do you have a suggested workaround? There is a lot of existing 
>>> code that uses the map structure akin to an STL map, so I'd like to avoid 
>>> re-factoring the protobuf into a repeated field if possible. 
>>>
>>> Thanks,
>>> Florian
>>>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/553698bd-9410-42fa-be51-989ba0e1a146n%40googlegroups.com.


[protobuf] Re: Performance Improvement with Protobuf and UPB in Python and C++

2024-09-10 Thread 'Tony Liao' via Protocol Buffers
Hi Evan,

Performance is something that will depend on the nature of your workload 
quite a lot, so it's hard to give advice without knowing more detail. Here 
are some recommendations that we tend to give:

   - Using arenas can reduce the cost of memory allocation. This can be 
   especially impactful during parsing (deserializing).
  - https://protobuf.dev/reference/cpp/arenas/
   - Protobuf reflection is slow -- try to avoid it in performance 
   sensitive code.
   - string fields are UTF-8 validated. If you don't need UTF-8 validation, 
   prefer to use bytes instead.
   - Deeply nested messages can result in a lot of pointer indirection. In 
   some cases, you might be able to come up with a wire-compatible message 
   format that is shallower. However, this may come at the cost of flexibility.
   - Serialization performance is O(number of fields). If you have a large 
   message with 1000s or 1s of fields, serializing it can be noticeably 
   slower than serializing a message with a handful of fields.

We also have best practices published here: 
https://protobuf.dev/programming-guides/dos-donts/. It's not focused on 
performance though.

And please keep in mind that we are continually evolving the protobuf 
implementation, so anything we're saying here isn't a substitute for 
running your own benchmarks. :)

Cheers,
Tony

On Monday, September 9, 2024 at 5:23:37 PM UTC-4 joe...@gmail.com wrote:

> Hi Tony,
>
> Thank you for the clarification! 
>
> Do you have any recommendations or best practices for improving 
> serialization/deserialization performance specifically for Protobuf in C++? 
> I'd appreciate any insights or optimizations you or anyone can suggest.
>
> Best regards,  
> Evan
>
> On Monday, September 9, 2024 at 1:52:01 PM UTC-7 Tony Liao wrote:
>
>> Hi Evan,
>>
>> The protobuf C++ implementation is separate from UPB and has its own set 
>> of optimizations driven by Google's own workloads.
>>
>> I can say that for Google's C++ workloads, we have not observed better 
>> performance with UPB. For this reason, we are still shipping with the 
>> native implementation for C++.
>>
>> -Tony
>>
>> On Friday, September 6, 2024 at 7:08:09 PM UTC-4 joe...@gmail.com wrote:
>>
>>> Hi everyone,
>>>
>>> We're currently using Protobuf 3.14 with C++ and Python in our project 
>>> and are looking to improve serialization/deserialization performance. I 
>>> recently tried Protobuf 3.24 and noticed performance improvements in 
>>> Python, likely due to the use of UPB.
>>>
>>> I have a couple of questions:
>>>
>>> 1. Does UPB also provide performance improvements for C++?
>>> 2. If so, in which version of Protobuf was UPB introduced for C++?
>>>
>>> Thanks in advance for your help!
>>>
>>> Best regards,  
>>> Evan
>>>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/e15db3f5-7761-422c-8b32-5ceea719a26cn%40googlegroups.com.


[protobuf] Re: Performance Improvement with Protobuf and UPB in Python and C++

2024-09-09 Thread Evan Lu
Hi Tony,

Thank you for the clarification! 

Do you have any recommendations or best practices for improving 
serialization/deserialization performance specifically for Protobuf in C++? 
I'd appreciate any insights or optimizations you or anyone can suggest.

Best regards,  
Evan

On Monday, September 9, 2024 at 1:52:01 PM UTC-7 Tony Liao wrote:

> Hi Evan,
>
> The protobuf C++ implementation is separate from UPB and has its own set 
> of optimizations driven by Google's own workloads.
>
> I can say that for Google's C++ workloads, we have not observed better 
> performance with UPB. For this reason, we are still shipping with the 
> native implementation for C++.
>
> -Tony
>
> On Friday, September 6, 2024 at 7:08:09 PM UTC-4 joe...@gmail.com wrote:
>
>> Hi everyone,
>>
>> We're currently using Protobuf 3.14 with C++ and Python in our project 
>> and are looking to improve serialization/deserialization performance. I 
>> recently tried Protobuf 3.24 and noticed performance improvements in 
>> Python, likely due to the use of UPB.
>>
>> I have a couple of questions:
>>
>> 1. Does UPB also provide performance improvements for C++?
>> 2. If so, in which version of Protobuf was UPB introduced for C++?
>>
>> Thanks in advance for your help!
>>
>> Best regards,  
>> Evan
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/3c53e7ac-abc5-434a-bcf0-7ea1c8e44dc7n%40googlegroups.com.


[protobuf] Re: Performance Improvement with Protobuf and UPB in Python and C++

2024-09-09 Thread 'Tony Liao' via Protocol Buffers
Hi Evan,

The protobuf C++ implementation is separate from UPB and has its own set of 
optimizations driven by Google's own workloads.

I can say that for Google's C++ workloads, we have not observed better 
performance with UPB. For this reason, we are still shipping with the 
native implementation for C++.

-Tony

On Friday, September 6, 2024 at 7:08:09 PM UTC-4 joe...@gmail.com wrote:

> Hi everyone,
>
> We're currently using Protobuf 3.14 with C++ and Python in our project and 
> are looking to improve serialization/deserialization performance. I 
> recently tried Protobuf 3.24 and noticed performance improvements in 
> Python, likely due to the use of UPB.
>
> I have a couple of questions:
>
> 1. Does UPB also provide performance improvements for C++?
> 2. If so, in which version of Protobuf was UPB introduced for C++?
>
> Thanks in advance for your help!
>
> Best regards,  
> Evan
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/a560344a-1ef6-449f-b17e-d92d16dcf09an%40googlegroups.com.


[protobuf] Re: Configuration of protobuf 3.22.2

2024-09-09 Thread 'Tony Liao' via Protocol Buffers
Hi Nazibur,

Protobuf 2.5.0 is really old -- running ./autogen.sh is part of using 
AutoMake, but we've migrated to CMake and Bazel. AutoTools was turned down 
in the middle of 2022 -- see PR #10132 
.

Our preferred way to build would be to use Bazel . If 
that's not possible, CMake  should provide wider 
compatibility with other (non-Google) open source projects.

You can find Bazel configurations at .bazelrc 
 and 
BUILD.bazel 
 files.

You can find CMake configuration here: 
https://github.com/protocolbuffers/protobuf/tree/main/cmake

-Tony

On Thursday, September 5, 2024 at 3:32:35 AM UTC-4 nazib...@gmail.com wrote:

> Hi All,
>
> We are using protobuf 2.5.0 in our project and we want to upgrade tp 
> protobuf v3.22.2 but we didn't find any configure or autogen.sh file 
> present in V3.22.2. Without this any one of the file how to configure?
>
> It will be helpful if you can share configuration doc if available.
>
> Regards,
> Nazibur Rahman
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/4f03fba8-17d5-46c1-a8b9-e0430510f7ban%40googlegroups.com.


[protobuf] Re: Suppressing invalid UTF-8 data warnings?

2024-09-06 Thread Florian Suri-Payer
Thank you for the detailed answer Em, I really appreciate it!

Good to know the warning can probably be ignored for now. I've opted to do 
the repeated option for now to avoid my logs being drowned in the 
warnings... I take it there is no way to suppress warnings?

Best,
Florian

On Thursday, September 5, 2024 at 5:19:00 PM UTC-4 Em Rauch wrote:

> Using non-UTF8 data in a string field should be understood as incorrect, 
> but realistically will work today as long as your messages are only used 
> exactly by C++ Protobuf on the current release of protobuf and only ever 
> with the binary wire format (not textproto or JSON encoding, etc).
>
> Today the malformed utf8 enforcement exists to different degrees in the 
> different languages (and even depending on the syntax of the .proto file), 
> but its not semantically intended that a `string` field should be used for 
> non-utf8 data in any language. It should be assumed that a serialized 
> message with a map where the keys are non-utf8 may start to 
> parse-fail in some future release of Protobuf.
>
> Unfortunately bytes as a map key isn't allowed due to obscure technical 
> concerns related to some non-C++ languages and the JSON representation, and 
> we don't have an immediate plan to relax that.
>
> Realistically your options are:
> - Keep doing what you're doing, only ever keep these messages in C++ and 
> binary wire encoding, ignore the warnings, know that it might stop working 
> if a future release of protobuf
> - Make your key data be valid utf8 strings instead (eg, use a base64 
> encoding of the digest instead of the raw digest bytes)
> - Use repeated of a message with a key and value field instead of a map, 
> and use your own struct as the in-memory representation when processing 
> (move the data into/out of a STL map at the parse/serialization boundaries 
> instead).
>
> Sorry there's not a more trivial fix available for this usecase!
>
> On Thursday, September 5, 2024 at 5:03:03 PM UTC-4 fs...@cornell.edu 
> wrote:
>
>> Hi,
>>
>> I've been using protobuf 3.5.1 in c++ and am using a message type with 
>> the following map type: `map txns = 1`
>>
>> It is my understanding that `string` and `bytes` are the same in proto 
>> c++; for maps however one can only use `string` as keys. I'm using the key 
>> field to send around transaction digests which are byte strings consisting 
>> of cryptographic hashes. As far as I can tell, it makes no difference 
>> whether I use strings/bytes (the decoding works), yet I keep getting the 
>> error:
>>  
>>  `String field 'pequinstore.proto.MergedSnapshot.MergedTxnsEntry.key' 
>> contains invalid UTF-8 data when serializing a protocol buffer. Use the 
>> 'bytes' type if you intend to send raw bytes.`
>>
>> I understand the error is complaining about my digests possibly not being 
>> UTF-8, but I'm unsure if I actually need to be concerned about it; I have 
>> not noticed any problems with parsing. Is there a way to suppress this 
>> error?
>>
>> Or, if this is a serious error that could lead to non-deterministic 
>> behavior, do you have a suggested workaround? There is a lot of existing 
>> code that uses the map structure akin to an STL map, so I'd like to avoid 
>> re-factoring the protobuf into a repeated field if possible. 
>>
>> Thanks,
>> Florian
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/a34ccc1e-f24d-4c03-ac8c-a420683f3dd1n%40googlegroups.com.


[protobuf] Re: Suppressing invalid UTF-8 data warnings?

2024-09-05 Thread 'Em Rauch' via Protocol Buffers
Using non-UTF8 data in a string field should be understood as incorrect, 
but realistically will work today as long as your messages are only used 
exactly by C++ Protobuf on the current release of protobuf and only ever 
with the binary wire format (not textproto or JSON encoding, etc).

Today the malformed utf8 enforcement exists to different degrees in the 
different languages (and even depending on the syntax of the .proto file), 
but its not semantically intended that a `string` field should be used for 
non-utf8 data in any language. It should be assumed that a serialized 
message with a map where the keys are non-utf8 may start to 
parse-fail in some future release of Protobuf.

Unfortunately bytes as a map key isn't allowed due to obscure technical 
concerns related to some non-C++ languages and the JSON representation, and 
we don't have an immediate plan to relax that.

Realistically your options are:
- Keep doing what you're doing, only ever keep these messages in C++ and 
binary wire encoding, ignore the warnings, know that it might stop working 
if a future release of protobuf
- Make your key data be valid utf8 strings instead (eg, use a base64 
encoding of the digest instead of the raw digest bytes)
- Use repeated of a message with a key and value field instead of a map, 
and use your own struct as the in-memory representation when processing 
(move the data into/out of a STL map at the parse/serialization boundaries 
instead).

Sorry there's not a more trivial fix available for this usecase!

On Thursday, September 5, 2024 at 5:03:03 PM UTC-4 fs...@cornell.edu wrote:

> Hi,
>
> I've been using protobuf 3.5.1 in c++ and am using a message type with the 
> following map type: `map txns = 1`
>
> It is my understanding that `string` and `bytes` are the same in proto 
> c++; for maps however one can only use `string` as keys. I'm using the key 
> field to send around transaction digests which are byte strings consisting 
> of cryptographic hashes. As far as I can tell, it makes no difference 
> whether I use strings/bytes (the decoding works), yet I keep getting the 
> error:
>  
>  `String field 'pequinstore.proto.MergedSnapshot.MergedTxnsEntry.key' 
> contains invalid UTF-8 data when serializing a protocol buffer. Use the 
> 'bytes' type if you intend to send raw bytes.`
>
> I understand the error is complaining about my digests possibly not being 
> UTF-8, but I'm unsure if I actually need to be concerned about it; I have 
> not noticed any problems with parsing. Is there a way to suppress this 
> error?
>
> Or, if this is a serious error that could lead to non-deterministic 
> behavior, do you have a suggested workaround? There is a lot of existing 
> code that uses the map structure akin to an STL map, so I'd like to avoid 
> re-factoring the protobuf into a repeated field if possible. 
>
> Thanks,
> Florian
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/fb062f2d-fdde-40ad-9eb9-a7717df0d6afn%40googlegroups.com.


[protobuf] Re: Debugging proto file already registered issue (health.proto)

2024-08-21 Thread Daz Wilkin
I think this is a Tensorflow problem and you may be better placed asking in 
those forums.

To my knowledge, the regular protocol buffer tools don't use a a database. 

On Tuesday, August 20, 2024 at 4:18:04 PM UTC-7 Ashish Tripathi wrote:

> Hi,
>I am currently encountering an issue where the build reports a failure 
> with the message: "File already exists in database: health.proto." This 
> error seems to suggest that the health.proto file is being registered 
> multiple times, possibly by different libraries. (error snippet below)
> *File already exists in database: health.proto **CHECK failed: 
> GeneratedDatabase()->Add(encoded_file_descriptor, size):* 
> To address this, I am wondering if there is any gRPC environment variable 
> or other mechanism that can be utilized to determine which libraries are 
> registering this proto file. Identifying the source of the conflict would 
> greatly assist in resolving the issue.
>
> For e,g, something like this (pseudocode)
> panic: proto: file "error.proto" is already registered
> previously from: "goa.design/goa/v3/grpc/pb"
> currently from: "github.com/googleapis/gax-go/v2/apierror/internal/proto"
>
> Any guidance or recommendations you could provide would be greatly 
> appreciated.
>
> Thank you,
> Ashish
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/e852dadb-c4ce-4ccd-82c0-18509440924bn%40googlegroups.com.


[protobuf] Re: How to properly do static linking within CMake project

2024-08-19 Thread Claude Robitaille
I forgot that this is required in my main project top CMakeList.txt:

find_package(protobuf REQUIRED PATHS 
$ENV{INSTALL}/protobuf/lib/cmake/protobuf)
include($ENV{INSTALL}/protobuf/lib/cmake/protobuf/protobuf-module.cmake)

Also, I am on Linux but I do not see why it would be different on macOS.


On Monday, August 19, 2024 at 9:46:38 a.m. UTC-4 Claude Robitaille wrote:

> I have been using protobuf with static linking for years and moved to 
> cmake a few months ago.
>
> Since I always cross-compile I never use the system installed library and 
> tools so my flow is this
> 1 - Build, using cmake anything that must be using the local compiler (in 
> the case of protobuf because protoc is needed for later). ATM it is only 
> protobuf
> 2 - Build all the dependencies, also using cmake
> 3 - Build my project, including internal libraries.
>
> For the first step, ie building protobuf and produce a static archive (a 
> file ending with .a) I use this excerpt from the corresponding cmake:
> include(ExternalProject)
>
> ExternalProject_Add(
> protobuf
> GIT_REPOSITORY https://github.com/protocolbuffers/protobuf.git
> GIT_TAG v3.20.3
> SOURCE_SUBDIR cmake
> CMAKE_ARGS
> -Dprotobuf_BUILD_TESTS=OFF
> -Dprotobuf_BUILD_EXAMPLES=OFF
> -DCMAKE_INSTALL_PREFIX=$ENV{INSTALL}/protobuf
> -Dprotobuf_BUILD_CONFORMANCE=OFF
> -Dprotobuf_BUILD_EXAMPLES=OFF
> -Dprotobuf_BUILD_LIBPROTOC=ON
> -Dprotobuf_BUILD_SHARED_LIBS=OFF
> -Dprotobuf_MSVC_STATIC_RUNTIME=OFF
> -Dprotobuf_WITH_ZLIB=OFF
> BUILD_BYPRODUCTS
> $ENV{INSTALL}/lib/libprotobuf.a
> $ENV{INSTALL}/bin/protoc
> )
>
> Note that the install prefix comes from an environment variable. It is 
> normally set to ./build  In other word, protobuf and the cmake files are in 
> a child directory off the main build location (which is used by step 3 
> above). Also, SHARED_LIBS is Off meaning that a static library is created.
>
> And then in my main project (so step 3 above), I use this to process my 
> proto files (I use to only deal with c++ hence the name; I added python at 
> a later stage):
>
> set(GLOBAL_PROTO_FILES "" CACHE INTERNAL "")
>
> # Define the append function used by other CMakeList files
> function(append_proto_file filename)
> set(GLOBAL_PROTO_FILES ${GLOBAL_PROTO_FILES} ${filename} CACHE 
> INTERNAL "")
> endfunction()
>
> #
> # The proto file processing function
> #
>
> function(CREATE_CPP_PROTO SRCS HDRS)
> list(REMOVE_DUPLICATES GLOBAL_PROTO_FILES)
> 
> set(${SRCS})
> set(${HDRS})   
>
> #printList(VAR ${PROTO_SRC_TREE} TITLE "Proto location location")
> #printList(VAR ${GLOBAL_PROTO_FILES} TITLE "Proto files")
>   
> add_library(proto-objects OBJECT ${GLOBAL_PROTO_FILES})
> target_link_libraries(proto-objects PUBLIC protobuf::libprotobuf)
> target_include_directories(proto-objects PUBLIC 
> "$")
> 
> make_directory($ENV{GENERATED})
> make_directory($ENV{GENERATED}/python)
> 
> # This is a function provided by the cmake binding for protobuf 
> # C++
> protobuf_generate(
>   TARGET proto-objects
>   LANGUAGE cpp
>   PROTOC_OUT_DIR $ENV{GENERATED}
>   IMPORT_DIRS ${MODELS_DIR};${PROTO_SRC_TREE}
>   #IMPORT_DIRS ${MODELS_DIR}
>   #PROTOC_OPTIONS "--cpp_opt=paths=source_relative"
>   #PROTOC_OPTIONS "--proto_path=${MODELS_DIR}"
> )
> 
> # Python
> protobuf_generate(
>   TARGET proto-objects
>   LANGUAGE python
>   PROTOC_OUT_DIR $ENV{GENERATED}/python
>   IMPORT_DIRS ${MODELS_DIR};${PROTO_SRC_TREE}
> )
>
> target_include_directories(proto-objects PRIVATE $ENV{GENERATED})
>
> # Probably useless but other targets depends on it, so we keep it but 
> it does nothing. In future we could probably remove
> # reference to it. 
> add_custom_target(PROTO_FILES_READY
>   COMMENT "Moving .pb.h and _pb2.py files to replicate source 
> directory structure"
> )  
>   
> set_source_files_properties(${${SRCS}} ${${HDRS}} PROPERTIES GENERATED 
> TRUE)
> set(${SRCS} ${${SRCS}} PARENT_SCOPE)
> set(${HDRS} ${${HDRS}} PARENT_SCOPE)
> endfunction()
>
>
>
> On Monday, August 19, 2024 at 2:32:25 a.m. UTC-4 Marcin Lewandowski wrote:
>
>> Hello,
>>
>> I am working on a library where I would use protobuf. 
>>
>> I need to link it statically. The target platform is macOS. 
>>
>> The usual way of fetching dependencies (brew install protobuf) is no go, 
>> as it comes only with shared libraries.
>>
>> I tried forcing homebrew to rebuild protobuf (brew install protobuf 
>> --build-from-source) but it fails as some tests are not passing (logs [1] 
>> at the end of this message).
>>
>> I fetched the source code (27.3) release from github, run 
>>
>> bazel build :portico :protobuf 
>>
>> inside the source directory, but while it compiles protoc compiler, 
>> there's no CMake/protobuf-lite library in the output.
>>
>>

[protobuf] Re: How to properly do static linking within CMake project

2024-08-19 Thread Claude Robitaille
I have been using protobuf with static linking for years and moved to cmake 
a few months ago.

Since I always cross-compile I never use the system installed library and 
tools so my flow is this
1 - Build, using cmake anything that must be using the local compiler (in 
the case of protobuf because protoc is needed for later). ATM it is only 
protobuf
2 - Build all the dependencies, also using cmake
3 - Build my project, including internal libraries.

For the first step, ie building protobuf and produce a static archive (a 
file ending with .a) I use this excerpt from the corresponding cmake:
include(ExternalProject)

ExternalProject_Add(
protobuf
GIT_REPOSITORY https://github.com/protocolbuffers/protobuf.git
GIT_TAG v3.20.3
SOURCE_SUBDIR cmake
CMAKE_ARGS
-Dprotobuf_BUILD_TESTS=OFF
-Dprotobuf_BUILD_EXAMPLES=OFF
-DCMAKE_INSTALL_PREFIX=$ENV{INSTALL}/protobuf
-Dprotobuf_BUILD_CONFORMANCE=OFF
-Dprotobuf_BUILD_EXAMPLES=OFF
-Dprotobuf_BUILD_LIBPROTOC=ON
-Dprotobuf_BUILD_SHARED_LIBS=OFF
-Dprotobuf_MSVC_STATIC_RUNTIME=OFF
-Dprotobuf_WITH_ZLIB=OFF
BUILD_BYPRODUCTS
$ENV{INSTALL}/lib/libprotobuf.a
$ENV{INSTALL}/bin/protoc
)

Note that the install prefix comes from an environment variable. It is 
normally set to ./build  In other word, protobuf and the cmake files are in 
a child directory off the main build location (which is used by step 3 
above). Also, SHARED_LIBS is Off meaning that a static library is created.

And then in my main project (so step 3 above), I use this to process my 
proto files (I use to only deal with c++ hence the name; I added python at 
a later stage):

set(GLOBAL_PROTO_FILES "" CACHE INTERNAL "")

# Define the append function used by other CMakeList files
function(append_proto_file filename)
set(GLOBAL_PROTO_FILES ${GLOBAL_PROTO_FILES} ${filename} CACHE INTERNAL 
"")
endfunction()

#
# The proto file processing function
#

function(CREATE_CPP_PROTO SRCS HDRS)
list(REMOVE_DUPLICATES GLOBAL_PROTO_FILES)

set(${SRCS})
set(${HDRS})   

#printList(VAR ${PROTO_SRC_TREE} TITLE "Proto location location")
#printList(VAR ${GLOBAL_PROTO_FILES} TITLE "Proto files")
  
add_library(proto-objects OBJECT ${GLOBAL_PROTO_FILES})
target_link_libraries(proto-objects PUBLIC protobuf::libprotobuf)
target_include_directories(proto-objects PUBLIC 
"$")

make_directory($ENV{GENERATED})
make_directory($ENV{GENERATED}/python)

# This is a function provided by the cmake binding for protobuf 
# C++
protobuf_generate(
  TARGET proto-objects
  LANGUAGE cpp
  PROTOC_OUT_DIR $ENV{GENERATED}
  IMPORT_DIRS ${MODELS_DIR};${PROTO_SRC_TREE}
  #IMPORT_DIRS ${MODELS_DIR}
  #PROTOC_OPTIONS "--cpp_opt=paths=source_relative"
  #PROTOC_OPTIONS "--proto_path=${MODELS_DIR}"
)

# Python
protobuf_generate(
  TARGET proto-objects
  LANGUAGE python
  PROTOC_OUT_DIR $ENV{GENERATED}/python
  IMPORT_DIRS ${MODELS_DIR};${PROTO_SRC_TREE}
)

target_include_directories(proto-objects PRIVATE $ENV{GENERATED})
   
# Probably useless but other targets depends on it, so we keep it but 
it does nothing. In future we could probably remove
# reference to it. 
add_custom_target(PROTO_FILES_READY
  COMMENT "Moving .pb.h and _pb2.py files to replicate source directory 
structure"
)  
  
set_source_files_properties(${${SRCS}} ${${HDRS}} PROPERTIES GENERATED 
TRUE)
set(${SRCS} ${${SRCS}} PARENT_SCOPE)
set(${HDRS} ${${HDRS}} PARENT_SCOPE)
endfunction()



On Monday, August 19, 2024 at 2:32:25 a.m. UTC-4 Marcin Lewandowski wrote:

> Hello,
>
> I am working on a library where I would use protobuf. 
>
> I need to link it statically. The target platform is macOS. 
>
> The usual way of fetching dependencies (brew install protobuf) is no go, 
> as it comes only with shared libraries.
>
> I tried forcing homebrew to rebuild protobuf (brew install protobuf 
> --build-from-source) but it fails as some tests are not passing (logs [1] 
> at the end of this message).
>
> I fetched the source code (27.3) release from github, run 
>
> bazel build :portico :protobuf 
>
> inside the source directory, but while it compiles protoc compiler, 
> there's no CMake/protobuf-lite library in the output.
>
> Apparently the runtime is based on CMake, so I run 
>
> cmake -S . -B build -Dprotobuf_BUILD_TESTS=OFF -DCMAKE_CXX_STANDARD=17 
> -Dprotobuf_ABSL_PROVIDER=package -Dprotobuf_JSONCPP_PROVIDER=package
>
> and then issued compilation in build/ dir, and yay, it compiled protobuf.
>
> However, when I added the build/cmake to CMAKE_PREFIX_PATH it complains 
> about missing build/cmake/protobuf/protobuf-targets.cmake
>
> So I tried adding the repository as git submodule, and adding it via 
> add_subdirectory CMake command, then it fails with
>
> CMake Error: install(EXPORT "protobuf-targets" ...) includes 

[protobuf] Re: Documentation link incorrect

2024-08-15 Thread Daz Wilkin
https://github.com/protocolbuffers/protocolbuffers.github.io/pull/177

On Thursday, August 15, 2024 at 8:50:06 AM UTC-7 Michael Owen wrote:

> Not sure if this is the best place to post this, but as I was looking 
> through the following page:
>
>- https://protobuf.dev/programming-guides/dos-donts/ I noticed that 
>the link for the '1-1-1-Rule', as highlighted in the attached image, is 
>wrong.  
>
> It currently points to: 
>
>- https://protobuf.dev/programming-guides/1-1-1.md
>
> And I think it should point to: 
>
>- https://protobuf.dev/programming-guides/1-1-1/
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/1b85b359-76e9-4f16-b144-ae915729eb83n%40googlegroups.com.


[protobuf] Re: Dynamic VS Static linking

2024-08-07 Thread saif kandil
Don't bother guys, I just saw new options that can be passed to 
Abseil: 
https://github.com/abseil/abseil-cpp/blob/3848ed7f1b13259a8ddd35dc56bf216ecd36d968/ci/linux_gcc-latest_libstdcxx_cmake.sh#L33

On Wednesday, August 7, 2024 at 3:19:48 PM UTC+3 saif kandil wrote:

> Hello folks,
>
> I created this simple solution to try to link absl, protobuf, and grpc 
> together to see if they work without problems, and I have figured out that 
> static linking is working perfect however the dynamic linking is not:
>
> Static linking status: 
> https://github.com/k0T0z/absl-proto-grpc-ci/actions/runs/10282329751/job/28453797159
> Dynamic linking status: 
> https://github.com/k0T0z/absl-proto-grpc-ci/actions/runs/10283676833/job/28458035515
>
> The error exists in the job file but here is a version of it:
>
> ```
> [ 53%] Linking CXX shared library libprotobuf.so 
> /usr/bin/ld: 
> /home/runner/work/absl-proto-grpc-ci/absl-proto-grpc-ci/absl-k0t0z-lib/lib/libabsl_log_internal_log_sink_set.a(log_sink_set.cc.o):
>  
> relocation R_X86_64_TPOFF32 against 
> `_ZZN4absl12lts_2024072212log_internal12_GLOBAL__N_121ThreadIsLoggingStatusEvE17thread_is_logging'
>  
> can not be used when making a shared object; recompile with -fPIC 
> /usr/bin/ld: failed to set dynamic section sizes: bad value 
> collect2: error: ld returned 1 exit status 
> make[2]: *** [CMakeFiles/libprotobuf.dir/build.make:1389: 
> libprotobuf.so.28.0.1] Error 1 
> make[1]: *** [CMakeFiles/Makefile2:142: CMakeFiles/libprotobuf.dir/all] 
> Error 2 
> make: *** [Makefile:136: all] Error 2
> ```
>
> Can anyone please explain it to me? Why this happened? I built absl from 
> scratch.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/e10c651c-74c6-447f-b93f-5bd0fbdc2e7bn%40googlegroups.com.


[protobuf] Re: compatibility (Python, C++, Go, ...): should newlines (and whitespace in general) in base64 be allowed? (for byte fields in JSON encoding)

2024-07-15 Thread 'Nicolas Hillegeer' via Protocol Buffers
Ping.

On Thursday, July 11, 2024 at 10:51:27 AM UTC+2 Nicolas Hillegeer wrote:

> In the protocolbuffers-go we received a bug (
> https://github.com/golang/protobuf/issues/1626) for our protojson 
> implementation not accepting strings with whitespace (specifically "\n" in 
> this case) in them.
>
> I think I verified that the C++ implementation also doesn't accept them 
> (see my last comment on the issue). We prefer to stay compatible to C++ as 
> much as possible. So the question is: should this be allowed?
>
> Thanks,
> Nicolas with the hat of Go protobuf maintainer
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/a8019179-e253-4dd2-bc4b-6d3df52082a1n%40googlegroups.com.


[protobuf] Re: 🥊🏆 เว็บแทงมวยที่ดีที่สุด แทงมวยออนไลน์บนมือถือ เว็บแทงมวยยอดนิยม🏆🥊เว็บแทงมวยออนไลน์อันดับ1ของเอเชีย🥊

2024-07-09 Thread Deeraya Seachotsa
🔴📢🦉แทงไฮโลออนไลน์กับ@pm356 
เล่นไฮโลได้เงินจริงถอนรัวๆเว็บคาสิโนออนไลน์เขย่าไฮโลสุดมันส์🔴📢🦉

🔴🔴เว็บแทงไฮโลไทย เป็นเว็บตรงที่นำเสนอวิธีที่ง่ายที่สุดในการทำกำไร 
ขณะเดียวกันก็มอบประสบการณ์เล่นเกมที่สนุกสนานอีกด้วย 
เลยกลายเป็นคำถามสำหรับผู้ที่อยากเล่นเกมว่าผู้ให้บริการ แทงไฮโลไทย ค่ายไหน? 
สามารถสร้างผลตอบแทนได้ดีที่สุด�🔴🔴

🥊 สมัครสมาชิกที่นี่   https://page.line.me/pm356
🥊 ช่องทางเข้าเว็บไซด์   https://page.line.me/pm356
🥊 ติดต่อสอบถาม https://page.line.me/pm356


🎯  🎯 ไฮโล ไทย ไฮโล พื้uบ้ๅu เว็UตรJ 🔥 🎯
✅สมัค$ ➤  https://page.line.me/pm356
✅หsือ แอoไลu์  ➤ @pm356 (มี @ นำหน้าด้วยนะคะ )
➖➖
🎲•เริ่มต้u 1o บาท
🎲•ฝๅกถouไม่มีขั้uต่ำ ไม่ทำเทิร์u

#ไฮโลออนไลน์ #ไฮโลไทย #ไฮโล #เว็บไฮโล #ไฮโลไทยออนไลน์ #เว็บไฮโลไทย #แทงไฮโล 
#ไฮโลเว็บตรง #ไฮโลออนไลน์เว็บตรง #ไฮโลพื้นบ้าน #ไฮโลไทยเว็บตรง 
#เว็บไฮโลไทยฝากถอนไม่มีขั้นต่ำ

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/abb8ea8b-6ea5-4398-ab5e-93d832f7a9c5n%40googlegroups.com.


[protobuf] Re: UFA222 👑แนะนำเว็บมวย @MST22 👑แทงมวยอันดับหนึ่ง ออนไลน์ด้วยราคามวยดีที่สุด

2024-07-09 Thread Deavpir Nidwran


On Tuesday, July 9, 2024 at 4:50:39 PM UTC+7 Roodlux Vorsfac wrote:

 
UFA222 👑แนะนำเว็บมวย @MST22  👑แทงมวยอันดับหนึ่ง ออนไลน์ด้วยราคามวยดีที่สุด

แทงมวยที่ได้มาตรฐานสูง มาพร้อมรูปแบบการเดิมพันที่หลากหลาย 
และเต็มไปด้วยอัตราการจ่ายที่ดีที่สุด 
จึงเหมาะอย่างยิ่งกับนักเดิมพันที่ชื่นชอบในกีฬา การแทงมวยออนไลน์

เว็บแทงมวย One อันดับ 1 ในไทยที่ดีที่สุดต้อง UFA222 เท่านั้น 
ผู้ให้บริการแทงมวยออนไลน์ครบวงจร ทั้ง 
แทงมวยเดี่ยวแทงมวยสเต็ปแทงมวยชุดแทงมวยพักยก
แทงมวยออนไลน์ กับ เว็บมวยออนไลน์ UFA222 ผู้ให้บริการแทงมวยออนไลน์ครบวงจร 
ทั้ง แทงมวยเดี่ยวแทงมวยสเต็ปแทงมวยชุดแทงมวยพักยก 
และยังมีการเดิมพันออนไลน์ให้ท่านเลือกอีกมากมาย UFA222 
เป็นเว็บแทงมวยออนไลน์ที่มีความมั่นคงสูง 
สำหรับแฟนมวยทุกท่านที่กำลังให้ความสนใจในการแทงมวยออนไลน์ เราขอแนะนำ 
ระบบการแทงมวยออนไลน์ของ UFA222 ที่มีความทันสมัยที่สุด ใช้งานง่ายที่สุด 
มีให้บริการทั้งในระบบ Android และ ระบบ IOS พร้อมทั้งระบบถ่ายทอดสดมวยไทย 
ครบทุกคู่ที่เปิดให้แทงมวยออนไลน์ สดตรงถึงท่านทุกวัน คุณภาพคมชัดระดับ HD

[image: ⭐] สมัคร > แทงมวยสเต็ป 
[image: ⭐] สมัคร > แทงมวยสเต็ป 

มวยสเต็ป OfficiaI  
มวยสเต็ป OfficiaI 

แทงมวยสเต็ป  ปลดล็อกความสนุก! ใหม่ล่าสุด 2024 พร้อมแทงบอลออนไลน์และบาคาร่า 
แทงมวยสเต็ป คุณภาพ เว็บแทงมวย ยอดนิยมอันดับหนึ่งในปี 2024 
ที่ลูกค้าเลือกแทงมวยด้วยต้องยกให้เว็บ เปิดให้แทงมวยครบทุกรูปแบบ 
ไม่ว่าจะเป็น แทงมวยไทย แทงมวยOne แทงมวยพักยก
แทงมวย มวยไทย ได้เงินจริง เว็บตรง ราคามวย [image: ⭐]️ 
คู่มือการเดิมพันสำหรับผู้ที่สนใจ: แทงมวย มวยไทย ได้เงินจริง  ราคามวย 
คำแนะนำเบื้องต้นสำหรับผู้ที่ต้องการลงเดิมพันทางกีฬา!

แทงมวยสเต็ป แทงมวยชุด คือ การนำเอาคู่มวย หลายๆคู่มาเดิมพันพร้อมกัน @MST22 
เปิดให้เดิมพันตั้งแต่ 2 คู่ขึ้นไป ซึ่งการแทงมวยสเต็ป จะใช้การลงทุนต่ำ [image: 
⭐] 
 

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/26498a74-44a7-4f7e-bdb6-72c14f2f91a0n%40googlegroups.com.


Re: [protobuf] Re: Is using tag numbers in the code - anti-pattern?

2024-07-08 Thread 'Em Rauch' via Protocol Buffers
I think it's hard to tell all the details from the message, but it seems
like you're creating some new form of extensions, perhaps combined with the
known pattern of "overlay messages" (which is where a client schema is
maintained with a perfect strict subset of fields of the serverside schema,
which permitting round tripping of the data to/from the server without
those fields being included in the client shipped schema).

It seems the specific pattern that you're describing isn't generally one
known to the protobuf team as having any precedent, so it is unlikely to
have any prior established decision about it being a known antipattern.


On Thu, Jun 20, 2024 at 1:54 PM Yan Zaitsev  wrote:

> On the client
> When we recursively parse the View Structure and see some `id` (aka tag
> number) key. We trying to find it in Backend response like `data[tagNumber:
> 11] as bytes` and merge with the current view (aka Text in the example).
>
> On Thursday, June 20, 2024 at 7:44:06 PM UTC+2 Yan Zaitsev wrote:
>
>> Hi,
>> I am developing a server-driven UI service in my company. *I want to
>> know your opinion on whether our approach is anti-pattern of protobuf usage
>> *or if it is valid usage even if it is not common.
>>
>> Thank you for your time reading this long message!
>>
>> All our APIs for Client(iOS,Android,Web) <-> backend are based on
>> protobuf models, thats why we want to use protobuf for SDUI too.
>>
>> We want to build a new service that will provide the View structure to
>> the client, which has to be rendered. The client can cache only UI styling
>> and layout, while another service provides the data for it.
>> Client app has to bind server data to view the structure. The client does
>> not know the schema of the server response, except it is protobuf data.
>> View Structure and Backend data are protobuf models and use the same
>> types - see examples below.
>>
>> ```
>> message Text {
>>  string label = 1;
>>   Color color = 2;
>> }
>> message VerticalStack {}
>> message HorizontalStack {}
>>
>> /// Typical View Structure in protobuf representation
>> VerticalStack {
>>   spacing = 20pt
>>   children = [
>>Text {
>>  id: 11
>>  color: .white
>> }
>>HorizontalStack {
>>  children: ...
>>}
>>   ]
>> }
>> ```
>>
>> Another service has to provide data for this view structure.
>>
>> ```
>> message BackendResponse {
>>   Text productTitle = 11;
>> }
>>
>> ```
>>
>> `*BackendResponse*` schema is *NOT known to the client. *
>>
>> We came with idea to *use protobuf tag numbers for binding.*
>> You can see that `Text.id = 11` and `BackendResponse.productTitle = 11`
>>
>> We built the autogenerated interfaces to safely set number 11 to the View
>> structure and ensure that the backend response uses the property with
>> number 11.
>>
>> I was not able to find a similar usage of protobuf structures. We found
>> devs using descriptors and sending whole descriptor models to work with
>> dynamic messages, or using FieldMask for filtering.
>> Using the whole descriptor sounds overkill for this purpose.
>> *The question of this post: even if it is not a common technique, is it
>> known as anti-pattern?*
>>
>> These binding(tag numbers) numbers are used across the different code
>> stacks:
>>
>>- View protos are known to clients, backend and separate DSL
>>repository (repo with View structures)
>>- Backend models are known to backend and DSL repository
>>
>>- DSL repository is TypeScript code
>>- Backend is Go code
>>- Clients are Swift, Kotlin, TypeScript/JS
>>
>>
>> Can the concept of extensions or custom options help us somehow?
>>
> --
> You received this message because you are subscribed to the Google Groups
> "Protocol Buffers" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to protobuf+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/protobuf/97e3a1b2-539c-4d0f-96c4-477390b49d5an%40googlegroups.com
> 
> .
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/CAKRmVH8VOv_asdACw0n8LLSwCqweS1rz-NPvLk7JfuMWeGvRsA%40mail.gmail.com.


[protobuf] Re: Import file issue

2024-07-01 Thread Rachana Deshpande
exports/camera.proto:18:5: "camera.ENUM_VIEWID_NO_VIEW" is already defined 
in file "camera.proto".

Exports is a folder insider Files and it has a file camera.proto which is 
imported in one of other protos.
Is this because the files in imports and same as the proto files found 
using find statement

On Tuesday, July 2, 2024 at 10:55:31 AM UTC+5:30 Rachana Deshpande wrote:

> IMPORTS=$(find Files/ -type d -exec echo -I '{}' \;)
>
> echo "Generating html..."
> find Files/ -name '*.proto' \
> | xargs protoc \
> ${IMPORTS} \
> --proto_path=. \
> --doc_out=. \
> --doc_opt=template.html,index.html
>
> Gives out the already defined error.

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/c5f48fc1-baba-4f02-8897-9446533afa97n%40googlegroups.com.


[protobuf] Re: CMake project find_package() gives warning about absl missing protobuf::gmock target

2024-06-26 Thread Zbigniew Piecuch
Hello
I have the same problem.
What I've noticed on my machine is that there is a difference between 
*find_package(Protobuf 
CONFIG REQUIRED)* and *find_package(protobuf CONFIG REQUIRED)*. Protobuf that 
starts with small letters contains protobuf::gmock target, however it is  
defined in following way:

*# Create imported target protobuf::gmockadd_library(protobuf::gmock STATIC 
IMPORTED)*
So it's only gmock target in protobuf namespace. 

Still, I get the same error and it doesn't work. I'm don't know how to 
install gmock to make it work.

BR,
Piecuch Z.

sobota, 6 kwietnia 2024 o 23:30:22 UTC+2 Evan Wegley napisał(a):

Protobuf version: v26.1
CMake version: 3.29.0

I built protobuf in a container as follows:

(From the local machine running Docker Desktop)
$ docker pull ubuntu:jammy
$ docker run -it -v /opt/ext:/opt/ext ubuntu:jammy

(From inside the container)
$ apt-get update
$ apt-get install build-essential git cmake
$ cd
$ git clone -b v26.1 https://github.com/protocolbuffers/protobuf.git
$ cd protobuf/
$ git submodule update --init --recursive
$ cmake -DCMAKE_INSTALL_PREFIX:PATH=/opt/ext/protobuf .
$ make -j 4 install

Now I have protobuf installed to /opt/ext/protobuf in my local Linux 
environment (Also Ubuntu 22.04).

In my local environment, I set up a CMake project with the following 
minimal CMakeLists.txt.

cmake_minimum_required(VERSION 3.29)
project(Demo VERSION 1.0)
list(APPEND CMAKE_PREFIX_PATH "/opt/ext/protobuf")
find_package(Protobuf PATHS "/opt/ext/protobuf" CONFIG REQUIRED)

But when I try to build, I get this warning:

CMake Warning at 
/opt/ext/protobuf/lib/cmake/protobuf/protobuf-config.cmake:7 (find_package):
  Found package configuration file:

/opt/ext/protobuf/lib/cmake/absl/abslConfig.cmake

  but it set absl_FOUND to FALSE so package "absl" is considered to be NOT
  FOUND.  Reason given by package:

  The following imported targets are referenced, but are missing:
  protobuf::gmock

Call Stack (most recent call first):
  CMakeLists.txt:4 (find_package)


Is this a bug, or is there something wrong with my usage?

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/48f85b9b-ba59-46bf-a67b-84549d8e00e3n%40googlegroups.com.


[protobuf] Re: Need suggestion on protobuf

2024-06-21 Thread Daz Wilkin
Cross-posted on Stack overflow where I added a comment:

https://stackoverflow.com/questions/78652218

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/1fd39496-b70c-4718-9dbe-e222fe0150c7n%40googlegroups.com.


[protobuf] Re: Is using tag numbers in the code - anti-pattern?

2024-06-20 Thread Yan Zaitsev
On the client
When we recursively parse the View Structure and see some `id` (aka tag 
number) key. We trying to find it in Backend response like `data[tagNumber: 
11] as bytes` and merge with the current view (aka Text in the example).

On Thursday, June 20, 2024 at 7:44:06 PM UTC+2 Yan Zaitsev wrote:

> Hi, 
> I am developing a server-driven UI service in my company. *I want to know 
> your opinion on whether our approach is anti-pattern of protobuf usage *or 
> if it is valid usage even if it is not common.
>
> Thank you for your time reading this long message!
>
> All our APIs for Client(iOS,Android,Web) <-> backend are based on protobuf 
> models, thats why we want to use protobuf for SDUI too.
>
> We want to build a new service that will provide the View structure to the 
> client, which has to be rendered. The client can cache only UI styling and 
> layout, while another service provides the data for it. 
> Client app has to bind server data to view the structure. The client does 
> not know the schema of the server response, except it is protobuf data.
> View Structure and Backend data are protobuf models and use the same types 
> - see examples below.
>
> ```
> message Text {
>  string label = 1;
>   Color color = 2;
> }
> message VerticalStack {}
> message HorizontalStack {}
>
> /// Typical View Structure in protobuf representation
> VerticalStack {
>   spacing = 20pt
>   children = [
>Text { 
>  id: 11
>  color: .white
> }
>HorizontalStack {
>  children: ...
>}
>   ]
> }
> ```
>
> Another service has to provide data for this view structure.
>
> ```
> message BackendResponse {
>   Text productTitle = 11;
> }
>
> ```
>
> `*BackendResponse*` schema is *NOT known to the client. *
>
> We came with idea to *use protobuf tag numbers for binding.*
> You can see that `Text.id = 11` and `BackendResponse.productTitle = 11`
>
> We built the autogenerated interfaces to safely set number 11 to the View 
> structure and ensure that the backend response uses the property with 
> number 11.
>
> I was not able to find a similar usage of protobuf structures. We found 
> devs using descriptors and sending whole descriptor models to work with 
> dynamic messages, or using FieldMask for filtering.
> Using the whole descriptor sounds overkill for this purpose.
> *The question of this post: even if it is not a common technique, is it 
> known as anti-pattern?*
>
> These binding(tag numbers) numbers are used across the different code 
> stacks:
>
>- View protos are known to clients, backend and separate DSL 
>repository (repo with View structures)
>- Backend models are known to backend and DSL repository
>
>- DSL repository is TypeScript code
>- Backend is Go code
>- Clients are Swift, Kotlin, TypeScript/JS
>
>
> Can the concept of extensions or custom options help us somehow?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/97e3a1b2-539c-4d0f-96c4-477390b49d5an%40googlegroups.com.


[protobuf] Re: Copying from allocated memory to a repetable field.

2024-06-13 Thread Gerardo Melesio
Bug in the copied code, I'm copying the matrix to protobuf this way:

// then manually copy the data to the protobuf
for(int i = 0; i < sizeA; i++){
reply->data_c()[i] = matrixDataA[i];
  }

I'm guessing the answer is now because of encoding, but if anyone has an 
idea I'm happy to hear it out. 
El jueves, 13 de junio de 2024 a las 20:17:41 UTC+1, Gerardo Melesio 
escribió:

> I am new to the gRPC implementation in C++ and wantd to see if there is 
> any way of achieving something.
>
> I have a low library in C that is performing some operations with 
> Matrices. It saves the result in a pointer that was allocated dynamically 
> in memory. 
>
> matrixDataA = (double *) malloc(sizeA*sizeof (double ));
> //... some code fills MatrixA with data
>
> // then manually copy the data to the protobuf
> for(int i = 0; i < sizeA; i++){
> matrixDataA[i] = reply->data_c()[i];
>   }
>
> Would there be any way of copying my data in the *double pointer to the 
> reply->data_c(), assuming data_c is a repeated double field?
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/b3bc6a6d-9323-4f82-aeb5-6ae909ee25b0n%40googlegroups.com.


[protobuf] Re: Cannot open include file 'google/protobuf/port_def.inc'

2024-05-31 Thread claire joseph
I am experiencing a similar issue. Any suggestions. I get file not found 
for port_def.inc. Installed protoc using vcpkg.

On Tuesday 18 August 2020 at 19:20:29 UTC-4 Nithya Murali wrote:

> I am on Windows 10 and trying to install the protoc compiler - I used the 
> following to install it:
>
> vcpkg install protobuf protobuf:x64-windows 
>
> protoc --version returns "libprotoc 3.13.0" as expected
>
> However, in the process of installing a python library (pycld3) which 
> requires protoc, I'm getting the following error:
>
> *Cannot open include file: 'google/protobuf/port_dec.inc': No such file or 
> directory*
> error: command "C:\\Program Files (x86)\\Microsoft Visual 
> Studio\\2019\\Community\\VC\\Tools\\MSVC\\14.27.29110\\bin\\Hostx86\\x64\\cl.exe"
>  
> failed with exit status 2
>
> Do I need to set anything further in the path besides the protoc binary 
> and CMake?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/6086d0e2-847e-48f6-8b87-99329be0faa0n%40googlegroups.com.


[protobuf] Re: How to set null value in map type

2024-05-31 Thread 'Jeff Sawatzky' via Protocol Buffers
As far as I am aware, a string is a scalar so you can't set it to null.
https://protobuf.dev/programming-guides/proto3/#scalar

So if you have a map then both the keys and values can not 
be null.

You may be able to try using the StringValue type from the Well-Known Types:
https://protobuf.dev/reference/protobuf/google.protobuf/

using a map, and most integrations will convert the 
StringValue to a simple nullable string.

On Thursday, May 30, 2024 at 10:40:45 AM UTC-4 Shenghe Wang wrote:

> Hi,
>
> I want to set an int field to 'null' when user want to clear some choice 
> but nullpointerexception was threw. I use map to store the 
> data which will be updated. But map do not accept null value. Is there any 
> way to store the null value into a map? 
>
> Thanks a lot
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/30ebd7a4-3f43-4d0f-9ed0-1a7a6c6503e5n%40googlegroups.com.


Re: [protobuf] Re: Nested Enum with protobuf and rust

2024-05-24 Thread 'Em Rauch' via Protocol Buffers
The suggestion on stack overflow looks correct: Rust's language enums are
"tagged unions", the Protobuf `oneof` feature also acts as a tagged union.
Protobuf Enums are intended to be more semantically like C++ enums (which
cannot hold values inside).

Can you clarify the followup comment "AAOS stands for Android Automotive
OS. The IDL that generates publisher code, struggles with message types"?
Any documentation links would be helpful for us to follow up on if there's
some deliberate limitation or just an unfortunate technical state that
might be worked around.


On Fri, Apr 19, 2024 at 8:50 PM Daz Wilkin  wrote:

> I took a stab at a reply to your question on Stack overflow:
>
> https://stackoverflow.com/questions/78355748/nested-enum-with-protobuf
>
>
> On Friday, April 19, 2024 at 5:10:25 PM UTC-7 Chedy Souffargi wrote:
>
>> I am working on a Rust project, using Protocol Buffers, and dealing with
>> nested enums. Here's what I have so far in Rust:
>>
>> enum Vehicle { Car(CarType), Truck(TruckType), } enum CarType { Sedan,
>> Coupe, Hatchback, } enum TruckType { Pickup, Semi, } fn
>> create_vehicle_fleet() -> [Vehicle; 8] { [ Vehicle::Car(CarType::Sedan),
>> Vehicle::Truck(TruckType::Pickup), Vehicle::Car(CarType::Coupe),
>> Vehicle::Truck(TruckType::Semi), Vehicle::Car(CarType::Hatchback),
>> Vehicle::Car(CarType::Sedan), Vehicle::Truck(TruckType::Pickup),
>> Vehicle::Car(CarType::Coupe), ] }
>>
>> I cannot find a proper way to represent the Vehicle enum in protobuf. Can
>> anyone suggest a solution or provide guidance on how to handle this
>> situation in Protocol Buffers?
>>
> --
> You received this message because you are subscribed to the Google Groups
> "Protocol Buffers" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to protobuf+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/protobuf/9827eaf2-7b6f-474c-9cfa-f3fbca084ab8n%40googlegroups.com
> 
> .
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/CAKRmVH_0nGdx-CBs_m4uMHo8wWmYAc1LZHjYorBJnf3YeL9Uuw%40mail.gmail.com.


[protobuf] Re: Decode size delimited compressed protobuf file in python

2024-05-17 Thread Mr Moose
I have figured it out by myself.
The problem is that _DecodeVarint() may only consume fewer than the 4 bytes 
reserved for it and reports how long it really was in the second return 
tuple element. So progressing offset by that returned value rather than 4 
does the trick.

Cheers,
Moose

Mr Moose schrieb am Freitag, 17. Mai 2024 um 13:19:58 UTC+2:

> Hello everyone,
>
> I hope I can find some advise here.
> I have C++ code that writes a number of protobuf messages to a compressed 
> size delimited stream like this (simplified):
>
> FILE *ofile = fopen("myfile.bin.gz", "wb");
> google::protobuf::io::FileOutputStream ostream(_fileno(ofile));
> google::protobuf::io::GzipOutputStream zipstream(&ostream);
>
> while (loop) {
>google::protobuf::util::SerializeDelimitedToZeroCopyStream(my_msg, 
> zipstream);
> }
>
> This works fine. The files are written and I can read them back in in C++ 
> with no issues.
> Now I am trying to read them in Python and I'm having difficulties to 
> understand the structure of the files. Here's what I'm trying:
>
> def read_messages(raw_data: bytes):
> offset = 0
> while offset < len(raw_data):
> # Read the size (4 bytes, little-endian) and decode
> size_bytes = raw_data[offset : offset + 4]
> offset += 4
> size, _ = _DecodeVarint(size_bytes, 0)
> # This reads the correct size of the message (verified in C++)
>
> message_data = raw_data[offset : offset + size]
> offset += size
>
> # This causes an "Error parsing message" exception at the first 
> message
> msg = my_messages_protobuf.MyMessage()
> msg.ParseFromString(message_data)
>
> ... and ...
>
>  with gzip.open( "myfile.bin.gz", "r") as f:
>   while True:
>   chunk = f.read(chunk_size)
>   if not chunk:
>   break;
>   read_messages(chunk)
>
> Now, to clarify a bit, I have worked with protobuf for very long, although 
> not in Python. Yet much Python code already deserializes such messages that 
> come in elsewhere, so I assume the whole "setup Protobuf in Python" thing 
> is not an issue here. It should work.
>
> Given the fact that _DecodeVarint() correctly reads the message size leads 
> me to believe the reading of the gzipped file is okay too.
>
> Yet when I look at the raw buffer "message_data" it looks very different 
> than the raw message data looks in C++ when I use the debugger there. I 
> have no idea what could cause this difference.
>
> Can anybody give me a hint on what could be wrong here?
>
> Much appreciated,
> Moose
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/b8db4d69-987b-452a-bad3-d2743302c068n%40googlegroups.com.


[protobuf] Re: Nested Enum with protobuf and rust

2024-04-19 Thread Daz Wilkin
I took a stab at a reply to your question on Stack overflow:

https://stackoverflow.com/questions/78355748/nested-enum-with-protobuf


On Friday, April 19, 2024 at 5:10:25 PM UTC-7 Chedy Souffargi wrote:

> I am working on a Rust project, using Protocol Buffers, and dealing with 
> nested enums. Here's what I have so far in Rust:
>
> enum Vehicle { Car(CarType), Truck(TruckType), } enum CarType { Sedan, 
> Coupe, Hatchback, } enum TruckType { Pickup, Semi, } fn 
> create_vehicle_fleet() -> [Vehicle; 8] { [ Vehicle::Car(CarType::Sedan), 
> Vehicle::Truck(TruckType::Pickup), Vehicle::Car(CarType::Coupe), Vehicle::
> Truck(TruckType::Semi), Vehicle::Car(CarType::Hatchback), 
> Vehicle::Car(CarType::Sedan), 
> Vehicle::Truck(TruckType::Pickup), Vehicle::Car(CarType::Coupe), ] }
>
> I cannot find a proper way to represent the Vehicle enum in protobuf. Can 
> anyone suggest a solution or provide guidance on how to handle this 
> situation in Protocol Buffers?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/9827eaf2-7b6f-474c-9cfa-f3fbca084ab8n%40googlegroups.com.


Re: [protobuf] Re: How to obtain the file libprotobuf.a?

2024-04-02 Thread Claus Volko
 Meanwhile I've been able to build libprotobuf.a using 
https://www.mingw-w64.org/downloads/#llvm-mingw (yes, I'm using Windows). 
But when I execute the compilation script I get errors like this one:

wasm-ld: warning: libprotobuf.a: archive member 'any.pb.cc.obj' is neither 
Wasm object file nor LLVM bitcode

I wonder: If it's neither a Wasm object file nor LLVM bitcode, what is it 
then?
Do I have to use some special compiler flag to create LLVM bitcode?
On Friday, March 29, 2024 at 11:46:59 PM UTC+1 Adam Cozzette wrote:

> How are you invoking CMake? I ran cmake . && make -j12 just now and 
> confirmed that it produced libprotobuf.a.
>
> On Wed, Mar 27, 2024 at 11:53 PM Claus Volko  wrote:
>
>> My problem is that when I compile a project using Visual C++, no archive 
>> file (file with extension .a) is created. But I would need one.
>>
>> I tried building the files from the source code of Google's protobuf, 
>> version 25.3, using cmake but it did not produce the file libprotobuf.a. I 
>> need this file for a project I downloaded from GitHub, namely 
>> https://github.com/retroplasma/earth-reverse-engineering. It contains a 
>> build script that looks like this:
>> #!/bin/sh
>>
>> if [ "$1" == "emscripten" ]; then
>> source config_emscripten.sh
>> echo build: emscripten
>> pwd="$(pwd)" && cd .. && $EMSCRIPTEN_PROTOBUF_EXE --cpp_out=client 
>> proto/rocktree.proto && cd "$pwd"
>> cd crn && emcc -std=c++14 -c crn.cc -w && cd ..
>>
>> emcc -Iinclude main.cpp -O2 -std=c++14 -I. -I./eigen/ \
>> -I$EMSCRIPTEN_PROTOBUF_SRC $EMSCRIPTEN_PROTOBUF_LIB crn/crn.o \
>> -s USE_SDL=2 -s FETCH=1 -s TOTAL_MEMORY=1073741824 -s USE_PTHREADS=1 -s 
>> PTHREAD_POOL_SIZE=4 \
>> -o main.html 
>> else
>> echo build: native
>> pwd="$(pwd)" && cd .. && protoc --cpp_out=client proto/rocktree.proto && 
>> cd "$pwd"
>> cd crn && g++ -std=c++14 -c crn.cc -w && cd ..
>>
>> CFLAGS="--std=c++14 -g -I. `pkg-config --cflags sdl2 protobuf` -I./eigen/"
>> LDFLAGS="`pkg-config --libs sdl2 protobuf` crn/crn.o"
>> if [ `uname` = "Darwin" ]; then 
>> CFLAGS="$CFLAGS `pkg-config --cflags glew`"
>> LDFLAGS="$LDFLAGS `pkg-config --static --libs glew` -framework OpenGL"
>> echo "$CFLAGS"
>> echo "$LDFLAGS"
>> else
>> CFLAGS="$CFLAGS -Igl2/include"
>> LDFLAGS="$LDFLAGS -lGL -lm -ldl"
>> fi
>> c++ $CFLAGS main.cpp $LDFLAGS -o main
>> fi
>>
>> where config_emscripten.sh reads as follows:
>>
>> #!/bin/bash
>>
>> EMSCRIPTEN_PROTOBUF_SRC="$(echo ~)/Downloads/protobuf/src"
>> EMSCRIPTEN_PROTOBUF_LIB="$(echo 
>> ~)/Downloads/protobuf/src/.libs/libprotobuf.a"
>> EMSCRIPTEN_PROTOBUF_EXE="$(echo 
>> ~)/Downloads/protoc-3.9.2-osx-x86_64/bin/protoc"
>>
>> So I need libprotobuf.a to build the emscripten version of the project.
>>
>> Could anybody please give me instructions how to build libprotobuf.a? 
>> What C++ compiler/linker creates files of this type?
>>
>>
>> On Wednesday, March 27, 2024 at 8:33:03 AM UTC+1 Claus Volko wrote:
>>
>>> I tried building the files from the source code using cmake but it did 
>>> not produce the file libprotobuf.a. I need this file for a project I 
>>> downloaded from GitHub. Also, it seems that the latest version of protobuf 
>>> is not supported by this project but I guess it should work with v25.3.
>>>
>>> Could anybody please give me instructions how to build libprotobuf.a or 
>>> directly send me the file? Thank you very much in advance.
>>>
>> -- 
>> You received this message because you are subscribed to the Google Groups 
>> "Protocol Buffers" group.
>> To unsubscribe from this group and stop receiving emails from it, send an 
>> email to protobuf+u...@googlegroups.com.
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/protobuf/71cfd831-69fc-42e4-b1a9-e727e872815fn%40googlegroups.com
>>  
>> 
>> .
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/af98ef02-b9a8-4027-b8c1-ba251b6c770en%40googlegroups.com.


Re: [protobuf] Re: How to obtain the file libprotobuf.a?

2024-03-29 Thread 'Adam Cozzette' via Protocol Buffers
How are you invoking CMake? I ran cmake . && make -j12 just now and
confirmed that it produced libprotobuf.a.

On Wed, Mar 27, 2024 at 11:53 PM Claus Volko  wrote:

> My problem is that when I compile a project using Visual C++, no archive
> file (file with extension .a) is created. But I would need one.
>
> I tried building the files from the source code of Google's protobuf,
> version 25.3, using cmake but it did not produce the file libprotobuf.a. I
> need this file for a project I downloaded from GitHub, namely
> https://github.com/retroplasma/earth-reverse-engineering. It contains a
> build script that looks like this:
> #!/bin/sh
>
> if [ "$1" == "emscripten" ]; then
> source config_emscripten.sh
> echo build: emscripten
> pwd="$(pwd)" && cd .. && $EMSCRIPTEN_PROTOBUF_EXE --cpp_out=client
> proto/rocktree.proto && cd "$pwd"
> cd crn && emcc -std=c++14 -c crn.cc -w && cd ..
>
> emcc -Iinclude main.cpp -O2 -std=c++14 -I. -I./eigen/ \
> -I$EMSCRIPTEN_PROTOBUF_SRC $EMSCRIPTEN_PROTOBUF_LIB crn/crn.o \
> -s USE_SDL=2 -s FETCH=1 -s TOTAL_MEMORY=1073741824 -s USE_PTHREADS=1 -s
> PTHREAD_POOL_SIZE=4 \
> -o main.html
> else
> echo build: native
> pwd="$(pwd)" && cd .. && protoc --cpp_out=client proto/rocktree.proto &&
> cd "$pwd"
> cd crn && g++ -std=c++14 -c crn.cc -w && cd ..
>
> CFLAGS="--std=c++14 -g -I. `pkg-config --cflags sdl2 protobuf` -I./eigen/"
> LDFLAGS="`pkg-config --libs sdl2 protobuf` crn/crn.o"
> if [ `uname` = "Darwin" ]; then
> CFLAGS="$CFLAGS `pkg-config --cflags glew`"
> LDFLAGS="$LDFLAGS `pkg-config --static --libs glew` -framework OpenGL"
> echo "$CFLAGS"
> echo "$LDFLAGS"
> else
> CFLAGS="$CFLAGS -Igl2/include"
> LDFLAGS="$LDFLAGS -lGL -lm -ldl"
> fi
> c++ $CFLAGS main.cpp $LDFLAGS -o main
> fi
>
> where config_emscripten.sh reads as follows:
>
> #!/bin/bash
>
> EMSCRIPTEN_PROTOBUF_SRC="$(echo ~)/Downloads/protobuf/src"
> EMSCRIPTEN_PROTOBUF_LIB="$(echo
> ~)/Downloads/protobuf/src/.libs/libprotobuf.a"
> EMSCRIPTEN_PROTOBUF_EXE="$(echo
> ~)/Downloads/protoc-3.9.2-osx-x86_64/bin/protoc"
>
> So I need libprotobuf.a to build the emscripten version of the project.
>
> Could anybody please give me instructions how to build libprotobuf.a? What
> C++ compiler/linker creates files of this type?
>
>
> On Wednesday, March 27, 2024 at 8:33:03 AM UTC+1 Claus Volko wrote:
>
>> I tried building the files from the source code using cmake but it did
>> not produce the file libprotobuf.a. I need this file for a project I
>> downloaded from GitHub. Also, it seems that the latest version of protobuf
>> is not supported by this project but I guess it should work with v25.3.
>>
>> Could anybody please give me instructions how to build libprotobuf.a or
>> directly send me the file? Thank you very much in advance.
>>
> --
> You received this message because you are subscribed to the Google Groups
> "Protocol Buffers" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to protobuf+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/protobuf/71cfd831-69fc-42e4-b1a9-e727e872815fn%40googlegroups.com
> 
> .
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/CADqAXr66E7zXiejQkUspQRYVdH2JfshnDb8BAcENjbnW3jMtxA%40mail.gmail.com.


[protobuf] Re: How to obtain the file libprotobuf.a?

2024-03-27 Thread Claus Volko
 

My problem is that when I compile a project using Visual C++, no archive 
file (file with extension .a) is created. But I would need one.

I tried building the files from the source code of Google's protobuf, 
version 25.3, using cmake but it did not produce the file libprotobuf.a. I 
need this file for a project I downloaded from GitHub, namely 
https://github.com/retroplasma/earth-reverse-engineering. It contains a 
build script that looks like this:
#!/bin/sh

if [ "$1" == "emscripten" ]; then
source config_emscripten.sh
echo build: emscripten
pwd="$(pwd)" && cd .. && $EMSCRIPTEN_PROTOBUF_EXE --cpp_out=client 
proto/rocktree.proto && cd "$pwd"
cd crn && emcc -std=c++14 -c crn.cc -w && cd ..

emcc -Iinclude main.cpp -O2 -std=c++14 -I. -I./eigen/ \
-I$EMSCRIPTEN_PROTOBUF_SRC $EMSCRIPTEN_PROTOBUF_LIB crn/crn.o \
-s USE_SDL=2 -s FETCH=1 -s TOTAL_MEMORY=1073741824 -s USE_PTHREADS=1 -s 
PTHREAD_POOL_SIZE=4 \
-o main.html 
else
echo build: native
pwd="$(pwd)" && cd .. && protoc --cpp_out=client proto/rocktree.proto && cd 
"$pwd"
cd crn && g++ -std=c++14 -c crn.cc -w && cd ..

CFLAGS="--std=c++14 -g -I. `pkg-config --cflags sdl2 protobuf` -I./eigen/"
LDFLAGS="`pkg-config --libs sdl2 protobuf` crn/crn.o"
if [ `uname` = "Darwin" ]; then 
CFLAGS="$CFLAGS `pkg-config --cflags glew`"
LDFLAGS="$LDFLAGS `pkg-config --static --libs glew` -framework OpenGL"
echo "$CFLAGS"
echo "$LDFLAGS"
else
CFLAGS="$CFLAGS -Igl2/include"
LDFLAGS="$LDFLAGS -lGL -lm -ldl"
fi
c++ $CFLAGS main.cpp $LDFLAGS -o main
fi

where config_emscripten.sh reads as follows:

#!/bin/bash

EMSCRIPTEN_PROTOBUF_SRC="$(echo ~)/Downloads/protobuf/src"
EMSCRIPTEN_PROTOBUF_LIB="$(echo 
~)/Downloads/protobuf/src/.libs/libprotobuf.a"
EMSCRIPTEN_PROTOBUF_EXE="$(echo 
~)/Downloads/protoc-3.9.2-osx-x86_64/bin/protoc"

So I need libprotobuf.a to build the emscripten version of the project.

Could anybody please give me instructions how to build libprotobuf.a? What 
C++ compiler/linker creates files of this type?


On Wednesday, March 27, 2024 at 8:33:03 AM UTC+1 Claus Volko wrote:

> I tried building the files from the source code using cmake but it did not 
> produce the file libprotobuf.a. I need this file for a project I downloaded 
> from GitHub. Also, it seems that the latest version of protobuf is not 
> supported by this project but I guess it should work with v25.3.
>
> Could anybody please give me instructions how to build libprotobuf.a or 
> directly send me the file? Thank you very much in advance.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/71cfd831-69fc-42e4-b1a9-e727e872815fn%40googlegroups.com.


Re: [protobuf] Re: Fail to build protobuf

2024-03-05 Thread Rick Genter
Thank you, Tricia. As a workaround I added the following to bottom of the
WORKSPACE file at the root of the protobuf-25.3 directory:

local_repository(name = "com_google_protobuf", path = ".")

Everything then built fine and seems to be functional.

On Tue, Mar 5, 2024 at 4:43 AM 'Tricia Decker' via Protocol Buffers <
protobuf@googlegroups.com> wrote:

> Hi Rick,
>
> There's an open issue for the same
> https://github.com/protocolbuffers/protobuf/issues/15615 in case you
> haven't seen it.
>
> Tricia
>
> On Monday, March 4, 2024 at 7:47:25 AM UTC-8 Rick Genter wrote:
>
>> I'm following the directions at
>> https://github.com/protocolbuffers/protobuf/blob/main/src/README.md to
>> try to build protobuf 25.3.
>>
>> I have bazel 7.0.2 installed.
>>
>> I downloaded protobuf-25.3.tar.gz, then gunzip'd and untar'd it.
>> I then cd'd to protobuf-25.3 and did
>>
>> $ bazel build :protoc
>>
>> That worked.
>>
>> I then tried
>>
>> $ bazel build :protobuf
>>
>> and get
>>
>> ERROR: no such package '@@com_google_protobuf//': The repository
>> '@@com_google_protobuf' could not be resolved: Repository
>> '@@com_google_protobuf' is not defined
>>
>> How do I fix this?
>>
>> --
> You received this message because you are subscribed to the Google Groups
> "Protocol Buffers" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to protobuf+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/protobuf/2cdb9d4f-2f4e-438b-a128-561328a33506n%40googlegroups.com
> 
> .
>


-- 
Rick Genter
rick.gen...@gmail.com

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/CADie1rzG72X-xvov6%3DZCydTwiRtB4WxbQSBt3UrxDH4qUghPvQ%40mail.gmail.com.


[protobuf] Re: Fail to build protobuf

2024-03-04 Thread 'Tricia Decker' via Protocol Buffers
Hi Rick,

There's an open issue for the 
same https://github.com/protocolbuffers/protobuf/issues/15615 in case you 
haven't seen it.

Tricia

On Monday, March 4, 2024 at 7:47:25 AM UTC-8 Rick Genter wrote:

> I'm following the directions at 
> https://github.com/protocolbuffers/protobuf/blob/main/src/README.md to 
> try to build protobuf 25.3.
>
> I have bazel 7.0.2 installed.
>
> I downloaded protobuf-25.3.tar.gz, then gunzip'd and untar'd it.
> I then cd'd to protobuf-25.3 and did
>
> $ bazel build :protoc
>
> That worked.
>
> I then tried
>
> $ bazel build :protobuf
>
> and get
>
> ERROR: no such package '@@com_google_protobuf//': The repository 
> '@@com_google_protobuf' could not be resolved: Repository 
> '@@com_google_protobuf' is not defined
>
> How do I fix this?
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/2cdb9d4f-2f4e-438b-a128-561328a33506n%40googlegroups.com.


[protobuf] Re: Error "Tried to write the same file twice." only with kotlin

2024-01-26 Thread Daniele Segato
I know it's very late for this, but I think the problem you are having is 
this one I just filed a bug for:
https://github.com/protocolbuffers/protobuf/issues/15606

On Tuesday, October 31, 2023 at 12:52:28 AM UTC+1 Choucroute_melba wrote:

> I'm trying to build the arduino cli grpc sources (
> https://github.com/arduino/arduino-cli/tree/master/rpc) in Kotlin using 
> the gradle protobuf plugin.
> So I have four outputs in my build : grpc, grpckt, java and kotlin an 
> kotlin is the only one to fail with the following error : 
>
>
> *Execution failed for task ':stub:generateProto'.> protoc: stdout: . 
> stderr: cc/arduino/cli/commands/v1/BoardKt.kt: Tried to write the same file 
> twice.*
>
> I have tried a lot of things but I really don't know how to work around 
> this error, especially because it work with every other languages.
>
> If it can help, here is the command that the gradle plugin execute and the 
> one that the arduino team execute to build their Go sources :
>
> Gradle : 
> ```
> D:\Vivien\AppData\.gradle\caches\modules-2\files-2.1\com.google.protobuf\protoc\3.24.1\3052022638437eefd2645963518be582bb24273a\protoc-3.24.1-windows-x86_64.exe,
>  
> -ID:\Vivien\Projects\Idea\intellino\stub\build\extracted-protos\main, 
> -ID:\Vivien\Projects\Idea\intellino\stub\build\extracted-include-protos\main, 
> --java_out=D:\Vivien\Projects\Idea\intellino\stub\build\generated\source\proto/main/java,
>  
> --kotlin_out=D:\Vivien\Projects\Idea\intellino\stub\build\generated\source\proto/main/kotlin,
>  
> --plugin=protoc-gen-grpc=D:\Vivien\AppData\.gradle\caches\modules-2\files-2.1\io.grpc\protoc-gen-grpc-java\1.57.2\5670558169ce74039d210781b06c8d8136c0868f\protoc-gen-grpc-java-1.57.2-windows-x86_64.exe,
>  
> --grpc_out=D:\Vivien\Projects\Idea\intellino\stub\build\generated\source\proto/main/grpc,
>  
> --plugin=protoc-gen-grpckt=D:\Vivien\Projects\Idea\intellino\stub\build\scripts\protoc-gen-grpc-kotlin-1.4.0-jdk8-generateProto-trampoline.bat,
>  
> --grpckt_out=D:\Vivien\Projects\Idea\intellino\stub\build\generated\source\proto/main/grpckt,
>  
> D:\Vivien\Projects\Idea\intellino\stub\build\extracted-protos\main\cc\arduino\cli\commands\v1\board.proto,
>  
> D:\Vivien\Projects\Idea\intellino\stub\build\extracted-protos\main\cc\arduino\cli\commands\v1\commands.proto,
>  
> D:\Vivien\Projects\Idea\intellino\stub\build\extracted-protos\main\cc\arduino\cli\commands\v1\common.proto,
>  
> D:\Vivien\Projects\Idea\intellino\stub\build\extracted-protos\main\cc\arduino\cli\commands\v1\compile.proto,
>  
> D:\Vivien\Projects\Idea\intellino\stub\build\extracted-protos\main\cc\arduino\cli\commands\v1\core.proto,
>  
> D:\Vivien\Projects\Idea\intellino\stub\build\extracted-protos\main\cc\arduino\cli\commands\v1\debug.proto,
>  
> D:\Vivien\Projects\Idea\intellino\stub\build\extracted-protos\main\cc\arduino\cli\commands\v1\lib.proto,
>  
> D:\Vivien\Projects\Idea\intellino\stub\build\extracted-protos\main\cc\arduino\cli\commands\v1\monitor.proto,
>  
> D:\Vivien\Projects\Idea\intellino\stub\build\extracted-protos\main\cc\arduino\cli\commands\v1\port.proto,
>  
> D:\Vivien\Projects\Idea\intellino\stub\build\extracted-protos\main\cc\arduino\cli\commands\v1\upload.proto,
>  
> D:\Vivien\Projects\Idea\intellino\stub\build\extracted-protos\main\cc\arduino\cli\settings\v1\settings.proto
> ```
>
> Arduino (from their github repo)
> ```
>  protoc --proto_path=rpc --go_out=./rpc --go_opt=paths=source_relative 
> --go-grpc_out=./rpc --go-grpc_opt=paths=source_relative 
> ./rpc/cc/arduino/cli/commands/v1/*.proto'
>   protoc --proto_path=rpc --go_out=./rpc 
> --go_opt=paths=source_relative --go-grpc_out=./rpc 
> --go-grpc_opt=paths=source_relative 
> ./rpc/cc/arduino/cli/settings/v1/*.proto'
> ```
> Thanks in advance, - Vivien
>
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/99d23ddc-176a-4989-8c3e-c5e88d8d6b00n%40googlegroups.com.


[protobuf] Re: Been working on this problem for a fortnight, really need help

2024-01-08 Thread Huang Nuoxian
Hello, zjk, have you finally solved this problem? I encountered exactly the 
same issue recently while training a model with TensorFlow 1.15.

On Tuesday, May 3, 2022 at 1:42:19 AM UTC+8 Deanna Garcia wrote:

> I think you should file this bug with tensorflow since you are only using 
> protobufs through them.
>
> On Saturday, April 23, 2022 at 10:49:41 AM UTC-7 zjk wrote:
>
>> Hi, I am not a direct user of protobuf, I was training my model with 
>> tensorflow using Python, while attempting to save my checkpoint model, this 
>> error related to protobuf occurred:
>>
>> [libprotobuf ERROR google/protobuf/wire_format_lite.cc:577] String field '
>> tensorflow.TensorShapeProto.Dim.name' contains invalid UTF-8 data when 
>> parsing a protocol buffer. Use the 'bytes' type if you intend to send raw 
>> bytes.
>> Traceback (most recent call last):
>>   File "train.py", line 77, in 
>> train(model)
>>   File "train.py", line 24, in train
>> model.optimize()
>>   File "/user-data/HyperBox-main/script/model/box_model.py", line 329, in 
>> optimize
>> self.save_model(itr)
>>   File "/user-data/HyperBox-main/script/model/box_model.py", line 140, in 
>> save_model
>> self.saver.save(self.sess, filename)
>>   File 
>> "/opt/conda/lib/python3.8/site-packages/tensorflow/python/training/saver.py",
>>  
>> line 1208, in save
>> self.export_meta_graph(
>>   File 
>> "/opt/conda/lib/python3.8/site-packages/tensorflow/python/training/saver.py",
>>  
>> line 1254, in export_meta_graph
>> graph_def=ops.get_default_graph().as_graph_def(add_shapes=True),
>>   File 
>> "/opt/conda/lib/python3.8/site-packages/tensorflow/python/framework/ops.py", 
>> line 3345, in as_graph_def
>> result, _ = self._as_graph_def(from_version, add_shapes)
>>   File 
>> "/opt/conda/lib/python3.8/site-packages/tensorflow/python/framework/ops.py", 
>> line 3262, in _as_graph_def
>> graph.ParseFromString(compat.as_bytes(data))
>> google.protobuf.message.DecodeError: Error parsing message
>>
>> from the information above, we can see that:
>> 1)  some so-called 'invalid UTF-8 data' was sent to protobuf;
>> 2)  Use the 'bytes' type if you intend to send raw bytes.
>>
>> the function at the second to last line compat.as_bytes(data) is defined 
>> as:
>>  def as_bytes(bytes_or_text, encoding='utf-8'):
>> if isinstance(bytes_or_text, bytearray):
>> return bytes(bytes_or_text)
>>  elif isinstance(bytes_or_text, _six.text_type):  
>> ##_six.text_type = unicode
>> return bytes_or_text.encode(encoding)
>>  elif isinstance(bytes_or_text, bytes):
>> return bytes_or_text
>>  else:
>> raise TypeError('Expected binary or unicode string, got %r' %
>> (bytes_or_text,))
>>
>> we can see that this function makes sure that what will be passed into 
>> graph.ParseFromString(which later passed to protobuf) is definitely utf-8 
>> data or bytes, however, the protobuf keeps claiming the data passed to it 
>> is not valid utf-8 data. Taking one step back, even if the data was not 
>> valid utf-8, it is definitely bytes, which it should take in normally. 
>>
>> And I printed some information before the execution of 
>> graph.ParseFromString(compat.as_bytes(data)):
>>
>> pprint(type(data))
>> graph.ParseFromString(compat.as_bytes(data))
>>
>> and here's the result:
>> 
>> the data has already been bytes even before it is passed into 
>> compat.as_bytes!
>>
>> I am sorry for sounding emotional, it's just that I've been stuck with 
>> this problem for too long and it really gets me frustrated.
>>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/77532789-5439-4a1b-bf21-321d4e9ac398n%40googlegroups.com.


[protobuf] Re: protobuf 2.5.0 for ARM64 for Hadoop 2.6.4

2024-01-07 Thread Muskan Khedia
Hi all,

I am facing issue while configuring Hadoop 3.3.5 on Linux Ubuntu 20.04. Can 
you direct me to the documentation for setting up Hadoop ARM64 on an Azure 
VM or any similar resources?

Currently, I'm using the guide Apache Hadoop 3.3.5 – Hadoop: Setting up a 
Single Node Cluster 

 for 
the setup process.

The software versions being utilized are:
hadoop-3.3.5-aarch64.tar.gz 

 
https://aka.ms/download-jdk/microsoft-jdk-11.0.19-linux-aarch64.tar.gz

I would greatly appreciate any assistance provided.

On Friday, June 3, 2016 at 12:16:36 AM UTC+5:30 Wei-ju Wu wrote:

> I had exactly the same problem.
>
> Thanks to Feng Xiao's reply below I was able to make a patch that works 
> for me (on ODROID C2 and Ubuntu Mate), I've put the patch file on my public 
> Dropbox folder.
>
> You can apply it e.g. like that
>
> $ wget -O protobuf-2.5.0-arm64.patch 
> https://www.dropbox.com/s/713wql5cw9dfxhx/protobuf-2.5.0-arm64.patch?dl=0
> $ tar xfz protobuf-2.5.0.tar.gz
> $ cd protobuf-2.5.0
> $ patch -p1 < ../protobuf-2.5.0-arm64.patch
>
> and then configure and build as you would typically do
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/96fc2d8e-7e5c-42f3-a4aa-85100f65e1a7n%40googlegroups.com.


[protobuf] Re: .so from bazel.BUILD

2024-01-02 Thread priya_cast_23
But it did not work 

On Tuesday, January 2, 2024 at 1:34:24 PM UTC priya_cast_23 wrote:

> I am trying to generate the .so from bazel. 
>
> I added this in the bazel.BUILD
>
>
> cc_library(
> name = "libprotobuf.so",
> srcs =  [
> ":dist_files",
> "//src/google/protobuf:dist_files",
> "//src/google/protobuf/compiler:dist_files",
> "//src/google/protobuf/compiler/cpp:dist_files",
> "//src/google/protobuf/compiler/csharp:dist_files",
> "//src/google/protobuf/compiler/java:dist_files",
> "//src/google/protobuf/compiler/objectivec:dist_files",
> "//src/google/protobuf/compiler/php:dist_files",
> "//src/google/protobuf/compiler/python:dist_files",
> "//src/google/protobuf/compiler/ruby:dist_files",
> "//src/google/protobuf/io:dist_files",
> "//src/google/protobuf/stubs:dist_files",
> "//src/google/protobuf/testing:dist_files",
> "//src/google/protobuf/util:dist_files",
> ],
> linkshared = 1,
> alwayslink = True,
> linkstatic = True,
> visibility = ["//visibility:public"],
> )
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/28762067-1cd2-40ca-b6ad-baf77138d807n%40googlegroups.com.


[protobuf] Re: Been working on this problem for a fortnight, really need help

2024-01-01 Thread Huang Nuoxian
Hello, I want to know if you finally solved this problem? I encountered 
exactly the same issue recently while training a model with TensorFlow 1.15.

On Sunday, April 24, 2022 at 1:49:41 AM UTC+8 zjk wrote:

> Hi, I am not a direct user of protobuf, I was training my model with 
> tensorflow using Python, while attempting to save my checkpoint model, this 
> error related to protobuf occurred:
>
> [libprotobuf ERROR google/protobuf/wire_format_lite.cc:577] String field '
> tensorflow.TensorShapeProto.Dim.name' contains invalid UTF-8 data when 
> parsing a protocol buffer. Use the 'bytes' type if you intend to send raw 
> bytes.
> Traceback (most recent call last):
>   File "train.py", line 77, in 
> train(model)
>   File "train.py", line 24, in train
> model.optimize()
>   File "/user-data/HyperBox-main/script/model/box_model.py", line 329, in 
> optimize
> self.save_model(itr)
>   File "/user-data/HyperBox-main/script/model/box_model.py", line 140, in 
> save_model
> self.saver.save(self.sess, filename)
>   File 
> "/opt/conda/lib/python3.8/site-packages/tensorflow/python/training/saver.py", 
> line 1208, in save
> self.export_meta_graph(
>   File 
> "/opt/conda/lib/python3.8/site-packages/tensorflow/python/training/saver.py", 
> line 1254, in export_meta_graph
> graph_def=ops.get_default_graph().as_graph_def(add_shapes=True),
>   File 
> "/opt/conda/lib/python3.8/site-packages/tensorflow/python/framework/ops.py", 
> line 3345, in as_graph_def
> result, _ = self._as_graph_def(from_version, add_shapes)
>   File 
> "/opt/conda/lib/python3.8/site-packages/tensorflow/python/framework/ops.py", 
> line 3262, in _as_graph_def
> graph.ParseFromString(compat.as_bytes(data))
> google.protobuf.message.DecodeError: Error parsing message
>
> from the information above, we can see that:
> 1)  some so-called 'invalid UTF-8 data' was sent to protobuf;
> 2)  Use the 'bytes' type if you intend to send raw bytes.
>
> the function at the second to last line compat.as_bytes(data) is defined 
> as:
>  def as_bytes(bytes_or_text, encoding='utf-8'):
> if isinstance(bytes_or_text, bytearray):
> return bytes(bytes_or_text)
>  elif isinstance(bytes_or_text, _six.text_type):  
> ##_six.text_type = unicode
> return bytes_or_text.encode(encoding)
>  elif isinstance(bytes_or_text, bytes):
> return bytes_or_text
>  else:
> raise TypeError('Expected binary or unicode string, got %r' %
> (bytes_or_text,))
>
> we can see that this function makes sure that what will be passed into 
> graph.ParseFromString(which later passed to protobuf) is definitely utf-8 
> data or bytes, however, the protobuf keeps claiming the data passed to it 
> is not valid utf-8 data. Taking one step back, even if the data was not 
> valid utf-8, it is definitely bytes, which it should take in normally. 
>
> And I printed some information before the execution of 
> graph.ParseFromString(compat.as_bytes(data)):
>
> pprint(type(data))
> graph.ParseFromString(compat.as_bytes(data))
>
> and here's the result:
> 
> the data has already been bytes even before it is passed into 
> compat.as_bytes!
>
> I am sorry for sounding emotional, it's just that I've been stuck with 
> this problem for too long and it really gets me frustrated.
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/fa7c1b11-3e74-46a7-bc76-01b3ed378569n%40googlegroups.com.


[protobuf] Re: How to make pb.h small to reduce compile times

2023-12-22 Thread amandee...@gmail.com
Also, we use bazel so precompiled headers is not really an option.

On Friday, December 22, 2023 at 6:43:17 PM UTC-8 amandee...@gmail.com wrote:

> When proto file is large, it makes compiling time too long (Especially 
> when I include the header).
>
> A similar question was asked few years ago, but no response: 
> https://github.com/protocolbuffers/protobuf/issues/7340
>
> Is there a way in which this can made faster?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/14070f69-4ee6-4cae-a726-1f492d021691n%40googlegroups.com.


[protobuf] Re: Case insensitive languages and field names

2023-11-21 Thread 'sh...@google.com' via Protocol Buffers
This is new to me as well. For Java it seems like it discerns the field by 
adding "_{field#}" to the end of accessor name. for example getS1(), 
getS2(). In C++, which I'm less familiar with, inserts the scope "O" 
between two fields, but keeps the same name i.e. "s". 

On Sunday, November 5, 2023 at 1:25:00 AM UTC-5 Carl Gay wrote:

> On Saturday, November 4, 2023 at 11:49:41 PM UTC-4 Carl Gay wrote:
>
> Hi. I'm implementing  protocol 
> buffers for a case-insensitive language (Dylan) and I'm wondering if 
> there's prior art that would give me an idea how to handle the following 
> case:
>
> message M {
>   string S = 1;
>   oneof O {
> string s = 2;
>   }
> }
>
> Since fields M.S and M.s are siblings, the names to access those fields 
> will collide. I can generate the name "s*" for lowercase "s". Since "*" is 
> invalid in proto field names that's future proof.
>
> So I think I have a way forward but I'm curious if anyone knows of other 
> languages that have already addressed this, and what they did.
>
> The Common Lisp implementation ignores the problem and relies on it not 
> happening.  The docs for the PHP implementation don't mention this 
> particular problem. (They do mention reserved names such as Empty being 
> renamed to PBEmpty, but that's not a problem for Dylan.)
>
> Thoughts?  Prayers?
>
> Thanks.
> -Carl
>
>
> A couple more thoughts:
>
> Auto-assigning a different name for the field, such as "s*", would have to 
> be resilient to movement of the fields textually within the file. If the 
> oneof were moved textually above field S the name "s*" could suddenly name 
> a different field.
>
> Another possibility: a custom field option: dylan_name = "s*". This 
> assumes access to modify the .proto file, or the need to maintain a copy of 
> it.
>
> Maintain a custom name mapping file to pass to the proto parser / code 
> generator.
>
> (This is largely an academic exercise. This scenario seems unlikely to 
> happen in practice, especially considering the number of Dylan users on the 
> planet.)
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/f004118e-d5c0-435a-8e5b-27e7df4dba78n%40googlegroups.com.


[protobuf] Re: upb help

2023-11-21 Thread 'sh...@google.com' via Protocol Buffers
+acozze...@google.com

On Friday, November 17, 2023 at 6:11:42 AM UTC-5 Dav-11 wrote:

> Hello,
> I am trying to get started with a project using upb (
> https://github.com/protocolbuffers/protobuf/blob/main/upb/README.md) but 
> I cannot understand the project structure and how to build it.
>
> There are 3 repositories, each one with some informations:
>
>- https://github.com/protocolbuffers/upb (old)
>- https://github.com/protocolbuffers/protobuf/tree/main/upb (seems to 
>be the actual one)
>- https://github.com/haberman/upb/tree/main (last commit was in 2021
>
> I know PHP , Python and Ruby protobuf implementation are based on this, 
> but I am quite new with bazel and the BUILD files are complex. can someone 
> help me to understand how can I build the library (and also, if it is 
> possible, to understand how the project is structured) ?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/35448569-66a5-4053-a5f1-fd1c84d08b39n%40googlegroups.com.


[protobuf] Re: Can't pickle repeated scalar fields, convert to list first

2023-11-21 Thread 'sh...@google.com' via Protocol Buffers
Can file an issue on github with more details about the protobuf versions, 
OS, etc: https://github.com/protocolbuffers/protobuf/issues

On Friday, November 17, 2023 at 6:48:05 AM UTC-5 priya_cast_23 wrote:

> I was wondering if someone experienced this error before. 
> I tried to check if the descriptor is a  repeated scalar fields and 
> convert to a list by
>  changing the new_element to a list but it did not work. 
>
> I appreciate your time and thoughtful consideration.
> On Thursday, November 16, 2023 at 12:39:58 PM UTC priya_cast_23 wrote:
>
>> I am trying to parallelize a protobuf application 
>> one instance in each thread but I am getting this error 
>>
>> /usr/local/lib/python3.10/dist-packages/google/protobuf/internal/containers.py",
>>  
>> line 250, in __reduce__
>> raise pickle.PickleError("Can't pickle repeated scalar fields, 
>> convert to list first")
>>
>> I compiled protobuf 
>> ./configure CPPFLAGS="-DGOOGLE_PROTOBUF_NO_THREAD_SAFETY"
>>
>>   def __reduce__(self, **kwargs) -> NoReturn:
>> # Convert repeated scalar fields to lists
>> raise pickle.PickleError("Can't pickle repeated scalar fields, 
>> convert to list first") 
>>
>>
>>   def add(self, **kwargs: Any) -> _T:
>> """Adds a new element at the end of the list and returns it. Keyword
>> arguments may be used to initialize the element.
>> """
>> new_element = self._message_descriptor._concrete_class(**kwargs)
>> new_element._SetListener(self._message_listener)
>> self._values.append(new_element)
>> if not self._message_listener.dirty:
>>   self._message_listener.Modified()
>> return new_element
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/0eeb63ad-9c3d-41b7-af98-913f29c4d13bn%40googlegroups.com.


[protobuf] Re: TypeError: Message must be initialized with a dict

2023-11-21 Thread Fred Douglis
I found this thread when searching for the same message, but my solution 
was entirely different.  It turned out that I was trying to update a client 
to use a slightly modified proto file, and I was making a call that was 
assigning to a field that no longer existed.  But the error message here ( 
TypeError: 
Message must be initialized with a dict)   is completely misleading -- the 
real message should have been "unknown argument" or "unknown field" or 
something  

On Sunday, June 11, 2023 at 12:38:42 AM UTC-4 Tony Piazza wrote:

> The problem had to do with the Date and Timestamp fields. The problem was 
> solved by converting them to int64.
>
> On Saturday, June 10, 2023 at 10:56:36 PM UTC-5 Tony Piazza wrote:
>
>> My current task requires me to use the BigQuery Storage Write API. I have 
>> created a .proto file and was able to use protoc to generate a Python 
>> message class. I am seeing this exception when creating an instance of that 
>> class:
>>
>> *TypeError: Message must be initialized with a dict: 
>> combocurve.Measurement*
>>
>> *File "/google/api_core/grpc_helpers.py", line 162, in 
>> error_remapped_callable*
>>
>> Here is my .proto file:
>>
>> syntax = "proto2";
>>
>> package combocurve;
>>
>> import "google/protobuf/timestamp.proto";
>> import "google/type/date.proto";
>>
>> message Measurement {
>> required string device_id = 1;
>> required google.type.Date last_service_date = 2;
>> optional double temperature = 3;
>> optional double pressure = 4;
>> optional google.protobuf.Timestamp created_at = 5;
>> }
>>
>> Here is the code that is raising the exception:
>>
>> measurement = Measurement(
>> device_id='ABC123',
>> last_service_date=date_pb2.Date(
>> year=last_service_date.year, 
>> month=last_service_date.month, 
>> day=last_service_date.day),
>> temperature=10.0,
>> pressure=20.0,
>> created_at=int(created_at.timestamp() * 1e6)
>> )
>>
>> Please let me know if you have any ideas as to what is causing this 
>> exception.
>>
>> Thanks in advance for your help!
>>
>> -Tony
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/cf2858c3-fd19-491f-a44b-ad336d67b8a6n%40googlegroups.com.


[protobuf] Re: Can't pickle repeated scalar fields, convert to list first

2023-11-17 Thread priya_cast_23
I was wondering if someone experienced this error before. 
I tried to check if the descriptor is a  repeated scalar fields and convert 
to a list by
 changing the new_element to a list but it did not work. 

I appreciate your time and thoughtful consideration.
On Thursday, November 16, 2023 at 12:39:58 PM UTC priya_cast_23 wrote:

> I am trying to parallelize a protobuf application 
> one instance in each thread but I am getting this error 
>
> /usr/local/lib/python3.10/dist-packages/google/protobuf/internal/containers.py",
>  
> line 250, in __reduce__
> raise pickle.PickleError("Can't pickle repeated scalar fields, convert 
> to list first")
>
> I compiled protobuf 
> ./configure CPPFLAGS="-DGOOGLE_PROTOBUF_NO_THREAD_SAFETY"
>
>   def __reduce__(self, **kwargs) -> NoReturn:
> # Convert repeated scalar fields to lists
> raise pickle.PickleError("Can't pickle repeated scalar fields, convert 
> to list first") 
>
>
>   def add(self, **kwargs: Any) -> _T:
> """Adds a new element at the end of the list and returns it. Keyword
> arguments may be used to initialize the element.
> """
> new_element = self._message_descriptor._concrete_class(**kwargs)
> new_element._SetListener(self._message_listener)
> self._values.append(new_element)
> if not self._message_listener.dirty:
>   self._message_listener.Modified()
> return new_element
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/43f80425-87f4-40e6-b024-6eed11694809n%40googlegroups.com.


[protobuf] Re: Case insensitive languages and field names

2023-11-04 Thread Carl Gay
On Saturday, November 4, 2023 at 11:49:41 PM UTC-4 Carl Gay wrote:

Hi. I'm implementing  protocol 
buffers for a case-insensitive language (Dylan) and I'm wondering if 
there's prior art that would give me an idea how to handle the following 
case:

message M {
  string S = 1;
  oneof O {
string s = 2;
  }
}

Since fields M.S and M.s are siblings, the names to access those fields 
will collide. I can generate the name "s*" for lowercase "s". Since "*" is 
invalid in proto field names that's future proof.

So I think I have a way forward but I'm curious if anyone knows of other 
languages that have already addressed this, and what they did.

The Common Lisp implementation ignores the problem and relies on it not 
happening.  The docs for the PHP implementation don't mention this 
particular problem. (They do mention reserved names such as Empty being 
renamed to PBEmpty, but that's not a problem for Dylan.)

Thoughts?  Prayers?

Thanks.
-Carl


A couple more thoughts:

Auto-assigning a different name for the field, such as "s*", would have to 
be resilient to movement of the fields textually within the file. If the 
oneof were moved textually above field S the name "s*" could suddenly name 
a different field.

Another possibility: a custom field option: dylan_name = "s*". This assumes 
access to modify the .proto file, or the need to maintain a copy of it.

Maintain a custom name mapping file to pass to the proto parser / code 
generator.

(This is largely an academic exercise. This scenario seems unlikely to 
happen in practice, especially considering the number of Dylan users on the 
planet.)

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/eb524684-d548-4b95-8411-77f07a9b28f9n%40googlegroups.com.


[protobuf] Re: Unclearness in the specification

2023-10-23 Thread 'Em Rauch' via Protocol Buffers
This part of the official grammar unfortunately unclear; you may find Buf's 
reverse engineering of the protobuf grammar to be more clear on this point 
for your usecase: https://protobuf.com/docs/language-spec

On Saturday, October 21, 2023 at 11:43:45 AM UTC-4 Mattias Hansson wrote:

> I am currently trying to implement a protobuf parser but it's a bit 
> unclear to me how I should interpret some parts of the specification.
>
> Lets start with RPC option. Options for normal fields and enum fields are 
> pretty clear.
>
> Here's the specification for normal fields:
> field = [ "repeated" ] type fieldName "=" fieldNumber [ "[" fieldOptions 
> "]" ] ";" fieldOptions = fieldOption { "," fieldOption } fieldOption = 
> optionName "=" constant
>
> Here's the specification for enum fields:
> enum = "enum" enumName enumBody enumBody = "{" { option | enumField | 
> emptyStatement | reserved } "}" enumField = ident "=" [ "-" ] intLit [ 
> "[" enumValueOption { "," enumValueOption } "]" ]";" enumValueOption = 
> optionName "=" constant
>
> Both fields boils down to optionName which is specified as:
> optionName = ( ident | "(" ["."] fullIdent ")" )  
> Here's the specification for RPC:
> service = "service" serviceName "{" { option | rpc | emptyStatement } "}" rpc 
> = "rpc" rpcName "(" [ "stream" ] messageType ")" "returns" "(" [ "stream" ] 
> messageType 
> ")" (( "{" {option | emptyStatement } "}" ) | ";")
>
> RPC refers to plain option. Does this mean that the following is valid and 
> intended syntax?
> rpc Foo (Req) returns (Res);
> rpc Foo (Req) returns (Res) {option opt1 = "opt1"; option opt2 = true; };
> rpc Foo (Req) returns (Res) {option opt1 = "opt1"; ;;; option opt2 = true; 
> ;;;};
>
> I added number 3, because it seems like repetition is allowed together 
> with emptyStatement. As a side note, I could not find any examples with RPC 
> options as inspiration or to clear things up.
>
> Further, I also find the specification for optionName a bit unclear.
> Here's the specification for option and optionName:
> option = "option" optionName "=" constant ";" optionName = ( ident | "(" 
> ["."] fullIdent ")" ) 
> optionName does not seem to allow for repetition and fullIdent is only 
> allowed within parenthesis(custom option). However multiple examples on 
> https://protobuf.dev/programming-guides/proto3/#option-targets use 
> custom-like options on the format:
> option (bar).baz = "value";
>
> For me it's pretty clear that the following optionNames are allowed:
> foo
> (foo)
> (foo.bar)
> (.foo)
> (.foo.bar)
>
> I don't really get which part of the specification that states that "baz", 
> in the example above, is allowed outside of the parenthesis.
>
> Thanks in advance,
>
> /Mattias
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/3c97d4a5-bd9f-46bc-9e7c-4dfc80e3145an%40googlegroups.com.


[protobuf] Re: Golang Dynamically create Message from .proto files

2023-10-23 Thread 'Em Rauch' via Protocol Buffers
This exists in Java as DynamicMessage but I believe the equivalent does not 
exist in the official Go implementation.

You may find an OSS implementation that achieves the same (for example 
https://github.com/bufbuild/prototransform may do what you need).

On Saturday, October 21, 2023 at 2:46:45 PM UTC-4 Gary Vidal wrote:

> I have a use case where I need to dynamically read a .proto definition and 
> deserialize a byte array of matching message. I scoured the internet does 
> anyone have a working example. I am writing a kafka plugin that filters 
> messages and uses a schema registry to store and retrieve proto files.  
>
> Something like the following:
> DeserializeMessage(proto string, bytes byte[]). 
>
> Must be in go please.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/ba3cb795-8743-4772-a286-ea378de5394fn%40googlegroups.com.


[protobuf] Re: Documentation about endianness support

2023-10-21 Thread Marc Gravell
protobuf ensures that the *value of primitives* will be preserved; so if 
you send the uint32 23433451, it will be decoded as  23433451 regardless of 
the CPU/process endianness of sender and receiver. The details are 
here: https://protobuf.dev/programming-guides/encoding/ - but they **don't 
matter** unless you're writing your own protocol-level tooling.

What the receiver does with that value, however, is up to the receiver. For 
example, if they interpret that as IPv4 bytes using shift/mask operators: 
it will be work the same anywhere. If they interpret that as IPv4 bytes 
using "punning" or any other in-place re-interpret cast: then expect 
problems. But that's not an issue in the protobuf side - it is an issue in 
the code that *consumed it*.

On Saturday, 21 October 2023 at 07:48:03 UTC+1 Kumar bhat wrote:

> What happens if a uint32_t is sent from little endian system and received 
> by bit endian system. For eg: An Ipv4 address. Can we send the IP address 
> in any byte order and be able to see the same value in receiver of a 
> different endian machine? If thats the case, I dont see any documentation 
> for the same. 

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/b5b87de0-6cd0-471a-b517-6d2e2f62f318n%40googlegroups.com.


Re: [protobuf] Re: Python Protobuf Serialization Performance In 2023

2023-10-06 Thread 'Adam Cozzette' via Protocol Buffers
Starting with 4.21.0, the default Python implementation is now based on upb
(see here
),
and this should be faster than the older Python C++ implementation.
Python-C++ is actually deprecated, and we are thinking about deleting it at
some point.

On Mon, Oct 2, 2023 at 2:23 PM 'Daniel Koohmarey' via Protocol Buffers <
protobuf@googlegroups.com> wrote:

> Note for anyone stumbling upon this thread in the future, leveraging cpp
> generated protobuf serialization that was pybound to python yielded a 5x+
> improvement in serialization times over just
> PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=cpp .
>
> On Tuesday, August 22, 2023 at 11:03:30 AM UTC-7 Daniel Koohmarey wrote:
>
>> I stumbled upon a post from 2010 mentioning using CExtensions to improve
>> python protobuf serialization times:
>> https://groups.google.com/g/protobuf/c/z7E80KYJscc/m/ysCjHHmoraUJ
>> where the author states "~13x speedups" as a result of "Python code with
>> PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=cpp actually
>> also *searches for the symbols for any pre-generated C++ code in the
>> current process*, and uses them if available instead of
>> DynamicMessage." when using a CExtension. I can find reference code on
>> github
>> 
>> that uses this technique, and references a corresponding speedup, however I
>> do notice that all code using this approach is ~10 years old. Is this
>> approach still beneficial in any way to improve serialization performance
>> in python? Or would protobuf 3.19+ with
>> PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=cpp result in equally performant
>> serialization when serializing in Python3? Thanks
>> -  Daniel
>>
> --
> You received this message because you are subscribed to the Google Groups
> "Protocol Buffers" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to protobuf+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/protobuf/2cc6947d-6b59-4b3e-b3df-767418a866f4n%40googlegroups.com
> 
> .
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/CADqAXr6WAnps-o2LzbULsdox8zvR79PYWhBJK_oXcE-f5Y6Jkg%40mail.gmail.com.


[protobuf] Re: Python Protobuf Serialization Performance In 2023

2023-10-02 Thread 'Daniel Koohmarey' via Protocol Buffers
Note for anyone stumbling upon this thread in the future, leveraging cpp 
generated protobuf serialization that was pybound to python yielded a 5x+ 
improvement in serialization times over just 
PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=cpp .

On Tuesday, August 22, 2023 at 11:03:30 AM UTC-7 Daniel Koohmarey wrote:

> I stumbled upon a post from 2010 mentioning using CExtensions to improve 
> python protobuf serialization times:
> https://groups.google.com/g/protobuf/c/z7E80KYJscc/m/ysCjHHmoraUJ
> where the author states "~13x speedups" as a result of "Python code with 
> PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=cpp actually
> also *searches for the symbols for any pre-generated C++ code in the
> current process*, and uses them if available instead of
> DynamicMessage." when using a CExtension. I can find reference code on 
> github 
> 
>  
> that uses this technique, and references a corresponding speedup, however I 
> do notice that all code using this approach is ~10 years old. Is this 
> approach still beneficial in any way to improve serialization performance 
> in python? Or would protobuf 3.19+ with 
> PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=cpp result in equally performant 
> serialization when serializing in Python3? Thanks
> -  Daniel
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/2cc6947d-6b59-4b3e-b3df-767418a866f4n%40googlegroups.com.


Re: [protobuf] Re: Cannot statically link with protobuf from windows+vcpkg

2023-09-28 Thread Michael Ngarimu
I’m not 100% sure who maintains the protobuf vcpkg port but  GitHub.com/microsoft/vcpkg  is always a good place to start for vcpkg related anything. Regards,-- Michael "Kiwi" NgarimuOn Sep 28, 2023, at 01:50, Teo Tyrov  wrote:I checked the installed protobuf distribution in vcpkg's directory and indeed it seems that it only supports dynamic libraries...I would gladly move this discussion to somewhere more related to the vcpkg package, but when I search the package on the website (https://vcpkg.io/en/packages), it does not show any info at all (maintainer, contact, any flags to control static/dynamic builds). Is vcpkg completely centralized (because you suggested opening an issue to vcpkg itself)?I do specify the toolchain file in my cmake invocation, in my github build pipeline file https://github.com/em-eight/ppc2cpp/blob/651231349e7e5745d64f5ef6294fb76704351b07/.github/workflows/windows.yml#L30At any case, it would be very convenient if protobuf's maintainers added a windows binary distribution to the release files. I think a regular static build covers 99% of the use cases.On Thursday, September 28, 2023 at 12:24:14 AM UTC+3 Michael Ngarimu wrote:I haven’t looked at all of your specific details but it looks like issue(s) might be how you’re using vcpkg and not related to protobuf. I don’t how list admins feel about whether this is OT or not.If this discussion needs to move then probably vcpkg github or somewhere might be good to start.From your description it sounds like vcpkg is building the dynamic library variant of protobuf. I’ve usually used the vcpkg.json manifest for selecting which variant to build. I don’t know if vcpkg uses CMake variables when building the vcpkg port itself.I do know that vcpkg uses a very specific build environment that the vcpkg.exe executable curates when ports are built so that they are always built in an identical environment, meaning it may not be the same as the environment in your CMake invocation. What I don’t see in your CMakefile is the CMAKE_TOOLCHAIN_FILE pointing to your install of vcpkg.cmake. Is that no longer needed?On Tuesday, September 26, 2023 at 2:39:56 PM UTC-7 Teo Tyrov wrote:Hello, I am trying to link my library (ppc2cpp_core.lib) with protobuf, and then create the executable ppc2cpp.exe. I want link everything statically, so that downstream users do not require installing protobuf themselvesI am successfully building the library and executable on a github workflow, but when I try to run it locally, I get "libprotobuf.dll missing". I've tried using `set(Protobuf_USE_STATIC_LIBS ON)` but it still doesn't work.My cmake file: https://github.com/em-eight/ppc2cpp/blob/main/CMakeLists.txtFull workflow logs: https://github.com/em-eight/ppc2cpp/actions/runs/6318382821/job/17157239443



-- 
You received this message because you are subscribed to a topic in the Google Groups "Protocol Buffers" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/protobuf/F1sBHcWzdQk/unsubscribe.
To unsubscribe from this group and all its topics, send an email to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/protobuf/e2a12ccf-c00f-4b18-bcf0-c99d7ac242d1n%40googlegroups.com.




-- 
You received this message because you are subscribed to the Google Groups "Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/protobuf/C587449E-336E-4F74-B596-7D247DDA0A2C%40gmail.com.


[protobuf] Re: Cannot statically link with protobuf from windows+vcpkg

2023-09-28 Thread Teo Tyrov
I checked the installed protobuf distribution in vcpkg's directory and 
indeed it seems that it only supports dynamic libraries...

I would gladly move this discussion to somewhere more related to the vcpkg 
package, but when I search the package on the website 
(https://vcpkg.io/en/packages), it does not show any info at all 
(maintainer, contact, any flags to control static/dynamic builds). Is vcpkg 
completely centralized (because you suggested opening an issue to vcpkg 
itself)?

I do specify the toolchain file in my cmake invocation, in my github build 
pipeline file 
https://github.com/em-eight/ppc2cpp/blob/651231349e7e5745d64f5ef6294fb76704351b07/.github/workflows/windows.yml#L30

At any case, it would be very convenient if protobuf's maintainers added a 
windows binary distribution to the release files. I think a regular static 
build covers 99% of the use cases.
On Thursday, September 28, 2023 at 12:24:14 AM UTC+3 Michael Ngarimu wrote:

> I haven’t looked at all of your specific details but it looks like 
> issue(s) might be how you’re using vcpkg and not related to protobuf. I 
> don’t how list admins feel about whether this is OT or not.
>
> If this discussion needs to move then probably vcpkg github or somewhere 
> might be good to start.
>
> From your description it sounds like vcpkg is building the dynamic library 
> variant of protobuf. I’ve usually used the vcpkg.json manifest for 
> selecting which variant to build. I don’t know if vcpkg uses CMake 
> variables when building the vcpkg port itself.
>
> I do know that vcpkg uses a very specific build environment that the 
> vcpkg.exe executable curates when ports are built so that they are always 
> built in an identical environment, meaning it may not be the same as the 
> environment in your CMake invocation. 
>
> What I don’t see in your CMakefile is the CMAKE_TOOLCHAIN_FILE pointing to 
> your install of vcpkg.cmake. Is that no longer needed?
>
> On Tuesday, September 26, 2023 at 2:39:56 PM UTC-7 Teo Tyrov wrote:
>
>> Hello, I am trying to link my library (ppc2cpp_core.lib) with protobuf, 
>> and then create the executable ppc2cpp.exe. I want link everything 
>> statically, so that downstream users do not require installing protobuf 
>> themselves
>>
>> I am successfully building the library and executable on a github 
>> workflow, but when I try to run it locally, I get "libprotobuf.dll 
>> missing". I've tried using `set(Protobuf_USE_STATIC_LIBS ON)` but it still 
>> doesn't work.
>>
>> My cmake file: 
>> https://github.com/em-eight/ppc2cpp/blob/main/CMakeLists.txt
>>
>> Full workflow logs: 
>> https://github.com/em-eight/ppc2cpp/actions/runs/6318382821/job/17157239443
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/e2a12ccf-c00f-4b18-bcf0-c99d7ac242d1n%40googlegroups.com.


[protobuf] Re: Cannot statically link with protobuf from windows+vcpkg

2023-09-27 Thread Michael Ngarimu
I haven’t looked at all of your specific details but it looks like issue(s) 
might be how you’re using vcpkg and not related to protobuf. I don’t how 
list admins feel about whether this is OT or not.

If this discussion needs to move then probably vcpkg github or somewhere 
might be good to start.

>From your description it sounds like vcpkg is building the dynamic library 
variant of protobuf. I’ve usually used the vcpkg.json manifest for 
selecting which variant to build. I don’t know if vcpkg uses CMake 
variables when building the vcpkg port itself.

I do know that vcpkg uses a very specific build environment that the 
vcpkg.exe executable curates when ports are built so that they are always 
built in an identical environment, meaning it may not be the same as the 
environment in your CMake invocation. 

What I don’t see in your CMakefile is the CMAKE_TOOLCHAIN_FILE pointing to 
your install of vcpkg.cmake. Is that no longer needed?

On Tuesday, September 26, 2023 at 2:39:56 PM UTC-7 Teo Tyrov wrote:

> Hello, I am trying to link my library (ppc2cpp_core.lib) with protobuf, 
> and then create the executable ppc2cpp.exe. I want link everything 
> statically, so that downstream users do not require installing protobuf 
> themselves
>
> I am successfully building the library and executable on a github 
> workflow, but when I try to run it locally, I get "libprotobuf.dll 
> missing". I've tried using `set(Protobuf_USE_STATIC_LIBS ON)` but it still 
> doesn't work.
>
> My cmake file: 
> https://github.com/em-eight/ppc2cpp/blob/main/CMakeLists.txt
>
> Full workflow logs: 
> https://github.com/em-eight/ppc2cpp/actions/runs/6318382821/job/17157239443
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/962f5eea-3842-42ea-b9f5-30d55f67dfeen%40googlegroups.com.


[protobuf] Re: Trying to rewrite a protobuf message but changing a couple of values

2023-09-20 Thread Joan Balagueró
Ok thanks, I was trying to avoid this, to compile and use the generated 
classes. Now we are currently processing the raw protobuf message as a map 
of UnknownFieldSet, and I was wondering if there was a choice to make this 
"rewrite" without the need to compile and use the class, just traversing 
the protobuf.

I will check and try.

Thanks,

Joan.

On Wednesday, September 20, 2023 at 12:30:26 PM UTC+2 Florian Enner wrote:

> As the very first step you should compile your schema and work with the 
> generated classes: 
> https://protobuf.dev/getting-started/javatutorial/#compiling-protocol-buffers
>
> Don't mess with the reflection API if you can avoid it. 
>
>
> On Wednesday, September 20, 2023 at 12:19:38 PM UTC+2 Joan Balagueró wrote:
>
>> Hi Florian,
>>
>> Not sure what I can do with this code. I only have a byte array 
>> representing the above protobuf, I don't have any object or message. At 
>> most I have a map with the list of the "UnknownFieldSet.Field" fields after 
>> parsing the byte array (this is java code).
>>
>> I'm relatively new to protobuf, I read a lot on the Internet trying to 
>> find a solution but I did not find anything.
>>
>> That's why I tried to traverse the map of fields and write them to the 
>> new byte array. It's the only solution I could think of, but I'm doing 
>> something wrong.
>>
>> Not sure if you can help me a bit more to solve this.
>>
>> Anyways thanks.
>>
>> Joan.
>>
>>
>>
>>
>>
>> On Wednesday, September 20, 2023 at 11:56:19 AM UTC+2 Florian Enner wrote:
>>
>>> Messages are serialized with a length delimiter, so changing the content 
>>> produces a mismatch and invalid message.
>>>
>>> Your schema has no affected repeated fields, so appending a delta should 
>>> work. I've never used the C# API, but here is some hopefully understandable 
>>> pseudo code:
>>>
>>> var delta = Request.newInstance();
>>> delta.getMutableMeta().getMutableCutOffTime()   
>>>   .setValue(value)
>>>   .setScale(TimeSpanScale.MINMAX)
>>> byte[] output = append(unmodifiedInputBytes, delta.toByteArray());
>>>
>>> If the server expects a length delimiter you'd need to update it to the 
>>> new combined length.
>>>
>>> - Florian
>>>
>>> On Wednesday, September 20, 2023 at 11:03:46 AM UTC+2 Joan Balagueró 
>>> wrote:
>>>
 Hi Florian,

 Thanks for your quick response. I'm stuck on this.

 1) It's not working. When I send the protobuf to the backend server 
 (it's not our api nor server) using the first method, I get a right 
 response. But using the second method I receive this error:
 ProtoBuf.ProtoException: Invalid wire-type; this usually means you have 
 over-written a file without truncating or setting the length; see 
 https://stackoverflow.com/q/2152978/23354
at ProtoBuf.ProtoReader.StartSubItem(ProtoReader reader) in 
 C:\code\protobuf-net\src\protobuf-net\ProtoReader.cs:line 637
at ProtoBuf.ProtoReader.ReadTypedObject(Object value, Int32 key, 
 ProtoReader reader, Type type) in 
 C:\code\protobuf-net\src\protobuf-net\ProtoReader.cs:line 584
at proto_40(Object , ProtoReader )
at ProtoBuf.Meta.TypeModel.DeserializeCore(ProtoReader reader, Type 
 type, Object value, Boolean noAutoCreate) in 
 C:\code\protobuf-net\src\protobuf-net\Meta\TypeModel.cs:line 722
at ProtoBuf.Meta.TypeModel.Deserialize(Stream source, Object value, 
 Type type, SerializationContext context) in 
 C:\code\protobuf-net\src\protobuf-net\Meta\TypeModel.cs:line 599
at 
 WebBeds.Connect.AspNetCore.Formatters.ProtobufInputFormatter.ReadRequestBodyAsync(InputFormatterContext
  
 context)
at 
 Microsoft.AspNetCore.Mvc.Formatters.InputFormatter.ReadAsync(InputFormatterContext
  
 context)
at 
 Microsoft.AspNetCore.Mvc.ModelBinding.Binders.BodyModelBinder.BindModelAsync(ModelBindingContext
  
 bindingContext)
at 
 Microsoft.AspNetCore.Mvc.ModelBinding.ParameterBinder.BindModelAsync(ActionContext
  
 actionContext, IModelBinder modelBinder, IValueProvider valueProvider, 
 ParameterDescriptor parameter, ModelMetadata metadata, Object value, 
 Object 
 container)
at 
 Microsoft.AspNetCore.Mvc.Controllers.ControllerBinderDelegateProvider.<>c__DisplayClass0_0.d.MoveNext()
 --- End of stack trace from previous location ---
at 
 Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.g__Awaited|13_0(ControllerActionInvoker
  
 invoker, Task lastTask, State next, Scope scope, Object state, Boolean 
 isCompleted)
at 
 Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.g__Awaited|26_0(ResourceInvoker
  
 invoker, Task lastTask, State next, Scope scope, Object state, Boolean 
 isCompleted


 2) This is the proto:

 The request with the 'Meta' element:

 message Request {
Meta Meta = 1;
repeated int32 Hotels 

[protobuf] Re: Trying to rewrite a protobuf message but changing a couple of values

2023-09-20 Thread Florian Enner
As the very first step you should compile your schema and work with the 
generated classes: 
https://protobuf.dev/getting-started/javatutorial/#compiling-protocol-buffers

Don't mess with the reflection API if you can avoid it. 


On Wednesday, September 20, 2023 at 12:19:38 PM UTC+2 Joan Balagueró wrote:

> Hi Florian,
>
> Not sure what I can do with this code. I only have a byte array 
> representing the above protobuf, I don't have any object or message. At 
> most I have a map with the list of the "UnknownFieldSet.Field" fields after 
> parsing the byte array (this is java code).
>
> I'm relatively new to protobuf, I read a lot on the Internet trying to 
> find a solution but I did not find anything.
>
> That's why I tried to traverse the map of fields and write them to the new 
> byte array. It's the only solution I could think of, but I'm doing 
> something wrong.
>
> Not sure if you can help me a bit more to solve this.
>
> Anyways thanks.
>
> Joan.
>
>
>
>
>
> On Wednesday, September 20, 2023 at 11:56:19 AM UTC+2 Florian Enner wrote:
>
>> Messages are serialized with a length delimiter, so changing the content 
>> produces a mismatch and invalid message.
>>
>> Your schema has no affected repeated fields, so appending a delta should 
>> work. I've never used the C# API, but here is some hopefully understandable 
>> pseudo code:
>>
>> var delta = Request.newInstance();
>> delta.getMutableMeta().getMutableCutOffTime()   
>>   .setValue(value)
>>   .setScale(TimeSpanScale.MINMAX)
>> byte[] output = append(unmodifiedInputBytes, delta.toByteArray());
>>
>> If the server expects a length delimiter you'd need to update it to the 
>> new combined length.
>>
>> - Florian
>>
>> On Wednesday, September 20, 2023 at 11:03:46 AM UTC+2 Joan Balagueró 
>> wrote:
>>
>>> Hi Florian,
>>>
>>> Thanks for your quick response. I'm stuck on this.
>>>
>>> 1) It's not working. When I send the protobuf to the backend server 
>>> (it's not our api nor server) using the first method, I get a right 
>>> response. But using the second method I receive this error:
>>> ProtoBuf.ProtoException: Invalid wire-type; this usually means you have 
>>> over-written a file without truncating or setting the length; see 
>>> https://stackoverflow.com/q/2152978/23354
>>>at ProtoBuf.ProtoReader.StartSubItem(ProtoReader reader) in 
>>> C:\code\protobuf-net\src\protobuf-net\ProtoReader.cs:line 637
>>>at ProtoBuf.ProtoReader.ReadTypedObject(Object value, Int32 key, 
>>> ProtoReader reader, Type type) in 
>>> C:\code\protobuf-net\src\protobuf-net\ProtoReader.cs:line 584
>>>at proto_40(Object , ProtoReader )
>>>at ProtoBuf.Meta.TypeModel.DeserializeCore(ProtoReader reader, Type 
>>> type, Object value, Boolean noAutoCreate) in 
>>> C:\code\protobuf-net\src\protobuf-net\Meta\TypeModel.cs:line 722
>>>at ProtoBuf.Meta.TypeModel.Deserialize(Stream source, Object value, 
>>> Type type, SerializationContext context) in 
>>> C:\code\protobuf-net\src\protobuf-net\Meta\TypeModel.cs:line 599
>>>at 
>>> WebBeds.Connect.AspNetCore.Formatters.ProtobufInputFormatter.ReadRequestBodyAsync(InputFormatterContext
>>>  
>>> context)
>>>at 
>>> Microsoft.AspNetCore.Mvc.Formatters.InputFormatter.ReadAsync(InputFormatterContext
>>>  
>>> context)
>>>at 
>>> Microsoft.AspNetCore.Mvc.ModelBinding.Binders.BodyModelBinder.BindModelAsync(ModelBindingContext
>>>  
>>> bindingContext)
>>>at 
>>> Microsoft.AspNetCore.Mvc.ModelBinding.ParameterBinder.BindModelAsync(ActionContext
>>>  
>>> actionContext, IModelBinder modelBinder, IValueProvider valueProvider, 
>>> ParameterDescriptor parameter, ModelMetadata metadata, Object value, Object 
>>> container)
>>>at 
>>> Microsoft.AspNetCore.Mvc.Controllers.ControllerBinderDelegateProvider.<>c__DisplayClass0_0.d.MoveNext()
>>> --- End of stack trace from previous location ---
>>>at 
>>> Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.g__Awaited|13_0(ControllerActionInvoker
>>>  
>>> invoker, Task lastTask, State next, Scope scope, Object state, Boolean 
>>> isCompleted)
>>>at 
>>> Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.g__Awaited|26_0(ResourceInvoker
>>>  
>>> invoker, Task lastTask, State next, Scope scope, Object state, Boolean 
>>> isCompleted
>>>
>>>
>>> 2) This is the proto:
>>>
>>> The request with the 'Meta' element:
>>>
>>> message Request {
>>>Meta Meta = 1;
>>>repeated int32 Hotels = 2 [packed = false];
>>>Country Market = 3;
>>>repeated Room Rooms = 4;
>>>.bcl.DateTime CheckIn = 5;
>>>.bcl.DateTime CheckOut = 6;
>>>OptionalCriteria OptionalCriteria = 7;
>>> }
>>>
>>> The 'Meta' element that contains the 'CutOffTime' that we want to modify:
>>>
>>> message Meta {
>>>int32 Client = 1;
>>>int32 Brand = 2;
>>>bool UseCache = 3;
>>>.bcl.TimeSpan CutOffTime = 4;
>>>bool B2C = 5;
>>>Language Language = 6;
>>>Currency Currency = 7;
>>>bool IncludeProviderAuditData = 8;
>>>SalesChan

[protobuf] Re: Trying to rewrite a protobuf message but changing a couple of values

2023-09-20 Thread Joan Balagueró
Hi Florian,

Not sure what I can do with this code. I only have a byte array 
representing the above protobuf, I don't have any object or message. At 
most I have a map with the list of the "UnknownFieldSet.Field" fields after 
parsing the byte array (this is java code).

I'm relatively new to protobuf, I read a lot on the Internet trying to find 
a solution but I did not find anything.

That's why I tried to traverse the map of fields and write them to the new 
byte array. It's the only solution I could think of, but I'm doing 
something wrong.

Not sure if you can help me a bit more to solve this.

Anyways thanks.

Joan.





On Wednesday, September 20, 2023 at 11:56:19 AM UTC+2 Florian Enner wrote:

> Messages are serialized with a length delimiter, so changing the content 
> produces a mismatch and invalid message.
>
> Your schema has no affected repeated fields, so appending a delta should 
> work. I've never used the C# API, but here is some hopefully understandable 
> pseudo code:
>
> var delta = Request.newInstance();
> delta.getMutableMeta().getMutableCutOffTime()   
>   .setValue(value)
>   .setScale(TimeSpanScale.MINMAX)
> byte[] output = append(unmodifiedInputBytes, delta.toByteArray());
>
> If the server expects a length delimiter you'd need to update it to the 
> new combined length.
>
> - Florian
>
> On Wednesday, September 20, 2023 at 11:03:46 AM UTC+2 Joan Balagueró wrote:
>
>> Hi Florian,
>>
>> Thanks for your quick response. I'm stuck on this.
>>
>> 1) It's not working. When I send the protobuf to the backend server (it's 
>> not our api nor server) using the first method, I get a right response. But 
>> using the second method I receive this error:
>> ProtoBuf.ProtoException: Invalid wire-type; this usually means you have 
>> over-written a file without truncating or setting the length; see 
>> https://stackoverflow.com/q/2152978/23354
>>at ProtoBuf.ProtoReader.StartSubItem(ProtoReader reader) in 
>> C:\code\protobuf-net\src\protobuf-net\ProtoReader.cs:line 637
>>at ProtoBuf.ProtoReader.ReadTypedObject(Object value, Int32 key, 
>> ProtoReader reader, Type type) in 
>> C:\code\protobuf-net\src\protobuf-net\ProtoReader.cs:line 584
>>at proto_40(Object , ProtoReader )
>>at ProtoBuf.Meta.TypeModel.DeserializeCore(ProtoReader reader, Type 
>> type, Object value, Boolean noAutoCreate) in 
>> C:\code\protobuf-net\src\protobuf-net\Meta\TypeModel.cs:line 722
>>at ProtoBuf.Meta.TypeModel.Deserialize(Stream source, Object value, 
>> Type type, SerializationContext context) in 
>> C:\code\protobuf-net\src\protobuf-net\Meta\TypeModel.cs:line 599
>>at 
>> WebBeds.Connect.AspNetCore.Formatters.ProtobufInputFormatter.ReadRequestBodyAsync(InputFormatterContext
>>  
>> context)
>>at 
>> Microsoft.AspNetCore.Mvc.Formatters.InputFormatter.ReadAsync(InputFormatterContext
>>  
>> context)
>>at 
>> Microsoft.AspNetCore.Mvc.ModelBinding.Binders.BodyModelBinder.BindModelAsync(ModelBindingContext
>>  
>> bindingContext)
>>at 
>> Microsoft.AspNetCore.Mvc.ModelBinding.ParameterBinder.BindModelAsync(ActionContext
>>  
>> actionContext, IModelBinder modelBinder, IValueProvider valueProvider, 
>> ParameterDescriptor parameter, ModelMetadata metadata, Object value, Object 
>> container)
>>at 
>> Microsoft.AspNetCore.Mvc.Controllers.ControllerBinderDelegateProvider.<>c__DisplayClass0_0.d.MoveNext()
>> --- End of stack trace from previous location ---
>>at 
>> Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.g__Awaited|13_0(ControllerActionInvoker
>>  
>> invoker, Task lastTask, State next, Scope scope, Object state, Boolean 
>> isCompleted)
>>at 
>> Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.g__Awaited|26_0(ResourceInvoker
>>  
>> invoker, Task lastTask, State next, Scope scope, Object state, Boolean 
>> isCompleted
>>
>>
>> 2) This is the proto:
>>
>> The request with the 'Meta' element:
>>
>> message Request {
>>Meta Meta = 1;
>>repeated int32 Hotels = 2 [packed = false];
>>Country Market = 3;
>>repeated Room Rooms = 4;
>>.bcl.DateTime CheckIn = 5;
>>.bcl.DateTime CheckOut = 6;
>>OptionalCriteria OptionalCriteria = 7;
>> }
>>
>> The 'Meta' element that contains the 'CutOffTime' that we want to modify:
>>
>> message Meta {
>>int32 Client = 1;
>>int32 Brand = 2;
>>bool UseCache = 3;
>>.bcl.TimeSpan CutOffTime = 4;
>>bool B2C = 5;
>>Language Language = 6;
>>Currency Currency = 7;
>>bool IncludeProviderAuditData = 8;
>>SalesChannel SalesChannel = 9;
>>string AgentId = 10;
>> }
>>
>> The 'CutOffTime':
>>
>> message TimeSpan {
>>   sint64 value = 1; // the size of the timespan (in units of the selected 
>> scale)
>>   TimeSpanScale scale = 2; // the scale of the timespan [default = DAYS]
>>   enum TimeSpanScale {
>> DAYS = 0;
>> HOURS = 1;
>> MINUTES = 2;
>> SECONDS = 3;
>> MILLISECONDS = 4;
>> TICKS = 5;
>>
>> MINMAX = 15; // dubious
>>   }
>> }

[protobuf] Re: Trying to rewrite a protobuf message but changing a couple of values

2023-09-20 Thread Florian Enner
Messages are serialized with a length delimiter, so changing the content 
produces a mismatch and invalid message.

Your schema has no affected repeated fields, so appending a delta should 
work. I've never used the C# API, but here is some hopefully understandable 
pseudo code:

var delta = Request.newInstance();
delta.getMutableMeta().getMutableCutOffTime()   
  .setValue(value)
  .setScale(TimeSpanScale.MINMAX)
byte[] output = append(unmodifiedInputBytes, delta.toByteArray());

If the server expects a length delimiter you'd need to update it to the new 
combined length.

- Florian

On Wednesday, September 20, 2023 at 11:03:46 AM UTC+2 Joan Balagueró wrote:

> Hi Florian,
>
> Thanks for your quick response. I'm stuck on this.
>
> 1) It's not working. When I send the protobuf to the backend server (it's 
> not our api nor server) using the first method, I get a right response. But 
> using the second method I receive this error:
> ProtoBuf.ProtoException: Invalid wire-type; this usually means you have 
> over-written a file without truncating or setting the length; see 
> https://stackoverflow.com/q/2152978/23354
>at ProtoBuf.ProtoReader.StartSubItem(ProtoReader reader) in 
> C:\code\protobuf-net\src\protobuf-net\ProtoReader.cs:line 637
>at ProtoBuf.ProtoReader.ReadTypedObject(Object value, Int32 key, 
> ProtoReader reader, Type type) in 
> C:\code\protobuf-net\src\protobuf-net\ProtoReader.cs:line 584
>at proto_40(Object , ProtoReader )
>at ProtoBuf.Meta.TypeModel.DeserializeCore(ProtoReader reader, Type 
> type, Object value, Boolean noAutoCreate) in 
> C:\code\protobuf-net\src\protobuf-net\Meta\TypeModel.cs:line 722
>at ProtoBuf.Meta.TypeModel.Deserialize(Stream source, Object value, 
> Type type, SerializationContext context) in 
> C:\code\protobuf-net\src\protobuf-net\Meta\TypeModel.cs:line 599
>at 
> WebBeds.Connect.AspNetCore.Formatters.ProtobufInputFormatter.ReadRequestBodyAsync(InputFormatterContext
>  
> context)
>at 
> Microsoft.AspNetCore.Mvc.Formatters.InputFormatter.ReadAsync(InputFormatterContext
>  
> context)
>at 
> Microsoft.AspNetCore.Mvc.ModelBinding.Binders.BodyModelBinder.BindModelAsync(ModelBindingContext
>  
> bindingContext)
>at 
> Microsoft.AspNetCore.Mvc.ModelBinding.ParameterBinder.BindModelAsync(ActionContext
>  
> actionContext, IModelBinder modelBinder, IValueProvider valueProvider, 
> ParameterDescriptor parameter, ModelMetadata metadata, Object value, Object 
> container)
>at 
> Microsoft.AspNetCore.Mvc.Controllers.ControllerBinderDelegateProvider.<>c__DisplayClass0_0.d.MoveNext()
> --- End of stack trace from previous location ---
>at 
> Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.g__Awaited|13_0(ControllerActionInvoker
>  
> invoker, Task lastTask, State next, Scope scope, Object state, Boolean 
> isCompleted)
>at 
> Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.g__Awaited|26_0(ResourceInvoker
>  
> invoker, Task lastTask, State next, Scope scope, Object state, Boolean 
> isCompleted
>
>
> 2) This is the proto:
>
> The request with the 'Meta' element:
>
> message Request {
>Meta Meta = 1;
>repeated int32 Hotels = 2 [packed = false];
>Country Market = 3;
>repeated Room Rooms = 4;
>.bcl.DateTime CheckIn = 5;
>.bcl.DateTime CheckOut = 6;
>OptionalCriteria OptionalCriteria = 7;
> }
>
> The 'Meta' element that contains the 'CutOffTime' that we want to modify:
>
> message Meta {
>int32 Client = 1;
>int32 Brand = 2;
>bool UseCache = 3;
>.bcl.TimeSpan CutOffTime = 4;
>bool B2C = 5;
>Language Language = 6;
>Currency Currency = 7;
>bool IncludeProviderAuditData = 8;
>SalesChannel SalesChannel = 9;
>string AgentId = 10;
> }
>
> The 'CutOffTime':
>
> message TimeSpan {
>   sint64 value = 1; // the size of the timespan (in units of the selected 
> scale)
>   TimeSpanScale scale = 2; // the scale of the timespan [default = DAYS]
>   enum TimeSpanScale {
> DAYS = 0;
> HOURS = 1;
> MINUTES = 2;
> SECONDS = 3;
> MILLISECONDS = 4;
> TICKS = 5;
>
> MINMAX = 15; // dubious
>   }
> }
>
> Thanks,
>
> Joan.
>
>
>
>
>
>
> On Wednesday, September 20, 2023 at 10:40:08 AM UTC+2 Florian Enner wrote:
>
>> 1) A "varint" is a "variable length integer". When you replace a large 
>> number with a small one, it's entirely possible to lose some bytes and 
>> still be valid. You need to check the actual output.
>>
>> 2) Can you provide the proto definition of the field you want to modify? 
>> Scalar fields get set to the last encountered value, so the easiest option 
>> may be to copy the original bytes and append a delta containing the 
>> differences.
>>  
>>
>>
>>
>> On Wednesday, September 20, 2023 at 10:19:07 AM UTC+2 Joan Balagueró 
>> wrote:
>>
>>> Hello,
>>>
>>> I have a protobuf message like this into a byte array:
>>>
>>> 1: { // META element
>>> 1: 2
>>> 2: 1
>>> 3: 1
>>>

[protobuf] Re: Trying to rewrite a protobuf message but changing a couple of values

2023-09-20 Thread Joan Balagueró
Hi Florian,

Thanks for your quick response. I'm stuck on this.

1) It's not working. When I send the protobuf to the backend server (it's 
not our api nor server) using the first method, I get a right response. But 
using the second method I receive this error:
ProtoBuf.ProtoException: Invalid wire-type; this usually means you have 
over-written a file without truncating or setting the length; see 
https://stackoverflow.com/q/2152978/23354
   at ProtoBuf.ProtoReader.StartSubItem(ProtoReader reader) in 
C:\code\protobuf-net\src\protobuf-net\ProtoReader.cs:line 637
   at ProtoBuf.ProtoReader.ReadTypedObject(Object value, Int32 key, 
ProtoReader reader, Type type) in 
C:\code\protobuf-net\src\protobuf-net\ProtoReader.cs:line 584
   at proto_40(Object , ProtoReader )
   at ProtoBuf.Meta.TypeModel.DeserializeCore(ProtoReader reader, Type 
type, Object value, Boolean noAutoCreate) in 
C:\code\protobuf-net\src\protobuf-net\Meta\TypeModel.cs:line 722
   at ProtoBuf.Meta.TypeModel.Deserialize(Stream source, Object value, Type 
type, SerializationContext context) in 
C:\code\protobuf-net\src\protobuf-net\Meta\TypeModel.cs:line 599
   at 
WebBeds.Connect.AspNetCore.Formatters.ProtobufInputFormatter.ReadRequestBodyAsync(InputFormatterContext
 
context)
   at 
Microsoft.AspNetCore.Mvc.Formatters.InputFormatter.ReadAsync(InputFormatterContext
 
context)
   at 
Microsoft.AspNetCore.Mvc.ModelBinding.Binders.BodyModelBinder.BindModelAsync(ModelBindingContext
 
bindingContext)
   at 
Microsoft.AspNetCore.Mvc.ModelBinding.ParameterBinder.BindModelAsync(ActionContext
 
actionContext, IModelBinder modelBinder, IValueProvider valueProvider, 
ParameterDescriptor parameter, ModelMetadata metadata, Object value, Object 
container)
   at 
Microsoft.AspNetCore.Mvc.Controllers.ControllerBinderDelegateProvider.<>c__DisplayClass0_0.d.MoveNext()
--- End of stack trace from previous location ---
   at 
Microsoft.AspNetCore.Mvc.Infrastructure.ControllerActionInvoker.g__Awaited|13_0(ControllerActionInvoker
 
invoker, Task lastTask, State next, Scope scope, Object state, Boolean 
isCompleted)
   at 
Microsoft.AspNetCore.Mvc.Infrastructure.ResourceInvoker.g__Awaited|26_0(ResourceInvoker
 
invoker, Task lastTask, State next, Scope scope, Object state, Boolean 
isCompleted


2) This is the proto:

The request with the 'Meta' element:

message Request {
   Meta Meta = 1;
   repeated int32 Hotels = 2 [packed = false];
   Country Market = 3;
   repeated Room Rooms = 4;
   .bcl.DateTime CheckIn = 5;
   .bcl.DateTime CheckOut = 6;
   OptionalCriteria OptionalCriteria = 7;
}

The 'Meta' element that contains the 'CutOffTime' that we want to modify:

message Meta {
   int32 Client = 1;
   int32 Brand = 2;
   bool UseCache = 3;
   .bcl.TimeSpan CutOffTime = 4;
   bool B2C = 5;
   Language Language = 6;
   Currency Currency = 7;
   bool IncludeProviderAuditData = 8;
   SalesChannel SalesChannel = 9;
   string AgentId = 10;
}

The 'CutOffTime':

message TimeSpan {
  sint64 value = 1; // the size of the timespan (in units of the selected 
scale)
  TimeSpanScale scale = 2; // the scale of the timespan [default = DAYS]
  enum TimeSpanScale {
DAYS = 0;
HOURS = 1;
MINUTES = 2;
SECONDS = 3;
MILLISECONDS = 4;
TICKS = 5;

MINMAX = 15; // dubious
  }
}

Thanks,

Joan.






On Wednesday, September 20, 2023 at 10:40:08 AM UTC+2 Florian Enner wrote:

> 1) A "varint" is a "variable length integer". When you replace a large 
> number with a small one, it's entirely possible to lose some bytes and 
> still be valid. You need to check the actual output.
>
> 2) Can you provide the proto definition of the field you want to modify? 
> Scalar fields get set to the last encountered value, so the easiest option 
> may be to copy the original bytes and append a delta containing the 
> differences.
>  
>
>
>
> On Wednesday, September 20, 2023 at 10:19:07 AM UTC+2 Joan Balagueró wrote:
>
>> Hello,
>>
>> I have a protobuf message like this into a byte array:
>>
>> 1: { // META element
>> 1: 2
>> 2: 1
>> 3: 1
>> 4: {// CutOffTime element within META
>> 1: 10
>> 2: 3
>>}
>> 5: 1
>> 6: 40
>> }
>> 2: 9836 // HOTEL element
>> 3: 724 // MARKET element
>>
>>
>> We need to traverse this message and write it to a 'CodeOutputStream', 
>> but changing the values of the 'cutoff' element to, for example to '4: { 1: 
>> 7, 2: 4 }'. I'm not able to do it, I need some help.
>>
>> A basic algorithm that writes the same protobuf to another byte array but 
>> without changing anything works. Here I try the special case of 'META' (key 
>> = 1) that contains the 'cutoff' element.
>>
>> Map rootFields = 
>> UnknownFieldSet.parseFrom(document).asMap();
>>
>> for (Map.Entry entry : 
>> rootFields.entrySet()) {
>> if (entry.getKey() == 1) {
>> ByteString bs =

[protobuf] Re: Trying to rewrite a protobuf message but changing a couple of values

2023-09-20 Thread Florian Enner
1) A "varint" is a "variable length integer". When you replace a large 
number with a small one, it's entirely possible to lose some bytes and 
still be valid. You need to check the actual output.

2) Can you provide the proto definition of the field you want to modify? 
Scalar fields get set to the last encountered value, so the easiest option 
may be to copy the original bytes and append a delta containing the 
differences.
 



On Wednesday, September 20, 2023 at 10:19:07 AM UTC+2 Joan Balagueró wrote:

> Hello,
>
> I have a protobuf message like this into a byte array:
>
> 1: { // META element
> 1: 2
> 2: 1
> 3: 1
> 4: {// CutOffTime element within META
> 1: 10
> 2: 3
>}
> 5: 1
> 6: 40
> }
> 2: 9836 // HOTEL element
> 3: 724 // MARKET element
>
>
> We need to traverse this message and write it to a 'CodeOutputStream', but 
> changing the values of the 'cutoff' element to, for example to '4: { 1: 7, 
> 2: 4 }'. I'm not able to do it, I need some help.
>
> A basic algorithm that writes the same protobuf to another byte array but 
> without changing anything works. Here I try the special case of 'META' (key 
> = 1) that contains the 'cutoff' element.
>
> Map rootFields = 
> UnknownFieldSet.parseFrom(document).asMap();
>
> for (Map.Entry entry : 
> rootFields.entrySet()) {
> if (entry.getKey() == 1) {
> ByteString bs = 
> entry.getValue().getLengthDelimitedList().get(0);
> output.writeByteArray(1, bs.toByteArray());
> }
> else {
> entry.getValue().writeTo(entry.getKey(), output);
> }
> }
>
>
> Now I try to go a bit further, trying to read the cuotff element, change 
> the values and rewrite them to the 'output'. And here is when I'm not able 
> to solve the problem. Below my try that does not work, it generates a byte 
> array
> of 69 bytes instead of 73 (I'm losing 4 bytes):
>
> Map rootFields = 
> UnknownFieldSet.parseFrom(document).asMap();
> 
> for (Map.Entry entry : 
> rootFields.entrySet()) {
> if (entry.getKey() == 1) {
> ByteString bs = 
> entry.getValue().getLengthDelimitedList().get(0);
> Map ufs = 
> UnknownFieldSet.parseFrom(bs).asMap();
>
> for (Map.Entry item : 
> ufs.entrySet()) {
> if (item.getKey() == 4) {
> Map cutoff = 
> UnknownFieldSet.parseFrom(item.getValue().getLengthDelimitedList().get(0)).asMap();
> cutoff.put(1, 
> UnknownFieldSet.Field.newBuilder().addVarint(7).build()).writeTo(1, output);
> cutoff.put(2, 
> UnknownFieldSet.Field.newBuilder().addVarint(4).build()).writeTo(1, output);
> }
> else {
> item.getValue().writeTo(item.getKey(), output);
> }
> }
> }
> else {
> entry.getValue().writeTo(entry.getKey(), output);
> }
> }
>
> Any help would be really appreciated.
>
> Thanks,
>
> Joan.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/d7cf1a43-2613-4dd1-8e80-2fbbd2b5db77n%40googlegroups.com.


[protobuf] Re: How to import .proto file in different package correctly?

2023-09-18 Thread Daz Wilkin
The repro is somewhat unclear.

The package structure is uses `abc` but the protobuf sources use `adc`; I 
assume both should be the same.

In `share.proto`, `message Test` contains no fields and fields contain 
values, so the `Test` message contains no values and isn't useful.

The `detect.proto` contains no messages and so isn't useful either. 

If instead the following structure is used:

```
.
├── com
│   └── abc
│   ├── depart
│   │   └── detect.proto
│   └── protobuf
│   └── share.proto
└── main.py
```
And:
`detect.proto`:
```
syntax = "proto3";

package com.abc.depart;

import "com/abc/protobuf/share.proto";

message X{
com.abc.protobuf.Test test = 1;
}
```
And:
`share.proto`:
```
syntax = "proto3";

package com.abc.protobuf;

message Test{
string value = 1;
}
```

Because, there's a single protopath (`${PWD}` or `.`), we can use one 
`protoc` command to compile to Python sources:

```bash
protoc \
--proto_path=${PWD} \
--python_out=${PWD} \
--pyi_out=${PWD} \
${PWD}/com/abc/depart/detect.proto \
${PWD}/com/abc/protobuf/share.proto
```
and e.g. `main.py`:
```python3
from com.abc.depart import detect_pb2

x = detect_pb2.X()
x.test.value = "foo"

print(x)
```
Yields:

```console
test {
  value: "foo"
}
```
On Thursday, September 14, 2023 at 9:32:17 PM UTC-7 Charlie Tian wrote:

> "I've encountered the same issue. Have you found a solution?
>
> 在2018年1月17日星期三 UTC+8 15:52:17 写道:
>
>> I have two packages like this
>>
>> com.abc.
>>  protobuf.
>> share.proto
>>  depart.
>> detect.proto 
>>
>> and the conent of  share.proto like this:
>>
>> syntax = "proto3";
>> package com.adc.protobuf;
>> message Test{}
>>
>> and the content of detect.proto like this:
>>
>> syntax = "proto3";
>> package com.adc.depart;
>> import "com/abc/protobuf/share.proto"
>>
>> and compile share.proto in it's dir like this:
>>
>> protoc -I=. --python_out=. share.proto
>>
>> then compile detect.proto in it's dir like this:
>>
>> protoc -I=/pathToSrcDir/ -I=. --python_out=. detect.proto 
>> and
>> 
>> pathToSrcDir has been added to PYTHONPATH
>>
>> all compilations work fine,but when run a python script which 
>>
>> from com.abc.depart import detect_pb2
>>
>> got this error   
>>
>> TypeError: Couldn't build proto file into descriptor pool!
>> Invalid proto descriptor for file "detect.proto":
>>   detect.proto: Import "com/abc/protobuf/share.proto" has not been 
>> loaded.
>>   com.abc.depert.XClass.ymethod: "com.abc.protobuf.Test" seems to be 
>> defined in "share.proto", which is not imported by "detect.proto".  To use 
>> it here, please add the necessary import.
>>
>> How to solve this import problem?
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/2645b186-2943-411d-b2f9-bd26974bae44n%40googlegroups.com.


[protobuf] Re: Failing to Import Files Compiled from Protobuf in Python

2023-09-17 Thread Daz Wilkin
I took a stab at answering your question on Stack overflow:

https://stackoverflow.com/a/77115927/609290

On Thursday, September 14, 2023 at 9:31:25 PM UTC-7 Charlie Tian wrote:

> My directory structure is as follows:
>
>  |-test.py
>  |-test.proto
>  |-test_pb2.py
>  |-__init__.py
>  |-comm
> |-comm.proto
> |-comm_pb2.py
> |-__init__.py  
>
> both \_\_init__.py is empty
> and **test.proto** is like this:
>
> package test;
> import "comm/comm.proto";
>
> message Test{
> optional comm.Foo foo = 1;
> }
> and **comm.proto** is like this:
>
> package comm;
>
> message Foo{}
>
> i successfuly used command *protoc --python_out=. -I.* to compile 
> **comm.proto** and **test.proto**
> but when i tried to import ***test_pb2*** in ***test.py***, i encounter 
> this error
>
> TypeError: Couldn't build proto file into descriptor pool: Depends on 
> file 'comm/comm.proto', but it has not been loaded
>
>
> Can someone help me identify the reason and provide a solution, please?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/dd328d73-0721-4857-afff-8371e541305dn%40googlegroups.com.


Re: [protobuf] Re: protoc 23.4 Windows x64 Python strange generated file

2023-09-15 Thread 'Derek Perez' via Protocol Buffers
you can also set --pyi_out to get the interface/type definitions I believe!

- D

On Fri, Sep 15, 2023 at 9:07 AM Nikita Ig  wrote:

> As I understand it's a new version of protoc. And now all fields for
> protobuff message dynamically adding at runtime. You can run your program
> and set breakpoint. You could find all fields for message. Pycharm shows
> errors about this field, but now it's a dynamical object and you'll get
> correct message.
> On Friday, 15 September 2023 at 14:39:08 UTC+3 Charlie Tian wrote:
>
>>
>> I got this problem too! Can you explain why your first attemp generate
>> "wrong" file ?
>> 在2023年8月1日星期二 UTC+8 22:32:49 写道:
>>
>>>
>>> Oh, I'm so sorry. I can understand my problems. I suppose that
>>> conversation may close and delete.
>>> On Tuesday, 1 August 2023 at 16:04:49 UTC+3 Nikita Ig wrote:
>>>
 I installed protoc to my ubuntu and got version libprotoc 3.12.4
 I reproduce all my steps and got typical example of pb2.py file
 consisted all data

 # -*- coding: utf-8 -*-
 # Generated by the protocol buffer compiler.  DO NOT EDIT!
 # source: test.proto

 from google.protobuf import descriptor as _descriptor
 from google.protobuf import message as _message
 from google.protobuf import reflection as _reflection

 from google.protobuf import symbol_database as _symbol_database
 # @@protoc_insertion_point(imports)

 _sym_db = _symbol_database.Default()




 DESCRIPTOR = _descriptor.FileDescriptor(
   name='test.proto',
   package='',
   syntax='proto3',
   serialized_options=None,
   create_key=_descriptor._internal_create_key,

 serialized_pb=b'\n\ntest.proto\"M\n\rSearchRequest\x12\r\n\x05query\x18\x01
 \x01(\t\x12\x13\n\x0bpage_number\x18\x02
 \x01(\x05\x12\x18\n\x10results_per_page\x18\x03 \x01(\x05\x62\x06proto3'
 )




 _SEARCHREQUEST = _descriptor.Descriptor(
   name='SearchRequest',
   full_name='SearchRequest',
   filename=None,
   file=DESCRIPTOR,
   containing_type=None,
   create_key=_descriptor._internal_create_key,
   fields=[
 _descriptor.FieldDescriptor(
   name='query', full_name='SearchRequest.query', index=0,
   number=1, type=9, cpp_type=9, label=1,
   has_default_value=False, default_value=b"".decode('utf-8'),
   message_type=None, enum_type=None, containing_type=None,
   is_extension=False, extension_scope=None,
   serialized_options=None, file=DESCRIPTOR,
  create_key=_descriptor._internal_create_key),
 _descriptor.FieldDescriptor(
   name='page_number', full_name='SearchRequest.page_number',
 index=1,
   number=2, type=5, cpp_type=1, label=1,
   has_default_value=False, default_value=0,
   message_type=None, enum_type=None, containing_type=None,
   is_extension=False, extension_scope=None,
   serialized_options=None, file=DESCRIPTOR,
  create_key=_descriptor._internal_create_key),
 _descriptor.FieldDescriptor(
   name='results_per_page',
 full_name='SearchRequest.results_per_page', index=2,
   number=3, type=5, cpp_type=1, label=1,
   has_default_value=False, default_value=0,
   message_type=None, enum_type=None, containing_type=None,
   is_extension=False, extension_scope=None,
   serialized_options=None, file=DESCRIPTOR,
  create_key=_descriptor._internal_create_key),
   ],
   extensions=[
   ],
   nested_types=[],
   enum_types=[
   ],
   serialized_options=None,
   is_extendable=False,
   syntax='proto3',
   extension_ranges=[],
   oneofs=[
   ],
   serialized_start=14,
   serialized_end=91,
 )

 DESCRIPTOR.message_types_by_name['SearchRequest'] = _SEARCHREQUEST
 _sym_db.RegisterFileDescriptor(DESCRIPTOR)

 SearchRequest =
 _reflection.GeneratedProtocolMessageType('SearchRequest',
 (_message.Message,), {
   'DESCRIPTOR' : _SEARCHREQUEST,
   '__module__' : 'test_pb2'
   # @@protoc_insertion_point(class_scope:SearchRequest)
   })
 _sym_db.RegisterMessage(SearchRequest)


 # @@protoc_insertion_point(module_scope)


 And I have a question  - what did I wrong in my first attempt? Is it a
 new version of protoc? or is it a problem in my Windows? Or anything else?
 On Tuesday, 1 August 2023 at 14:35:32 UTC+3 Nikita Ig wrote:

> Hello. I'm a rookie in protocol buffers and I'm trying to repeat
> example from documentation.
>
> Step 1. Installed protoc
> Step 2. Created test.proto file consisted this code:
> syntax = "proto3";
>
> message SearchRequest {
>   string query = 1;
>   int32 page_number = 2;
>   int32 results_per_page = 3;
> }
> Step 3. I ran protoc with help command:
> protoc.exe --proto_path=${PWD}/protobuf -

[protobuf] Re: protoc 23.4 Windows x64 Python strange generated file

2023-09-15 Thread Nikita Ig
As I understand it's a new version of protoc. And now all fields for 
protobuff message dynamically adding at runtime. You can run your program 
and set breakpoint. You could find all fields for message. Pycharm shows 
errors about this field, but now it's a dynamical object and you'll get 
correct message.
On Friday, 15 September 2023 at 14:39:08 UTC+3 Charlie Tian wrote:

>
> I got this problem too! Can you explain why your first attemp generate 
> "wrong" file ?
> 在2023年8月1日星期二 UTC+8 22:32:49 写道:
>
>>
>> Oh, I'm so sorry. I can understand my problems. I suppose that 
>> conversation may close and delete.
>> On Tuesday, 1 August 2023 at 16:04:49 UTC+3 Nikita Ig wrote:
>>
>>> I installed protoc to my ubuntu and got version libprotoc 3.12.4
>>> I reproduce all my steps and got typical example of pb2.py file 
>>> consisted all data
>>>
>>> # -*- coding: utf-8 -*-
>>> # Generated by the protocol buffer compiler.  DO NOT EDIT!
>>> # source: test.proto
>>>
>>> from google.protobuf import descriptor as _descriptor
>>> from google.protobuf import message as _message
>>> from google.protobuf import reflection as _reflection
>>>
>>> from google.protobuf import symbol_database as _symbol_database
>>> # @@protoc_insertion_point(imports)
>>>
>>> _sym_db = _symbol_database.Default()
>>>
>>>
>>>
>>>
>>> DESCRIPTOR = _descriptor.FileDescriptor(
>>>   name='test.proto',
>>>   package='',
>>>   syntax='proto3',
>>>   serialized_options=None,
>>>   create_key=_descriptor._internal_create_key,
>>>   
>>> serialized_pb=b'\n\ntest.proto\"M\n\rSearchRequest\x12\r\n\x05query\x18\x01 
>>> \x01(\t\x12\x13\n\x0bpage_number\x18\x02 
>>> \x01(\x05\x12\x18\n\x10results_per_page\x18\x03 \x01(\x05\x62\x06proto3'
>>> )
>>>
>>>
>>>
>>>
>>> _SEARCHREQUEST = _descriptor.Descriptor(
>>>   name='SearchRequest',
>>>   full_name='SearchRequest',
>>>   filename=None,
>>>   file=DESCRIPTOR,
>>>   containing_type=None,
>>>   create_key=_descriptor._internal_create_key,
>>>   fields=[
>>> _descriptor.FieldDescriptor(
>>>   name='query', full_name='SearchRequest.query', index=0,
>>>   number=1, type=9, cpp_type=9, label=1,
>>>   has_default_value=False, default_value=b"".decode('utf-8'),
>>>   message_type=None, enum_type=None, containing_type=None,
>>>   is_extension=False, extension_scope=None,
>>>   serialized_options=None, file=DESCRIPTOR, 
>>>  create_key=_descriptor._internal_create_key),
>>> _descriptor.FieldDescriptor(
>>>   name='page_number', full_name='SearchRequest.page_number', index=1,
>>>   number=2, type=5, cpp_type=1, label=1,
>>>   has_default_value=False, default_value=0,
>>>   message_type=None, enum_type=None, containing_type=None,
>>>   is_extension=False, extension_scope=None,
>>>   serialized_options=None, file=DESCRIPTOR, 
>>>  create_key=_descriptor._internal_create_key),
>>> _descriptor.FieldDescriptor(
>>>   name='results_per_page', 
>>> full_name='SearchRequest.results_per_page', index=2,
>>>   number=3, type=5, cpp_type=1, label=1,
>>>   has_default_value=False, default_value=0,
>>>   message_type=None, enum_type=None, containing_type=None,
>>>   is_extension=False, extension_scope=None,
>>>   serialized_options=None, file=DESCRIPTOR, 
>>>  create_key=_descriptor._internal_create_key),
>>>   ],
>>>   extensions=[
>>>   ],
>>>   nested_types=[],
>>>   enum_types=[
>>>   ],
>>>   serialized_options=None,
>>>   is_extendable=False,
>>>   syntax='proto3',
>>>   extension_ranges=[],
>>>   oneofs=[
>>>   ],
>>>   serialized_start=14,
>>>   serialized_end=91,
>>> )
>>>
>>> DESCRIPTOR.message_types_by_name['SearchRequest'] = _SEARCHREQUEST
>>> _sym_db.RegisterFileDescriptor(DESCRIPTOR)
>>>
>>> SearchRequest = 
>>> _reflection.GeneratedProtocolMessageType('SearchRequest', 
>>> (_message.Message,), {
>>>   'DESCRIPTOR' : _SEARCHREQUEST,
>>>   '__module__' : 'test_pb2'
>>>   # @@protoc_insertion_point(class_scope:SearchRequest)
>>>   })
>>> _sym_db.RegisterMessage(SearchRequest)
>>>
>>>
>>> # @@protoc_insertion_point(module_scope)
>>>
>>>
>>> And I have a question  - what did I wrong in my first attempt? Is it a 
>>> new version of protoc? or is it a problem in my Windows? Or anything else?
>>> On Tuesday, 1 August 2023 at 14:35:32 UTC+3 Nikita Ig wrote:
>>>
 Hello. I'm a rookie in protocol buffers and I'm trying to repeat 
 example from documentation.

 Step 1. Installed protoc
 Step 2. Created test.proto file consisted this code:
 syntax = "proto3";

 message SearchRequest {
   string query = 1;
   int32 page_number = 2;
   int32 results_per_page = 3;
 }
 Step 3. I ran protoc with help command:
 protoc.exe --proto_path=${PWD}/protobuf --python_out=${PWD}/protobuf/ 
 ${PWD}/protobuf/test.proto

 Step 4. I got test_pb2.py file consisted this code:
 # -*- coding: utf-8 -*-
 # Generated by the protocol buffer compiler.  DO NOT EDIT!
 # source: test.proto
 """Generate

[protobuf] Re: protoc 23.4 Windows x64 Python strange generated file

2023-09-15 Thread Charlie Tian

I got this problem too! Can you explain why your first attemp generate 
"wrong" file ?
在2023年8月1日星期二 UTC+8 22:32:49 写道:

>
> Oh, I'm so sorry. I can understand my problems. I suppose that 
> conversation may close and delete.
> On Tuesday, 1 August 2023 at 16:04:49 UTC+3 Nikita Ig wrote:
>
>> I installed protoc to my ubuntu and got version libprotoc 3.12.4
>> I reproduce all my steps and got typical example of pb2.py file consisted 
>> all data
>>
>> # -*- coding: utf-8 -*-
>> # Generated by the protocol buffer compiler.  DO NOT EDIT!
>> # source: test.proto
>>
>> from google.protobuf import descriptor as _descriptor
>> from google.protobuf import message as _message
>> from google.protobuf import reflection as _reflection
>>
>> from google.protobuf import symbol_database as _symbol_database
>> # @@protoc_insertion_point(imports)
>>
>> _sym_db = _symbol_database.Default()
>>
>>
>>
>>
>> DESCRIPTOR = _descriptor.FileDescriptor(
>>   name='test.proto',
>>   package='',
>>   syntax='proto3',
>>   serialized_options=None,
>>   create_key=_descriptor._internal_create_key,
>>   
>> serialized_pb=b'\n\ntest.proto\"M\n\rSearchRequest\x12\r\n\x05query\x18\x01 
>> \x01(\t\x12\x13\n\x0bpage_number\x18\x02 
>> \x01(\x05\x12\x18\n\x10results_per_page\x18\x03 \x01(\x05\x62\x06proto3'
>> )
>>
>>
>>
>>
>> _SEARCHREQUEST = _descriptor.Descriptor(
>>   name='SearchRequest',
>>   full_name='SearchRequest',
>>   filename=None,
>>   file=DESCRIPTOR,
>>   containing_type=None,
>>   create_key=_descriptor._internal_create_key,
>>   fields=[
>> _descriptor.FieldDescriptor(
>>   name='query', full_name='SearchRequest.query', index=0,
>>   number=1, type=9, cpp_type=9, label=1,
>>   has_default_value=False, default_value=b"".decode('utf-8'),
>>   message_type=None, enum_type=None, containing_type=None,
>>   is_extension=False, extension_scope=None,
>>   serialized_options=None, file=DESCRIPTOR, 
>>  create_key=_descriptor._internal_create_key),
>> _descriptor.FieldDescriptor(
>>   name='page_number', full_name='SearchRequest.page_number', index=1,
>>   number=2, type=5, cpp_type=1, label=1,
>>   has_default_value=False, default_value=0,
>>   message_type=None, enum_type=None, containing_type=None,
>>   is_extension=False, extension_scope=None,
>>   serialized_options=None, file=DESCRIPTOR, 
>>  create_key=_descriptor._internal_create_key),
>> _descriptor.FieldDescriptor(
>>   name='results_per_page', 
>> full_name='SearchRequest.results_per_page', index=2,
>>   number=3, type=5, cpp_type=1, label=1,
>>   has_default_value=False, default_value=0,
>>   message_type=None, enum_type=None, containing_type=None,
>>   is_extension=False, extension_scope=None,
>>   serialized_options=None, file=DESCRIPTOR, 
>>  create_key=_descriptor._internal_create_key),
>>   ],
>>   extensions=[
>>   ],
>>   nested_types=[],
>>   enum_types=[
>>   ],
>>   serialized_options=None,
>>   is_extendable=False,
>>   syntax='proto3',
>>   extension_ranges=[],
>>   oneofs=[
>>   ],
>>   serialized_start=14,
>>   serialized_end=91,
>> )
>>
>> DESCRIPTOR.message_types_by_name['SearchRequest'] = _SEARCHREQUEST
>> _sym_db.RegisterFileDescriptor(DESCRIPTOR)
>>
>> SearchRequest = _reflection.GeneratedProtocolMessageType('SearchRequest', 
>> (_message.Message,), {
>>   'DESCRIPTOR' : _SEARCHREQUEST,
>>   '__module__' : 'test_pb2'
>>   # @@protoc_insertion_point(class_scope:SearchRequest)
>>   })
>> _sym_db.RegisterMessage(SearchRequest)
>>
>>
>> # @@protoc_insertion_point(module_scope)
>>
>>
>> And I have a question  - what did I wrong in my first attempt? Is it a 
>> new version of protoc? or is it a problem in my Windows? Or anything else?
>> On Tuesday, 1 August 2023 at 14:35:32 UTC+3 Nikita Ig wrote:
>>
>>> Hello. I'm a rookie in protocol buffers and I'm trying to repeat example 
>>> from documentation.
>>>
>>> Step 1. Installed protoc
>>> Step 2. Created test.proto file consisted this code:
>>> syntax = "proto3";
>>>
>>> message SearchRequest {
>>>   string query = 1;
>>>   int32 page_number = 2;
>>>   int32 results_per_page = 3;
>>> }
>>> Step 3. I ran protoc with help command:
>>> protoc.exe --proto_path=${PWD}/protobuf --python_out=${PWD}/protobuf/ 
>>> ${PWD}/protobuf/test.proto
>>>
>>> Step 4. I got test_pb2.py file consisted this code:
>>> # -*- coding: utf-8 -*-
>>> # Generated by the protocol buffer compiler.  DO NOT EDIT!
>>> # source: test.proto
>>> """Generated protocol buffer code."""
>>> from google.protobuf import descriptor as _descriptor
>>> from google.protobuf import descriptor_pool as _descriptor_pool
>>> from google.protobuf import symbol_database as _symbol_database
>>> from google.protobuf.internal import builder as _builder
>>> # @@protoc_insertion_point(imports)
>>>
>>> _sym_db = _symbol_database.Default()
>>>
>>>
>>>
>>>
>>> DESCRIPTOR = 
>>> _descriptor_pool.Default().AddSerializedFile(b'\n\ntest.proto\"M\n\rSearchRequest\x12\r\n\x05query\x18\x01
>>>  
>

[protobuf] Re: How to import .proto file in different package correctly?

2023-09-14 Thread Charlie Tian
"I've encountered the same issue. Have you found a solution?

在2018年1月17日星期三 UTC+8 15:52:17 写道:

> I have two packages like this
>
> com.abc.
>  protobuf.
> share.proto
>  depart.
> detect.proto 
>
> and the conent of  share.proto like this:
>
> syntax = "proto3";
> package com.adc.protobuf;
> message Test{}
>
> and the content of detect.proto like this:
>
> syntax = "proto3";
> package com.adc.depart;
> import "com/abc/protobuf/share.proto"
>
> and compile share.proto in it's dir like this:
>
> protoc -I=. --python_out=. share.proto
>
> then compile detect.proto in it's dir like this:
>
> protoc -I=/pathToSrcDir/ -I=. --python_out=. detect.proto 
> and
> 
> pathToSrcDir has been added to PYTHONPATH
>
> all compilations work fine,but when run a python script which 
>
> from com.abc.depart import detect_pb2
>
> got this error   
>
> TypeError: Couldn't build proto file into descriptor pool!
> Invalid proto descriptor for file "detect.proto":
>   detect.proto: Import "com/abc/protobuf/share.proto" has not been 
> loaded.
>   com.abc.depert.XClass.ymethod: "com.abc.protobuf.Test" seems to be 
> defined in "share.proto", which is not imported by "detect.proto".  To use 
> it here, please add the necessary import.
>
> How to solve this import problem?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/ac3390fb-3963-4173-a4d7-ad535dc0e571n%40googlegroups.com.


[protobuf] Re: How to ignore/hide 'Could not evaluate protobuf implementation type'?

2023-08-28 Thread Jacky Ryan
Does someone have an idea where this message come from?

For information, I use wirepas python scripts, e.g. class 
WirepasNetworkInterface 

 from 
e.g. this script (import...) example_data.py 


Searching on github actually lists a hit:
https://github.com/wirepas/wirepas-mesh-messaging-python/blob/main/wirepas_mesh_messaging/__init__.py#L46

Could you imagine which exception occurs?

Jacky Ryan schrieb am Mittwoch, 23. August 2023 um 09:48:11 UTC+2:

> Hello,
>
> I'm working on Debian 10 with Python 3.7.3 and pip 23.2.1. protobuf 
> package is installed in version 4.24.1 (shown with pip list).
>
> Everytime I execute a python script (maybe they use protobuf, I'm not sure 
> about this) I get the message
> "Could not evaluate protobuf implementation type"
>
> What does it mean, do I have to change something? If yes, what can I 
> change or check?
> If not, how can I hide this message?
>
> Regards,
> Daniel
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/0c5db56c-3b9a-4cfa-8e65-c296d61597a2n%40googlegroups.com.


[protobuf] Re: Undefined Symbols when Linking Static Library

2023-08-16 Thread Santosh Passoubady (Sansan)
Did you ever figured?

On Saturday, May 16, 2020 at 1:38:33 AM UTC-4 Victor Stewart wrote:

> Hey having an absolutely miserable time trying to cross compile 
> libphonenumber, which depends on protobuf, into a static library.
>
> Everything compiles. But fails with undefined symbols at linking time.
>
> this is how I'm building protobuf, on x86 with an x86 protoc that I built 
> first.
>
> file libraries/protoc 
>
> libraries/protoc: Mach-O 64-bit executable x86_64
>
>
> ///
>
>
> export build_dir=`pwd`/cpp-build
>
> export darwin=darwin`uname -r`
>
> export protoc=/usr/local/bin/protoc
>
> export isysroot=`xcrun --sdk iphoneos --show-sdk-path`
>
> export cflags="-Wno-unused-local-typedef -Wno-unused-function -DNDEBUG -g 
> -O3 -pipe -fPIC -fcxx-exceptions"
>
> export cxxflags="$cflags -std=c++17 -stdlib=libc++"
>
>
> mkdir -p $build_dir/arch
>
> mkdir -p $build_dir/lib
>
>
> ./configure \
>
> --build=x86_64-apple-$darwin \
>
> --host=arm \
>
> --with-protoc=$protoc \
>
> --disable-shared \
>
> --prefix=/usr/local \
>
> "CC=clang" \
>
> "CFLAGS=$cflags -miphoneos-version-min=12.0 -arch arm64 -isysroot 
> $isysroot" \
>
> "CXX=clang" \
>
> "CXXFLAGS=$cxxflags -miphoneos-version-min=12.0 -arch arm64 -isysroot 
> $isysroot" \
>
> LDFLAGS="-arch arm64 -miphoneos-version-min=12.0 -stdlib=libc++" \
>
> "LIBS=-lc++ -lc++abi"
>
>
> make -j8
>
> make install
>
>
> ///
>
>
> lipo -info libprotobuf.a
>
> Non-fat file: libprotobuf.a is architecture: arm64
>
>
> lipo -info Release-iphoneos/libphonenumber.a
>
> Non-fat file: Release-iphoneos/libphonenumber.a is architecture: arm64
>
>
> Undefined symbols for architecture arm64:
>
>   "google::protobuf::internal::LogMessage::operator<<(char const*)", 
> referenced from:
>
>   ZN13ContactReaper14gatherContactsEv_block_invoke in xxx_lto.o
>
>   
> google::protobuf::internal::ArenaStringPtr::CreateInstance(google::protobuf::Arena*,
>  
> std::__1::basic_string, 
> std::__1::allocator > const*) in xxx_lto.o
>
>   "google::protobuf::internal::LogMessage::~LogMessage()", referenced 
> from:
>
>   ZN13ContactReaper14gatherContactsEv_block_invoke in xxx_lto.o
>
>   
> google::protobuf::internal::ArenaStringPtr::CreateInstance(google::protobuf::Arena*,
>  
> std::__1::basic_string, 
> std::__1::allocator > const*) in xxx_lto.o
>
>   "google::protobuf::internal::ReadSizeFallback(char const*, unsigned 
> int)", referenced from:
>
>   i18n::phonenumbers::PhoneMetadata::_InternalParse(char const*, 
> google::protobuf::internal::ParseContext*) in 
> libphonenumber.a(phonemetadata.pb.o)
>
>   i18n::phonenumbers::PhoneMetadataCollection::_InternalParse(char 
> const*, google::protobuf::internal::ParseContext*) in 
> libphonenumber.a(phonemetadata.pb.o)
>
>
> etc... 41 of them.
>
>
> been at this for 2 days, and for the life of me can't figure out how to 
> fix it.
>
>
> has anyone else made it through this alive?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/c5ae7b99-189b-460f-8df6-5992cf57f6den%40googlegroups.com.


[protobuf] Re: How to initialize embedded messages in Python?

2023-08-08 Thread Jens Troeger
I sent too early…
 

In this case, will the generated initializer use e.g. correctly?


 Will the generated initializer use e.g. *CopyFrom()* correctly?

Thanks!
Jens

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/269878ae-9b6f-42cf-a716-caa0986ec7fan%40googlegroups.com.


[protobuf] Re: Protobuf parsing

2023-08-07 Thread Sumit Patil
 if( eventType == "ADDED" || eventType == "MODIFIED" )
{ 
 WatchEvent watch; 
 watch.set_type( eventType ); 
 string object = responseJson["object"].dump(); 
 watch.mutable_object()->set_raw(object); 
 cout << "Message" << endl; 
 cout << watch.DebugString() << endl; 
 Pod pod; 
 JsonStringToMessage( watch.object().raw(), &pod ); 
 cout << "Pod" << endl;
 cout << pod.DebugString() << endl;
 }

On Monday, 7 August, 2023 at 2:14:29 pm UTC+5:30 Sumit Patil wrote:

>
> I am receiving a WatchEvent message and the value stored in this 
> kubernetes proto is -
>
>1. Type - for ex. event such as modified or added or deleted
>2. Object - raw bytes of actual object. Can be any object.
>
> I want to convert those raw bytes to actual objects such as pod. Note that 
> I know the kind name of that RawExtension runtime object.
>
> Currently I was doing this but result is negative -
>
>
> if( eventType == "ADDED" || eventType == "MODIFIED" ) { WatchEvent watch; 
> watch.set_type( eventType ); string object = responseJson["object"].dump(); 
> watch.mutable_object()->set_raw(object); cout << "Message" << endl; cout 
> << watch.DebugString() << endl; Pod pod; JsonStringToMessage( 
> watch.object().raw(), &pod ); cout << "Pod" << endl; cout << 
> pod.DebugString() << endl; }
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/59f6ac9f-1299-41ba-a9e5-2768aa28a2c8n%40googlegroups.com.


[protobuf] Re: Coming soon: Protobuf Editions

2023-08-02 Thread 'Jeff Sawatzky' via Protocol Buffers
Looking at https://protobuf.dev/news/2023-06-29/#editions-syntax-1 it seems 
like with the "2023" edition you are planning on making the field presence 
explicit by default, whereas with "proto3" the default field presence was 
"no presence" (if we are using the terms defined here 
https://github.com/protocolbuffers/protobuf/blob/main/docs/field_presence.md)

This seems like a step back to "proto2" where explicit presence was the 
default.

Also, I believe that in "proto3" going from non-optional (the default in 
proto3) to optional (by specifying the optional flag) is a breaking change 
and will require a new version. So if we have a "proto3" definition that 
doesn't use optional fields, then when editions land we will need to update 
the field to have the features.field_presence = IMPLICIT in order to not be 
a breaking change (as shown in the example).

Therefore, is it safe to say that if we are creating a new API using 
"proto3" that we should mark all fields as optional in order to avoid an 
unnecessary breaking change in the future when editions become a thing and 
features.field_presence = IMPLICIT becomes deprecated.

Are there any other breaking changes from "proto3" that will need special 
features.field_presence options, and if so, what are they and how can we 
design our APIs now to avoid this and breaking changes in the future?

Thanks!
On Thursday, June 29, 2023 at 7:38:10 PM UTC-4 Mike Kruskal wrote:

> TL;DR: We are planning to release Protobuf Editions to the open source 
> project in the second half of 2023. While there is no requirement to move 
> from proto2/proto3 syntax to Editions syntax at initial release, we 
> encourage you to plan a move in your software project’s future timeline.
>
>
> Protobuf Editions will replace the proto2 and proto3 designations that we 
> have used for Protocol Buffers. Instead of adding syntax = "proto2" or syntax 
> = "proto3" at the top of proto definition files, you will use an edition 
> number, such as edition = “2024”, to specify the behaviors your 
> implementation will conform to. Editions enable the language to evolve 
> incrementally over time.
>
>
> Instead of the hard-coded behaviors that older versions have had, editions 
> will represent a collection of “features” with a default value (behavior) 
> per feature, which you can override. Features are options on a syntax 
> entry—file, message, field, enum, and so on— that specify the behavior of 
> protoc, the backends, and protobuf runtimes. You can explicitly set the 
> desired behavior at those different levels (file, message, field, …) when 
> your needs don’t match the default behavior for the edition you’ve selected.
>
>
> The introduction of Editions requires the one-time conversion of your 
> .proto definition files into a new format. We will be providing a tool, 
> called Prototiller, for this conversion. The code generated by protoc will 
> work with your existing code without modification.
>
>
> For more information about Protobuf Editions, see the full announcement at 
> https://protobuf.dev/news/ and the overview at 
> https://protobuf.dev/editions/overview/.
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/573ffa85-7f70-4d57-8977-56d4079f2e8en%40googlegroups.com.


[protobuf] Re: protoc 23.4 Windows x64 Python strange generated file

2023-08-01 Thread Nikita Ig

Oh, I'm so sorry. I can understand my problems. I suppose that conversation 
may close and delete.
On Tuesday, 1 August 2023 at 16:04:49 UTC+3 Nikita Ig wrote:

> I installed protoc to my ubuntu and got version libprotoc 3.12.4
> I reproduce all my steps and got typical example of pb2.py file consisted 
> all data
>
> # -*- coding: utf-8 -*-
> # Generated by the protocol buffer compiler.  DO NOT EDIT!
> # source: test.proto
>
> from google.protobuf import descriptor as _descriptor
> from google.protobuf import message as _message
> from google.protobuf import reflection as _reflection
>
> from google.protobuf import symbol_database as _symbol_database
> # @@protoc_insertion_point(imports)
>
> _sym_db = _symbol_database.Default()
>
>
>
>
> DESCRIPTOR = _descriptor.FileDescriptor(
>   name='test.proto',
>   package='',
>   syntax='proto3',
>   serialized_options=None,
>   create_key=_descriptor._internal_create_key,
>   
> serialized_pb=b'\n\ntest.proto\"M\n\rSearchRequest\x12\r\n\x05query\x18\x01 
> \x01(\t\x12\x13\n\x0bpage_number\x18\x02 
> \x01(\x05\x12\x18\n\x10results_per_page\x18\x03 \x01(\x05\x62\x06proto3'
> )
>
>
>
>
> _SEARCHREQUEST = _descriptor.Descriptor(
>   name='SearchRequest',
>   full_name='SearchRequest',
>   filename=None,
>   file=DESCRIPTOR,
>   containing_type=None,
>   create_key=_descriptor._internal_create_key,
>   fields=[
> _descriptor.FieldDescriptor(
>   name='query', full_name='SearchRequest.query', index=0,
>   number=1, type=9, cpp_type=9, label=1,
>   has_default_value=False, default_value=b"".decode('utf-8'),
>   message_type=None, enum_type=None, containing_type=None,
>   is_extension=False, extension_scope=None,
>   serialized_options=None, file=DESCRIPTOR, 
>  create_key=_descriptor._internal_create_key),
> _descriptor.FieldDescriptor(
>   name='page_number', full_name='SearchRequest.page_number', index=1,
>   number=2, type=5, cpp_type=1, label=1,
>   has_default_value=False, default_value=0,
>   message_type=None, enum_type=None, containing_type=None,
>   is_extension=False, extension_scope=None,
>   serialized_options=None, file=DESCRIPTOR, 
>  create_key=_descriptor._internal_create_key),
> _descriptor.FieldDescriptor(
>   name='results_per_page', full_name='SearchRequest.results_per_page', 
> index=2,
>   number=3, type=5, cpp_type=1, label=1,
>   has_default_value=False, default_value=0,
>   message_type=None, enum_type=None, containing_type=None,
>   is_extension=False, extension_scope=None,
>   serialized_options=None, file=DESCRIPTOR, 
>  create_key=_descriptor._internal_create_key),
>   ],
>   extensions=[
>   ],
>   nested_types=[],
>   enum_types=[
>   ],
>   serialized_options=None,
>   is_extendable=False,
>   syntax='proto3',
>   extension_ranges=[],
>   oneofs=[
>   ],
>   serialized_start=14,
>   serialized_end=91,
> )
>
> DESCRIPTOR.message_types_by_name['SearchRequest'] = _SEARCHREQUEST
> _sym_db.RegisterFileDescriptor(DESCRIPTOR)
>
> SearchRequest = _reflection.GeneratedProtocolMessageType('SearchRequest', 
> (_message.Message,), {
>   'DESCRIPTOR' : _SEARCHREQUEST,
>   '__module__' : 'test_pb2'
>   # @@protoc_insertion_point(class_scope:SearchRequest)
>   })
> _sym_db.RegisterMessage(SearchRequest)
>
>
> # @@protoc_insertion_point(module_scope)
>
>
> And I have a question  - what did I wrong in my first attempt? Is it a new 
> version of protoc? or is it a problem in my Windows? Or anything else?
> On Tuesday, 1 August 2023 at 14:35:32 UTC+3 Nikita Ig wrote:
>
>> Hello. I'm a rookie in protocol buffers and I'm trying to repeat example 
>> from documentation.
>>
>> Step 1. Installed protoc
>> Step 2. Created test.proto file consisted this code:
>> syntax = "proto3";
>>
>> message SearchRequest {
>>   string query = 1;
>>   int32 page_number = 2;
>>   int32 results_per_page = 3;
>> }
>> Step 3. I ran protoc with help command:
>> protoc.exe --proto_path=${PWD}/protobuf --python_out=${PWD}/protobuf/ 
>> ${PWD}/protobuf/test.proto
>>
>> Step 4. I got test_pb2.py file consisted this code:
>> # -*- coding: utf-8 -*-
>> # Generated by the protocol buffer compiler.  DO NOT EDIT!
>> # source: test.proto
>> """Generated protocol buffer code."""
>> from google.protobuf import descriptor as _descriptor
>> from google.protobuf import descriptor_pool as _descriptor_pool
>> from google.protobuf import symbol_database as _symbol_database
>> from google.protobuf.internal import builder as _builder
>> # @@protoc_insertion_point(imports)
>>
>> _sym_db = _symbol_database.Default()
>>
>>
>>
>>
>> DESCRIPTOR = 
>> _descriptor_pool.Default().AddSerializedFile(b'\n\ntest.proto\"M\n\rSearchRequest\x12\r\n\x05query\x18\x01
>>  
>> \x01(\t\x12\x13\n\x0bpage_number\x18\x02 
>> \x01(\x05\x12\x18\n\x10results_per_page\x18\x03 \x01(\x05\x62\x06proto3')
>>
>> _globals = globals()
>> _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
>> _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 

[protobuf] Re: protoc 23.4 Windows x64 Python strange generated file

2023-08-01 Thread Nikita Ig
I installed protoc to my ubuntu and got version libprotoc 3.12.4
I reproduce all my steps and got typical example of pb2.py file consisted 
all data

# -*- coding: utf-8 -*-
# Generated by the protocol buffer compiler.  DO NOT EDIT!
# source: test.proto

from google.protobuf import descriptor as _descriptor
from google.protobuf import message as _message
from google.protobuf import reflection as _reflection
from google.protobuf import symbol_database as _symbol_database
# @@protoc_insertion_point(imports)

_sym_db = _symbol_database.Default()




DESCRIPTOR = _descriptor.FileDescriptor(
  name='test.proto',
  package='',
  syntax='proto3',
  serialized_options=None,
  create_key=_descriptor._internal_create_key,
  
serialized_pb=b'\n\ntest.proto\"M\n\rSearchRequest\x12\r\n\x05query\x18\x01 
\x01(\t\x12\x13\n\x0bpage_number\x18\x02 
\x01(\x05\x12\x18\n\x10results_per_page\x18\x03 \x01(\x05\x62\x06proto3'
)




_SEARCHREQUEST = _descriptor.Descriptor(
  name='SearchRequest',
  full_name='SearchRequest',
  filename=None,
  file=DESCRIPTOR,
  containing_type=None,
  create_key=_descriptor._internal_create_key,
  fields=[
_descriptor.FieldDescriptor(
  name='query', full_name='SearchRequest.query', index=0,
  number=1, type=9, cpp_type=9, label=1,
  has_default_value=False, default_value=b"".decode('utf-8'),
  message_type=None, enum_type=None, containing_type=None,
  is_extension=False, extension_scope=None,
  serialized_options=None, file=DESCRIPTOR, 
 create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
  name='page_number', full_name='SearchRequest.page_number', index=1,
  number=2, type=5, cpp_type=1, label=1,
  has_default_value=False, default_value=0,
  message_type=None, enum_type=None, containing_type=None,
  is_extension=False, extension_scope=None,
  serialized_options=None, file=DESCRIPTOR, 
 create_key=_descriptor._internal_create_key),
_descriptor.FieldDescriptor(
  name='results_per_page', full_name='SearchRequest.results_per_page', 
index=2,
  number=3, type=5, cpp_type=1, label=1,
  has_default_value=False, default_value=0,
  message_type=None, enum_type=None, containing_type=None,
  is_extension=False, extension_scope=None,
  serialized_options=None, file=DESCRIPTOR, 
 create_key=_descriptor._internal_create_key),
  ],
  extensions=[
  ],
  nested_types=[],
  enum_types=[
  ],
  serialized_options=None,
  is_extendable=False,
  syntax='proto3',
  extension_ranges=[],
  oneofs=[
  ],
  serialized_start=14,
  serialized_end=91,
)

DESCRIPTOR.message_types_by_name['SearchRequest'] = _SEARCHREQUEST
_sym_db.RegisterFileDescriptor(DESCRIPTOR)

SearchRequest = _reflection.GeneratedProtocolMessageType('SearchRequest', 
(_message.Message,), {
  'DESCRIPTOR' : _SEARCHREQUEST,
  '__module__' : 'test_pb2'
  # @@protoc_insertion_point(class_scope:SearchRequest)
  })
_sym_db.RegisterMessage(SearchRequest)


# @@protoc_insertion_point(module_scope)


And I have a question  - what did I wrong in my first attempt? Is it a new 
version of protoc? or is it a problem in my Windows? Or anything else?
On Tuesday, 1 August 2023 at 14:35:32 UTC+3 Nikita Ig wrote:

> Hello. I'm a rookie in protocol buffers and I'm trying to repeat example 
> from documentation.
>
> Step 1. Installed protoc
> Step 2. Created test.proto file consisted this code:
> syntax = "proto3";
>
> message SearchRequest {
>   string query = 1;
>   int32 page_number = 2;
>   int32 results_per_page = 3;
> }
> Step 3. I ran protoc with help command:
> protoc.exe --proto_path=${PWD}/protobuf --python_out=${PWD}/protobuf/ 
> ${PWD}/protobuf/test.proto
>
> Step 4. I got test_pb2.py file consisted this code:
> # -*- coding: utf-8 -*-
> # Generated by the protocol buffer compiler.  DO NOT EDIT!
> # source: test.proto
> """Generated protocol buffer code."""
> from google.protobuf import descriptor as _descriptor
> from google.protobuf import descriptor_pool as _descriptor_pool
> from google.protobuf import symbol_database as _symbol_database
> from google.protobuf.internal import builder as _builder
> # @@protoc_insertion_point(imports)
>
> _sym_db = _symbol_database.Default()
>
>
>
>
> DESCRIPTOR = 
> _descriptor_pool.Default().AddSerializedFile(b'\n\ntest.proto\"M\n\rSearchRequest\x12\r\n\x05query\x18\x01
>  
> \x01(\t\x12\x13\n\x0bpage_number\x18\x02 
> \x01(\x05\x12\x18\n\x10results_per_page\x18\x03 \x01(\x05\x62\x06proto3')
>
> _globals = globals()
> _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, _globals)
> _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 'test_pb2', _globals)
> if _descriptor._USE_C_DESCRIPTORS == False:
>
>   DESCRIPTOR._options = None
>   _globals['_SEARCHREQUEST']._serialized_start=14
>   _globals['_SEARCHREQUEST']._serialized_end=91
> # @@protoc_insertion_point(module_scope)
>
> But documentation said about other format file for python generated file 
> and I have no idea what I need to import in my producer for creat

[protobuf] Re: python protobuf: incomplete _pb2.py file?

2023-07-29 Thread Rodrigo Schramm
Hi. I am having the same issue. How did you solve this?

Em sexta-feira, 28 de outubro de 2022 às 17:24:10 UTC+2, Nelson Meyer 
escreveu:

> Just to let you know, that I solved the problem and it had nothing to do 
> with the _pb2.py files.
> Sorry for the "false alarm". Thanks!
>
> Nelson Meyer schrieb am Dienstag, 25. Oktober 2022 um 12:22:42 UTC+2:
>
>> Hi everyone,
>> I hope, this is the right place to ask this question. If not feel free to 
>> refer me elsewhere.
>>
>> *Some installation details:*
>> - I have installed the python protobuf according to this guide:
>> https://github.com/protocolbuffers/protobuf/tree/main/python 
>> - I have put the binary from protoc-21.8-osx-universal_binary.zip 
>> 
>>  
>> in /usr/local/bin and this path is at first place in the PATH variable
>> - the tests ran fine (Output: Ran 911 tests in 25.761s, OK (skipped=9))
>>
>>
>> *Problem:*
>> The generated _pb2.py file does not seem to be complete.
>> *It contains:*
>>
>> _sym_db = _symbol_database.Default()
>>
>> DESCRIPTOR = _descriptor_pool.Default().AddSerializedFile(b'\n\x19
>> jfinder_duplicate.proto\x12\tjfinder\"\xef\x06\n\rduplicate_ads
>> \x12\x10\n\x03\x63id\x18\x01 \x01(\x05H\x00\x88\x01\x01\x12\x12\n\x05\x62
>> q_id\x18\x14 \x01(\x05H\x01\x88\x01\x01\x12\x16\n\tj_title\x18\x02 \x01(
>> \tH\x02\x88\x01\x01 ...x06proto3')
>>
>> _builder.BuildMessageAndEnumDescriptors(DESCRIPTOR, globals())
>> _builder.BuildTopDescriptorsAndMessages(DESCRIPTOR, 
>> 'jfinder_duplicate_pb2', globals())
>> if _descriptor._USE_C_DESCRIPTORS == False:
>>
>> DESCRIPTOR._options = None
>> _DUPLICATE_ADS._serialized_start=41
>> _DUPLICATE_ADS._serialized_end=920
>> _J_HASHES._serialized_start=922
>> _J_HASHES._serialized_end=1022
>>
>> *What seems to be missing:*
>> _DUPLICATE_ADS =  and _JOB_HASHES =...
>> duplicate_ads = _reflection.GeneratedProtocolMessageType( ...
>> j_hashes = _reflection.GeneratedProtocolMessageType(.
>>
>> Consequence: Bigquery upload fails.
>>
>> From what I know, the protoc files and _pb2.py generation using this 
>> installation  
>> have worked before, but I had to try a new installation due to the error:
>> TypeError: Descriptors cannot not be created directly. If this call came 
>> from a _pb2.py file, your generated code is out of date and must be 
>> regenerated with protoc >= 3.19.0.
>>
>> Do you have any suggestion on how I can solve this problem?
>> Many thanks in advance!
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/8e280d26-f045-430c-90e6-b8a788af543an%40googlegroups.com.


[protobuf] Re: gRPC Versioning and Folder structure

2023-07-17 Thread 'Altay Sansal' via Protocol Buffers
On top of what Robert provided, I want to add some more context here. It 
may help us get more targeted answers.

We have already defined multiple versions of protobufs based on the best 
practices. We have v2 and v1, and v2 is not backward compatible and we have 
some users of v1.

We have server and client libraries in Python that uses code generated by 
the proto compiler. We are trying to minimize the user-facing API changes. 

What would be the best way to structure our server/client code? Upon 
research we found the following patterns:
1. Define services with v1 and v2 suffixes and move business logic to 
common modules.
2. Add a version field to the protobuf definition and handle the selection 
at runtime using a router design pattern (to avoid if/else)
3. Make a v1, v2 submodule within our client and server libraries, however 
it will duplicate some code and add more hierarchy.

P.S. We basically want to maximize maintainability, minimize technical 
debt, and make it streamlined for the users of the client library.

Thanks a lot!

On Monday, July 17, 2023 at 10:20:23 AM UTC-5 Robert Hardisty wrote:

> Hi all,
>
> Does anyone have an example of a good folder structure for maintaining 
> different API versions for both the server and client? Lots of good 
> examples on the proto files being versioned, but not much on server and 
> almost non-existent on client. (preferably python, but doesn't really 
> matter that much)
>
> e.g.
>
>- 
>
> https://medium.com/stackpulse/grpc-in-practice-directory-structure-linting-and-more-d4d438ac4f86
>- 
>
> https://learn.microsoft.com/en-us/aspnet/core/grpc/versioning?view=aspnetcore-7.0#version-number-services
>
>
> Best,
> Bob
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/de282aa0-25a8-4b7a-8729-633c47c4f997n%40googlegroups.com.


[protobuf] Re: serialize message to UDP socket

2023-07-16 Thread Adarsh Singh
Hi all, 
Do anyone have idea about how we can receive the proto buffer serialized 
message on the receiver side via a udp socket. I am not receiving the data 
while using the recvfrom and any other method as well. if any one has an 
example please share it. 

thank you,
adarsh singh

On Saturday, 19 September 2009 at 00:49:29 UTC+5:30 jayt...@gmail.com wrote:

> Hello all,
>
> I am having trouble figuring out how to serialize data over a socket
> utilizing UDP protocol. I am in C++ environment. When writing to the
> socket without protocol buffers, I use the standard sendto() socket
> call which allows me to specify the port and IP address of the
> intended receiver of my UDP message. When trying to send a protocol
> buffers message, this seems to be the recommended strategy on the
> google docs:
>
> ZeroCopyOutputStream* raw_output = new FileOutputStream
> (sock);
> CodedOutputStream* coded_output = new CodedOutputStream
> (raw_output);
> coded_output->WriteRaw(send_data,strlen(send_data));
>
> There is no way to specify what the port and IP address is here,
> analogous to when using the standard sendto() socket writing call. So
> my message never gets received by the intended recipient on the
> network. I am aware that this is a raw message, not a PB message.
> Getting this raw message over the network is a first step in
> accomplishing the ultimate goal of getting the PB message over the
> network.
>
> Is there a way to get all of the bytes of a serialized PB message into
> raw form and then send them with sendto()?
>
> Any ideas? Thanks for any help.
>
> Jay
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/4bbb5aff-5574-41b1-9ea6-f398fa5c69dbn%40googlegroups.com.


[protobuf] Re: serialize message to UDP socket

2023-07-16 Thread Adarsh Singh
Hi all, 
Do anyone have idea about how we can receive the proto buffer serialized 
message on the receiver side. I am not receiving the data while using the 
recvfrom and any other method as well. if any one has an example please 
share it. 

thank you,
adarsh singh

On Saturday, 19 September 2009 at 00:49:29 UTC+5:30 jayt...@gmail.com wrote:

> Hello all,
>
> I am having trouble figuring out how to serialize data over a socket
> utilizing UDP protocol. I am in C++ environment. When writing to the
> socket without protocol buffers, I use the standard sendto() socket
> call which allows me to specify the port and IP address of the
> intended receiver of my UDP message. When trying to send a protocol
> buffers message, this seems to be the recommended strategy on the
> google docs:
>
> ZeroCopyOutputStream* raw_output = new FileOutputStream
> (sock);
> CodedOutputStream* coded_output = new CodedOutputStream
> (raw_output);
> coded_output->WriteRaw(send_data,strlen(send_data));
>
> There is no way to specify what the port and IP address is here,
> analogous to when using the standard sendto() socket writing call. So
> my message never gets received by the intended recipient on the
> network. I am aware that this is a raw message, not a PB message.
> Getting this raw message over the network is a first step in
> accomplishing the ultimate goal of getting the PB message over the
> network.
>
> Is there a way to get all of the bytes of a serialized PB message into
> raw form and then send them with sendto()?
>
> Any ideas? Thanks for any help.
>
> Jay
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/0d586663-1ab0-4cf9-b192-124d7efca811n%40googlegroups.com.


[protobuf] Re: set numerci_range and read numeric_range

2023-07-16 Thread Michael Ngarimu
>From any Message you can call GetReflection(). 

>From the Reflection call ListFields().

FieldDescriptor has a variety of “name” attributes that are available as 
strings against which to compare.

FieldDescriptor also has FieldOptions. I have not used FieldOptions for 
numeric_range but the documentation suggests that FieldOptions supports the 
options enclosed in []. 
https://protobuf.dev/reference/cpp/api-docs/google.protobuf.descriptor/#FieldDescriptor.options.details


On Tuesday, July 11, 2023 at 7:51:18 PM UTC+12 Shahab Nassiri wrote:

> Hi 
>
> I want to set numeric_range on protobuf field and read the defined 
> numeric_range in runtime. I use the file descriptor and field descriptor 
> for this goal.
>
> But i have a problem. I don't want to hard code the name of class variable.
>
> I express one example about this. I define a day field in proto file and 
> set numeric_range on it as following.
>
> message Person{
>   int32 day  = 1; [range_min = 5; range_max = 10]
> }
>
> now, I want to read the defined numeric_range in runtime.
>
> for this goal, i must determine the variable name as hard code.
>
> is there any way for determining the variable as string? 
>
>
> thank you 
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/32716470-8158-4508-94d8-56887a53ded8n%40googlegroups.com.


Re: [protobuf] Re: serialize message to UDP socket

2023-07-12 Thread 'Adam Cozzette' via Protocol Buffers
I think in this case it would help to first make sure that you're able to
successfully get the raw bytes from one side to the other over UDP (i.e.
log the byte arrays on the sending side and receiving side and make sure
they are the same length and are identical). This seems like the part that
is most likely to be the problem. Once you're able to get the raw bytes
across successfully, parsing the proto should be fairly easy.

On Tue, Jul 11, 2023 at 11:36 PM Adarsh Singh 
wrote:

> I serialized my protobuff messages using the SerializedToArray() method.
> and send it to the server over an UDP socket. But On  receiver side when I
> tried to parse the data, I'm not able to parse the data from the UDP
> socket_fd. Can someone please  help me to get over this problem.  with TCP
> socket it's working fine. But I want to implement using UDP socket.
>
> thank you,
> Adarsh Singh
>
> On Saturday, 19 September 2009 at 00:49:29 UTC+5:30 jayt...@gmail.com
> wrote:
>
>> Hello all,
>>
>> I am having trouble figuring out how to serialize data over a socket
>> utilizing UDP protocol. I am in C++ environment. When writing to the
>> socket without protocol buffers, I use the standard sendto() socket
>> call which allows me to specify the port and IP address of the
>> intended receiver of my UDP message. When trying to send a protocol
>> buffers message, this seems to be the recommended strategy on the
>> google docs:
>>
>> ZeroCopyOutputStream* raw_output = new FileOutputStream
>> (sock);
>> CodedOutputStream* coded_output = new CodedOutputStream
>> (raw_output);
>> coded_output->WriteRaw(send_data,strlen(send_data));
>>
>> There is no way to specify what the port and IP address is here,
>> analogous to when using the standard sendto() socket writing call. So
>> my message never gets received by the intended recipient on the
>> network. I am aware that this is a raw message, not a PB message.
>> Getting this raw message over the network is a first step in
>> accomplishing the ultimate goal of getting the PB message over the
>> network.
>>
>> Is there a way to get all of the bytes of a serialized PB message into
>> raw form and then send them with sendto()?
>>
>> Any ideas? Thanks for any help.
>>
>> Jay
>>
>> --
> You received this message because you are subscribed to the Google Groups
> "Protocol Buffers" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to protobuf+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/protobuf/8be9bc35-3c3f-455f-9390-ba2de96aaa9dn%40googlegroups.com
> 
> .
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/CADqAXr5RuN1Ar3P90wu2w_tDk6iR58RnQrsTXkVeUucRYvP-bw%40mail.gmail.com.


[protobuf] Re: serialize message to UDP socket

2023-07-11 Thread Adarsh Singh
I serialized my protobuff messages using the SerializedToArray() method. 
and send it to the server over an UDP socket. But On  receiver side when I 
tried to parse the data, I'm not able to parse the data from the UDP 
socket_fd. Can someone please  help me to get over this problem.  with TCP 
socket it's working fine. But I want to implement using UDP socket. 

thank you,
Adarsh Singh 

On Saturday, 19 September 2009 at 00:49:29 UTC+5:30 jayt...@gmail.com wrote:

> Hello all,
>
> I am having trouble figuring out how to serialize data over a socket
> utilizing UDP protocol. I am in C++ environment. When writing to the
> socket without protocol buffers, I use the standard sendto() socket
> call which allows me to specify the port and IP address of the
> intended receiver of my UDP message. When trying to send a protocol
> buffers message, this seems to be the recommended strategy on the
> google docs:
>
> ZeroCopyOutputStream* raw_output = new FileOutputStream
> (sock);
> CodedOutputStream* coded_output = new CodedOutputStream
> (raw_output);
> coded_output->WriteRaw(send_data,strlen(send_data));
>
> There is no way to specify what the port and IP address is here,
> analogous to when using the standard sendto() socket writing call. So
> my message never gets received by the intended recipient on the
> network. I am aware that this is a raw message, not a PB message.
> Getting this raw message over the network is a first step in
> accomplishing the ultimate goal of getting the PB message over the
> network.
>
> Is there a way to get all of the bytes of a serialized PB message into
> raw form and then send them with sendto()?
>
> Any ideas? Thanks for any help.
>
> Jay
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/8be9bc35-3c3f-455f-9390-ba2de96aaa9dn%40googlegroups.com.


Re: [protobuf] Re: protoc returns Unknown flag: --go-grpc_opts

2023-06-25 Thread Daz Wilkin
You should not.

We've all done it  mix hyphens and underscores  it's confusing.


On Sunday, June 25, 2023 at 8:50:18 PM UTC-7 Jeff Kayser wrote:

> Hi, Daz.
>
> Doh!  Now I feel really stupid.  Thanks for letting me know.
>
> Jeff Kayser
>
>
>
>
> On Sun, Jun 25, 2023 at 7:25 PM Daz Wilkin  wrote:
>
>> It's `--go-grpc_opt` (no `s`)
>>
>> On Wednesday, June 21, 2023 at 4:32:19 PM UTC-7 Jeff Kayser wrote:
>>
>>> [jeffkayser@localhost ch12]$ protoc --go-grpc_out=. 
>>> --go-grpc_opts=source_relative housework/v1/housework.proto
>>> Unknown flag: --go-grpc_opts
>>> [jeffkayser@localhost ch12]$ pwd
>>> /home/jeffkayser/projects/jkayser/network/ch12
>>> [jeffkayser@localhost ch12]$ 
>>>
>>> Fedora 38:
>>>
>>> [jeffkayser@localhost ch12]$ uname -a
>>> Linux localhost.localdomain 6.3.8-200.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC 
>>> Thu Jun 15 02:15:40 UTC 2023 x86_64 GNU/Linux
>>> [jeffkayser@localhost ch12]$
>>>  
>>> Go 1.20.5
>>>
>>> [jeffkayser@localhost ch12]$ go version
>>> go version go1.20.5 linux/amd64
>>> [jeffkayser@localhost ch12]$ 
>>>
>>> Protocol buffers:
>>>
>>> [jeffkayser@localhost ch12]$ which protoc
>>> /usr/bin/protoc
>>> [jeffkayser@localhost ch12]$ which protoc-gen-go
>>> /usr/bin/protoc-gen-go
>>> [jeffkayser@localhost ch12]$ which protoc-gen-go-grpc
>>> /usr/bin/protoc-gen-go-grpc
>>> [jeffkayser@localhost ch12]$ /usr/bin/protoc --version
>>> libprotoc 3.19.6
>>> [jeffkayser@localhost ch12]$ /usr/bin/protoc-gen-go --version
>>> protoc-gen-go v1.28.1
>>> [jeffkayser@localhost ch12]$ /usr/bin/protoc-gen-go-grpc --version
>>> protoc-gen-go-grpc 1.2.0
>>> [jeffkayser@localhost ch12]$
>>>
>>> Any help would be appreciated.
>>>
>>> Jeff Kayser
>>>
>>> -- 
>> You received this message because you are subscribed to a topic in the 
>> Google Groups "Protocol Buffers" group.
>> To unsubscribe from this topic, visit 
>> https://groups.google.com/d/topic/protobuf/nvkLDlr9_lo/unsubscribe.
>> To unsubscribe from this group and all its topics, send an email to 
>> protobuf+u...@googlegroups.com.
>> To view this discussion on the web visit 
>> https://groups.google.com/d/msgid/protobuf/4168088d-b29b-499c-8ced-a8b4bda8c60fn%40googlegroups.com
>>  
>> 
>> .
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/b795e10b-b108-4022-a8a8-10f73e40eaf7n%40googlegroups.com.


Re: [protobuf] Re: protoc returns Unknown flag: --go-grpc_opts

2023-06-25 Thread Jeff Kayser
Hi, Daz.

Doh!  Now I feel really stupid.  Thanks for letting me know.

Jeff Kayser




On Sun, Jun 25, 2023 at 7:25 PM Daz Wilkin  wrote:

> It's `--go-grpc_opt` (no `s`)
>
> On Wednesday, June 21, 2023 at 4:32:19 PM UTC-7 Jeff Kayser wrote:
>
>> [jeffkayser@localhost ch12]$ protoc --go-grpc_out=.
>> --go-grpc_opts=source_relative housework/v1/housework.proto
>> Unknown flag: --go-grpc_opts
>> [jeffkayser@localhost ch12]$ pwd
>> /home/jeffkayser/projects/jkayser/network/ch12
>> [jeffkayser@localhost ch12]$
>>
>> Fedora 38:
>>
>> [jeffkayser@localhost ch12]$ uname -a
>> Linux localhost.localdomain 6.3.8-200.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC
>> Thu Jun 15 02:15:40 UTC 2023 x86_64 GNU/Linux
>> [jeffkayser@localhost ch12]$
>>
>> Go 1.20.5
>>
>> [jeffkayser@localhost ch12]$ go version
>> go version go1.20.5 linux/amd64
>> [jeffkayser@localhost ch12]$
>>
>> Protocol buffers:
>>
>> [jeffkayser@localhost ch12]$ which protoc
>> /usr/bin/protoc
>> [jeffkayser@localhost ch12]$ which protoc-gen-go
>> /usr/bin/protoc-gen-go
>> [jeffkayser@localhost ch12]$ which protoc-gen-go-grpc
>> /usr/bin/protoc-gen-go-grpc
>> [jeffkayser@localhost ch12]$ /usr/bin/protoc --version
>> libprotoc 3.19.6
>> [jeffkayser@localhost ch12]$ /usr/bin/protoc-gen-go --version
>> protoc-gen-go v1.28.1
>> [jeffkayser@localhost ch12]$ /usr/bin/protoc-gen-go-grpc --version
>> protoc-gen-go-grpc 1.2.0
>> [jeffkayser@localhost ch12]$
>>
>> Any help would be appreciated.
>>
>> Jeff Kayser
>>
>> --
> You received this message because you are subscribed to a topic in the
> Google Groups "Protocol Buffers" group.
> To unsubscribe from this topic, visit
> https://groups.google.com/d/topic/protobuf/nvkLDlr9_lo/unsubscribe.
> To unsubscribe from this group and all its topics, send an email to
> protobuf+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/protobuf/4168088d-b29b-499c-8ced-a8b4bda8c60fn%40googlegroups.com
> 
> .
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/CAMxODjGWO_P8jhTLemnAxcfTdVrUSYbHWcS-ruzMAG07N1dOHw%40mail.gmail.com.


[protobuf] Re: protoc returns Unknown flag: --go-grpc_opts

2023-06-25 Thread Daz Wilkin
It's `--go-grpc_opt` (no `s`)

On Wednesday, June 21, 2023 at 4:32:19 PM UTC-7 Jeff Kayser wrote:

> [jeffkayser@localhost ch12]$ protoc --go-grpc_out=. 
> --go-grpc_opts=source_relative housework/v1/housework.proto
> Unknown flag: --go-grpc_opts
> [jeffkayser@localhost ch12]$ pwd
> /home/jeffkayser/projects/jkayser/network/ch12
> [jeffkayser@localhost ch12]$ 
>
> Fedora 38:
>
> [jeffkayser@localhost ch12]$ uname -a
> Linux localhost.localdomain 6.3.8-200.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC 
> Thu Jun 15 02:15:40 UTC 2023 x86_64 GNU/Linux
> [jeffkayser@localhost ch12]$
>  
> Go 1.20.5
>
> [jeffkayser@localhost ch12]$ go version
> go version go1.20.5 linux/amd64
> [jeffkayser@localhost ch12]$ 
>
> Protocol buffers:
>
> [jeffkayser@localhost ch12]$ which protoc
> /usr/bin/protoc
> [jeffkayser@localhost ch12]$ which protoc-gen-go
> /usr/bin/protoc-gen-go
> [jeffkayser@localhost ch12]$ which protoc-gen-go-grpc
> /usr/bin/protoc-gen-go-grpc
> [jeffkayser@localhost ch12]$ /usr/bin/protoc --version
> libprotoc 3.19.6
> [jeffkayser@localhost ch12]$ /usr/bin/protoc-gen-go --version
> protoc-gen-go v1.28.1
> [jeffkayser@localhost ch12]$ /usr/bin/protoc-gen-go-grpc --version
> protoc-gen-go-grpc 1.2.0
> [jeffkayser@localhost ch12]$
>
> Any help would be appreciated.
>
> Jeff Kayser
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/4168088d-b29b-499c-8ced-a8b4bda8c60fn%40googlegroups.com.


[protobuf] Re: Parse from string returns false

2023-06-19 Thread Krati Chordia
Thanks much for your input. 
-Krati

On Friday, June 16, 2023 at 10:23:58 PM UTC+5:30 Deanna Garcia wrote:

> It's the same situation with integers. If we made this return true then if 
> you did not set the value and attempted to serialize it it would return 
> true. I agree that this behavior isn't optimal, we just don't have a great 
> way around it.
>
> On Friday, June 16, 2023 at 5:40:59 AM UTC-7 Krati Chordia wrote:
>
>> Hi Deanna!
>> Agreed to your point. But if there is an integer and it's value is set to 
>> 0, even then the parsing returns false. Which should not happen.
>>
>> On Thursday, June 15, 2023 at 11:20:56 PM UTC+5:30 Deanna Garcia wrote:
>>
>>> In protobuf, we can't store null strings so to denote a string that 
>>> isn't set we use the empty string. For this reason, the parsing is 
>>> returning false to tell you that there isn't a field there but as you're 
>>> seeing will still parse as an empty string.
>>>
>>> On Wednesday, June 14, 2023 at 1:37:05 AM UTC-7 Krati Chordia wrote:
>>>
 Hi,
 I am trying to send a message on wire with the following protobuf 
 structure

 message TestMsg {
 string status = 1;
 }

 I create an instance of TestMsg and set status as empty and serialize 
 it to a string.

 TestMsg m1;
 m1.set_status("");

 std::string str = m1.SerializeAsString();

 Post serialization, str is sent over wire and tried to be parsed. 
 ParseFromString returns false whereas it should not. For any other value, 
 it parses successfully.

 TestMsg m2;
 m2.ParseFromString(str);  <- this returns false

 Also, if I try to retrieve the value of m2.status(), it will return an 
 empty string even though the parsing returns false.

>>>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/8504e0b5-bfec-4b69-b00f-866f4da93421n%40googlegroups.com.


[protobuf] Re: Is it possible to build shared libprotobuf.so with static abls libraries

2023-06-16 Thread Adam Dembek
Thanks

We need absl only for  protobuf  and grpc. Right now  we plan to build abls 
static  libraries and link the same files to both protobuf  and  grpc. When 
this  will not work  we  can try building shared libarary.

Adam

piątek, 16 czerwca 2023 o 19:04:02 UTC+2 Mike Kruskal napisał(a):

> Hey Adam,
>
> You can try to set `POSITION_INDEPENDENT_CODE` in the Abseil build.  That 
> should at least get around the error you're seeing here.  However, the 
> reason we don't explicitly support this is because in many situations it's 
> an ODR violation.  If two shared libraries both use Abseil via static 
> linkage, there will be two definitions of absl.  Assuming they're both 
> identical, this isn't a problem in most of the absl code.  However, when it 
> comes to things like the hash seed you can end up breaking every hash table 
> in very difficult to debug ways.  I think the only way this could work is 
> if none of the other shared libraries (and the code that's using them) were 
> using Abseil, and you don't need protoc (which also uses Abseil).
>
> That seems like a pretty brittle requirement to place on anyone, and I 
> think a feature request for Abseil to ship a single shared library might be 
> a better approach.  They already do this for windows, but not on linux or 
> mac.
>
> -Mike
>
> On Friday, June 16, 2023 at 2:59:43 AM UTC-7 Adam Dembek wrote:
>
>> We want to use shared libprotobuf.so for many tools to not increase size 
>> of each binary. 
>> We know we can build protobuf with -Dprotobuf_BUILD_SHARED_LIBS=ON 
>> -Dprotobuf_INSTALL_SUPPORTED_FROM_MODULE=ON -Dprotobuf_ABSL_PROVIDER=module 
>> but then both protobuf and absl are build as shared library.
>> We need to deliver over 80 libraries.  It would easier if we could only 
>> deliver single libprotobuf.so file like in the past when protobuf was not 
>> using absl. 
>>
>> Is it possible to use static absl libraies and link it libprotobuf.so ? 
>> I tried firts build abls independentlay with
>>
>> cmake -S ${ABSL_SOURCE} -B ${ABSL_PROD_DIR} 
>> -DCMAKE_PREFIX_PATH=${ABSL_PROD_DIR} 
>> -DCMAKE_INSTALL_PREFIX=${ABSL_PROD_DIR} -DABSL_ENABLE_INSTALL=ON 
>> -DABSL_USE_EXTERNAL_GOOGLETEST=OFF -DABSL_FIND_GOOGLETEST=OFF
>>
>> This created static absl libraries.
>>
>> And then build protobuf  with 
>>  -DCMAKE_PREFIX_PATH=${ABSL_PROD_DIR}/lib64/cmake/absl 
>> -Dprotobuf_ABSL_PROVIDER=package-DCMAKE_CXX_STANDARD=14 
>> -Dprotobuf_BUILD_SHARED_LIBS=ON 
>>
>> But it fails  with 
>>
>> [ 36%] Linking CXX shared library libprotobuf.so
>> /usr/bin/ld: 
>> /user/adembek/PROD/absl/build/absl_2023_01_25/lib64/libabsl_cord.a(cord.cc.o):
>>  
>> relocation R_X86_64_32 against `.rodata' can not be used when making a 
>> shared object; recompile with -fPIC
>> /usr/bin/ld: 
>> /user/adembek/PROD/absl/build/absl_2023_01_25/lib64/libabsl_cord.a(cord_analysis.cc.o):
>>  
>> relocation R_X86_64_32 against `.rodata' can not be used when making a 
>> shared object; recompile with -fPIC
>> /usr/bin/ld: 
>> /user/adembek/PROD/absl/build/absl_2023_01_25/lib64/libabsl_die_if_null.a(die_if_null.cc.o):
>>  
>> relocation R_X86_64_32 against `.rodata' can not be used when making a 
>> shared object; recompile with -fPIC
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/199dfedb-71ce-47b1-975d-2e4b9c63f303n%40googlegroups.com.


[protobuf] Re: Is it possible to build shared libprotobuf.so with static abls libraries

2023-06-16 Thread 'Mike Kruskal' via Protocol Buffers
Hey Adam,

You can try to set `POSITION_INDEPENDENT_CODE` in the Abseil build.  That 
should at least get around the error you're seeing here.  However, the 
reason we don't explicitly support this is because in many situations it's 
an ODR violation.  If two shared libraries both use Abseil via static 
linkage, there will be two definitions of absl.  Assuming they're both 
identical, this isn't a problem in most of the absl code.  However, when it 
comes to things like the hash seed you can end up breaking every hash table 
in very difficult to debug ways.  I think the only way this could work is 
if none of the other shared libraries (and the code that's using them) were 
using Abseil, and you don't need protoc (which also uses Abseil).

That seems like a pretty brittle requirement to place on anyone, and I 
think a feature request for Abseil to ship a single shared library might be 
a better approach.  They already do this for windows, but not on linux or 
mac.

-Mike

On Friday, June 16, 2023 at 2:59:43 AM UTC-7 Adam Dembek wrote:

> We want to use shared libprotobuf.so for many tools to not increase size 
> of each binary. 
> We know we can build protobuf with -Dprotobuf_BUILD_SHARED_LIBS=ON 
> -Dprotobuf_INSTALL_SUPPORTED_FROM_MODULE=ON -Dprotobuf_ABSL_PROVIDER=module 
> but then both protobuf and absl are build as shared library.
> We need to deliver over 80 libraries.  It would easier if we could only 
> deliver single libprotobuf.so file like in the past when protobuf was not 
> using absl. 
>
> Is it possible to use static absl libraies and link it libprotobuf.so ? 
> I tried firts build abls independentlay with
>
> cmake -S ${ABSL_SOURCE} -B ${ABSL_PROD_DIR} 
> -DCMAKE_PREFIX_PATH=${ABSL_PROD_DIR} 
> -DCMAKE_INSTALL_PREFIX=${ABSL_PROD_DIR} -DABSL_ENABLE_INSTALL=ON 
> -DABSL_USE_EXTERNAL_GOOGLETEST=OFF -DABSL_FIND_GOOGLETEST=OFF
>
> This created static absl libraries.
>
> And then build protobuf  with 
>  -DCMAKE_PREFIX_PATH=${ABSL_PROD_DIR}/lib64/cmake/absl 
> -Dprotobuf_ABSL_PROVIDER=package-DCMAKE_CXX_STANDARD=14 
> -Dprotobuf_BUILD_SHARED_LIBS=ON 
>
> But it fails  with 
>
> [ 36%] Linking CXX shared library libprotobuf.so
> /usr/bin/ld: 
> /user/adembek/PROD/absl/build/absl_2023_01_25/lib64/libabsl_cord.a(cord.cc.o):
>  
> relocation R_X86_64_32 against `.rodata' can not be used when making a 
> shared object; recompile with -fPIC
> /usr/bin/ld: 
> /user/adembek/PROD/absl/build/absl_2023_01_25/lib64/libabsl_cord.a(cord_analysis.cc.o):
>  
> relocation R_X86_64_32 against `.rodata' can not be used when making a 
> shared object; recompile with -fPIC
> /usr/bin/ld: 
> /user/adembek/PROD/absl/build/absl_2023_01_25/lib64/libabsl_die_if_null.a(die_if_null.cc.o):
>  
> relocation R_X86_64_32 against `.rodata' can not be used when making a 
> shared object; recompile with -fPIC

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/dc86a283-30f7-4901-998e-9a32c831cf63n%40googlegroups.com.


[protobuf] Re: error: cannot access CheckReturnValue

2023-06-16 Thread Kevin Jimenez
When I remove 
implementation("com.google.protobuf:protobuf-java:3.23.0") {
exclude(group: 'com.google.guava', module: 'guava')
exclude(group: 'org.checkerframework', module: 
'checker-compat-qual')
exclude(group: 'javax.annotation', module: 'jsr250-api')
exclude(group: 'io.opencensus', module: 'opencensus-api')
exclude(group: 'com.google.errorprone', module: 
'error_prone_annotations')
exclude(group: 'com.google.protobuf', module: 'protobuf-javalite')
exclude(group: 'io.grpc', module: 'grpc-context')
}
}
Timestamp does not get recognized, but CheckReturnValue problem is gone.  
On the other side when I add it I get CheckReturnValue error. Do you have 
any idea if the problem is with the protobuf-java or the guava dependency?
On Friday, June 16, 2023 at 11:55:09 AM UTC-5 Deanna Garcia wrote:

> I think you're right that this is likely a problem with guava. Can you try 
> posting a bug in their repo?
>
> On Thursday, June 15, 2023 at 6:55:37 PM UTC-7 Kevin Jimenez wrote:
>
>> I am trying to install grpc for Android with access to Timestamp. The 
>> generated code generates ok but the project throws this error. I did some 
>> research and it looks like it has to do with guava. Attached is my app 
>> build.grade:
>>
>> *Error* 
>>
>> > Task :app:compileDevDebugJavaWithJavac
>> error: cannot access CheckReturnValue
>>   class file for javax.annotation.CheckReturnValue not found
>> cannot access CheckReturnValue
>>
>> Note: Some input files use or override a deprecated API.
>> Note: Recompile with -Xlint:deprecation for details.
>> 1 error
>>
>> Build Gradle Snippet
>> `` 
>>
>> // You need to build grpc-java to obtain these libraries below.
>> implementation 'io.grpc:grpc-okhttp:1.55.1' // CURRENT_GRPC_VERSION
>> implementation 'io.grpc:grpc-protobuf-lite:1.55.1' // 
>> CURRENT_GRPC_VERSION
>> implementation 'io.grpc:grpc-stub:1.55.1' // CURRENT_GRPC_VERSION
>> implementation 'org.apache.tomcat:annotations-api:6.0.53'
>> implementation 'com.google.protobuf:protobuf-javalite:3.23.2'
>> implementation("com.google.protobuf:protobuf-java:3.23.0") {
>> exclude(group: 'com.google.guava', module: 'guava')
>> exclude(group: 'org.checkerframework', module: 
>> 'checker-compat-qual')
>> exclude(group: 'javax.annotation', module: 'jsr250-api')
>> exclude(group: 'io.opencensus', module: 'opencensus-api')
>> exclude(group: 'com.google.errorprone', module: 
>> 'error_prone_annotations')
>> exclude(group: 'com.google.protobuf', module: 'protobuf-javalite')
>> exclude(group: 'io.grpc', module: 'grpc-context')
>> }
>> //implementation 
>> 'com.google.errorprone:error_prone_annotations:2.18.0'
>> //implementation("com.google.guava:guava:32.0.1-android")
>> //protobuf 'com.google.protobuf:protobuf-java:3.23.0'
>> }
>>
>> protobuf {
>> protoc { artifact = 'com.google.protobuf:protoc:3.23.2' }
>> plugins {
>> grpc {
>> artifact = 'io.grpc:protoc-gen-grpc-java:1.55.1' // 
>> CURRENT_GRPC_VERSION
>> }
>> }
>> generateProtoTasks {
>> all().each { task ->
>> task.builtins {
>> java { option 'lite' }
>> }
>> task.plugins {
>> grpc { // Options added to --grpc_out
>> option 'lite'
>> }
>> }
>> }
>> }
>> }
>> ```
>>
>> Thanks in advance! Lmk if you want any more info
>>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/0a450964-2396-4798-8757-ea4f1e346e6dn%40googlegroups.com.


[protobuf] Re: error: cannot access CheckReturnValue

2023-06-16 Thread 'Deanna Garcia' via Protocol Buffers
I think you're right that this is likely a problem with guava. Can you try 
posting a bug in their repo?

On Thursday, June 15, 2023 at 6:55:37 PM UTC-7 Kevin Jimenez wrote:

> I am trying to install grpc for Android with access to Timestamp. The 
> generated code generates ok but the project throws this error. I did some 
> research and it looks like it has to do with guava. Attached is my app 
> build.grade:
>
> *Error* 
>
> > Task :app:compileDevDebugJavaWithJavac
> error: cannot access CheckReturnValue
>   class file for javax.annotation.CheckReturnValue not found
> cannot access CheckReturnValue
>
> Note: Some input files use or override a deprecated API.
> Note: Recompile with -Xlint:deprecation for details.
> 1 error
>
> Build Gradle Snippet
> `` 
>
> // You need to build grpc-java to obtain these libraries below.
> implementation 'io.grpc:grpc-okhttp:1.55.1' // CURRENT_GRPC_VERSION
> implementation 'io.grpc:grpc-protobuf-lite:1.55.1' // 
> CURRENT_GRPC_VERSION
> implementation 'io.grpc:grpc-stub:1.55.1' // CURRENT_GRPC_VERSION
> implementation 'org.apache.tomcat:annotations-api:6.0.53'
> implementation 'com.google.protobuf:protobuf-javalite:3.23.2'
> implementation("com.google.protobuf:protobuf-java:3.23.0") {
> exclude(group: 'com.google.guava', module: 'guava')
> exclude(group: 'org.checkerframework', module: 
> 'checker-compat-qual')
> exclude(group: 'javax.annotation', module: 'jsr250-api')
> exclude(group: 'io.opencensus', module: 'opencensus-api')
> exclude(group: 'com.google.errorprone', module: 
> 'error_prone_annotations')
> exclude(group: 'com.google.protobuf', module: 'protobuf-javalite')
> exclude(group: 'io.grpc', module: 'grpc-context')
> }
> //implementation 'com.google.errorprone:error_prone_annotations:2.18.0'
> //implementation("com.google.guava:guava:32.0.1-android")
> //protobuf 'com.google.protobuf:protobuf-java:3.23.0'
> }
>
> protobuf {
> protoc { artifact = 'com.google.protobuf:protoc:3.23.2' }
> plugins {
> grpc {
> artifact = 'io.grpc:protoc-gen-grpc-java:1.55.1' // 
> CURRENT_GRPC_VERSION
> }
> }
> generateProtoTasks {
> all().each { task ->
> task.builtins {
> java { option 'lite' }
> }
> task.plugins {
> grpc { // Options added to --grpc_out
> option 'lite'
> }
> }
> }
> }
> }
> ```
>
> Thanks in advance! Lmk if you want any more info
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/d102f575-ee36-4c5d-aa7f-d1c88525aa5en%40googlegroups.com.


[protobuf] Re: Parse from string returns false

2023-06-16 Thread 'Deanna Garcia' via Protocol Buffers
It's the same situation with integers. If we made this return true then if 
you did not set the value and attempted to serialize it it would return 
true. I agree that this behavior isn't optimal, we just don't have a great 
way around it.

On Friday, June 16, 2023 at 5:40:59 AM UTC-7 Krati Chordia wrote:

> Hi Deanna!
> Agreed to your point. But if there is an integer and it's value is set to 
> 0, even then the parsing returns false. Which should not happen.
>
> On Thursday, June 15, 2023 at 11:20:56 PM UTC+5:30 Deanna Garcia wrote:
>
>> In protobuf, we can't store null strings so to denote a string that isn't 
>> set we use the empty string. For this reason, the parsing is returning 
>> false to tell you that there isn't a field there but as you're seeing will 
>> still parse as an empty string.
>>
>> On Wednesday, June 14, 2023 at 1:37:05 AM UTC-7 Krati Chordia wrote:
>>
>>> Hi,
>>> I am trying to send a message on wire with the following protobuf 
>>> structure
>>>
>>> message TestMsg {
>>> string status = 1;
>>> }
>>>
>>> I create an instance of TestMsg and set status as empty and serialize it 
>>> to a string.
>>>
>>> TestMsg m1;
>>> m1.set_status("");
>>>
>>> std::string str = m1.SerializeAsString();
>>>
>>> Post serialization, str is sent over wire and tried to be parsed. 
>>> ParseFromString returns false whereas it should not. For any other value, 
>>> it parses successfully.
>>>
>>> TestMsg m2;
>>> m2.ParseFromString(str);  <- this returns false
>>>
>>> Also, if I try to retrieve the value of m2.status(), it will return an 
>>> empty string even though the parsing returns false.
>>>
>>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/32938bf8-ad54-4a80-b86b-6cfedf0f2befn%40googlegroups.com.


[protobuf] Re: Parse from string returns false

2023-06-16 Thread Krati Chordia
Hi Deanna!
Agreed to your point. But if there is an integer and it's value is set to 
0, even then the parsing returns false. Which should not happen.

On Thursday, June 15, 2023 at 11:20:56 PM UTC+5:30 Deanna Garcia wrote:

> In protobuf, we can't store null strings so to denote a string that isn't 
> set we use the empty string. For this reason, the parsing is returning 
> false to tell you that there isn't a field there but as you're seeing will 
> still parse as an empty string.
>
> On Wednesday, June 14, 2023 at 1:37:05 AM UTC-7 Krati Chordia wrote:
>
>> Hi,
>> I am trying to send a message on wire with the following protobuf 
>> structure
>>
>> message TestMsg {
>> string status = 1;
>> }
>>
>> I create an instance of TestMsg and set status as empty and serialize it 
>> to a string.
>>
>> TestMsg m1;
>> m1.set_status("");
>>
>> std::string str = m1.SerializeAsString();
>>
>> Post serialization, str is sent over wire and tried to be parsed. 
>> ParseFromString returns false whereas it should not. For any other value, 
>> it parses successfully.
>>
>> TestMsg m2;
>> m2.ParseFromString(str);  <- this returns false
>>
>> Also, if I try to retrieve the value of m2.status(), it will return an 
>> empty string even though the parsing returns false.
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/c371a73a-723e-4bb7-aba1-4751e39cd728n%40googlegroups.com.


[protobuf] Re: Parse from string returns false

2023-06-15 Thread 'Deanna Garcia' via Protocol Buffers
In protobuf, we can't store null strings so to denote a string that isn't 
set we use the empty string. For this reason, the parsing is returning 
false to tell you that there isn't a field there but as you're seeing will 
still parse as an empty string.

On Wednesday, June 14, 2023 at 1:37:05 AM UTC-7 Krati Chordia wrote:

> Hi,
> I am trying to send a message on wire with the following protobuf structure
>
> message TestMsg {
> string status = 1;
> }
>
> I create an instance of TestMsg and set status as empty and serialize it 
> to a string.
>
> TestMsg m1;
> m1.set_status("");
>
> std::string str = m1.SerializeAsString();
>
> Post serialization, str is sent over wire and tried to be parsed. 
> ParseFromString returns false whereas it should not. For any other value, 
> it parses successfully.
>
> TestMsg m2;
> m2.ParseFromString(str);  <- this returns false
>
> Also, if I try to retrieve the value of m2.status(), it will return an 
> empty string even though the parsing returns false.
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/57656589-532a-4780-9427-4a510107a1c4n%40googlegroups.com.


[protobuf] Re: Self-describing messages in Python

2023-06-15 Thread Jan Machek
Hello,

in the end I managed to fix this issue using CopyFrom(fds), where fds is an 
instnce of google.protobuf.descriptor_pb2.FileDescriptorSet, rather than 
file.append function as show in my original code.

So in the code above we would call 
my_proto_instance.descriptor_set.CopyFrom(addressbook_fds)  instead of 
my_proto_instance.descriptor_set.file.append(addressbook_fds).

Hope someone finds this useful.

Cheers

Jan
On Friday, April 7, 2023 at 11:38:50 PM UTC+2 Jan Machek wrote:

> Hello,
>
> I am trying to build self-describing messages using Python. I read through 
> the documentation 
> https://protobuf.dev/programming-guides/techniques/#self-description as 
> well as through the Python API 
> https://googleapis.dev/python/protobuf/latest/google/protobuf.html, but I 
> am still not very sure about how to do that.
>
> I started with the example message 
> syntax = "proto3"; import "google/protobuf/any.proto"; import 
> "google/protobuf/descriptor.proto"; message SelfDescribingMessage { // 
> Set of FileDescriptorProtos which describe the type and its dependencies. 
> google.protobuf.FileDescriptorSet descriptor_set = 1; // The message and 
> its type, encoded as an Any message. google.protobuf.Any message = 2; }
>
> And I would like to pass through it a simple addressbook message
>
> syntax = "proto2";
>
> package tutorial;
>
> message AddressBook {
>   optional string name = 1;
>   optional string number = 2;
> }
>
> The AddressBook message would be contained in the message field and the 
> descriptor_set field would contain the descriptor of the proto above.
>
> However I could not do this no matter what I have tried.
>
> The closest I got was exporting a file descriptor set from the 
> SelfDescribingMessage.proto given in the documentation as "protoc 
> --proto_path=. --proto_path=./include 
> --descriptor_set_out=./self_describing_ds --include_imports 
> self_describing.proto" and then reading it
>
> with open("self_describing_ds", 'rb') as fh:
> fds = descriptor_pb2.FileDescriptorSet.FromString(fh.read())
>
> message_classes = message_factory.GetMessages(fds.file)
> my_proto_instance = message_classes["SelfDescribingMessage"]()
>
> address_book = addressbook_pb2.AddressBook()
> address_book.name = "John Doe"
> address_book.number = "123456"
>
> my_proto_instance.message.Pack(address_book)
>
> I am not able to set my_proto_instance.descriptor_set though. Extracting 
> address book descriptor set using protoc and then reading it and trying to 
> append it 
>
> with open("addressbook_ds", 'rb') as fh:
> addressbook_fds = 
> descriptor_pb2.FileDescriptorSet.FromString(fh.read()) 
> my_proto_instance.descriptor_set.file.append(addressbook_fds)
>
> fails on
>
> TypeError: Parameter to MergeFrom() must be instance of same class: 
> expected  got  'google.protobuf.descriptor_pb2.FileDescriptorProto'>.
>
> I could not get any closer.
>
> Does anyone have any simple example on sending the self-describing 
> messages in Python, please?
>
> Thanks a lot in advance.
>
> Jan
>
>
>
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/e85a4c87-9c2b-4a99-8c9c-a88db37a8b6cn%40googlegroups.com.


[protobuf] Re: How to Create Class Use C++ without protoc file

2023-06-13 Thread 'Deanna Garcia' via Protocol Buffers
I don't think I understand your question, code generation is a key part of 
protobufs (and separate from protoc -- the compiler). You can view our 
documentation at https://protobuf.dev/ which should provide some more 
information.

On Monday, June 12, 2023 at 9:48:37 PM UTC-7 qiqi tang wrote:

> Hi , I have saw the example about protobuf document. But I want to use 
> it's characters without using code generation.What should I do?

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/75dea73b-93de-4850-8a6d-4120181733afn%40googlegroups.com.


[protobuf] Re: php_protobuf.dll for PHP 8.1.20 VS16 x64?

2023-06-12 Thread 'Deanna Garcia' via Protocol Buffers
Unfortunately we do not publish precompiled binaries for PHP. You can find 
all our PHP offerings on packagist 
 or PECL 
.

On Sunday, June 11, 2023 at 10:26:54 AM UTC-7 genosmrpg7899 wrote:

> Added note, I am on Windows preferably, how Linux binaries are also 
> accepted.
> On Sunday, June 11, 2023 at 1:26:05 p.m. UTC-4 genosmrpg7899 wrote:
>
>> Are precompiled binaries available for the protobuf extension on PHP 
>> 8.1.20 VS16 x64?
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/cd7d7f2d-aca4-4a55-9ab9-bfdafe545370n%40googlegroups.com.


[protobuf] Re: How do I become a developer and get my bug fix merged?

2023-06-12 Thread 'Deanna Garcia' via Protocol Buffers
A protobuf team member has to manually add a label to mark the PR as safe 
before tests can be run. I see there are new comments on your PR. Once 
those comments are resolved, someone will follow up and add the label to 
run the tests.

On Wednesday, June 7, 2023 at 8:34:38 AM UTC-7 Jeff wrote:

> Hello,
> I opened an issue https://github.com/protocolbuffers/protobuf/issues/12994 
> and pushed a fix to my fork with a PR 
> https://github.com/protocolbuffers/protobuf/pull/12993. The build fails 
> because "This pull request is from an unsafe fork and hasn't been 
> approved to run tests!" Fair enough.
>
> What is the procedure I need to follow to move ahead with this bug fix? 
> I've searched for a developer guide but didn't find it.
>
> Thanks,
> Jeff
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/b9e69823-6546-4415-b70f-51b65ec44210n%40googlegroups.com.


[protobuf] Re: php_protobuf.dll for PHP 8.1.20 VS16 x64?

2023-06-11 Thread genosmrpg7899
Added note, I am on Windows preferably, how Linux binaries are also 
accepted.
On Sunday, June 11, 2023 at 1:26:05 p.m. UTC-4 genosmrpg7899 wrote:

> Are precompiled binaries available for the protobuf extension on PHP 
> 8.1.20 VS16 x64?
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/8ab3cedc-3d37-4727-8785-44b6eef54bffn%40googlegroups.com.


[protobuf] Re: TypeError: Message must be initialized with a dict

2023-06-10 Thread 'Tony Piazza' via Protocol Buffers
The problem had to do with the Date and Timestamp fields. The problem was 
solved by converting them to int64.

On Saturday, June 10, 2023 at 10:56:36 PM UTC-5 Tony Piazza wrote:

> My current task requires me to use the BigQuery Storage Write API. I have 
> created a .proto file and was able to use protoc to generate a Python 
> message class. I am seeing this exception when creating an instance of that 
> class:
>
> *TypeError: Message must be initialized with a dict: 
> combocurve.Measurement*
>
> *File "/google/api_core/grpc_helpers.py", line 162, in 
> error_remapped_callable*
>
> Here is my .proto file:
>
> syntax = "proto2";
>
> package combocurve;
>
> import "google/protobuf/timestamp.proto";
> import "google/type/date.proto";
>
> message Measurement {
> required string device_id = 1;
> required google.type.Date last_service_date = 2;
> optional double temperature = 3;
> optional double pressure = 4;
> optional google.protobuf.Timestamp created_at = 5;
> }
>
> Here is the code that is raising the exception:
>
> measurement = Measurement(
> device_id='ABC123',
> last_service_date=date_pb2.Date(
> year=last_service_date.year, 
> month=last_service_date.month, 
> day=last_service_date.day),
> temperature=10.0,
> pressure=20.0,
> created_at=int(created_at.timestamp() * 1e6)
> )
>
> Please let me know if you have any ideas as to what is causing this 
> exception.
>
> Thanks in advance for your help!
>
> -Tony
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/b97189ab-2211-4281-8e9f-84e75a97cb2bn%40googlegroups.com.


[protobuf] Re: Benchmarks for protoc buffer?

2023-06-06 Thread Steven L Wang
Hi Florian:
   Thanks for your suggestion. It's helpful. 

Thanks,
  -Steven

On Tuesday, June 6, 2023 at 5:07:55 PM UTC+8 Florian Enner wrote:

> There are many community benchmarks that include protobuf and/or grpc, but 
> whether they are representative of your use case is a different question. 
> Most comparisons I've seen are either basic, use suboptimal message 
> definitions, or have been specifically designed to make Protobuf look bad 
> (e.g. benchmarks from competitors that are trying to establish themselves). 
> IMO it's best to create your own benchmarks if you need reliable results 
> for your specific use case.
>
> Here are a few guidelines for message definitions that should generally 
> help performance:
>
>- prefer field ids below 16 (5 bit)
>- avoid deep message nesting
>- use varint types (uint32, sint32) for small numbers (ideally max 7 
>bit, i.e., 0-128)
>- use fixed-width types (fixed32, sfixed32, etc.) for large numbers
>- use with caution: groups can be serialized more efficiently than 
>messages, but they have been deprecated and the syntax is terrible
>
> The performance impact also depends a lot on the actual implementation, 
> e.g., deep nesting hits harder on implementations that don't cache the 
> computed size.
>
>
> On Tuesday, June 6, 2023 at 4:00:44 AM UTC+2 Steven L Wang wrote:
>
>> Hi:
>>   Is there any benchmark for protoc buffers, which is supported or used 
>> by the community? I can find many benchmarks for protoc buffers on the 
>> internet. But I don't know which one is advocated by the community.
>>
>> Thanks,
>>   -Steven
>>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/0a2e2150-08ea-4c03-a306-05cbde34db77n%40googlegroups.com.


[protobuf] Re: Benchmarks for protoc buffer?

2023-06-06 Thread Florian Enner
There are many community benchmarks that include protobuf and/or grpc, but 
whether they are representative of your use case is a different question. 
Most comparisons I've seen are either basic, use suboptimal message 
definitions, or have been specifically designed to make Protobuf look bad 
(e.g. benchmarks from competitors that are trying to establish themselves). 
IMO it's best to create your own benchmarks if you need reliable results 
for your specific use case.

Here are a few guidelines for message definitions that should generally 
help performance:

   - prefer field ids below 16 (5 bit)
   - avoid deep message nesting
   - use varint types (uint32, sint32) for small numbers (ideally max 7 
   bit, i.e., 0-128)
   - use fixed-width types (fixed32, sfixed32, etc.) for large numbers
   - use with caution: groups can be serialized more efficiently than 
   messages, but they have been deprecated and the syntax is terrible

The performance impact also depends a lot on the actual implementation, 
e.g., deep nesting hits harder on implementations that don't cache the 
computed size.


On Tuesday, June 6, 2023 at 4:00:44 AM UTC+2 Steven L Wang wrote:

> Hi:
>   Is there any benchmark for protoc buffers, which is supported or used by 
> the community? I can find many benchmarks for protoc buffers on the 
> internet. But I don't know which one is advocated by the community.
>
> Thanks,
>   -Steven
>

-- 
You received this message because you are subscribed to the Google Groups 
"Protocol Buffers" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to protobuf+unsubscr...@googlegroups.com.
To view this discussion on the web visit 
https://groups.google.com/d/msgid/protobuf/ddd54b17-f078-4d18-8b5a-c0ddb0f504dcn%40googlegroups.com.


  1   2   3   4   5   6   7   8   9   10   >