ASF GitHub Bot logged work on BEAM-7819:

                Author: ASF GitHub Bot
            Created on: 07/Aug/19 10:47
            Start Date: 07/Aug/19 10:47
    Worklog Time Spent: 10m 
      Work Description: matt-darwin commented on issue #9232: [BEAM-7819] 
Python - parse PubSub message_id into attributes property
URL: https://github.com/apache/beam/pull/9232#issuecomment-519044616
   > Note that in Dataflow, the Python SDK uses FnAPI to read from Pub/Sub 
using a Java harness. In other words, the message id might be missing because 
Java code doesn't provide it somehow. I'll look into it later.
   > When running locally using DirectRunner, there's a different 
implementation that replaces `ReadFromPubSub` entirely, which probably needs 
modification as well. See:
   The directrunner has been working ok with the above changes; I think the 
issue is on the dataflow runner side.  The directrunner is using the 
_from_message method, and this is parsing correctly and returning the pubsub 
message id in my testing so far.
    def _read_from_pubsub(self, timestamp_attribute):
       from apache_beam.io.gcp.pubsub import PubsubMessage
       from google.cloud import pubsub
       def _get_element(message):
         parsed_message = PubsubMessage._from_message(message)
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:

Issue Time Tracking

    Worklog Id:     (was: 290361)
    Time Spent: 1.5h  (was: 1h 20m)

> PubsubMessage message parsing is lacking non-attribute fields
> -------------------------------------------------------------
>                 Key: BEAM-7819
>                 URL: https://issues.apache.org/jira/browse/BEAM-7819
>             Project: Beam
>          Issue Type: Bug
>          Components: io-python-gcp
>            Reporter: Ahmet Altay
>            Assignee: Udi Meiri
>            Priority: Major
>          Time Spent: 1.5h
>  Remaining Estimate: 0h
> User reported issue: 
> https://lists.apache.org/thread.html/139b0c15abc6471a2e2202d76d915c645a529a23ecc32cd9cfecd315@%3Cuser.beam.apache.org%3E
> """
> Looking at the source code, with my untrained python eyes, I think if the 
> intention is to include the message id and the publish time in the attributes 
> attribute of the PubSubMessage type, then the protobuf mapping is missing 
> something:-
> @staticmethod
> def _from_proto_str(proto_msg):
> """Construct from serialized form of ``PubsubMessage``.
> Args:
> proto_msg: String containing a serialized protobuf of type
> https://cloud.google.com/pubsub/docs/reference/rpc/google.pubsub.v1#google.pubsub.v1.PubsubMessage
> Returns:
> A new PubsubMessage object.
> """
> msg = pubsub.types.pubsub_pb2.PubsubMessage()
> msg.ParseFromString(proto_msg)
> # Convert ScalarMapContainer to dict.
> attributes = dict((key, msg.attributes[key]) for key in msg.attributes)
> return PubsubMessage(msg.data, attributes)
> The protobuf definition is here:-
> https://cloud.google.com/pubsub/docs/reference/rpc/google.pubsub.v1#google.pubsub.v1.PubsubMessage
> and so it looks as if the message_id and publish_time are not being parsed as 
> they are seperate from the attributes. Perhaps the PubsubMessage class needs 
> expanding to include these as attributes, or they would need adding to the 
> dictionary for attributes. This would only need doing for the _from_proto_str 
> as obviously they would not need to be populated when transmitting a message 
> to PubSub.
> My python is not great, I'm assuming the latter option would need to look 
> something like this?
> attributes = dict((key, msg.attributes[key]) for key in msg.attributes)
> attributes.update({'message_id': msg.message_id, 'publish_time': 
> msg.publish_time})
> return PubsubMessage(msg.data, attributes)
> """

This message was sent by Atlassian JIRA

Reply via email to