[
https://issues.apache.org/jira/browse/BEAM-7819?focusedWorklogId=291824&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-291824
]
ASF GitHub Bot logged work on BEAM-7819:
----------------------------------------
Author: ASF GitHub Bot
Created on: 09/Aug/19 05:32
Start Date: 09/Aug/19 05:32
Worklog Time Spent: 10m
Work Description: matt-darwin commented on pull request #9232:
[BEAM-7819] Python - parse PubSub message_id into attributes property
URL: https://github.com/apache/beam/pull/9232#discussion_r312333851
##########
File path: sdks/python/apache_beam/io/gcp/pubsub.py
##########
@@ -127,6 +130,10 @@ def _from_message(msg):
"""
# Convert ScalarMapContainer to dict.
attributes = dict((key, msg.attributes[key]) for key in msg.attributes)
+ # Parse the PubSub message_id and add to attributes
+ sysattribs = dict(message_id=msg.message_id)
Review comment:
Yes I agree. This was merely done to fit with the existing documentation,
however I feel that making the PubSubMessage mimic the protobuf definition
makes more sense (and can add the publish_time attribute as well). In
addition, due to the handling of pubsub in dataflow, the message_id is
currently not exposed and so is simply an empty attribute at that point. I'm
thinking to close this pull request and update the issue with a dependency on
the dataflow worker being amended (unfortunately I'm even worse at java than
python!)
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
Issue Time Tracking
-------------------
Worklog Id: (was: 291824)
Time Spent: 3.5h (was: 3h 20m)
> PubsubMessage message parsing is lacking non-attribute fields
> -------------------------------------------------------------
>
> Key: BEAM-7819
> URL: https://issues.apache.org/jira/browse/BEAM-7819
> Project: Beam
> Issue Type: Bug
> Components: io-py-gcp
> Reporter: Ahmet Altay
> Assignee: Udi Meiri
> Priority: Major
> Time Spent: 3.5h
> Remaining Estimate: 0h
>
> User reported issue:
> https://lists.apache.org/thread.html/139b0c15abc6471a2e2202d76d915c645a529a23ecc32cd9cfecd315@%3Cuser.beam.apache.org%3E
> """
> Looking at the source code, with my untrained python eyes, I think if the
> intention is to include the message id and the publish time in the attributes
> attribute of the PubSubMessage type, then the protobuf mapping is missing
> something:-
> @staticmethod
> def _from_proto_str(proto_msg):
> """Construct from serialized form of ``PubsubMessage``.
> Args:
> proto_msg: String containing a serialized protobuf of type
> https://cloud.google.com/pubsub/docs/reference/rpc/google.pubsub.v1#google.pubsub.v1.PubsubMessage
> Returns:
> A new PubsubMessage object.
> """
> msg = pubsub.types.pubsub_pb2.PubsubMessage()
> msg.ParseFromString(proto_msg)
> # Convert ScalarMapContainer to dict.
> attributes = dict((key, msg.attributes[key]) for key in msg.attributes)
> return PubsubMessage(msg.data, attributes)
> The protobuf definition is here:-
> https://cloud.google.com/pubsub/docs/reference/rpc/google.pubsub.v1#google.pubsub.v1.PubsubMessage
> and so it looks as if the message_id and publish_time are not being parsed as
> they are seperate from the attributes. Perhaps the PubsubMessage class needs
> expanding to include these as attributes, or they would need adding to the
> dictionary for attributes. This would only need doing for the _from_proto_str
> as obviously they would not need to be populated when transmitting a message
> to PubSub.
> My python is not great, I'm assuming the latter option would need to look
> something like this?
> attributes = dict((key, msg.attributes[key]) for key in msg.attributes)
> attributes.update({'message_id': msg.message_id, 'publish_time':
> msg.publish_time})
> return PubsubMessage(msg.data, attributes)
> """
--
This message was sent by Atlassian JIRA
(v7.6.14#76016)