sahnib commented on code in PR #45674:
URL: https://github.com/apache/spark/pull/45674#discussion_r1552920644
##########
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/StateTypesEncoderUtils.scala:
##########
@@ -89,14 +124,29 @@ class StateTypesEncoder[GK, V](
val value = rowToObjDeserializer.apply(reusedValRow)
value
}
+
+ /**
+ * Decode the ttl information out of Value row. If the ttl has
+ * not been set (-1L specifies no user defined value), the API will
+ * return None.
+ */
+ def decodeTtlExpirationMs(row: UnsafeRow): Option[Long] = {
+ val expirationMs = row.getLong(1)
Review Comment:
I have added an assert for `hasTTL`. However, I want to note that at the end
of microbatch (before `store.commit`), we will call this method multiple times
for each candidate row which has expired ttl in the secondary index.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]