scwhittle commented on code in PR #28537:
URL: https://github.com/apache/beam/pull/28537#discussion_r1332798651
##########
runners/google-cloud-dataflow-java/worker/src/main/java/org/apache/beam/runners/dataflow/worker/StreamingDataflowWorker.java:
##########
@@ -701,6 +414,71 @@ public void run() {
LOG.debug("maxWorkItemCommitBytes: {}", maxWorkItemCommitBytes);
}
+ /** Returns whether an exception was caused by a {@link OutOfMemoryError}. */
+ private static boolean isOutOfMemoryError(Throwable t) {
+ while (t != null) {
+ if (t instanceof OutOfMemoryError) {
+ return true;
+ }
+ t = t.getCause();
+ }
+ return false;
+ }
+
+ private static MapTask parseMapTask(String input) throws IOException {
+ return Transport.getJsonFactory().fromString(input, MapTask.class);
+ }
+
+ public static void main(String[] args) throws Exception {
Review Comment:
It is referenced in dataflow internal launching of the Dataflow v1 Java
worker (not in github repo).
With beam portability, the runner specifics like this are behind the FnApi
but for the v1 harness the dataflow details leaked some. They were initially
separate repositories but we kept having jar compatibility issues so we moved
it to the external beam repository and single jar.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]