Hello Matthew,

To better understand the problem, are you trying to copy the
JavaMapOperator into a Spark operator? Or you are trying to copy a
SparkOperator into a new spark operator such as you did for Java?

Best,
Jorge

On Thu, 25 Aug 2022 at 5:56 PM MatthewJ Sanyoto <[email protected]>
wrote:

> Hi,
>
> I have a problem regarding implementing a new custom operator using Spark.
> So I was trying to map a new operator from Wayang Basic operator to the
> Spark Platform to execute in Spark.
> I tried using the Java Platform (only Java Plugin) and just copied the
> code(MapOperator, Mapping, JavaMapOperator and MapMapping) but with
> different operators name to test and it worked but not for Spark Platform.
> (only Spark Plugin)
>
> Maybe someone here could please explain to me why I got this error.
> Caused by: org.apache.wayang.core.api.exception.WayangException: No
> implementations that concatenate out@Alternative[2x
> ~Alternative[[MapHackIt[1->1, id=3c947bc5]]], 1e683a3e] with
> [in@Alternative[2x
> ~Alternative[[LocalCallbackSink[1->0, id=69fb6037]]], 6babf3bf]].
>
> Maybe I am missing out on something for Spark?
>
> Best Regards,
> Matthew
>

Reply via email to