[ https://issues.apache.org/jira/browse/SPARK-19492?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15856320#comment-15856320 ]
Sean Owen commented on SPARK-19492: ----------------------------------- Yeah, was mostly establishing that it had to be inferred. It does. But then I don't see what the difference is between Seq and Dataset in this respect. > Dataset, filter and pattern matching on elements > ------------------------------------------------ > > Key: SPARK-19492 > URL: https://issues.apache.org/jira/browse/SPARK-19492 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 2.0.2, 2.1.0 > Reporter: Loic Descotte > Priority: Minor > > It seems it is impossible to use pattern matching to define input parameters > for function filter on datasets. > Example : > This one is working : > {code} > val departments = Seq( > Department(1, "hr"), > Department(2, "it") > ).toDS > departments.filter{ d=> > d.name == "hr" > } > {code} > but not this one : > {code} > departments.filter{ case Department(_, name)=> > name == "hr" > } > {code} > Error : > {code} > error: missing parameter type for expanded function > The argument types of an anonymous function must be fully known. (SLS 8.5) > Expected type was: ? > departments.filter{ case Department(_, name)=> > {code} > This kind of pattern matching should work (as departements dataset type is > known) like Scala collections filter function, or RDD filter function for > example. > Please note that it works on map function : > {code} > departments.map{ case Department(_, name)=> > name > } > {code} -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org