[ 
https://issues.apache.org/jira/browse/SPARK-25908?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Apache Spark reassigned SPARK-25908:
------------------------------------

    Assignee: Apache Spark  (was: Sean Owen)

> Remove old deprecated items in Spark 3
> --------------------------------------
>
>                 Key: SPARK-25908
>                 URL: https://issues.apache.org/jira/browse/SPARK-25908
>             Project: Spark
>          Issue Type: Task
>          Components: Spark Core, SQL
>    Affects Versions: 3.0.0
>            Reporter: Sean Owen
>            Assignee: Apache Spark
>            Priority: Major
>
> There are many deprecated methods and classes in Spark. They _can_ be removed 
> in Spark 3, and for those that have been deprecated a long time (i.e. since 
> Spark <= 2.0), we should probably do so. This addresses most of these cases, 
> the easiest ones, those that are easy to remove and are old:
> - Remove some AccumulableInfo .apply() methods
> - Remove non-label-specific multiclass precision/recall/fScore in favor of 
> accuracy
> - Remove toDegrees/toRadians in favor of degrees/radians
> - Remove approxCountDistinct in favor of approx_count_distinct
> - Remove unused Python StorageLevel constants
> - Remove Dataset unionAll in favor of union
> - Remove unused multiclass option in libsvm parsing
> - Remove references to deprecated spark configs like spark.yarn.am.port
> - Remove TaskContext.isRunningLocally
> - Remove ShuffleMetrics.shuffle* methods
> - Remove BaseReadWrite.context in favor of session
> - Remove Column.!== in favor of =!=
> - Remove Dataset.explode
> - Remove Dataset.registerTempTable
> - Remove SQLContext.getOrCreate, setActive, clearActive, constructors
> Not touched yet:
> - everything else in MLLib
> - HiveContext
> - Anything deprecated more recently than 2.0.0, generally



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to