It's in the spark-catalyst_2.11-2.1.1.jar since the logical query plans and
optimization also need to know about types.

On Tue, Jun 20, 2017 at 1:14 PM, Jean Georges Perrin <j...@jgp.net> wrote:

> Hey all,
>
> i was giving a run to 2.1.1 and got an error on one of my test program:
>
> package net.jgp.labs.spark.l000_ingestion;
>
> import java.util.Arrays;
> import java.util.List;
>
> import org.apache.spark.sql.Dataset;
> import org.apache.spark.sql.Row;
> import org.apache.spark.sql.SparkSession;
> import org.apache.spark.sql.types.IntegerType;
>
> public class ArrayToDataset {
>
> public static void main(String[] args) {
> ArrayToDataset app = new ArrayToDataset();
> app.start();
> }
>
> private void start() {
> SparkSession spark = SparkSession.builder().appName("Array to Dataset"
> ).master("local").getOrCreate();
>
> Integer[] l = new Integer[] { 1, 2, 3, 4, 5, 6, 7 };
> List<Integer> data = Arrays.asList(l);
> Dataset<Row> df = spark.createDataFrame(data, IntegerType.class);
>
> df.show();
> }
> }
>
> Eclipse is complaining that it cannot find 
> org.apache.spark.sql.types.IntegerType
> and after looking in the spark-sql_2.11-2.1.1.jar jar, I could not find it
> as well:
>
> I looked at the 2.1.1 release notes as well, did not see anything. The
> package is still in Javadoc: https://spark.apache.
> org/docs/latest/api/java/org/apache/spark/sql/types/package-summary.html
>
> I must be missing something. Any hint?
>
> Thanks!
>
> jg
>
>
>
>
>
>

Reply via email to