[
https://issues.apache.org/jira/browse/SPARK-8842?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Hyukjin Kwon updated SPARK-8842:
--------------------------------
Labels: bulk-closed (was: )
> Spark SQL - Insert into table Issue
> -----------------------------------
>
> Key: SPARK-8842
> URL: https://issues.apache.org/jira/browse/SPARK-8842
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.4.0, 2.0.1
> Reporter: James Greenwood
> Priority: Major
> Labels: bulk-closed
>
> I am running spark 1.4 and currently experiencing an issue when inserting
> data into a table. The data is loaded into an initial table and then selected
> from this table, processed and then inserted into a second table. The issue
> is that some of the data goes missing when inserted into the second table
> when running in a multi-worker configuration (a master, a worker on the
> master and then a worker on a different host).
> I have narrowed down the problem to the insert into the second table. An
> example process to generate the problem is below.
> Generate a file (for example /home/spark/test) with the numbers 1 to 50 on
> separate lines.
> spark-sql --master spark://spark-master:7077 --hiveconf
> hive.metastore.warehouse.dir=/spark
> (/spark is shared between all hosts)
> create table test(field string);
> load data inpath '/home/spark/test' into table test;
> create table processed(field string);
> from test insert into table processed select *;
> select * from processed;
> The result from the final select does not contain all the numbers 1 to 50.
> I have also run the above example in some different configurations :-
> - When there is just one worker running on the master. The result of the
> final select is the rows 1-50 i.e all data as expected.
> - When there is just one worker running on a host which is not the master.
> The final select returns no rows.
> No errors are logged in the log files.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]