[
https://issues.apache.org/jira/browse/SPARK-12741?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15110584#comment-15110584
]
Sasi commented on SPARK-12741:
------------------------------
I checked my DB which is Aerospike, and I got the same results of my collect.
I'm creating my DataFrame with Aerospark which is a connector written by Sasha,
https://github.com/sasha-polev/aerospark/
I'm using the DataFrame actions as describe in sql-programming-guide,
https://spark.apache.org/docs/1.3.0/sql-programming-guide.html
I know there're two ways to do actions on DataFrame:
1) SQL way.
{code}
dataFrame.sqlContext().sql("select count(*) from tbl").collect()[0]
{code}
2) DataFrame way.
{code}
dataFrame.where("...").count()
{code}
I'm using the DataFrame way which is more simple to understand and to read as a
JAVA code.
> DataFrame count method return wrong size.
> -----------------------------------------
>
> Key: SPARK-12741
> URL: https://issues.apache.org/jira/browse/SPARK-12741
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 1.5.0
> Reporter: Sasi
>
> Hi,
> I'm updating my report.
> I'm working with Spark 1.5.2, (used to be 1.5.0), I have a DataFrame and I
> have 2 method, one for collect data and other for count.
> method doQuery looks like:
> {code}
> dataFrame.collect()
> {code}
> method doQueryCount looks like:
> {code}
> dataFrame.count()
> {code}
> I have few scenarios with few results:
> 1) Non data exists on my NoSQLDatabase results: count 0 and collect() 0
> 2) 3 rows exists results: count 0 and collect 3.
> 3) 5 rows exists results: count 2 and collect 5.
> I tried to change the count code to the below code, but got the same results
> as I mentioned above.
> {code}
> dataFrame.sql("select count(*) from tbl").count/collect[0]
> {code}
> Thanks,
> Sasi
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]