x27;. If you execute it as is,
> it fails for circularity. Both are bad, so it's just disallowed.
> Just fix your code?
>
> On Mon, Dec 13, 2021 at 11:27 AM Daniel de Oliveira Mantovani <
> daniel.oliveira.mantov...@gmail.com> wrote:
>
>> I've reduced the code
a good
> reason to do that other than it's what you do now.
> I'm not clear if it's coming across that this _can't_ work in the general
> case.
>
> On Mon, Dec 13, 2021 at 11:03 AM Daniel de Oliveira Mantovani <
> daniel.oliveira.mantov...@gmail.com> wrot
probably easy to do that, so you don't want to do that. You want
> different names for different temp views, or else ensure you aren't doing
> the kind of thing shown in the SO post. You get the problem right?
>
> On Mon, Dec 13, 2021 at 10:43 AM Daniel de Oliveira Mantova
ly explicitly disallowed in all cases now, but, you
> should not be depending on this anyway - why can't this just be avoided?
>
> On Mon, Dec 13, 2021 at 10:06 AM Daniel de Oliveira Mantovani <
> daniel.oliveira.mantov...@gmail.com> wrote:
>
>> Sean,
>>
>&
sion, which doesn't quite make sense - somewhere the new
>>> definition depends on the old definition. I think it just correctly
>>> surfaces as an error now,.
>>>
>>> On Mon, Dec 13, 2021 at 9:41 AM Daniel de Oliveira Mantovani <
>>> daniel.oliveira.
ooks 'valid' - you define a temp view in terms of its own
> previous version, which doesn't quite make sense - somewhere the new
> definition depends on the old definition. I think it just correctly
> surfaces as an error now,.
>
> On Mon, Dec 13, 2021 at 9:41 AM Daniel
Hello team,
I've found this issue while I was porting my project from Apache Spark
3.1.x to 3.2.x.
https://stackoverflow.com/questions/69937415/spark-3-2-0-the-different-dataframe-createorreplacetempview-the-same-name-tempvi
Do we have a bug for that in apache-spark or I need to create one ?
Th
truction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Mon, 22 Nov
Hi Spark Team,
I've written a library for Apache Spark to flatten JSON/Avro/Parquet/XML
using a DSL(Domain Specific Language) in Apache Spark. You actually don't
even need to write the DSL, you can generate it as well :)
I've written an article to teach how to use:
https://medium.com/@danielmanto
ver transforms the queries emitted by applications and converts
them into an equivalent form in HiveQL.
Try to change the "NativeQuery" parameter and see if it works :)
On Tue, Jul 20, 2021 at 1:26 PM Daniel de Oliveira Mantovani <
daniel.oliveira.mantov...@gmail.com> wrote:
> Ins
arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Tue, 20 Jul 2021 at 17:05, Daniel de Oliveira Mantovani <
>
Badrinath is trying to write to a Hive in a cluster where he doesn't have
permission to submit spark jobs, he doesn't have Hive/Spark metadata
access.
The only way to communicate with this third-party Hive cluster is through
JDBC protocol.
[ Cloudera Data Hub - Hive Server] <-> [Spark Standalone]
Did you include Apache Spark dependencies in your build? if you did, you
should remove it. If you are using sbt, all spark dependencies should be as
"provided".
On Wed, Jun 2, 2021 at 10:11 AM Kanchan Kauthale <
kanchankauthal...@gmail.com> wrote:
> Hello Sean,
>
> Please find below the stack tra
In my opinion this should be part of the official documentation. Amazing
work Zhou Yang.
On Wed, Nov 25, 2020 at 5:45 AM Zhou Yang wrote:
> Hi all,
>
> I found the solution through the source code. Appending the —conf k-v into
> `sparkProperties` work.
> For example:
>
> ./spark-submit \
> —conf
Is possible to give options when reading semistructured files using SQL
Syntax like in the example below:
"SELECT * FROM csv.`file.csv`
For example, if I want to have header=true. Is it possible ?
Thanks
--
--
Daniel Mantovani
d these useful.
>
>
> https://ask.streamsets.com/question/7/how-do-you-configure-a-hive-impala-jdbc-driver-for-data-collector/?answer=8#post-id-8
>
> On Thu, Jul 9, 2020 at 11:28 AM Daniel de Oliveira Mantovani <
> daniel.oliveira.mantov...@gmail.com> wrote:
>
>> One of my
download the driver from Cloudera here:
https://www.cloudera.com/downloads/connectors/hive/jdbc/2-6-1.html
On Tue, Jul 7, 2020 at 12:03 AM Daniel de Oliveira Mantovani <
daniel.oliveira.mantov...@gmail.com> wrote:
> Hello Gabor,
>
> I meant, third-party connector* not "connectio
Hi Teja,
To access Hive 3 using Apache Spark 2.x.x you need to use this connector
from Cloudera
https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.5/integrating-hive/content/hive_hivewarehouseconnector_for_handling_apache_spark_data.html
.
It has many limitations You just can write to Hive ma
> Not sure what you mean by third-party connection but AFAIK there is no
> workaround at the moment.
>
> BR,
> G
>
>
> On Mon, Jul 6, 2020 at 12:09 PM Daniel de Oliveira Mantovani <
> daniel.oliveira.mantov...@gmail.com> wrote:
>
>> Hello List,
>>
&
Hello List,
Is it possible to access Hive 2 through JDBC with Kerberos authentication
from Apache Spark JDBC interface ? If it's possible do you have an example ?
I found this tickets on JIRA:
https://issues.apache.org/jira/browse/SPARK-12312
https://issues.apache.org/jira/browse/SPARK-31815
Do
?ftlId=1&name=0049003208&uid=0049003208%40znv.com&iconUrl=https%3A%2F%2Fmail-online.nosdn.127.net%2Fqiyelogo%2FdefaultAvatar.png&items=%5B%220049003208%40znv.com%22%5D>
> 签名由 网易邮箱大师 <https://mail.163.com/dashi/dlpro.html?from=mail81> 定制
> - To
> unsubscribe e-mail: user-unsubscr...@spark.apache.org
--
--
Daniel de Oliveira Mantovani
Perl Evangelist/Data Hacker
+1 786 459 1341
iver.writeAheadLog.enable", "true")
>
> val ssc = new StreamingContext(sparkConf, batchDuration)
> ssc.checkpoint(checkpointDir)
> ssc.remember(Minutes(1))
>
> These lines are not doing anything for Structured Streaming.
>
>
> Best,
> Burak
>
> On T
s.select("key")
noAggDF
.writeStream
.format("console")
.start()
But I'm having the error:
http://paste.scsys.co.uk/565658
How do I get my messages using kafka as format from Structured Streaming ?
Thank you
--
--
Daniel de Oliveira Mantovani
Perl Evangelist/Data Hacker
+1 786 459 1341
the message it self and not an
object with the RabbitMQ message structure, which would include the
delivery tag. I really need the delivery tag to write an efficient and safe
reader.
Someone knows how to get the delivery tag ? Or should I use other library
to read from RabbitMQ ?
Thank you
--
24 matches
Mail list logo