Hi all,
We want to use apply INSERTS, UPDATE, and DELETE operations on tables based on
parquet or ORC files served by thrift2.
Actually its unclear whether we can enable them and where.
At the moment, when executing UPDATE or DELETE operations those are getting
blocked.
Anyone out who uses ACI
Hi Sean,
I have had a more detailed look at what Spark is doing with log4 APIs and at
this point I suspect that a logj 2.x migration might be more appropriate at the
code level.
That still does not solve the libraries issue though. That would need more
investigation.
I could be tempted to tac
Yep that's what I tried, roughly - there is an old jira about it. The issue
is that Spark does need to configure some concrete logging framework in a
few cases, as do other libs, and that isn't what the shims cover. Could be
possible now or with more cleverness but the simple thing didn't work out
Hi there,
It’s true that the preponderance of log4j 1.2.x in many existing live projects
is kind of a pain in the butt.
But there is a solution.
1. Migrate all Spark code to use slf4j APIs;
2. Exclude log4j 1.2.x from any dependencies sucking it in;
3. Include the log4j-over-slf4j bridge jar
No plans that I know of. It's not that Spark uses it so much as its
dependencies. I tried and failed to upgrade it a couple years ago. you are
welcome to try, and open a PR if successful.
On Tue, Nov 9, 2021 at 6:09 AM Ajay Kumar wrote:
> Hi Team,
> We wanted to send Spark executor logs to a cen
Hi Team,
We wanted to send Spark executor logs to a centralized logging server using
TCP Socket. I see that the spark log4j version is very old(1.2.17) and it
does not support JSON logs over tcp sockets on containers.
I wanted to konw what is the plan for upgrading the log4j version to log4j2.
Than