Re: Announcing Delta Lake 0.3.0

2019-08-02 Thread Gourav Sengupta
Yah!

celebrations, wine, cocktails, parties, dances tonight :)

Regards,
Gourav

On Fri, Aug 2, 2019 at 2:44 AM Tathagata Das  wrote:

> Hello everyone,
>
> We are excited to announce the availability of Delta Lake 0.3.0 which
> introduces new programmatic APIs for manipulating and managing data in
> Delta Lake tables.
>
> Here are the main features:
>
>
>-
>
>Scala/Java APIs for DML commands - You can now modify data in Delta
>Lake tables using programmatic APIs for *Delete*, *Update* and *Merge*.
>These APIs mirror the syntax and semantics of their corresponding SQL
>commands and are great for many workloads, e.g., Slowly Changing Dimension
>(SCD) operations, merging change data for replication, and upserts from
>streaming queries. See the documentation
> for more details.
>
>
>
>-
>
>Scala/Java APIs for query commit history - You can now query a table’s
>commit history to see what operations modified the table. This enables you
>to audit data changes, time travel queries on specific versions, debug and
>recover data from accidental deletions, etc. See the documentation
> for
>more details.
>
>
>
>-
>
>Scala/Java APIs for vacuuming old files - Delta Lake uses MVCC to
>enable snapshot isolation and time travel. However, keeping all versions of
>a table forever can be prohibitively expensive. Stale snapshots (as well as
>other uncommitted files from aborted transactions) can be garbage collected
>by vacuuming the table. See the documentation
> for more
>details.
>
>
> To try out Delta Lake 0.3.0, please follow the Delta Lake Quickstart:
> https://docs.delta.io/0.3.0/quick-start.html
>
> To view the release notes:
> https://github.com/delta-io/delta/releases/tag/v0.3.0
>
> We would like to thank all the community members for contributing to this
> release.
>
> TD
>
> --
> You received this message because you are subscribed to the Google Groups
> "Delta Lake Users and Developers" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to delta-users+unsubscr...@googlegroups.com.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/delta-users/CA%2BAHuKmAhUar%3D7GZ9bUwJKmh%3Diu67%3DTVzH%2BhiwTpC0v33A_MQQ%40mail.gmail.com
> 
> .
>


Re: How to get logging right for Spark applications in the YARN ecosystem

2019-08-02 Thread raman gugnani
HI Srinath,

I am not able to use log4j2 , Rolling file appender is only supported in
log4j 2.

On Fri, 2 Aug 2019 at 15:48, Girish bhat m  wrote:

> Hi Raman,
>
> Since you are using YARN, you collect the yarn logs (which also contains
> the application logs) by executing below command and move to the desired
> location
>
> yarn logs -applicationId 
>
> Best
> Girish
>
> On Fri, Aug 2, 2019 at 10:17 AM Srinath C  wrote:
>
>> Hi Raman,
>>
>> Probably use the rolling file appender in log4j to compress the rotated
>> log file?
>>
>> Regards.
>>
>>
>> On Fri, Aug 2, 2019 at 12:47 AM raman gugnani 
>> wrote:
>>
>>> HI ,
>>>
>>> I am looking for right solution for logging the logs produced by the
>>> executors. Most of the places I have seen logging done by log4j properties,
>>> but no where people I have seen any solution where logs are being
>>> compressed.
>>>
>>> Is there anyway I can compress the logs, So that further those logs can
>>> be shipped to S3.
>>>
>>> --
>>> Raman Gugnani
>>>
>>
>
> --
> Girish bhat m
>
>


-- 
Raman Gugnani


Re: How to get logging right for Spark applications in the YARN ecosystem

2019-08-02 Thread Girish bhat m
Hi Raman,

Since you are using YARN, you collect the yarn logs (which also contains
the application logs) by executing below command and move to the desired
location

yarn logs -applicationId 

Best
Girish

On Fri, Aug 2, 2019 at 10:17 AM Srinath C  wrote:

> Hi Raman,
>
> Probably use the rolling file appender in log4j to compress the rotated
> log file?
>
> Regards.
>
>
> On Fri, Aug 2, 2019 at 12:47 AM raman gugnani 
> wrote:
>
>> HI ,
>>
>> I am looking for right solution for logging the logs produced by the
>> executors. Most of the places I have seen logging done by log4j properties,
>> but no where people I have seen any solution where logs are being
>> compressed.
>>
>> Is there anyway I can compress the logs, So that further those logs can
>> be shipped to S3.
>>
>> --
>> Raman Gugnani
>>
>

-- 
Girish bhat m