[ 
https://issues.apache.org/jira/browse/IMPALA-6180?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Philip Zeyliger resolved IMPALA-6180.
-------------------------------------
    Resolution: Information Provided

It looks to me that Tim provided information about which crashes these are.

> Impala daemon crash because of SIGSEGV (0xb)
> --------------------------------------------
>
>                 Key: IMPALA-6180
>                 URL: https://issues.apache.org/jira/browse/IMPALA-6180
>             Project: IMPALA
>          Issue Type: Bug
>          Components: Backend, Frontend
>    Affects Versions: Impala 2.5.0
>            Reporter: Manikandan R
>            Priority: Critical
>              Labels: crash, planner
>         Attachments: hs_err_pid103943.log, hs_err_pid17706.log, 
> hs_err_pid46091.log, hs_err_pid56887.log, hs_err_pid61874.log, 
> hs_err_pid63443.log, hs_err_pid65146.log, hs_err_pid67457.log, 
> hs_err_pid75901.log, impala_crash_dec01_2_gdb_stacktraces_pid56887.txt, 
> impala_crash_nov14_gdb_stacktraces.txt, 
> impala_crash_nov22_gdb_stacktraces.txt, 
> impala_crash_nov28_gdb_stacktraces_pid103943.txt, 
> impala_crash_nov28_gdb_stacktraces_pid75901.txt, notes.txt, 
> query_1_crash.sql, query_1_crash_explain_2.txt, query_2_crash.sql, 
> query_2_crash_explain_2.txt, query_2_crash_modified_success.sql, 
> stack_traces_core.126504, stack_traces_core.39127, stack_traces_core.50726, 
> stack_traces_core.53980, stack_traces_core.74127, stack_traces_core.75069, 
> stack_traces_core.90005
>
>
> I am using CDH-5.7.6. Daemon is crashing quite often these days. Couple of 
> incidents are almost similar. In both cases, there were 4-5 crashes with in 
> span of 15 mins. Also, this has occured in all impalad's in the cluster and 
> memory usage (mem_rss) were almost full in almost all daemons. Attaching 
> crash report for debugging purpose. I am trying to understand the queries ran 
> around that time. Thoughts?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to