https://gcc.gnu.org/bugzilla/show_bug.cgi?id=95348

--- Comment #5 from qinzhao at gcc dot gnu.org ---
(In reply to Martin Liška from comment #4)> 
> Can you please share some statistics how big are the files and how many runs 
> do you merge?

  There were on the order of 10,000 processes. Source code coverage 
  approximately 20%. Size of the profiling data gathered in the vicinity of
1TB.

> Would it be possible to share 'gcov-dump -l' for all your .gcda files?

It is impossible since too many .gdca files, each process has one directory,
there are over 10,000 directories and under each directory, there are over
thousand .gdca files. 

the situation is similar as the small testing case I added in the first
comment. the functions and modules that do not execute have records in the
.gdca file with zero counts.

Reply via email to