If you extract the lines containing all the function names, and filter them
with a script that counts the number of times that the function appears,
you only have to store (for example in a JSON) the number of times that
each function was used. And you can avoid storing all the logs in a
On Thu, Apr 20, 2017 at 9:29 AM, SARA QUISPE MEJIA <a20132...@pucp.pe>
> yes, I tried to do that and then I did the analysis with goaccess,
> but now I need to do an interface(page.php) for the user can choose
> the function that want to filter and then generate a report.
> So, I thought put all my information on a database like mysql
> 2017-04-20 9:11 GMT+02:00 oscaretu . <oscar...@gmail.com>:
>> Sara, why don't you process the log file just with grep / pcregrep to get
>> just the lines that containt that function name?
>> On Thu, Apr 20, 2017 at 9:05 AM, SARA QUISPE MEJIA <a20132...@pucp.pe>
>>> I want to parse the log file respect to a client that means to make a
>>> report of how the client is using my application through the information
>>> provided by a log file.
>>> So I need to filter some url where I find an especific name of a
>>> function. For that I thought in insert my log file to database like mysql.
>>> Could I do that with my log file ? (I use nginx )
>>> I tried to do that with syslog-ng but it doesn't work
>>> Do you have any ideas?
>>> nginx mailing list
>> Oscar Fernandez Sierra
>> nginx mailing list
> nginx mailing list
Oscar Fernandez Sierra
nginx mailing list