The Matomo (Piwik) Server Logs Analytics script described on this page is a Python script that will parse server log files and track the data in the Matomo MySQL database. It will then let you visualize useful and interesting reports from the data mined from your logs. The standard Matomo user interface is available when you visualize your website reports (visits, page views, downloads, keywords, referring websites, etc.) .
Using Matomo to track and report your server logs means you have access to many useful features such as: 30+ web statistic reports, IP filters and exclusion, real time reports, high performance, and so much more!
There are many cases in which it can be useful to use the information contained in your server access logs and import this data into Matomo (Piwik).
For example, in the following use cases, analyzing server logs with Matomo is desired:
- if you want to track and report the activity on a particular server or set of servers (for system administration purposes, QA, debugging, dealing with spammers, etc
- if you have weeks or months of server logs that you wish to import for historical analysis: Matomo will let you import months of server access logs that you can then visualize
And more! Please let us know if you use this script with another use case and we can add it to the list.
There are advantages and specificities to each tracking method.
Log files are already available so it makes sense to make the most of this information. Log files contain search engine bot & spam bot information as well as static image/css information. Log files contain requests of users with adblockers (ghostery, adblock, adblock plus, privacy badger, etc.)
Can I import data Using Matomo Server Log Analytics and using the standard Matomo JS at the same time?
- create a new website in Matomo, eg. with a name “Example.org (log files)”.
- note the idsite of this new website. You will use this website ID to import your log file data into.
- in the command line, force all requests from the log files to be recorded in a specific website ID via
The first time you run Log Analytics, you may import a lot of historical data, maybe months or years of past log data. After this data was imported, to archive all historical report data run this command once:
./console core:archive --force-all-websites --url=http://example/matomo
After the initial data log import, you likely would import log files hourly or daily into Matomo (Piwik).
Put the following command in a cron to process archives after the logs are imported hourly or daily:
./console core:archive --url=http://example/piwik/
See also: How to setup reports auto archiving
If you are running Matomo 3.x
Run the following command instead after importing your logs to archive all historical data:
./console core:archive --force-all-websites --force-all-periods=315576000 --force-date-last-n=1000 --url=http://example/matomo
Log Analytics lets you import any web server log file. In this FAQ we will focus on one particular type of logs that you may find useful to import in Matomo: the Matomo (Piwik) tracking API logs.
What are Matomo Tracking API logs?
piwik.php) Tracking API endpoint. If you use one of the Tracking API clients to measure your mobile apps or games or desktop apps, they will also send requests to
piwik.php). Your webserver handling those requests will create access log files containing the tracking data that Matomo will collect in your database.
Here is what an example access log line looks like:
220.127.116.11 - - [03/Feb/2020:16:40:31 +1300] "GET /matomo.php?idsite=1&rec=1&urlref=https://www...................... HTTP/1.1" 200 256 "-" "Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:72.0) Gecko/20100101 Firefox/72.0"
Uses of replaying logs
Replaying logs is very useful for example when your database server breaks down and Matomo could not write the data for a few hours. Luckily you can use your web server logs matching
/piwik.php) and replay them into Matomo. Replaying logs means that the Log Analytics tool will go through each line of the log and import them in your Matomo for the correct datetime in the past. Replaying logs is also useful if you want to setup High availability Matomo.
How to replay Tracking API logs? Steps to follow
1) Firstly you would prepare a log file containing only the requests that should imported. Typically you would import only a given period of time. In these logs, all the request URLs would start with
piwik.php. These are the requests we can replay next.
2) Secondly, make sure all requests in the file are sorted chronologically. This is especially important when you have merged data from different log files. Afterwards it is important to order the log file by the date-time field.
3) Finally you can replay the tracking API logs by calling the log analytics importer with the
--replay-tracking parameter, for example:
./misc/log-analytics/import_logs.py --url=piwik.example.net --replay-tracking /var/log/apache2/access.log
4) After replaying the logs it is recommended to reprocess the data with the
core:archive console command.
Once this is completed, congratulations: you have now recovered your missing web analytics data!
Limitations of logs replay
When replaying the logs, most of your log data will be replayed as expected (visits, pageviews, goals, ecommerce transactions, etc.) but there may be a few Tracking requests which are not replayed: specifically any log entries which are POST requests will not be replayed (because the POST request parameters are not stored in the access log files).
When using Log Analytics to import your logs into Matomo (Piwik), when your Matomo server is protected by HTTP authentication (basic http access authentication), you can specify the username and password using the following parameters: