Technology

Hosting Guide for Matomo and Self‑Hosted Analytics

Analytics is no longer just about increasing pageviews and conversion rates. For many organisations, it is about respecting user privacy, complying with GDPR or KVKK, and keeping sensitive data under their own control. That is exactly where self‑hosted, privacy‑focused tools like Matomo shine. Instead of sending behavioural data to third‑party platforms, you run the analytics stack on your own server, in a data center and jurisdiction you choose. In this guide, we will walk through how to host Matomo and similar self‑hosted analytics tools in a way that is fast, secure and compliant. We will look at server requirements, capacity planning, different hosting options (shared, VPS, dedicated and colocation) and the small but critical settings that make the difference between a slow, bloated installation and a lean, reliable analytics platform. All examples and recommendations are based on what we see every day while running infrastructure at dchost.com.

İçindekiler

What Makes Analytics Privacy‑Focused?

From third‑party trackers to first‑party data

Classic web analytics tools are usually third‑party services. Your visitors load a script from an external domain, their actions are sent to that provider, and you get aggregated reports in a dashboard. The downside is clear: tracking scripts from many different domains, complicated consent banners, and user data leaving your control. Privacy‑focused analytics flips this model. Scripts and tracking endpoints live on your own domain, logs and databases stay on your servers, and you decide how long to retain data and how aggressively to anonymise it.

Matomo is one of the best‑known open‑source analytics platforms in this category. It can be run fully on‑premise or on a VPS or dedicated server that you control. It offers IP anonymisation, cookieless tracking modes, and detailed configuration for data retention and consent. When combined with correct hosting architecture, it lets you build a first‑party analytics stack that satisfies both marketing teams and data protection officers.

Key privacy features you should care about

  • Data ownership: All raw data and reports live in your own database; no third‑party has access by default.
  • Configurable IP anonymisation: Mask parts of visitor IPs to reduce personal data while keeping location statistics useful.
  • Cookieless tracking options: Ability to reduce or avoid cookies in certain setups, simplifying consent banners.
  • Flexible data retention: Automatic deletion of old logs and reports to fit KVKK / GDPR requirements.
  • On‑premise jurisdiction control: Choose the country and data center that match your regulatory obligations.

Why Choose Matomo for Self‑Hosted Analytics

Core components and basic architecture

From a hosting perspective, Matomo is a classic PHP web application with a relational database. A typical installation includes:

  • Web server: Apache, Nginx or LiteSpeed serving PHP through PHP‑FPM.
  • PHP runtime: Modern PHP version (8.x recommended) with necessary extensions like PDO, mysqli, gd, mbstring and others.
  • Database: MariaDB or MySQL (Matomo also supports some alternatives, but most deployments use MySQL‑compatible servers).
  • Background jobs: Cron job or systemd timer that regularly runs Matomo archiving to pre‑compute reports.

At small scale, all of these live on a single VPS or even a high‑quality shared hosting plan. As traffic grows, you can separate web and database servers, add a dedicated reporting node or archive server, or integrate with external storage for long‑term log retention. Because the stack is simple and well‑understood, it is easy to grow from a basic setup into a robust analytics platform without throwing everything away.

Real‑world deployment patterns we often see

  • Single VPS instance: Nginx + PHP‑FPM + MariaDB on a 2–4 vCPU VPS with NVMe storage. Suitable for most small and medium sites.
  • Web + database split: Matomo PHP frontend on one VPS, database on another, both on fast local or same‑region networks.
  • Central analytics cluster: Agencies and groups run one central Matomo instance tracking dozens of client sites via a shared tag.
  • Log analytics mode: Importing HTTP access logs from many web servers into a larger Matomo reporting server.

This flexibility is one of the reasons we like Matomo in hosting environments. You can start small, on the same infrastructure that already hosts your main site, and later promote analytics to its own VPS or dedicated server when the numbers demand it.

Capacity Planning: CPU, RAM, Disk and Bandwidth for Matomo

Matomo can feel extremely light or surprisingly heavy depending on how many hits you track and how well you size your server. The good news is that sizing is quite predictable if you understand your traffic patterns. If you are not sure about your overall traffic yet, our detailed guide on how to calculate monthly traffic and bandwidth requirements is a great starting point.

What drives resource usage?

  • Number of pageviews / events per month: The main factor for CPU and database load.
  • Number of tracked sites: Many small sites can behave differently from one big site.
  • Report complexity: Custom segments, long date ranges and e‑commerce tracking increase archiving time.
  • Log retention period: Keeping raw logs for years will grow your database and disk requirements significantly.

Example sizing scenarios

These are simplified reference points based on what we see in practice at dchost.com. They assume Matomo is the only major application on the server and that you use modern PHP and database versions.

Small sites and blogs (up to 100,000 pageviews per month)

  • Hosting type: Quality shared hosting or entry‑level VPS.
  • CPU: 1 vCPU (shared) is usually enough.
  • RAM: 1–2 GB.
  • Disk: 10–20 GB on SSD or NVMe.
  • Notes: You can host Matomo on the same account as your main site if resource limits are generous and cron jobs are allowed.

Growing business sites (100,000 – 5 million pageviews per month)

  • Hosting type: VPS is strongly recommended.
  • CPU: 2–4 vCPUs.
  • RAM: 4–8 GB.
  • Disk: 50–200 GB NVMe, depending on retention.
  • Notes: Run Matomo on its own VPS if possible, and consider a separate database server when you pass a few million hits per month.

High‑traffic portals and multi‑client setups (5+ million pageviews per month)

  • Hosting type: Performance VPS, dedicated server or colocation.
  • CPU: 8+ vCPUs (or physical cores) with good single‑thread performance.
  • RAM: 16–64 GB, depending on query complexity and concurrent users.
  • Disk: 500 GB+ NVMe, possibly combined with cheaper storage for historical archives.
  • Notes: At this scale, separating frontend, database and possibly a dedicated archiving node becomes very attractive.

If you want a deeper dive into CPU and RAM planning for application workloads, the general principles in our article on how many vCPUs and how much RAM you really need also apply directly to Matomo.

Disk type matters more than you think

Matomo writes and reads a lot of small records from the database. Latency is more important than raw capacity. In practice, this means:

  • Prefer NVMe SSD for primary database storage wherever possible.
  • Use slower SATA disks or object storage only for archive exports and backups.
  • Monitor disk IOPS and IOwait when your reports feel slow; they are often the bottleneck.

For a deeper understanding of how disk choices impact web applications, see our guide to NVMe SSD vs SATA SSD vs HDD for hosting and backups.

Choosing the Right Hosting: Shared, VPS, Dedicated or Colocation

Shared hosting: only for very small setups

Matomo can technically run on shared hosting, and we have clients who do this successfully. However, you must be realistic about the limits:

  • Background archiving jobs may be restricted or throttled by the host.
  • MySQL limits and concurrent connections can become a problem as you grow.
  • CPU and memory are heavily shared with other users, so peak times may feel slow.

If your site is a personal blog or a small corporate site with modest traffic and you only track a few basic metrics, shared hosting can be an economical starting point. The moment you add e‑commerce tracking, multiple sites, or heavy segmentation, it is time to move Matomo to its own VPS.

VPS hosting: the sweet spot for most Matomo installs

For the majority of real‑world deployments, a VPS is the ideal home for Matomo. You get full control over PHP versions, MySQL tuning, cron jobs and security hardening. Resources are dedicated, so a neighbour cannot slow down your reports. At dchost.com we see a clear pattern: once analytics becomes important to a business, they prefer to move it to a VPS even if the rest of their sites are still on shared hosting.

With a VPS you can:

  • Tune PHP‑FPM and OPcache specifically for Matomo.
  • Adjust MySQL or MariaDB settings for heavy read/write workloads.
  • Schedule archiving jobs during off‑peak hours without hitting shared limits.
  • Apply server‑level hardening like Fail2ban and firewalls without waiting for a panel update.

If you are new to VPS administration, our VPS security hardening checklist and our guide on how to secure a VPS server step‑by‑step will help you start safely.

Dedicated servers and colocation: for central analytics platforms

Some organisations want a single, powerful analytics platform for many properties: dozens of sites, mobile apps, maybe even internal dashboards. In those cases a dedicated server or colocated server in our data center becomes attractive:

  • More predictable performance: Physical cores dedicated entirely to your analytics workloads.
  • Large memory and disk capacity: Keeping several years of anonymised data online is much easier.
  • Network flexibility: Private interconnects, VPNs or VLANs between web frontends and the analytics box.

With colocation, you can even bring your own hardware design: RAID levels, specific NVMe models, hardware encryption cards and more. This is popular with institutions that have strict procurement and compliance rules but want to benefit from a professional data center environment.

Practical Matomo Hosting Architectures

Scenario 1: Single VPS with Matomo and database

This is the simplest production‑ready setup and covers most small to medium projects:

  • 1 VPS, 2–4 vCPUs, 4–8 GB RAM, NVMe storage.
  • Nginx or Apache serving Matomo on analytics.example.com.
  • MariaDB on the same VPS, tuned with a reasonable buffer pool.
  • System cron or systemd timer running Matomo archiving every 15–30 minutes.
  • Automatic backups of both files and database to remote storage.

This design is easy to manage and very cost‑effective. CPU spikes from archiving are contained within your dedicated resources, and latency between app and database is minimal because they share the same node.

Scenario 2: Separate database server for heavy reporting

When you start tracking several million hits per month or running many custom segments, database load becomes the main bottleneck. Moving MariaDB to its own VPS or dedicated server helps a lot:

  • Matomo PHP frontend and web server on one VPS.
  • Dedicated database VPS or dedicated server with more RAM and NVMe capacity.
  • Private network or VPN between the two servers.

This frees up CPU and RAM on the frontend and lets you size the database box specifically for analytics workloads. It is similar to the architectures we describe in our article on when to separate database and application servers.

Scenario 3: Central analytics for agencies and multi‑site owners

Agencies, hosting resellers and groups of companies often prefer to run one central Matomo instance:

  • One Matomo server tracking dozens of client sites via different site IDs.
  • Shared infrastructure for backups, security, and maintenance.
  • Per‑site permissions so each client logs into their own dashboard only.

Here it is crucial to plan data retention and database growth carefully, because you are aggregating traffic from many sources. Disk and backup strategies matter more than ever, which is where our guides on designing a backup strategy with RPO and RTO and ransomware‑resistant hosting backups become directly relevant.

Security, Privacy and Compliance Settings You Should Not Skip

Transport security: HTTPS everywhere

Because analytics contains behavioural and sometimes personal data, it absolutely must be transmitted over HTTPS:

  • Install a valid SSL/TLS certificate on analytics.example.com.
  • Redirect all HTTP requests to HTTPS.
  • Enable modern TLS versions and ciphers and consider HSTS for extra protection.

If you are new to SSL setup, our articles on fixing common SSL certificate errors and TLS protocol updates and best practices will help you configure your server correctly.

Application‑level hardening

Do not treat your analytics dashboard as less important than your main site. If attackers gain access, they gain rich behavioural data on all your visitors. At minimum you should:

  • Use strong, unique passwords and enable two‑factor authentication for Matomo accounts.
  • Restrict access to the admin interface by IP or VPN where possible.
  • Keep Matomo core and plugins updated on a regular schedule.
  • Harden the underlying VPS with a firewall, Fail2ban and non‑root SSH logins.

Log anonymisation, IP masking and KVKK / GDPR

Running Matomo on your own server is only the first step for privacy. You also need to configure retention and anonymisation. Common steps include:

  • Anonymising IP addresses by at least 2 bytes (configurable in Matomo).
  • Disabling or restricting user ID tracking unless you really need it.
  • Configuring automatic deletion for old logs beyond your legally justified retention window.
  • Being transparent in your privacy policy about how you collect and process analytics data.

On the hosting side, you should apply similar anonymisation rules to web server and reverse proxy logs. Our dedicated article on log anonymisation and IP masking for KVKK / GDPR‑compliant hosting logs explains how to do this at the server level.

Performance and Maintenance: Keeping Matomo Fast Over the Years

Archive reports with cron, not via the browser

By default, Matomo can archive reports when users open the dashboard in their browser. This is convenient at first but quickly becomes painful as traffic grows: the first person to open a large report after midnight may wait a very long time. The recommended approach is:

  • Disable browser‑triggered archiving in Matomo settings.
  • Set up a system cron job or systemd timer to run core:archive regularly.
  • Monitor execution time and adjust schedule or server resources as needed.

This turns heavy report generation into a controlled background task, avoiding unpleasant surprises for dashboard users.

Database maintenance and pruning

Even with anonymisation, analytics databases can grow very quickly. To keep performance stable:

  • Enable Matomo’s built‑in data retention tools to automatically delete old raw logs.
  • Consider keeping only aggregated daily or weekly reports beyond a certain age.
  • Regularly run database maintenance: optimising tables, checking indexes and monitoring slow queries.
  • Watch disk usage and plan capacity upgrades before you hit critical limits.

Backups and restore tests

An analytics platform is often central to business decisions, so losing data is not an option. At a minimum, you should:

  • Schedule daily database backups and frequent incremental backups where possible.
  • Back up the Matomo config and the entire installation directory.
  • Store backups off‑site, ideally in object storage with versioning and immutability options.
  • Regularly test restores on a staging server so you know the process works when you need it.

The same 3‑2‑1 principles we describe for other applications apply here: at least three copies of your data, on two different media types, with one copy off‑site.

How dchost.com Can Help You Host Matomo Safely

When you run privacy‑focused analytics, you are taking responsibility for both performance and data protection. The right hosting partner and architecture make that responsibility much lighter. At dchost.com we work daily with customers who run Matomo and other self‑hosted tools alongside their websites, e‑commerce stores and SaaS platforms. That experience influences how we design our hosting plans, from shared hosting that allows real cron jobs to NVMe‑based VPS and dedicated servers ready for analytics workloads.

If you are just starting, you can deploy Matomo on a small VPS and grow from there. As metrics become mission‑critical, we can help you move to a larger VPS, split out the database, or even design a dedicated analytics server or colocation setup in our data centers. Our existing guides on topics like how domain, DNS, server and SSL work together and VPS and cloud hosting trends can give you additional architectural context as you plan.

Self‑hosted analytics is not only about avoiding third‑party trackers; it is a strategic choice to own your data and respect your visitors. With a well‑sized server, sound security practices and sensible data retention, Matomo can run quietly in the background for years, delivering exactly the insights you need. If you would like help choosing the right dchost.com plan for your analytics stack or you are planning to centralise tracking for multiple sites, our team is happy to review your numbers and suggest a concrete, realistic hosting layout.

Frequently Asked Questions

For a small site up to around 100,000 pageviews per month, Matomo can run on a modest VPS or even a high‑quality shared hosting plan. As a baseline, plan for at least 1 vCPU, 1–2 GB RAM and 10–20 GB of SSD or NVMe storage. You will need a modern PHP version (8.x recommended), a web server such as Nginx or Apache, and a MariaDB/MySQL database. For anything beyond basic tracking or for multiple sites, we recommend moving to a dedicated VPS with 2–4 vCPUs, 4 GB RAM and NVMe storage so you can safely run cron‑based archiving and database maintenance.

Both options are possible and we see both in production. Hosting Matomo on the same server as your main site is convenient and fine for low traffic; you simply create analytics.yourdomain.com as a subdomain and install it there. However, Matomo can generate CPU and database load during archiving, especially when traffic grows. To keep your main site fast and to make capacity planning easier, we generally recommend a separate VPS once analytics becomes important, you enable e‑commerce tracking, or you aggregate data from multiple sites on one Matomo instance.

Matomo provides many tools to help with GDPR and KVKK compliance: IP anonymisation, configurable data retention, consent modes and the ability to run entirely on your own infrastructure. However, compliance is never automatic. You must still configure anonymisation levels, define how long you keep raw logs and aggregated reports, update your privacy policy, and ensure that server‑side logs are handled correctly. Our guide on log anonymisation and IP masking for KVKK / GDPR‑compliant hosting logs is a useful companion to Matomo's own privacy settings. When both application and hosting layers are configured well, Matomo can fit comfortably into a privacy‑by‑design strategy.

The most important step is to move report archiving to a cron job or systemd timer, instead of relying on browser‑triggered archiving. This ensures heavy processing happens in the background on a predictable schedule. Next, make sure you are using SSD or, ideally, NVMe storage for the database, and give MariaDB a sensible amount of RAM for its buffer pool. Enable automatic data retention so raw logs do not grow forever, and periodically optimise tables. If you outgrow a single VPS, consider splitting the database onto its own server or upgrading to a larger VPS or dedicated server with more CPU cores and memory.

In many cases yes, although the exact process depends on which analytics tool you are using. Matomo offers import tools and plugins for popular platforms, and you can also import historical data from web server logs. From a hosting perspective, the main consideration is temporary load and disk usage during migration: imports can be CPU and I/O intensive, and the database can grow quickly. We recommend performing large imports on a VPS or dedicated server with enough headroom, scheduling them during off‑peak hours, and ensuring you have backups before and after the migration so you can roll back if anything goes wrong.