Email archiving with Open Archiver

I’ve recently discovered Open Archiver in a (German) YouTube video, which allows you to archive your emails from various different email providers. It seems to be still a very new project, but it looks promising to me. And it’s completely open source.

Disclaimer

The article has been written back in October 2025. Since then, the overall stability improved. I just wanted to wait until a problem with larger email accounts has been fixed, which was the case in the latest version 0.5.0 release on March 20,2026.

A bumpy start

First of all, installation was quite forward using the official repository and adjusting it a bit to run properly on my TrueNAS (setting STORAGE_LOCAL_ROOT_PATH correctly to a local path (for whatever reason you have to configure this path here instead of just using a Docker volume) and the sync interval to every 15 minutes in the .env file, a different port for the frontend to prevent collision – port 3000 is often used for something else – and the volumes in the docker-compose.yml file), I started the first imports.

And oh boy, I should have started with a smaller IMAP account, but I used my biggest, resulting in the immediate import of over 30.000 emails. The first thing I saw was the heavy amount of RAM the container was using, up to over 5 GiB. On a system with 16 GiB in total, where 13,6 GiB were usable and other containers and a virtual machine were active as well, this was wild. Wild enough for the system that it just killed the virtual machine with my Home Assistant because of an “out of memory” error.

I stopped all containers that were not strictly necessary and restarted the VM and allowed Open Archiver to continue archiving. Per my research, I found out that the high memory consumption comes from using OCR on file attachments. Since I really want to have this feature, I couldn’t just disable it.

After some time, the memory usage settled to around 3 GiB while importing. The other thing worth mentioning was CPU usage. There were several spikes, which used up to 100 % CPU, and it was notable within all other services.

CPU usage graph from netdata with multiple spikes to 100 %, lasting from a few seconds up to a minute

After quite some time, when it seems that everything has been imported the first time a couple of hours later, CPU usage stayed at around 15 %. Not bad in terms of overall system load, but unnecessary, since it didn’t seem to do anything, except for adding entries to the log that it skipped duplicate emails.

According to certain GitHub issues both issues with memory and CPU usage are not uncommon at the moment. So be aware of that, especially if you’re importing large amount of data.

Fortunately for me, CPU usage settled after nearly 24 hours and stayed like this ever since.

CPU usage graph from netdata with some spikes at the beginning to up to 80 % and then nearly 25 % usage for most of the time until it lowered to 10 %

Since my fear was that the same will happen on every import, I changed the import to daily at 02:00 a.m., but the next night, it was quite quiet. No high CPU or memory usage.

Email archive

After the first storm, you’ll get a nice email archive, which supports IMAP, Google Workspace, Microsoft 365 and imports from PST, EML and Mbox. All emails are stored as EML files and thus you don’t have any vendor lock-in.

Dashboard of Open Archiver demo website: Total emails archived: 1449; Total Storage Used: 159.25 MB, Failed Ingestions: 0; Ingestion History graph by day for the last 30 days and a pie chart for Storage by Source

Current state

Memory usage is still relatively high for a single container with ~1,7 GiB, but in an acceptable range. This is mostly due to the search engine, which is very powerful, without a doubt. CPU usage is neglectable in my case.

It definitely needs some polishing, but keep in mind that it’s a relatively new (the first commit is from July 10, 2025). To me, it looks very promising. (And keep my disclaimer in mind.)

If you want to try it out, checkout the website:

Matze Avatar

Author:

As a developer, I work with WordPress every day. By day, I create plugins and themes, solve problems and sometimes produce new ones. At night, it’s not unusual to find me at the WordPress Meetup in Stuttgart with my dog. 🐕

Comments

Fediverse Reactions

Leave a Reply

Your email address will not be published. Required fields are marked *