Site's logo digitalkOmiX
  • Blog
    Articles Add Shot article
  • Docs
    Chi siamo / About
  • Login
    Login Sign Up

I aM MICH

A b&w picture from the 20th century, a lady with volcano Vesuvius in the background
A b&w picture from the 20th century, a lady with volcano Vesuvius in the background

13 Mar 2026

In this post I will talk about IMMICH, the first service I enabled on my home server. IMMICH is the "Self-hosted photo and video management solution. Easily back up, organize, and manage your photos on your own server. Immich helps you browse, search and organize your photos and videos with ease, without sacrificing your privacy".

It's true, IMMICH is very easy to setup. You will need Docker Engine on your machine (more on that in a future post), then open the terminal and make a directory where you want to set the app:

mkdir ./immich-app
cd ./immich-app

Then run these two commands to download the files you need for setup:

wget -O docker-compose.yml https://github.com/immich-app/immich/releases/latest/download/docker-compose.yml
wget -O .env https://github.com/immich-app/immich/releases/latest/download/example.env

You will have to populate some data in the resulting .env file, but it's ok to keep the default. Once you're done with the changes start Docker:

docker compose up -d

Once the containers are up and running, from another computer (be sure to be connected to your tailnet) navigate to http://<your machine with the immich app installed>:2283

You will access the IMMICH web interface, and will be prompted for an username and a password. As you are the first to access the dashboard, you will be the administrator of the service. In the admin you can do a lot of settings. What I do by default is to activate the Folder view (browse a directory of pictures) and set an External Library, a repository of pictures outside the IMMICH app that will be automatically uploaded. In this library I have copied all digital pictures I keep since 1998.

On my mobile I installed the IMMICH app, activated the tailnet and logged into my account. Then I synchronized the Camera folder of the mobile, so when I shoot a picture it is automatically uploaded to the server... in my living room.

IMMICH is really well done, it extracts metadata from pictures (date, location, EXIF), and has facial recognition. In the search box you can write i.e. "Vesuvius", and all pictures with a cone shaped mountain will pop up. The app will look for duplicate images, and let you choose which one to keep. The ML that drives all these features is inside your machine, so no data will leak out.

Of course not all uploads worked seamlessly, old pictures don't have metadata attached, so you will have to position them in time and space manually (that took several hours of work), but the folder view is very helpful in this task.

We talked about backups in former posts (I'll write in detail about backups in a future one): all pictures and metadata are periodically backed up into the specific device.

Just to refresh my memory, below you have the procedure to upgrade the IMMICH app on your server. In the terminal navigate to the immich-app folder, upgrade the IMMICH_VERSION value in your .env file, then run:

docker compose pull && docker compose up -d

To erase old unused containers run:

docker image prune

Hope you find this post helpful. Store safely your memories, and don't let tech giants harvest them.

#homelab

Tiger vs. AI

A tiger faces three clumsy robots, ink drawing
A tiger faces three clumsy robots, ink drawing

12 Mar 2026

Made this drawing a couple of years ago for a friend's birthday. More on tigers in future posts.

Up and running!

A mini PC sitting on a Playstation behind a TV screen
A mini PC sitting on a Playstation behind a TV screen

07 Mar 2026

Ok, we installed the system (Ubuntu 2024.04) and prepared the disks for backups, now we have to make the machine behave like a server: we must be able to reach it remotely. As long as we are in a local network we can access it knowing it's IP address with tools as Secure Shell (SSH), Secure File Transfert Protocol, VNC and many others. But we want to access the server from outside our local network, even from our mobile when we are on vacation, and that's a slightly harder task, because the server is plugged to a router to access the internet, and the public IP of the router changes in time.

Today there are many services that enable the remote connection, personally I use Tailscale. You will have to register an account to this service, but as long as you have only three users and one hundred (!) connected devices, you can live on their free tier. Once you have the account, you have to connect your machines. First I connected my laptop (Windows 10) and my mobile (Android), and that's a rather simple task: once you have downloaded the installers, you run the clients and log them into your Tailscale account. On your Tailscale dashboard, you can see the status (online / offline) of your devices (called "machines"). Now we have to add the server to the tailnet. Open the terminal and type:

curl -fsSL https://tailscale.com/install.sh | sh

That's it! Now start the service:

sudo tailscale up

The terminal will show you a link. Follow it and you will be redirected to your Tailscale account login page. Once logged in, the server will appear on the dashboard. On the dashboard you can manage your machines. In example I selected the server machine and I set it's access key never to expire. Thats's normal for a server that needs to be always available. To access remotely the server's terminal, I added the Tailscale SSH to it:

sudo tailscale set --ssh

The SSH flag appears on the dashbord, next to the server machine. That's great, this way you have an encrypted connection on a VPN! From my laptop I access the server's terminal with PuTTY, and I upload files with Filezilla.

Okay, now the server is ready, unplug the screen, the keyboard and the mouse, sit the server next to your router, plug in the ethernet cable and switch the power on: the server will immediately join your tailnet, accessible from everywhere.

#homelab

Disk shortage? Not me!

The bottom of a ThinkCentre with the NVME SSD
The bottom of a ThinkCentre with the NVME SSD

02 Mar 2026

As I purchased the Lenovo ThinkCentre as my home server, I noticed that the disk was wrong: 256GB instead of 512! I complained with the supplier and they sent me the right one, so now I have a spare SATA SSD with Windows 10 Pro installed.

I knew that this kind of mini PC can handle more than one disk, and that the NVME that eventually sits on the bottom is faster then the SATA, so I bought one at a local retailer (480GB). My idea was to install the system on the fast disk, and to use the slower for backups.

My system of choice is of course Ubuntu. You download the ISO image of the system on a USB device, plug it in the computer and reboot it. You must be sure that the USB device is first in the boot sequence (check your BIOS). In my case, as I have two disks, I put the USB as first bootable device, the NVME as second and so on.

With the USB plugged in, the system it holds will be in charge. From there you can test your machine (see if everything works) or install Ubuntu permanently. I targeted the NVME for the install. Once the process is complete you reboot the machine and unplug the USB, so the installed system takes control of the machine. If something goes wrong you replug the USB and start over.

Ubuntu is a very friendly operating system, you can do almost everything with a graphic interface, but if you are experienced enough, you can use the terminal. In my case I changed the name of the backup disk with the "Disks" application, then I fired up the terminal to change write permissions to the disk, so that any user can write stuff on it:

sudo chmod ugo+rw /path/to/your/disk

Just to remember: this command assigns (+) (r)ead and (w)rite permissions to all (u)sers, (g)roups and (o)thers for a given folder/disk.

Edit: just to be sure, I gave the user full ownership of the backup disk with the "Disks" application. I try to use the terminal as much as I can, but coming from Windows I'm more a GUI guy.

#homelab

A database accident

Two crumpled grocery lists on a wooden surface
Two crumpled grocery lists on a wooden surface

04 Feb 2026

While cleaning up my web host account I accidentally destroyed the database of this website, so navigating to digitalkOmiX.com threw a 500 error. Fortunately my web host makes automatic backups, so I just emailed the support team and I received a response within minutes. The *.dump file was uploaded to my account, ready to be restored. A brief explanation of "how to fix" was provided, all by email.

First I had to recreate the database with the same name of the deleted one, along with it's user. Upon creation a new password was issued, so I had to copy/paste it. Accessing my account via SSH in the same directory where the *.dump file was uploaded, I restored the Postgres database with the following command:

pg-restore -U <database user> -d <database name> file.dump

I was prompted for the new password and the process worked seamlessly. Next I had to update the same password in the configuration file of the application that managed the database. I restarted the server and the website was back online!

Huge thanks to the (((Opalstack staff! 

My StudioLab

A small desktop, a tiny one, a switch and a net extender
A small desktop, a tiny one, a switch and a net extender

30 Jan 2026

I already have a homelab, but it sits in my studio, so I'll call it StudioLab. It's made from a small HP we had at work, RAM 16 GB, two SSD respectively 512 and 256 GB. It's too big and noisy to stay at home, but suitable for a work environment. It has Proxmox installed, here I learned to use VMs and Linux containers. Actually there is a LXC for a name server (BIND) and a VM with an Immich instance, other stuff is just shut down. The machine is remote controlled via a Tailscale VPN.

#homelab

digitalkOmiX - 2026 Mastodon