

Just a warning the Download and transfer via USB will no longer be an option from 26th of Feb according to Amazon UI.
Just a warning the Download and transfer via USB will no longer be an option from 26th of Feb according to Amazon UI.
I didn’t think OP was going the ZFS route so it wouldn’t matter on that point.
His Server 2 will be running on the red line imho so any overhead would have impact.
Mount your NFS in the fstab and make sure you have docker set to wait until the mount is working. Here is a guide. https://davejansen.com/systemctl-delay-start-docker-service-until-mounts-available/
I’ve only had to delay on my N100s.
So I have the mounts set and then just use those paths in my compose. All my machines have the same paths.
quicksync should let the i3 handle jellyfin just fine if you’re not going beyond 1080p for a couple of concurrent users. Especially if you configure the Nice values to prefer jellyfin over immich.
Most of my content is 4K h264. You may be right on the 1080 but I don’t have content at that resolution generally.
Worst case scenario he can always keep the N300 for other stuff if it doesn’t work out.
I’ve looked at it but never actually given the Synology proxy a go despite using their DNS server. Does it do auto certificate renewal?
Have you considered using a Cloudflare tunnel to bypass the CGNAT? You can do that into a proxy or straight into the service.
Personally I would keep it simple and just run a separate NAS and run all your services in containers across the devices best suited to them. The i3 is not going to manage for Jellyfin while sharing those other services. I tried running it on an N100 and had to move it to a beefier machine(i5). Immich for example will use a lot of resources when peforming operations, just a warning.
If you mount a NAS storage for hosting the container data, you can move them between machines with minimal issues. Just make sure you run services using a docker-compose for them and keep them on the NAS.
You completely negate the need for VMs and their overhead, can still snapshot the machine if you run debian as the OS there is timeshift. Other distros have similar.
Download your the Amazon UI or use this tool while you can though.