Edit: for a solution see this post or maybe some later post of this thread as post can't be edited anymore after 30 days.
Hi,
Not directly a PVE question, but maybe someone got an idea how to accomplish this.
I've got multiple non-clustered PVE servers that use full system encryption using ZFS native encryption. One of them is running 24/7 and the other ones are only started when needed (damn electricity prices...). To unlock the root partitions after boot I connect via SSH to dropbear-initramfs. Dropbear on the server is automatically running "/usr/bin/zfsunlock" that will ask for the passphrase to unlock the pool and then terminate the SSH session. So only thing that is possible to do is typing in that password.
This works totally fine when manually unlocking the servers using putty where I then can copy-paste my passphrase stored in a keepass safe.
But I don't find a solution on how to automatically unlock these PVE servers. For example every sunday at 00:00 all of these servers have to be only for backup and maintaince tasks (PVEs backing each other up to virtualized PBSs, ZFS scrubs, PBS prune/GC/verify, especially the weekly ZFS snapshots are important because without these running every sunday 00:00 the ZFS replication will fail once booted again and all those TBs of data need to be replicated again as incremental replication will then fail with missing snapshots) and I need to be at home to manually unlock them or boot and unlock them up before leaving my home when I know I won't be back before 00:00. Then the servers are running all the day without being used wasting money.
What I would like to do is to create a script on the 24/7 running PVE server that will unlock these other servers for me. Wouldn't be a problem that the passphrase is stored on the server, as the server itself is fully encrypted, so the stored passphrase isn't accessible once that server gets shut down (= stolen).
But I just can't find a way how I can open a SSH session from a bash script and then auto-type-in my passphrase. Looks like piping stdin from a file to the ssh command isn't working.
Anyone got an idea how I could solve this?
Hi,
Not directly a PVE question, but maybe someone got an idea how to accomplish this.
I've got multiple non-clustered PVE servers that use full system encryption using ZFS native encryption. One of them is running 24/7 and the other ones are only started when needed (damn electricity prices...). To unlock the root partitions after boot I connect via SSH to dropbear-initramfs. Dropbear on the server is automatically running "/usr/bin/zfsunlock" that will ask for the passphrase to unlock the pool and then terminate the SSH session. So only thing that is possible to do is typing in that password.
This works totally fine when manually unlocking the servers using putty where I then can copy-paste my passphrase stored in a keepass safe.
But I don't find a solution on how to automatically unlock these PVE servers. For example every sunday at 00:00 all of these servers have to be only for backup and maintaince tasks (PVEs backing each other up to virtualized PBSs, ZFS scrubs, PBS prune/GC/verify, especially the weekly ZFS snapshots are important because without these running every sunday 00:00 the ZFS replication will fail once booted again and all those TBs of data need to be replicated again as incremental replication will then fail with missing snapshots) and I need to be at home to manually unlock them or boot and unlock them up before leaving my home when I know I won't be back before 00:00. Then the servers are running all the day without being used wasting money.
What I would like to do is to create a script on the 24/7 running PVE server that will unlock these other servers for me. Wouldn't be a problem that the passphrase is stored on the server, as the server itself is fully encrypted, so the stored passphrase isn't accessible once that server gets shut down (= stolen).
But I just can't find a way how I can open a SSH session from a bash script and then auto-type-in my passphrase. Looks like piping stdin from a file to the ssh command isn't working.
Anyone got an idea how I could solve this?
Last edited: