Say you have a script or something that gets run in cron/task scheduler and it needs a password… say to ssh to a raspberry pi elsewhere in your house.

How do you save that password in a way that automation can access it?

Some ideas:

  • Plaintext file. Not a fan because its sitting unencrypted on the box somewhere.
  • Environment variable. Not a fan because its still unencrypted somewhere to someone on the box (albeit likely the same user or an admin).
  • A secrets manager. If I use something locally like hashicorp vault or infisical, I can get to a point where a cli/api call gets the password. Though in this case I still need a vault password/secret to get my password. So I fall back to needing one of the above to get this to work.

If the secrets manager is easily available, the secret to get into the secrets manager is available as well leading to a feeling of security by obscurity.

If someone breaks into my system via SSH/etc. then they can get the passwords either way.

… How do people normally do this? I’m not sure I actually get anything out of a secrets manager if its local and I have the disk itself encrypted before login.

What actually makes sense at a personal/home scale?

(Edit: I know using SSH key probably is better for getting to the raspberry pi, but still the question is the same idea).

  • SleepyBear
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    Generally you want to model out what the risks are and how acceptable each risk is, compared to the effort to mitigate it.

    Physical access, where someone can break in and get access to your secret material. So, you can just have full disk encryption to mitigate that. If you want to remotely reboot, you can install dropbear with SSH keys.

    If you want to control and secure remote access you care about authentication and authorization, as well as the data in transit. To protect the data in transit you need to ensure it’s encrypted, so use SSH (or a similar encrypted protocol).

    For auth and auth you want to minimize the risk. You can control who can log-in and from where with allow-lists of IP ranges, or control their keys. For example, for external SSH access I only use keys that are physically backed, either through a Yubikey or secure-enclave generated key. Those are non-extractable, so the only machine it can come from is the ones holding it. No one can get in without stealing the device. The device itself is full-disk encrypted, or course. You can also control how the keys are used, either biometric auth, password auth, or both, if you want.

    Then, within your network for automated job you can setup ssh certs between accounts that have no other access other than to run the specific task. If you need further abilities, run those locally based on a trigger of the transfer itself. Sure, the cert used is in memory, and on disk, but it’s encrypted on the disk. To be read from memory you’d need a very skilled attacker.

    At that point you’re worried about someone physically breaking it and reading memory in a running system. That’s a deeply unlikely scenario. The other risk is someone breaking SSH cert-only auth, with strong certs, and that’s only if you have SSH publicly open.

    Turn off external access to everything and you’re back to local network attacks, which you can mitigate with: password controlled wifi, MAC control if you want (although can be spoofed), and even then you can limit communication on your local network. You could vlan out the jobs you care about to their own network which isn’t wifi accessible.

    Basically it comes down to making the cert/password low-risk if you lose it, and then mitigating everything to that point. Then not worrying.