This same “discovery” gets reported on once or twice a year; it’s starting to feel like a FUD campaign rather than actual research
on desktop devices
Kinda should have been in the headline
It is a super important detail, but it’s still unforgivable for an app that expects privacy to be part of its brand identity.
This is a big difference between privacy and security.
Agreed
But you can’t have privacy without security, and any privacy brand must have security in their bones.
You can’t encrypt anything without a key. This is the key. If it wasn’t in plaintext then it would be encrypted. Then you’d need a key for that. Where do you put it?
Phone OSs have mechanisms to solve this. Desktop ones do not.
unforgivable
yeah absolutely agreed
But… That’s how encryption keys are stored.
No your don’t understand, you’re supposed to encrypt the keys.
Then you encrypt that key
And then that key
Until it’s all encrypted /s
opportunistic TPM integration would be nice.
I.e. use the security chip of the device, if one is found. Otherwise use password.
OR use a Nitrokey etc, which can act as a secure device to store these keys too.
Take that, Windows. You dont need a builtin TPM if you can use a Nitrokey 3 with a secure element, externally.
If your computer is compromised to the point someone can read the key, read words 2-5 again.
This is FUD. Even if Signal encrypted the local data, at the point someone can run a process on your system, there’s nothing to stop the attacker from adding a modified version of the Signal app, updating your path, shortcuts, etc to point to the malicious version, and waiting for you to supply the pin/password. They can siphon the data off then.
Anyone with actual need for concern should probably only be using their phone anyway, because it cuts your attack surface by half (more than half if you have multiple computers), and you can expect to be in possession/control of your phone at all times, vs a computer that is often left unattended.
Should the encryption keys be… encrypted?
With what? Where would you store the encryption key for the encryption key on a desktop system where it would not be accessible to an attacker?
Perhaps there could be a pin or password that must be entered every time to decrypt it into memory.
As the article states, currently all processes are able to read the file which contains the key. Instead, you could store the key in the macOS Keychain (and Linux/Windows equivalents), which AFAIK is a list of all sorts of sensitive data (think WiFi passwords etc.), encrypted with your user password. I believe the Keychain also only let’s certain processes see certain entries, so the Signal Desktop App could see only its own encryption key, whereas for example iMessage would only see the iMessage encryption key.
Always knew this project was a honeypot because of their insistence on needing a phone number. Welp. Let’s see how they damage control yet again.
It originally needed a phone number because it was originally a phone texting app.
Not storing it in plaintext would require setting up some kind of password, right?
Some way to encrypt the decryption key.
This could also mean TPM + Pin. Or using a Nitrokey, externally, which stores the password to decrypt the decryption key.
That is how user account unlocking (on GrapheneOS with Pixel phones) is done.