robertooovanitasvitae and others:
I've watched the "Cryptographic Identity: Conquering the Fingerprint Chaos" presentation and I'm worried about the direction this is headed.
What's the point of 1:1 copying what matrixorg does? Why not do things differently, why not do them better? Better for privacy, security, decentralization?
Storing E2EE secrets server-side is a horrible idea. It's no longer E2E. It's dishonest to the users who don't have a clue keys are stored on servers.
Yes, I know, matrix uses passwords to make it "secure". When protonmail originated the idea and marketing that a passphrase is E2E, I didn't think anyone, especially not libre software, would take this seriously. Think about it for a second, do you really believe people are capable of risk-management required for proper password creation and maintenance to secure encryption keys on public (often of undefined trust) servers? First, you create an easy to use app for average Joe and then tell him to upload his "legit E2EE" keys to random server and secure it with a password? How about he uses his account password (123456, stored in plaintext) for this?
robertoooYou can do the same (but more securely) without uploading keys to server nor using password to "secure" them. I talked about this with matrixorg team and they said it's completely doable and they know how to do it. I think it's in one of the specs they wrote. In short, it's not hard to securely transfer keys if you already have a secure layer designed for this, ie. OMEMO, that's the way to go. Matrix folks promised me back in a day (that was before they landed cross signing last year) that they will 100% give people choice on whether they want to upload keys to server or not. Even their cross signing release announcement mentions this. A year has passed since they released that, so I asked them recently, what's the plan, is this ever going to land? No reply. Currently element forces server-side storage even on users who actively opt-out. I think their plan is to use this to aid in rapid user growth, it's E2EE, but easy. Welp, more like faux-E2EE. They want people to store server-side, they will gradually expand what's stored server-side, people will get used to it and that's going to be the new legit E2EE. They may implement what they promised, but likely in 5 or 10 years, when everyone forgets and so that the feature isn't popular.
This is clearly something both protocols can implement, it's more secure, it gives users freedom, and respects them by being honest. Meanwhile the developments around XMPP cross signing - as I see it - are aiming to blindly copycat another protocol's shortcomings. And its shortcomings are the prime qualities in which XMPP excels and why it's so good and so strong.
robertoooDisclaimer: I'm not categorically against storing anything server-side. I can understand that some people may be ok with that. And with not verifying fingerprints. But then, why even bother with encryption. XMPP provides superior privacy even without it. We don't need to cut corners.
vanitasvitaerobertooo: sharing keys across devices by uploading them encrypted to the server is only one possible way of syncing.
vanitasvitaeThere are certainly other ways like manually copying them around, but thats not very user friendly
vanitasvitae> Disclaimer: I'm not categorically against storing anything server-side. I can understand that some people may be ok with that. And with not verifying fingerprints. But then, why even bother with encryption. XMPP provides superior privacy even without it. We don't need to cut corners.
Raise the floor instead of the ceiling.
vanitasvitaeI.e. a bit of crypto is better than none.
robertoooI do understand that. I know XMPP is not P2P. But there's an immense difference between sharing keys in a secure OMEMO session and sharing keys by encrypting them with password (123456, stored in plaintext on user's server) and uploading to a random server...
robertoooYou may want to dig deeper into matrix's cross-signing story. Fe. this is a quote from their announcement:
> Currently cross-signing requires that you store your encrypted cross-signing keys on the server, meaning that an attacker on the server who guesses your recovery passphrase/key could cross-sign a malicious device to spy on your account. Generally this is fine, as if you lose all your devices it’s the only way to recover your account’s cross-signing state. However, if you are super paranoid, you may wish to share (gossip) the keys between devices instead rather than storing them encrypted on the server. The protocol supports that - we just haven’t hooked it up yet.
robertoooPlease don't dismiss my concerns as proposing "manually copying keys". This is not what I'm talking about.
robertoooAnd yes, a bit of crypto is good, but my proposal still allows for this without hurting UX for such less concerned people.
vanitasvitae> sharing keys by encrypting them with password (123456, stored in plaintext on user's server) and uploading to a random server...
who proposes that? The OpenPGP XEP also allows for server-side storage of encryption keys and it uses a secure, random passphrase which is not picked by the user.
vanitasvitaeI agree, that this problem is not trivial to solve and in the talk I might have made it look simpler than it ultimately is 😛
vanitasvitaeI appreciate your input on the matter 🙂
vanitasvitaeI guess this is sort of a bootstrapping problem, isn't it? You need the cross-signing keys to sign device keys. On the other hand you want to use device keys to distribute cross signing keys.
robertoooThat's the password all my non-technical friends use. And they use random XMPP providers (so anything can happen really). Thus far the only reason, why I can say my communication with them is private is that I verify their OMEMO fingerprints. Matrix-style cross-signing with SSSS would make it no longer so, I would have to disable cross-signing on my friends' devices or move elsewhere.
I'm not well versed on how would cross-singing with this OpenPGP XEP work. No password = definitely better. But still, it seems like storing keys server-side is needless, and I talked this through extensively with matrix folks.
vanitasvitaeWell, one argument one could bring is, that with XMPP everything goes through the server eventually. So you might as well use the server for longer-term storage
vanitasvitaeAs in, even if you send the keys encrypted from one device to another, it passes the server.
vanitasvitaeJust that, depending on the encryption scheme, the message might be encrypted with a random key instead of a passphrase.
robertoooSay you have 2 devices. 1 trusts 2. You get 3rd. No you'd have to verify both 1 and 2 on the 3rd. But you can also verify only 1 and then device 1 can send to device 3 what fingerprints it trusts (ie. device 2). This, but automatic. You get a new device, you scan one QR, you're done, no matter how many devices you have. You meet a friend, they scan your QR (only one) and they're done.
vanitasvitaesounds like XEP-0450 to me
DebXWoodyI like the idea of QR scan.
DebXWoodyok, I like the idea the user should be able to choose what he / she would like to use.
DebXWoodyMay wife will not care about it at all, it should just work. I have keys on my nitrokey, which I may should like to use.
robertooo> So you might as well use the server for longer-term storage
But you also have to trust the algorithms and protocols involved in this long-term storage. This stuff would be new and custom. Note that matrix folks basically rolled whole lot of their own crypto. Normally infosec despises rolling your own crypto and there's already been many bugs in matrix implementations.
Also, a bit more state-level attacky, you also have to trust the encryption schemes long-term. I mean forever, until you're dead. This is because it's trivial to sniff the well packaged "your cross signing and decryption key backup" from the server. Hypothetically, if quantum computers are going to land in 10 years. If I sniff that little backup now, I can read everything you said. In 10 years. And, touch wood, the encryption schemes do have holes, and Signal protocol even if bleeding edge is quite new and untested.
robertoooWhen I say trust algos and protos. I mean that now you really only need to trust one - OMEMO, but then you'll have to trust all this new stuff. Stuff introduced to the world only last year in matrix.
vanitasvitaeI understand your points, but I disagree with some.
vanitasvitaeThe signal protocol is nearly 10 years old soon.
vanitasvitaeIt has been subject to intensive studies by researchers for some time now.
vanitasvitaeI agree, that you should be very careful with uploading encryption keys to the server. However, the cross signing stuff only uploads attestation keys which would be used to automatically mark trustworthy fingerprints as such.
vanitasvitaeBtw. I'd be happy to take this discussion to the mailing list, as I'm kind of distracted right now 🙂
robertoooYes, that's subjective. 10 years is new for me. On the other hand, what can you expect if curve25519 (the earlier alternative is proven shady) was released in 2005 and people started caring about this stuff only recently? These are best in class protocols. Still better to not put your life behind that solely.
vanitasvitaeI'd say, if your life depends on it, don't use it. Still, if it enables the masses to communicate more securely by default, do it.
robertoooDoes that mean the master key isn't in that server-side encrypted blob? Where is it then? Matrix store everything in that blob, including master and all message decryption keys.
robertoooAgree, hence that's less important and choice may be good to have.
robertoooI'm not on the mailing lists, but I'm non-stop here.
vanitasvitaeThis is how I would do it with OMEMO: Each device has its own encryption keys. Those are never uploaded anywhere. Then you have your "Account Identity Key". This key can be uploaded to the server, but not necessarily. Lastly you have your set of attestation/cross signing keys. Those are signed by the Account Identity Key, and are uploaded to the server. These are only being used to sign device encryption keys.
vanitasvitaeNow, if an attacker manages to get hold of the private cross signing keys, they could introduce fake devices for the user (given that they also have access to the users account, so at that point the user should already chose another service to begin with).
vanitasvitaeThey could however not access past encrypted communication, only fool contacts to encrypt new messages for the new evil device.
vanitasvitaeThis would probably be detectable though.
vanitasvitaeIf the user detects this, they can use their Account identity key to rotate their cross signing keys.
vanitasvitaeSo with this model, the user could keep their Account Identity Key (and also their cross signing keys) "offline" by choosing to not upload them.
vanitasvitaeBUT they have the option to do so.
vanitasvitaeAnd even if at some point crypto breaks, this mechanism would not be the cause of catastrophe.
larmahow do you handle lost or stolen account identity keys?
vanitasvitaeDunno, pre-generating a revocation cert? 😛
vanitasvitaeThere are still some rough edges with this idea, no doubt.
vanitasvitaeBut I believe it would be better than what we have today.
vanitasvitaeIdeally the user would keep their Account Identity Key offline in a safe
vanitasvitaenobody is going to do that, but it is possible.
larmaattestation key could just be the same as encryption key. the account key is just a layer on top to hide the fact that revocations are a hard problem. So we're actually not far off your idea with XEP-0450
vanitasvitaeThe flaw of XEP-0450 is that as soon as one device is compromised there is no clear way to recover.
vanitasvitae(as far as I understand it)
vanitasvitaeBecause a quick attacker could mark their device as trusted and then untrust all other devices of the user.
larmaDistrusting should not happen silently with the recipient, it's basically a "something is terribly wrong" message, which needs manual resolution.
larmaif you want to be more correct, you can store the trust chain, and if your anchor to this chain (the device you verified) is distrusted, all trusts derived from that will be distrusted as well (typicall resulting in no device being trusted any longer)✎
larmaif you want to be more correct, you can store the trust chain, and if your anchor to this chain (the device you verified) is distrusted, all trusts derived from that will be distrusted as well (typically resulting in no device being trusted any longer) ✏
vanitasvitaeThats a really bad idea imho
vanitasvitaeWhat if you have cycles?
vanitasvitaeDo you expect the user to untether the chain, understanding the consequences of distrusting certain devices?
larmaDistrusting a device is a very rare thing to happen. It basically means you are under attack. The best way forward is to consider the channel compromised and resolve by creating a completely new trust relationship.
vanitasvitaeWhat if I retire a device?
vanitasvitaeokay, fair point, maybe I shouldn't distrust it in that case
vanitasvitaeStill, would you require the user to again manually re-scan all devices fingerprints?
vanitasvitaeHm, what if I use OMEMO on a browser in an internet cafe for a day?
vanitasvitaeSurely I would want to distrust that set of keys at the end of the day?
larmaIf you retire a device, you remove it from the active device list without distrusting it. If a retired device ever shows up again, you must consider the key compromised and distrust it, which may break trust chains
vanitasvitaeI dislike the "devices are part of a trust chain" idea. I'd much rather have a clearly defined top-down tree-like chain that goes from the users identity down to their devices.
vanitasvitaeThats easy to understand, and implement. Having chains of undefined length and possibly with cycles is a source for disaster imho.
vanitasvitaeWhat if an attacker gets hold of that one device you first used to establish the trust graph with and that is now the "root" element of your chain?
larmayeah, it sounds like the cleaner approach, but it doesn't match reality. People have n devices, none of them is the master device, so there is no way to have an identity key without putting things on users 99% won't understand
vanitasvitaeI disagree that the identity key approach doesn't reflect reality. If you sync the signing keys, there is no single master device.
larmaalso cycles are hardly relevant here. the only relevant question is: is there a trust chain path from the device(s) you trust to an active device. If yes, consider that device trusted. Yes you have to do some minimal path finding and you can't just do a trivial recursive search if there are cycles, but it's really not that complicated either.
vanitasvitaeThere are only devices with access to the signing keys (aka devices that can introduce new devices), and those who dont (lets say you are in an internet cafe, you wouldn't want to give that browser app access to your signing keys)
larmathe master device is the one holding the identity key. If you sync the identity key on every device you win nothing because having one device stolen means definitely losing all trust
vanitasvitaeYou don't have to sync it on *every* device.
vanitasvitaeIts enough to have one device which has access to it, but you can sync it to as much devices as you like.
vanitasvitaeIf you sync it to any of your devices, this method basically becomes XEP-0450 in a way.
one is asking if we "would have used the old logo from the email provider mailbox.org"
Is that something we have to investigate? Is that actually of any real thing - as we show the XMPP logo and an envelope?