User asks (human) Assistant to login to their online banking and make a transfer. No problem. No digital security system can stop this (bar requiring true biometrics on every sign-in, which isn’t happening soon).
User asks Company (with human staff) to login and do the same thing. Perhaps the company is an accounting firm, a legal firm, or a “manage my company for me” kind of firm. No problem.
User asks Company which makes self-hosted business management tools to login to their online banking. Oh shit!!! This is a violation of the ToS! The Company that makes this tool is violating the bank’s rights! The user doesn’t understand how they’re letting themselves get hacked!! Block block block! (Also some banks realise that can charge a fee for such access!)
Everyone on HN sees how that last case — the most useful given how great automation is these days — should be permitted.
I wish the governing layers of society could also see how useful such automation is.
These Device-Bound Session Credentials could result in the death of many good automation solutions.
The last hope is TPM emulation, but I’m sure that TPM attestation will become a part of this spec, and attestation prevents useful emulation. In this future, Microsoft and others will be able to charge the banks a great deal of money to help “protect their customers” via TPM attestation licensing fees, involving rotation, distribution, and verification of keys.
I’m guessing the protocol will somehow prevent one TPM being used for too many different user accounts with one entity (bank), preventing cloud-TPM-as—a-service being a solution to this. If you have 5,000 users that want to let your app connect to their Bobby's Bank online banking, then you’ll need 5,000 different TPMs. Also Microsoft (or whoever) could detect and blacklist “shared” TPMs entirely to kill TPMaaS entirely.
Robotic Process Automation on the user’s desktop, perhaps in a hidden Puppeteer browser, could still work. But that’s obviously a great deal harder to implement than just “install this Chrome extension and press this button to give me your cookies.”
Goodbye web freedom, and my software product :(
We're once again one step closer to losing whatever little autonomy we have left when interacting with online services. Why the hell did we have to put TPMs in every computer?? They bring essentially no benefit for the vast majority of users, but companies keep finding new ways to use TPM capabilities to the user's detriment.
I don't understand the benefit of all this complexity vs simply having the device store the cookie jar securely (with help from the TPM or secure enclave if required).
That would have the benefit that every web service automatically gets added security.
One implementation might be:
* Have a secure enclave/trustzone worker store the cookie jar. The OS and browser would never see cookies.
* When the browser wants to make an HTTPS request containing a cookie, the browser send "GET / HTTP/1.0 Cookie: <placeholder>" to the secure enclave.
* The secure enclave replaces the placeholder with the cookie, and encrypts the https traffic, and sends it back to the OS to be sent over the network.
Leverages TPM-backed secure storage when available
Step 2: TPM required, and your cookies are no longer yours.
I actually like the idea as long as you hold the keys. Unfortuately, the chasm to cross is so small that I can't see this ending in a way beneficial for users.
Hell no. If I can't make a full (encrypted) backup of my entire device and restore it on different hardware, I don't want it.
The opsec reason I use Safari as a work browser today is that Safari has a much more blunt tool to disrupt cookie stealers: Safari and macOS do not permit (silent) access to Safari's local storage to user level processes. If malware attempts to access Safari, its access is either denied or the user gets presented a popup to grant access.
I wish other browsers implemented this kind of self protection, but I suppose that is difficult to do for third party browsers. This seems like a great improvement as well, but it seems this is quite overengineered to work around security limitations of desktop operating systems.
Are there really many web services where an attacker having long-lived access gives them much more power than short lived access?
If someone gets short lived access to a control panel for something, there are normally ways to twiddle settings to, for example, create more user accounts, or slacken permissions.
If someone gets short lived access to a datastore, they can download all the data.
etc.
> Even if session cookies are stolen, they cannot be used from another device.
This seems false? Given the description in the article, the short lived cookie could be used from another device during its lifetime. Having this short lived cookie and having the browser proactively refresh it seems like a bad design to me. The proof of possession should be a handshake at the start of each connection. With HTTP3 you shouldn't need a lot of connections.
I'm curious why the solution here is bearer tokens bound to asymmetric keys instead of a preshared key model. Both solutions require a new browser API. In either case the key is never revealed to the caller and can potentially be bound to the device via a hardware module if the user so chooses.
Asymmetric crypto is more complex and resource intensive but is useful when you have concerns about the remote endpoint impersonating you. However that's presumably not a concern when the authentication is unique to the ( server, client ) pair as it appears to be in this case. This doesn't appear to be an identity scheme hence my question.
(This is not criticism BTW. I am always happy to see the horribly insecure bearer token model being replaced by pretty much anything else.)
Are they slowly trying to sneak in WEI again?
I wonder how long these short lived cookies actually live for? From the article it sounds like chrome makes a request to the server every time it has to generate a new short lived cookie, so if they do have very short lives (say a few minutes) chrome could be making a lot of requests to your server to generate new cookies.
Ed: reading a bit more closely it sounds like the request is more of a notification and actually all the real work happens in the user's browser, so you could presumably ignore it and hope the generated bandwidth to your server is pretty low.
mTLS but not mTLS. These Google standards are always so half-baked.
Shouldn't webauthn can do this already? why a separate proposal to do this again?
It seems like this requires you to have very high availability for the refresh endpoint. If that endpoint is unavailable, the user can end up being effectively logged out, which could lead to a confusing, and frustrating experience for the user.