Why don't browsers block cross-site POSTs by default?

In theory your suggestion is perfectly reasonable. If browsers blocked all cross origin POST requests by default, and it required a CORS policy to unlock them, a lot of all the CSRF vulnerabilities out there would magically disappear. As a developer, you would only need to make sure to not change server state on GET requests. No tokens would be needed. That would be nice.

But that is not how the internet was built back in the day, and there is no way to change it now. There are many legitimate uses of cross origin POST requests. If browsers suddenly changed the rules mid game and forbade this, sites relying on the old rules would stop working. Breaking existing sites like that is something that we try to avoid to the largest extent possible. Unfortunately we have to live with our past.

So is there any way we could tweak the system to work better without breaking anything? One way would be to introduce a new HTTP verb, let's call it PEST, that works just like POST only that all PEST requests are preflighted and subject to CORS policies. That is just a silly suggestion I made up, but it shows how we can develop the standards without breaking them.


The problem is not the request method: CSRF could also be done with a GET request. The problem is instead that authentication information like (session) cookies or the Authorization header are automatically included with the cross-site request, thus making CSRF possible. Therefore the mitigation would not be to prohibit such methods to be used within cross site requests but instead to not send these authentication information.

With cookies there is a proposal for a samesite flag which would make sure that the cookie is not sent within cross-site requests. Unfortunately the flag is currently only available in Chrome, but will become available for Firefox with v60 in May 2018. Also, it would have been much better if this restriction would be enabled by default and would need to be explicitly changed to be less secure (like in CORS) instead of being insecure by default. Only this would mean a serious change to the current behavior and would probably break many existing applications.


I partly disagree with Anders on

But that is not how the internet was built back in the day, and there is no way to change it now.

The developers of major browsers do have pretty much power to change the Internet and guide web developers to the direction they want. Obsoleting cross-site POST data would be possible, if it was seen as a major threat. There's examples of such progress on other things, although it's not sudden nor fast:

  • Flash. While it was formerly seen as the future of the web, major browsers have announced not to support it in the future, and web developers are adjusting.

  • HTTPS has been slowly forced by the browsers, with small steps towards warning about plain HTTP being insecure. We may eventually see a world where plain HTTP is slowly suffocated to death.

I'd like to see this to develop towards prioritizing security over compatibility more widely. Naturally, such a big change would not be something to do over-nigh, but by giving alternatives and discouraging it first. The path to achieve this could be like this:

  1. Introducing a Same-Origin Policy header for POST requests, that allows explicit consent.
  2. Starting to show warning of possible security problem on cross-site POST without the consent.
  3. Sites still needing this functionality starts slowly to adapt, to get rid of the warning.
  4. After a long transitional period the action could be changed to be more rough.

Discouraging POST on plain HTTP is quite close to discouraging cross-site POST, both being against the standards. This is just conscious loss of backward compatibility, for increasing security.