Understanding the Access-Control-Allow-Origin HTTP header policy in simple terms

I came across this interesting question on programmers.stackexchange.com :

How do web servers enforce the same-origin policy?
It reminded me of the long standing discussion on web-forums on a topic which must have nagged any web-developer by now, although the fundamental principle of the same origin policy is very easy to grasp:

CORS explained
Fig 1.0, A user can request the same site from the server as the url origin in the Browser but not any others which aren't specifically specified as such -Download as Svg

In short, web servers do not enforce the same origin policy. The Access Control Allow Origin is meant for a web client, before even requesting web-content. If properly implemented it decides upon the client's ability to even make a Web-Browser request in the first place -such as an XmlHttpRequest which proceeds without user interaction.
Of course the Access-Control-Allow-Origin HTTP header will not impact a command line program like curl. After all, curl runs with user privileges. However, Web-content must not.

One important reason is that the ACAO header protects the servers themselves from rampant DDOS, - Distributed Denial of Service- attacks.

The ACAO as a HTTP response-header is meant for the web client to be interpreted, operating under the assumption that the majority of human internet users are browsing the web through major browser vendors who adhere and implement the W3C recommended draft. They should, since most browser vendors benefit from a fast, accessible internet.
Otherwise anyone, could just copy and paste a few lines of javascript code into a malicious website, or an JavaScript injection-attacked website, that runs a simple loop, which makes an Ajax HTTP GET's or POST's request to a foreign domain. This would occur without user interaction, and the ability to multithread, ultimately leading to catastrophic results for a typical attacked web-server, with attacker numbers as low as a few thousand. These would be fully unaware, and browsing a malicious site as usual, whilst performing tens of web-request per second in the background. For the web-server this would mean handling tens of thousands of web-requests per second, probably construed such as to increase the processing complexity for a given request.

That is why accessing a cross-origin site is an opt-in process, specified through the ACAO HTTP header. The user browsing the web, can access said origin site any time through a user-aware interaction, i.e. an internet link. Just like you can user-aware copy or paste content from or to your clipboard, but not any other way / plugins aside.

Future web-direction plays a role:
Most major web-browser vendors have aligned and stated the following goals: 
  • plugin removal, as a step towards a fully sandboxed browser environment and

A typical client-server exchange (Source: MDN):

OPTIONS /resources/post-here/ HTTP/1.1
Host: bar.other
User-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10.5; en-US; rv:1.9.1b3pre) Gecko/20081130 Minefield/3.1b3pre
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: en-us,en;q=0.5
Accept-Encoding: gzip,deflate
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7
Connection: keep-alive
Origin: http://foo.example
Access-Control-Request-Method: POST
Access-Control-Request-Headers: X-PINGOTHER

HTTP/1.1 200 OK
Date: Mon, 01 Dec 2008 01:15:39 GMT
Server: Apache/2.0.61 (Unix)
Access-Control-Allow-Origin: http://foo.example
Access-Control-Allow-Methods: POST, GET, OPTIONS
Access-Control-Allow-Headers: X-PINGOTHER
Access-Control-Max-Age: 1728000
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 0
Keep-Alive: timeout=2, max=100
Connection: Keep-Alive
Content-Type: text/plain

Security Considerations:

Security restrictions can be decently established solely using a combination of TSL 2/3, strong session-IDs, TANs, two factor authentication and similar more or less unobtrusive methods.

'Google' has this to show and say about DDOS