Increase Concurrent HTTP calls

what you can do is dispatch that load to many subdomains. Instead of using only www, you use www1, www2, www3, www4 and round robin between those clientside.

You'll need to configure your web server so that www* subdomains ends up to the same place.


Just to extend on Charly Koza's answer as this has some limitations depending on user count etc.

First thing you should look at is into using CDNs, I will assume you have done this already.

The fact you are only hitting one server is not a problem, the browser will allow concurrent connections based on DNS host and not just IP.

If you have access to your DNS management and can dynamically spawn up a new subdomain, look to free services like CloudFlare's API.

Alternatively, create a wildcard domain, which will allow any subdomain to point to 1 server.

In this way, on your server side, you can identify if the user already has X amount of connections active, if so, the following scenario can be done:

  • Dynamically create a new Subdomain on the same IP, or if using the Wildcard, create a random subdomain newDomainRandom.domain.com
  • Then return the user a 301 redirect to the new domain, the users Internet Client will then register this as a new connection to another domain.

There is a lot of Pseudo work here, but this is more of a networking issue that a coding issue.

Compulsory warning on this method though :

There are no limits in using 301 redirects on a site. You can implement more than 100k of 301 redirects without getting any penalty. But: Too many 301 redirects put unnecessary load on the server and reduce speed.


14 requests is not an issue. It becomes issue only if server response time is large. So most likely the root issue is the server side performance

These solutions are possible:

  • use HTTP cache (server should send corresponding headers)
  • use cache at the middle (e.g. CDN, Varnish)
  • optimize server side
    • content related:
      • combine several requests into one
      • remove duplicated information in requests
      • do not load information which client doesn't render
    • use cache at server side
    • etc... any other approach... there are plenty of them.

UPDATE:

Suggestions for people who have to download static resources and have troubles with that...

  1. Check size of resources and optimize where possible

  2. Use HTTP2 - it shares connection between requests, so server will be less loaded and respond faster, mostly because it doesn't need to establish separate SSL connection per each request (web is secure nova days, everybody use HTTPS)

  3. HTTP specification limits number of parallel requests to single domain. This leaves chance to increase count of parallel requests using several different domains (or subdomains) to download required resources


How to call more than the maximum http calls set by browsers to one domain.

That is a HTTP/1.1 limit (6-8), If you are able to change the server (you tag this question as http), the best solution is using HTTP/2 (RFC 7540) instead of HTTP/1.1.

HTTP/2 multiplex many HTTP requests on a single connection, see this diagram. When HTTP/1.1 has a limit of 6-8 roughly, HTTP/2 does not have a standard limit but say that "It is recommended that this value (SETTINGS_MAX_CONCURRENT_STREAMS) be no smaller than 100" (RFC 7540). That number is better than 6-8.