I have 10 pages of content. Each of those pages is complex, and will most likely be visited by the person when they arrive.
40 images are on some pages. 130 small images all up. The images don't change that often. So after the first visit they are not downloaded again. With well specified expires, and cache directives sent out by the web server - the browsers don't even check to see if the images changed all of the time.
However a browser may only allow four requests at a time. This means it's going to take a while for all of the 130 images to download.
The obvious solution is to batch all of the requests into one. Since web browsers are dumb, they don't do this. HTTP was poorly designed in this regard. They didn't allow batched downloads. eg 'GET image1.jpg,image2.jpg,image3.jpg' or something similar is not allowed with HTTP.
Multiple requests in the same connection were not a good idea. Since there is latency involved for each file requested.
So one hacky way around this is to batch all of the content together yourself.
Using code and algorithms I developed for batching textures with OpenGL programs, I can combine images into one big image. Then you use css to set clipping areas, width and height of each image - so it only shows the portion of the large image it is supposed to.
The result? 140 requests down to two requests. That is a 70x improvement.
As you can imagine, this results in much faster browser rendering of lots of data. Meaning that I can put a lot more on a web page than with a non batching method.
I wrote an article on Batching Apis.