Batching as applied to websites.
I have 10 pages of content. Each of those pages is complex, and will most likely be visited by the person when they arrive.
40 images are on some pages. 130 small images all up. The images don't change that often. So after the first visit they are not downloaded again. With well specified expires, and cache directives sent out by the web server - the browsers don't even check to see if the images changed all of the time.
However a browser may only allow four requests at a time. This means it's going to take a while for all of the 130 images to download.
The obvious solution is to batch all of the requests into one. Since web browsers are dumb, they don't do this. HTTP was poorly designed in this regard. They didn't allow batched downloads. eg 'GET image1.jpg,image2.jpg,image3.jpg' or something similar is not allowed with HTTP.
Multiple requests in the same connection were not a good idea. Since there is latency involved for each file requested.
So one hacky way around this is to batch all of the content together yourself.
Using code and algorithms I developed for batching textures with OpenGL programs, I can combine images into one big image. Then you use css to set clipping areas, width and height of each image - so it only shows the portion of the large image it is supposed to.
For the html content, I encoded it into a javascript file using a json encoder. Then constructed the pages with javascript.
The result? 140 requests down to two requests. That is a 70x improvement.
As you can imagine, this results in much faster browser rendering of lots of data. Meaning that I can put a lot more on a web page than with a non batching method.
I wrote an article on Batching Apis.
Written by a Melbourne web developer. Available for your projects - python, php, mysql, e commerce, javascript, CMS, css, flash, actionscript, games, postgresql, xml, asm.
40 images are on some pages. 130 small images all up. The images don't change that often. So after the first visit they are not downloaded again. With well specified expires, and cache directives sent out by the web server - the browsers don't even check to see if the images changed all of the time.
However a browser may only allow four requests at a time. This means it's going to take a while for all of the 130 images to download.
The obvious solution is to batch all of the requests into one. Since web browsers are dumb, they don't do this. HTTP was poorly designed in this regard. They didn't allow batched downloads. eg 'GET image1.jpg,image2.jpg,image3.jpg' or something similar is not allowed with HTTP.
Multiple requests in the same connection were not a good idea. Since there is latency involved for each file requested.
So one hacky way around this is to batch all of the content together yourself.
Using code and algorithms I developed for batching textures with OpenGL programs, I can combine images into one big image. Then you use css to set clipping areas, width and height of each image - so it only shows the portion of the large image it is supposed to.
For the html content, I encoded it into a javascript file using a json encoder. Then constructed the pages with javascript.
The result? 140 requests down to two requests. That is a 70x improvement.
As you can imagine, this results in much faster browser rendering of lots of data. Meaning that I can put a lot more on a web page than with a non batching method.
I wrote an article on Batching Apis.
Written by a Melbourne web developer. Available for your projects - python, php, mysql, e commerce, javascript, CMS, css, flash, actionscript, games, postgresql, xml, asm.
Comments
http://alistapart.com/articles/sprites/
Yeah, that article is getting at a similar thing I am talking about with images. However it doesn't talk about the process of constructing that master image.
Doing rollovers with CSS and one image is quite common now.
For the process of constructing a big master image automatically have a look at articles on combining lightmaps.
Also have a look at some 2D engines which use 3D hardware APIs. Some of which use the same technique of combining images.
Some other issues are image type. Eg, png, jpg, and gif. Since each have different properties, you need to be careful about which ones you combine together. It can be possible to combine multiple gifs together into a large gif as long as there is enough color depth. If jpegs are already heavily compressed, it can look kind of dodgey when they are compressed again. So having access to source images is useful.
You can see what I mean for a slow loading site with 100 or so images with the current online version of pretendpaper.com ... but then again you've probably seen a slow loading site before - so nothing new.