Posts

Showing posts from December, 2006

python map vs C map.

Image
Ages ago I made some python implementations of map.

Map allows you to call a function on every element of a sequence.
[2,3,4] == map(lambda x:x+1, [1,2,3])

One implementation I made used threads. Which made threaded programing easier for myself.

However the interesting thing was comparing the speed of the builtin map to the map function accelerated by psyco.

It turns out that for a whole bunch of cases the psyco version of map is faster than the CPython version.

Here's some unittests and the map function which show the speed differences.
http://rene.f0o.com/~rene/maps/map_unittests.py
http://rene.f0o.com/~rene/maps/maps.py

It shows how nicely loopy code can be speed up with psyco.





Written by a Melbourne web developer. Available for your projects - python, php, mysql, e commerce, javascript, CMS, css, flash, actionscript, games, postgresql, xml.

Galcon released.

Image
Risk with spaceships. Galcon is kind of like the game risk - but with space ships.

The most fun can be had playing multiplayer on the internet with others.

Download the windows version


Download the MacOSX version

A linux version is also in the works. It should be up on the site within a week or so on the Downloads page.


Galcon was made using Python, pygame and PGU. It uses some C extension modules for the graphically intense parts. Like the additive blending modes for particle systems, and custom nice fast space ship drawing code.

It's great to see another quality game being made in python.

The networking uses UDP so that network games are quite good. I can even play games here in Australia with people in the USA and elsewhere. Even with 400ms ping times.

The single player game is fun too. With a whole bunch of different missions to play through. Different AI bots are used which give different challenges.

There's a small group of Galcon multiplayer people giving the game a whir…

Batching as applied to websites.

Image
I have 10 pages of content. Each of those pages is complex, and will most likely be visited by the person when they arrive.

40 images are on some pages. 130 small images all up. The images don't change that often. So after the first visit they are not downloaded again. With well specified expires, and cache directives sent out by the web server - the browsers don't even check to see if the images changed all of the time.

However a browser may only allow four requests at a time. This means it's going to take a while for all of the 130 images to download.


The obvious solution is to batch all of the requests into one. Since web browsers are dumb, they don't do this. HTTP was poorly designed in this regard. They didn't allow batched downloads. eg 'GET image1.jpg,image2.jpg,image3.jpg' or something similar is not allowed with HTTP.

Multiple requests in the same connection were not a good idea. Since there is latency involved for each file requested.

So one…