Thursday, February 25, 2010

uh0h, I made a logo. What do you think?

Made a quick little logo for a website called...

Tried a few different versions... showed my girl friend a few, and stuck with this one in the end. Made it with a simple DejaVu Sans Mono Bold font. Ya for the DejaVu font project. Modified the zero(0) a bit, then inverted the colors (ya! negative space). Played around with the letter spacing... applied an old school guassian blur filter to the right side, resized for web... and done!

What do you think?

The brief is 'something quick for yet-another-culture blog/zine called uh0h... as in uh oh I dropped a hammer on my foot.'.

update: Added some results in this image below. Which is updated with a pygame/freetype script. The idea is that the results in the image can be updated without the rss feed getting updated. Hoping to clean the script up and add it to pywebsite. The same technique seems to be used by some wordpress blogs with their 'comments:3' image links - so I think it will be useful for all sorts of things shown on blogs where frequent updates are required without having to update the post.

Wednesday, February 24, 2010

svn merging is easy...

Subversion(svn) merging is easy... iff you are using a modern svn version (1.6.x).
Here it is in short:

$ cd /dir/of/your-branch
$ svn merge ^/trunk
Where ^/trunk is the url of the branch you want to merge from. For more details, have a look in the svn book basic merging section.

Also this article on svn merging explains it fairly well.

The svn 1.6 release notes and 1.5 release notes also talk about various svn updates including merge enhancements amongst other goodies (like improved python bindings).

Really, merging is not too bad in subversion now. I know this post won't stop everyone from using 2004 reasons arguing over version control systems... but what ever.

Ok, bzr, hg, and git all have lots of nice features - but merging in svn is pretty easy now so please bash(zsh) it for other reasons if you must.

Now, let's move back to the vi VS emacs argument ;)

Friday, February 19, 2010

The secret to my web development productivity...

Can you see the secret to my web development productivity in this photo?

No it's not the red cup of coffee.

It's not the pieces of sticky tape on my laptop.

Follow the cable from my laptop...

and have a look under the desk.


... I'll wait whilst you have a look before telling you the answer.


That's right!!!


it's a joypad.

Using python and pygame, I've made a little app which listens for joystick events, and then deploys the website I'm working on.

With a mash, kick or prod of my foot, up goes my website. Deployed.

Deploy is just a fancy word for 'upload my website, do database migrations, restart app servers, run tests, rollback if there are failures... etc'.

For 5 british pounds, $7.8 USD, or 5.74 euros, you can get one of these joypads delivered to most places these days.

It's giving me too much pleasure pressing it every time I want to upload a new version of the website. Most live updates EVER today - and I have the joypad to thank for it.

The joypad is the secret to my web development productivity. Please don't tell anyone.

update: here's a slightly cleaned up, and slightly silly version of 'Joy to Deploy'...

Check out using bzr revision control:
    bzr co lp:joytodeploy
Or you can view source code. deployment_program

for example: with great justice echo 'ninjas are better than pirates'

Thursday, February 18, 2010

Genshi templates - header/footer templates, and including templates.

How to include a Genshi template inside of another genshi template?

Use "py:match" inside the template you want to include into other pages (eg. a sitelayout.html).

Then you can use "xi:include" in the pages you want the files included in. (eg. best_picture_of_my_cat_today.html)

Now, it's not a "simple include this template here" kind of thing. The py:match can find various parts of the file, and then replace them how they like. For example, a page layout template will match the header and footer with "py:match".

It is best explained with examples, and in the documentation:
  • pylons genshi example
  • Includes section of the genshi documentation where it explains "xi:include".
  • Genshi documentation where it explains py:match.

  • Hopefully this explains the genshi way of including a template into a template.

    Wednesday, February 17, 2010

    My fling with engine X.

    Projects that have X in the name are cool. Engines are cool (especially steam powered ones). Spelling engine 'ngin' gains 17.5 l33t points too. All combined, this makes the name nginx super schwet!

    I've been using nginx for a while now, and have found it quite good so far. Been moving a couple of websites to it, fooling around with it a lot.

    So far I've been able to use it for everything I've tried. Some of my apache configs are fairly long... so that is saying quite a bit. On my low memory (512MB) server it has saved quite a bit of memory - even though I've only moved over a couple of websites. Along with the cherrypy memory reduction work I did recently, my server has a bit more room to breathe... (and for me to waste memory on hosting other websites! ya!).

    Nginx has a good reputation as being rock solid - so I hope it holds true for me. Then perhaps I can get rid of apache completely (on this one server). I've tried to replace apache with other web servers before... but always come up with a reason to move back to apache. Either some application uses a feature that the other server does not support, or the other server is just not as robust as apache. I don't like fixing, or looking at servers... I just want them to work without hassle. I'm not afraid of working, and learning about a new webserver to work... It's just that some webservers are too high maintenance.

    Fastcgi is one way nginx allows you to host php and python websites. Nginx can also be used as a reverse proxy server. Personally I like to host python websites with cherrypy and a reverse proxy, and use fastcgi with php. The nginx proxy_pass configuration seems to work quite well... as does its simple-to-setup load balancing.

    Like all good software, I love to check out how it was made. Reading the nginx source code is breath of fresh air. Despite Igor[the main author] being Russian, the code is written in english C(with tiny smatterings of asm/perl)... with very few comments. It is a modular, and very clean code base. It doesn't seem to have any unittests... but it's still quality software.

    Our relationship is still a fling really. Before I invite nginx to meet my parents, I'll give it few more months. Ok, my love letter to nginx is done now.

    Thursday, February 04, 2010

    python - unifying c types from different packages.

    Python already has a number of objects to represent c types. However, there is a need to improve interoperability between systems using these c types. Below I explain the need, and discuss existing efforts to address this need. Then ways to transparently translate between the various type systems without each system needing to know about each other are also discussed.

    In the ctypes library - you can represent an unsigned 32bit integer with ctypes.c_unit32.

    In the array, and struct modules there are different array type codes. For example, 'L' represents unsigned int with a minimum of 4 bytes on 32bit systems and 8 on 64bit systems.

    numpy, cython, pyopengl and other python extensions have their own types representing c types too. Most extensions which link up to languages which use static typing represent basic c types to python in some way.

    Not only libraries, but various compilers and translation tools also use c types. For example tinypyC++, cython, swig, etc. Also type inference is done from things like shedskin, and rpython - but they represent types internally with their own type objects.

    Standardising on one set of c type objects or string codes would give some compatibility advantages. However, that will be hard to change for backwards compatibility reasons. A mapping between the various types should provide plenty of the advantages. For example, to be able to translate from a ctypes to a numpy type should be fairly simple.

    Here you can see that numpy, ctypes and the python.array module already have integration:

    >>> import numpy, cython, ctypes, OpenGL.GL

    >>> numpy.array([1,2,3,4.2], ctypes.c_uint32)
    array([1, 2, 3, 4], dtype=uint32)

    >>> numpy.array([1,2,3,4.2], numpy.uint32)
    array([1, 2, 3, 4], dtype=uint32)

    >>> numpy.array([1,2,3,4.2], 'L')
    array([1, 2, 3, 4], dtype=uint32)

    >>> numpy.array([1,2,3,4.2], OpenGL.GL.GLuint)
    array([1, 2, 3, 4], dtype=uint32)

    >>> numpy.array([1,2,3,4.2], cython.uint)
    Traceback (most recent call last):
    File "", line 1, in
    TypeError: data type not understood

    Pretty cool hey? Numpy already knows about many of the type variables available in the python ecosystem. With the notable exception of cython.

    I think there is a need to try and standardise use of c type variables - so that more code can interoperate without each system needing to know about each other systems type objects. Alternatively a translation layer can be made in place.

    For example an adaptor something like this:
    # this registers two types which are the same.
    >>> type_registry.register_types(numpy.uint32,
    ... cython.uint)

    # here numpy.array does not know about cython directly,
    # but can look at the registered type we just did to get it from there.
    >>> numpy.array([1,2,3,4.2], cython.uint)
    array([1, 2, 3, 4], dtype=uint32)

    # if numpy does not know about the adaptor registry then we can still
    # use the registry, if only in a more ugly - non transparent way
    # by calling a translate function directly:
    >>> numpy.array([1,2,3,4.2],
    ... type_registry.translate(cython.uint))
    array([1, 2, 3, 4], dtype=uint32)

    Instead of an adaptor, a magic variable could be used which would contain the 'standard c type variable' from python. For example - cython.uint.__ctype__ == ctype.c_unit32. Then numpy could look for a __ctype__ variable and use that - without having to be extended for every system that is made. One problem with a magic variable over registered types is that some python objects can not have those magic variables assigned. For example, try adding a __ctype__ variable to an int instance - it won't work.

    Either the adaptor, or the magic variable would let cython - and other systems use their own type objects and still have a way to translate the types to the standard python c type variables (when/if they are chosen).

    A simple mapping (with a dict) from a package to the standard c type objects/type codes is another method that could be used. This will allow a package to fairly easily hook into the eco system. For example cython could have a __c_type_mappings__ magic variable at the top level of its package. Then another package looking to translate the type could look to the package for this __c_type_mappings__ variable. The advantage of this is that many times variables can be injected into a package but not into extension types in the package. On the other hand this feels icky.

    The c types from the ctype package seem to be a fairly good choice for this. PyopenGL 3.x series uses the ctypes as its types. eg, OpenGL.GL.GLuint == ctypes.c_uint32. Except ctypes is a fairly big dependency just for a few types.

    The buffer pep 3118 being introduced into python/numpy to make buffer sharing between libraries is a similar use case. However it involves sharing instances of differently typed buffers - and has quite clever semantics for many use cases. The formats from that pep could also probably be used to share type information.

    The buffer protocol pep specifies extra format strings over the ones specified in the python.array module. So as to be able to specify a more complete set of type, and memory layouts. So rather than using the ctypes types, it probably makes sense to use the new buffer protocol format codes specified in pep 3118. As they are just strings without any further dependencies on the rest of the ctypes machinery (eg libffi etc). Of course, if you are using ctypes already - then depending on it is not a problem.

    Of course ctypes.c_uint32 is more descriptive than 'L' to many people, so the format codes (eg 'L') should just be used for specification. People should still use their own type objects - but provide a translation to their format codes as specified in pep 3118 the new buffer protocol.

    The codes specified in pep 3118 will probably need to be expanded as more c types need to be described. For example bit fields and bit depths of types are not described in the pep. Many systems specify the bit depth of the type - numpy, ctypes, opengl, etc. For example they use 'uint32' rather than 'unsigned int'. Also bit fields are becoming more common in C so they should be added to the type code formats in someway too.

    In conclusion there is a need for interoperability of c types from various python extensions, libraries and python compilers. The pep 3118 format codes, and ctypes types are good candidates to work with for standard c type objects/codes. Adaptor registries, simple mappings, and/or magic variable names could be used to enhance interoperability.