July 01, 2012


Writing about web page Http://www.arm.com

Starting a new job tomorrow. Trepidatious. Accommodation not ready, must juggle . There's a plan, but that's not important. No idea of exact job, number of interns, or whether I'll get fed.

January 15, 2012


Writing about web page http://www.dailymail.co.uk/news/article-2086771/Richard-Wilson-Dispatches-How-humans-replaced-machines.html

Richard Wilson's (aka Victor Meldrew out of One Foot in the Grave) conclusions about automated phone-lines and self-checkouts are interesting but flawed. It's not automation which doesn't help us - it's automation done badly.

December 07, 2011

Setting up a comfortable environment for hacking PostgreSQL

  1. Create a new user account on your computer.
  2. Do a git clone of the master source tree. i.e
    git clone git://git.postgresql.org/git/postgresql.git
  3. While that does its thing, open up .bashrc and add some exports:
    export CFLAGS='-g'
    export PATH
    export BINDIR=$HOME/bin
    export DOCDIR=$HOME/share/doc/postgresql
    export HTMLDIR=$HOME/share/doc/postgresql
    export INCLUDEDIR=$HOME/include
    export PKGINCLUDEDIR=$HOME/include/postgresql
    export LIBDIR=$HOME/lib
    export PKGLIBDIR=$HOME/lib/postgresql
    export LOCALEDIR=$HOME/share/locale
    export MANDIR=$HOME/share/man
    export SHAREDIR=$HOME/share/postgresql
    export SYSCONFDIR=$HOME/etc/postgresql
    export PGXS=$HOME/lib/postgresql/pgxs/src/makefiles/pgxs.mk
    export CONFIGURE="--prefix=$HOME' 'CFLAGS=-g"
    These are to make the build process easy.
  4. Once git has finished cloneing, cd into the ~/postgresql directory and run git checkout -b [branch_name] . You end up committing into this branch and diffing it against the master one.
  5. Then configure: ./configure . Then make then make install.
  6. The installation script should have generated lots of new directories, including bin . However, must cd into bin and initdb . Output should be something like:
    postdev@linux-e6ux:~/bin> initdb
    The files belonging to this database system will be owned by user "postdev".
    This user must also own the server process.
    The database cluster will be initialized with locale en_GB.UTF-8.
    The default database encoding has accordingly been set to UTF8.
    The default text search configuration will be set to "english".
    creating directory /home/postdev/PGDATA ... ok
    creating subdirectories ... ok
    selecting default max_connections ... 100
    selecting default shared_buffers ... 32MB
    creating configuration files ... ok
    creating template1 database in /home/postdev/PGDATA/base/1 ... ok
    initializing pg_authid ... ok
    initializing dependencies ... ok
    creating system views ... ok
    loading system objects' descriptions ... ok
    creating collations ... ok
    creating conversions ... ok
    creating dictionaries ... ok
    setting privileges on built-in objects ... ok
    creating information schema ... ok
    loading PL/pgSQL server-side language ... ok
    vacuuming database template1 ... ok
    copying template1 to template0 ... ok
    copying template1 to postgres ... ok
    WARNING: enabling "trust" authentication for local connections
    You can change this by editing pg_hba.conf or using the -A option the
    next time you run initdb.
    Success. You can now start the database server using:
        postgres -D /home/postdev/PGDATA
        pg_ctl -D /home/postdev/PGDATA -l logfile start
  7. It's time to start the server. Open a new terminal and execute postgres .
    postdev@linux-e6ux:~/bin> postgres
    LOG:  database system was shut down at 2011-12-07 00:49:47 GMT
    LOG:  database system is ready to accept connections
    LOG:  autovacuum launcher started
  8. Then create a default db by typing createdb [user_name] .
    postdev@linux-e6ux:~/bin> createdb postdev
  9. Type psql to verify that the installation is working.
    postdev@linux-e6ux:~/bin> psql
    psql (9.2devel)
    Type "help" for help.
    postdev=# SHOW TABLES

November 23, 2011


Writing about web page http://www.zdnet.com/blog/hardware/your-thanksgiving-mission-help-eradicate-internet-explorer-6/16505?tag=content;feature-roto

I’ve given you your Thanksgiving survival kit, now your mission, should you decided to accept it, is to help eradicate Internet Explorer 6 from US soil.

Are we still on this topic? Really? I don’t understand the zealous commandments issued by “web-people”, such as Adrian Kingsley-Hughes calling for the elimination of a piece of software still used by 7.9% of the world’s Internet users. What’s more, I understand why he, writing on a ZDNet sub-blog called Hardware 2.0, is so preachy about IE6, and how terrible it is, and why it should just curl up in a corner and die.

I’m fairly sure that IE6 is lurking on a lot of home machines … the sorts of machines that your parents or grandparents might be using.

And what’s wrong with that? Nothing.

Check all those XP machines for IE6. If you find it, get rid of it. You won’t be able to upgrade the system to IE9 (because it won’t install on XP) so you’re going to have to look elsewhere. My preference would be Chrome (it auto-updates, so hopefully you won’t have to do it next year!), but go with what you think is best.

Go with what I think is best. What I think is best is not arbitrarily telling people what they can and cannot do with their machine. If it ain’t broke, don’t fix it. When it comes to my relatives, I have a zero-intervention policy - intervention results in an inevitable string of phone calls saying their computer’s gone all weird and they can’t find the Internet button.

Not only are there old versions of IE still in use out there, but also old versions of Firefox and Opera (Chrome updates itself nicely so there’s little need to worry … although it might be worth checking just to make sure it’s working).


Again – why? Why should I trouble them with upgrading? Upgrading only causes stress and disorientation, and usually for no discernible benefit. Do I lament that the DCS machines run Firefox 3.6? Of course not, I would gain nothing if they pushed out the latest Firefox nightly each day – Facebook and the Outlook Web App run just fine. The people who run IE6 don’t care about security, or high-performance Javascript, or WebGL or whatever fancy soon-to-be-broken semi-openly-proprietarily spotty-soon-to-be-supported-in-several-different-flavors-in-several-different-browsers things like Dart or WebM or WebSockets. They care about having a simple, consistent and familiar computing environment, free from troubles. Updating (even auto-updating) introduces complication where none need exist: if they open their browser and Chrome shuffles the interface around again, they feel frustrated and their already-tenuous grip on a computer feels threatened.

And this is before we get to the issue of whether evolving “web standards” even are a good thing or bad thing. They might put a CD full of photographs in their computer – perhaps one of the PhotoCDs from Boots – and it has a HTML menu. If the HTML menu doesn’t render properly because of the bizarre focus on “standards” and “HTML purity” pushed by the eliminators, then they can’t see the photos. This is not something that can be fixed either: that HTML page is final, statically generated and burned onto a CD. They can’t see your baby photos, and that makes grandma sad. It might even use – gasp – Flash, and if you read this website and took they steps they describe as necessary for the “progress of the web”, they wouldn’t be able to see it either!

  Together, we can eradicate IE6!

If the new is genuinely better than the old, the IE6 “problem” should resolve itself, with no action on my part. No battle-cry, no call to arms, will make me change my non-upgrading stance. Ultimately, IE6 works as well today as it did when it was released, and for everyone who doesn’t care about Websockets, that’s good enough.

August 14, 2011

Haskell, First Thoughts

Over the past few days I’ve been attempting to familiarize myself with the bits of Haskell I find interesting. My initial impression of the language has not been good so far. In the last few days, my brain has been battered with:

1. documentation so impenetrable that industrial diamonds wither in shame;

2. performance so dire it would make the computing capabilities of a snail look stellar;

3. an obsession with purity so intense it should be considered flammable; and

4. a gulf between what I think it will do and what it does so vast that whole live animals could fit in there.

Terrible Documentation

Haskell’s documentation is terrible. From the Haskell Wiki:

Monads in Haskell can be thought of as composable computation descriptions.

Bonk. I’m out at the first sentence. “Can be thought of as?” Surely there can be no other way to think about other than the way encoded into the compiler? And what the hell is a “computation description?” Assuming that the next bit of the article must expand this original definition, I pick my head up off the table and read on:

The essence of monad is thus separation of composition timeline from the composed computation's execution timeline, as well as the ability of computation to implicitly carry extra data as pertaining to the computation itself in addition to its one (hence the name) output.

I’m out again. “The essence of a monad?” Seperation of compisition timeline from the composed computation’s execution timeline?” That last thing? The fact is the two opening sentences of that article are loose and meaningless sequences of words to anyone who doesn’t understand what a monad is, but they’ve gone to the monad page for precisely that purpose! How does Wikipedia explain what a monad is?

In functional programming, a monad is a programming structure that represents computations.

Good start. Instantly, my knowledge of what a monad is has gone from nothing, to something. It keeps getting clearer too:

Monads are a kind of abstract data type constructor that encapsulate program logic instead of data [...]. A defined monad allows the programmer to chain actions together to build a pipeline to process data in various steps, in which each action is decorated with additional processing rules provided by the monad. Programs written in functional style can make use of monads to structure procedures that include sequenced operations, or to define some arbitrary control flows (like handling concurrency, continuations, side effects such as input/output, or exceptions).In the first paragraph, the Wikipedia article instantly explains:

1. what a monad is,

2. what it does,

3. how languages like Haskell are likely to use them.

Wikipedia wins.

Dire performance

As a test, I wrote both Haskell and C versions of a program which sums integers passed in from standard input. I tested it with one million integers randomly generated from /dev/urandom. On my 2007 Dell Pentium Dual – the Haskell version took 12 seconds to complete the operation, vs 0.6 for the C program, so the C program is already 20 times faster. But the worst thing is that’s a Haskell best case – compiled and optimized in advance by ghc. If you’re stuck with runhugs in the department, you’d be lucky to get that done in 20 minutes. Fortunately, the Haskell documentation claims it has excellent multiprocessing capabilities – with uniprocessing performance that dire, they’d better be.

Triple filtered for an irritating outcome

Haskell’s obsession with purity is one which I find perverse. In Haskell, the long and short of it is that functions can only return values, and can’t alter the overall state of the program. As a result, IO operations can’t be considered “pure” because once a string’s been read from standard input – “slurrp” – it’s gone! Haskell gets around this somehow using these things called monads.

While Haskell forces you to deal with these poorly-explained monads just to do IO, it makes a concession towards n00bs like me in the form of the do keyword. But the do keyword actually makes things worse because code you write in a do-block appears to be procedural: despite the fact the code’s sequence is explicitly encoded, the do keyword is really just syntactic sugar for something which chains together monadic expressions. Hmmm.

The Conclusion

It appears that I’m criticizing Haskell for having no functional value whatsoever (what with it being slowand poorly documented), and indeed I can’t name any programs that have gained much from being written in it. So why am I going to persevere with it, publish my notes and take CS256? Easy!

I’m not a careers person[i], nor do I read ZDNET whitepapers about how “big data” allows the Emperor to “apply complex aggregate processing and data-mining techniques to real-time crowd-sourced multiplexed information” to define his fashion stratagem. But Haskell told me that I’m stuck in a procedural mindset, where all I can think about are loops – but Haskell is so high-level it abstracts away looping: there simply is no way to express it. It’s just a detail – exactly like the assembly code generated from a C program. It must be this ability to eliminate details like the necessity to loop which gives it applications. But while the concept of functional programming is interesting, the idea of doing it in Haskell isn’t appealing at the moment – all these problems reared their heads after finishing just one Haskell program .[ii]

[i] Seriously, ask me what I want to be doing in two years, I’ll change the subject.

[ii] Well, two if you count the version which ran out of stack space.

May 31, 2011


Writing about web page http://www.youtube.com/watch?v=oCqpLZzx_-U&NR=1

After using computers every day throughout my life, I admit that I forgot - or at least never thought about - their amazing ability to change the world for the better. While revision for my first year modules is tough and boring - I think I might find it a little easier after watching this.

May 05, 2011


[TAKES DAMAGE]. But wait -- why would you do this to one of your own kind? [TAKES FURTHER DAMAGE.]

May 03, 2011


Wondered how to plot lots of inequalities.

These Mathematica functions let you combine lots of inequality plots (they're all plotted a different colour for clarity.)

randomCol[] := (Return[
RGBColor[RandomReal[], RandomReal[], RandomReal[],0.5]

plotSet[g_, xvars_, yvars_] := ( Return[
Table [
RegionPlot[g[[i]], xvars, yvars,  Mesh -> 10,
MeshShading -> {{randomCol[]}} ]
, {i, 1, Length[g]}

It can then be called like:

Show[plotSet[{x + 2 y <= 6, 2 x + y <= 8, y <= 1.9, -x + y <= 1}, {x, 
0, 5}, {y, 0, 5}]]

The result looks something like this:

Plot using plotSet function.

May 02, 2011


It's bank holiday Monday. Both undergraduate printers are out of toner. To the bat-maths-department-cave!

csukak@host-233-144:~> lpq
ma_a001_iprint is ready
no entries

Excellent. Now to get the PDF to print over here.

csukak@host-233-144:~> ssh csukak@joshua.warwick.ac.uk 'cat draft.pdf' | lpr -Pma_a001_iprint

April 18, 2011


While we have this break from Uni, I got called in to mantain an application I wrote last year. It's all in C#, .NET 4.0, WPF - very modern - yet it falls over all the time. Having (hopefully) fixed some of the most critical bugs, I walk through the code and find stuff like this.

   public static bool ExistsInStore(Guid Data) {
            var Query = from f in Context.Formulations where f.FIdentifier == Data select f;
            if (Query.Count() == 1) return true;
            else return false;

The idea of this code is to see if a record exists in the database based on it's identifier. In SQL you might do this using "SELECT Count(FIdentifier) FROM Formulations WHERE FIdentifier == some value", but the application doesn't use any SQL - only LINQ.

Not to bash LINQ: using it cut my debugging time in half (despite the number of bugs that made it through, I did spend a long time debugging the damn thing). But I suspect what it's doing is iterating over a collection of objects returned by the Entity Framework (Microsoft's new ORM framework built into .NET) and each object needs to be read from the database and converted into an object. I suspect that I don't need to actually get the whole object. So I figured that I'd just been going crazy with six weeks of work and I rewrote it.

        public static bool ExistsInStore(Guid Data) {
            var query = Context.Formulations.Count(n => n.FIdentifier == Data);
            return query == 1;

Question: how can you tell if the new version works the same as the old version? Er...

While it might seem that the new code would produce exactly the same result as the old, I don't actually know that. Blanket-optimizing the entire thing might silently introduce data-integrity errors, or remove a bug that the rest of the application logic depends on. So I add some debugging code:

        public static bool ExistsInStore(Guid Data) {
            bool oldR, newR;
            var Query = from f in Context.Formulations where f.FIdentifier == Data select f;
            if (Query.Count() == 1) oldR = true;
            else oldR = false;

var query = Context.Formulations.Count(n => n.FIdentifier == Data);
            newR = query == 1;

        Debug.Assert(newR == oldR);
        return oldR;

I can then open the call stack of the function - which tells me where it's called - and retest the application. Low and behold - it works! But I still don't know if the optimization optimizes at all, because I don't have a battery of tests I can do a before-and-after style showdown.

Long and short of it - I've discovered that optimization is hard unless you write tests, and the time you spent writing tests could be spent optimizing, which can't really be done unless you write tests, which takes time away from optimization which can't really be done unless you write tests. RECURSE YOUR WAY INTO MADNESS.

January 2020

Mo Tu We Th Fr Sa Su
Dec |  Today  |
      1 2 3 4 5
6 7 8 9 10 11 12
13 14 15 16 17 18 19
20 21 22 23 24 25 26
27 28 29 30 31      

Search this blog


Most recent comments

  • What you have not mentioned is that IE6 has some major exploits (leading to possible viruses) that h… by Splice Marketing Ltd on this entry

Blog archive

RSS2.0 Atom
Not signed in
Sign in

Powered by BlogBuilder