Favourite blogs for Connection reset by beer

Chum....ps » All entries

February 16, 2015

History and Policy Parenting Forum Workshops

Writing about web page http://www2.warwick.ac.uk/fac/arts/history/chm/outreach/parenting/events/workshops/

Looking forward to our second History and Policy Parenting Workshop today. JulieMarie Strange (Manchester) is leading the session and will talk generally on the subject as well as giving examples from her own research on fatherhood in late Victorian and Edwardian Britain. Details from the session will appear on the blog soon along with materials from the first workshop on working with ethnic and religious minority families.


September 29, 2014

All Roads Lead to Coventry – continuing the journey

This is a reposting of Chris Bilton's blog on the Centre for Cultural Policy Studies blog, 'Culture Matters'


Start of the walk

All Roads Lead to Coventry was a joint venture organised by Warwick Creative Exchange, Coventry Artspace and Warwick Arts Centre – the aim was to invite universities, arts organisations and council officers to share ideas about the longer term future for the city and the role that arts and culture could play in reimagining that future. Rather than spending day in meetings, we decided to invite participants to spend a morning walking and talking through Coventry, discovering and rediscovering some of the ‘hidden gems’ which are tucked away across the city (museums, theatres, statues and historic buildings) as well as encountering the everyday experience of a richly varied (but still compact) contemporary city. And so, on a sunny September morning in Coventry, our journey began.

Our walks started from ten locations across the city, converging in the city centre. My own journey took in the village atmosphere of Earlsdon, the statue of Frank Whittle (inventor of the jet engine and a reminder of the city’s heritage as a centre of engineering, innovation and manufacturing) and the Albany Theatre, a beautiful 600-seat theatre run entirely by volunteers with big plans to relaunch itself as a centre for community arts. Like many of the places we visited, the Albany is hidden in plain sight, tucked away in an impressive former college which it shares, rather bizarrely, with a Premier Inn.


Albany Theatre - what next?

Other groups walked along the canal, visited Coventry’s music museum (home of The Specials and a back catalogue of ska and reggae), looked at the new Fargo development or took in the Hillfields neighbourhood.

Coventry is often identified with its history of motor manufacturing and today it is still dominated by the ringroad – an impressive feat of engineering and a vital artery for people who live and work in the area – but also a symbolic barrier which seems to cut off the city centre from the diverse neighbourhoods beyond. So it was refreshing to be able to walk the city, meandering through its many histories, guided by people who knew its hidden corners and byways.

All walks led to EGO performance space – here we were served a wholesome lunch, shared stories from the day and tried to connect our experiences of the city into a bigger picture.

It became clear in the conversations during and after the walks that people who know Coventry well enjoy sharing its secret histories and hidden pockets of culture and community. But for outsiders, like many of the academics and students at the city’s two universities (including myself), the city is hard to ‘read’ (and hard to navigate!). One of the challenges we confronted was how to connect the city’s diverse histories and communities into a coherent narrative – how can we sell this complicated, fragmented, reticent city in a world dominated by brash city ‘brands’ and honeypot tourist destinations? Coventry is a city which wears its history lightly – ancient cottages house kebab shops, the old city walls skirt brutal modernist buildings. How can we connect the city’s many pasts and presents into a new future? What part can the arts and culture play in drawing the city together and opening it out to visitors? What are the barriers in the way of what we want the city to become?


These questions will be part of a series of ongoing conversations between artists, academics and council officers and members over the coming months. Following the walk, the organisers are compiling a list of ideas and questions which we hope might lead to further discussions, actions and collaborations. Please feel free to add comments and questions of your own!



September 23, 2014

All Roads Lead to Coventry

All Roads Lead to Coventry

Marie-Jeanne Merillet, a Masters student in the Centre for Cultural Policy Studies, University of Warwick, writes about 'All Roads Lead to Coventry': an event initiated by Warwick Creative Exchange, and co-organised with Warwick Arts Centre and Coventry Artspace, to explore how arts and culture can help to re-shape the city for its citizens and visitors. It took place on 16th September.

The day started out with a series of walks through the City, the goal of which was to generate discussion around art and culture and how this aspect could be used strategically in the future to make the city a more engaging environment. Essentially each of the walks aimed to bring together academics, creatives, and council staff. I joined one such group in a walk which began at Daimler road, winded down the canal way, and from there to the EGO performance space where the afternoon discussions took place. It was a surprising walk in the sense of discovery. Having spent an entire year living in Coventry, while studying a Masters at the Centre for Cultural Policy studies, I realized as the day progressed how little I really know about the city and how rich it is in terms of history and culture. In the course of our morning’s walk we wandered through the Cash buildings,

coventry_building.jpg

up Foleshill, to Tower Court, once home to a thriving car industry. Next we descended an obscure flight of stairs to explore Coventry’s waterway heading in the direction of the canal basin, yet another dimension of the city I am completely unfamiliar with.

coventry_canal_brisge.jpg

(In the mist the barges would suddenly appear like a monster)

At the end point, we were met by Alan Dyer, artist and Director of the Canal Basin Trust, who took us on a tour of the artists’ studios. Alan explained that the building was carefully restored in the 1970s, to look like the original, and that initially they invited their third year students to come and work there, filling it up with graduates. It has since remained a creative hub of artistic industry. We climbed down tiny staircases and through low doorways. Apparently the warehouse has its own ghost, believed to be a canal worker who plummeted to his death from the top of the staircase. The warehouse at present boasts some free studios and work spaces for creatives. There is some question about efficiently latching on and using spaces already in existence in the city instead of trying to carve out new ones elsewhere. As intended, the results of the walk left us with many ideas and perceptions about Coventry, which left us well prepared for the afternoon’s activity.

Grouped around tables of ten at the EGO performance center, the second part of the day revolved around discussing Coventry. Themes discussed included nostalgia for the past, a sense of pride in history, and also the role of universities in the shaping of the city. One of the issues that arose was that students do not often engage with the city, many do not even realize that the University of Warwick is in fact in Coventry. Having met many students during my studies at Warwick, I know that the majority I spoke with had a very negative impression about Coventry, an impression that could be remedied if they were encouraged to get to know the city better, and have some sense of its cultural richness. Some of the conclusions reached highlighted the universities as being instrumental in shaping students’ perception of the city. This year for example I took part in the Dominos project as part of the Coventry Mysteries Festival, which was a wonderful experience to engage both with the city and its culture. Perhaps raising students’ awareness and interest in the cultural history of the city could have a long-term positive effect on their involvement in the community and could be seen as a strategy in itself.


March 17, 2014

Collaborative Working in Practice

RICHARD HAYHOW, Theatre Director, Open Theatre and JO TROWSDALE, Centre for Education Studies, Warwick talk about how their work together led to learning for both partners and significant developments in Richard's theatre work with young people and school teachers, and in turn has led to a symposium, journal articles and a research funding bid.

JO: I was a regular attender of Arts Alive – a festival largely of Coventry based independent theatre companies - and there saw the work of The Shysters. Richard was their founder and director. Their work was so joyful, playful and challenged perceptions about what actors with learning disabilities could achieve. A few years later I was approached to report on a three-year partnership experiment (called ‘Something Wicked’) between the Belgrade theatre and seven independent theatre companies – the Shysters was one of them. As I witnessed work and interviewed artists, I got insights into the practice and especially into how Richard was enabling these young people to develop as actors and as people, as well as to engage and entertain others.

RICHARD: The ‘Something Wicked’ work gave me the opportunity to reflect on the development of the Shyster practice and in many ways to legitimise it as a valid theatre practice that could sit alongside that of the other theatre companies involved. The approach to creating theatre with the Shysters had always been driven by the desire to make high quality productions: over a period of five years leading up to Something Wicked many ways of working were tried and tested, with some rejected, in order to achieve this. The basic principles of non-verbal physical work created in response to pre-recorded music (acting in effect as the ‘script’) were developed and explored in this period. The young people were able to lead much of the creative process through their intimate engagement with this way of working. An aesthetic for the work began to grow and the actors themselves became highly skilled within this. Perhaps the most significant statement made about the company at the time was: ‘ The Shysters are great actors – but they are also great people’. A process that had been developed to create high quality theatre had inadvertently had a transformatory effect on the actors themselves.

JO: In 2004 I was appointed Creative Director for Creative Partnerships Coventry and as I met with Deedmore MLD Special Primary school’s head teacher and heard her explain her ambition for her children, I was reminded of Richard. I bumped into Richard at another Coventry Arts event and he was talking about the desire to develop a longer term relationship with schools where he might see if the Shyster practice introduced with younger children might affect similar changes and give them a better chance in society. I arranged for the two of them to meet -a six year year long relationship was born between Richard and Deedmore, now joined with an SLD school and renamed Castlewood.

RICHARD: I remember that conversation well. I had learnt two important things from the Shyster process – firstly the work needed time to achieve its effect – on both the quality of the acting and the theatre created, and on the impact on the people involved. Secondly the younger the person involved the more significant the impact you could have. I had also worked in special schools at various times before this but only on one-off projects – but perhaps the best comment from one of these schools had been that the project we had developed had not been a ‘dumbed-down version of a mainstream project’ but had been ideally suited to the particular students involved. I was therefore keen to work in a primary special school to explore these ideas further – and as such made it clear from the outset that I would only become involved in Creative Partnerships Coventry if I could work in one school for a significant period of time – at least a year, if not two –and not be forced to compromise the work to fit in with preconceived notions of what an arts project should look like. The work in Deedmore began after a key meeting set up by Creative Partnerships Coventry at the school in which I made it clear that I was only interested in working in special schools because I valued the creativity of people with learning disabilities above all else!

JO: Richard had a healthy disrespect for the bureaucracy of Creative Partnerships, suggesting that he began working very openly, learning through the practice where to take the work. But he was also keen to engage with the reflective process and welcomed questions. His approach challenged me to re-think the possible frames and prompts we offered teachers and artists in scoping, planning and reflecting on the work in order to deepen value.

RICHARD: I think I had gained that healthy disrespect because of working with people with learning disabilities. After trying many conventional theatre-making processes I’d had to discover and invent new ways of working that got to a deep level of truth in terms of the contribution that the actors could make to a creative process. So within the education system and the bureaucracy of Creative Partnerships I continued that search for truth in terms of both adapting the practice and in gaining a sense of its impact on pupils. The challenge of imposing existing systems on the work forced me to come up with better systems and better articulation of what was really going on. It was great that Jo was open to the challenge and also that the lead teacher at Deedmore was equally keen to understand and change ways of thinking and doing things when she witnessed the impact of the Shyster practice on her pupils.

JO: Richard built a strong relationship with the lead teacher and both welcomed engagement with the practice. I was regularly invited to witness and get involved with the work. Visits, discussions, evaluation meetings and written reports were littered with evidence and accounts of children whose behaviour had altered positively; who had achieved something new and previously unimaginable: a catalogue of moments of children communicating: eye contact, smiling, talking; instances of caring for each other, playing imaginatively together at playtime, listening and learning. Performance events were emotional affairs with parents (and staff) moved by the transformations they saw. Richard was increasingly committed to this work with young people. During this time I also attended a number of Shyster events and was always aware of how joyful and full of laughter they were. Richard brought out the best in everyone and the dynamic he generated with young people was life affirming.

RICHARD: I was genuinely quite puzzled by the impact on the children involved of what I was doing – although it was very welcome and satisfying to know that it was happening I wanted to know why. Was it just because I was mostly a joyful and life affirming kind of person and was good at passing that on in sessions and to some extent it didn’t matter what I did as long as I did in that way? Or was it the uniqueness of the practice itself (which included the way it was delivered by a practitioner) that was key to the impact. If it was the second, then the potential for its use could become huge – others could be trained in the practice and many more young people in special schools and elsewhere could benefit from it. So it became even more important to find ways of understanding and sharing the work. At the heart of this was a central conundrum though - how could we explain in words a process that in its essence is beyond words?

JO: Witnessing the children working with Richard and each other was evidently the best way to communicate the impact of the practice, so a film was made and used to share the work at a Drama education conference in Exeter. Preparing for the conference was our first attempt at understanding what was happening and marked my closer involvement with the project: in debating with the team about how to develop understanding and practice across the newly formed school. We focussed upon visual techniques and critical incident analyses…

RICHARD: …which we called ‘significant moments’. We felt that we could share those moments when a particular child made a significant achievement – a quantum leap -within the work and which was witnessed by the adults in the room. By talking through what led up to this moment it could be possible to understand what combination of current circumstances and previous experiences could have contributed to the significant moment taking place, which in turn could help with our understanding of what were the key elements of the practice. One of the most memorable of these significant moments was when one particular child, after two years of him resisting the use of his imagination, gave in to his growing understanding of ‘pretending’ by suddenly juggling with imaginary balls and really enjoying the experience.

JO: By the time Creative Partnerships and the Shysters funding was cut, the demand for Richard’s practice by schools in Coventry and Birmingham was established, supported also by Liz Leck at the Birmingham Hippodrome with whom we planned and delivered a symposium about the practice and its particular relevance to adolescents with learning disabilities transitioning from school. We also began writing about the work: for a research bid to Leverhulme and for journals – one has been published in the BJSE.

RICHARD: All of these activities – the symposium, the articles and the funding bids were in effect further (welcome) challenges to articulate the work more effectively and to communicate a better understanding of it both to ourselves and to others. They were also partly about a move to articulate the work in a context that was about common human development as opposed to being solely about learning disability and therefore implying a specialised and insular practice of limited use. As we looked at the work through a theatre lens, the term ‘mimetics’ seemed quite an accurate way to describe this embodied practice - and a more useful way of explaining it than 'shystering', which we had been using so far.

JO: We are now looking to develop our understanding of what happens to people through mimetics - how they develop. We have both read a bit about mirror-neuron theory – although there is no consensus about this from neuroscientists. Human development markers – as proposed by psychology seems most useful, but we really need to find some interested experts in other fields to help us take it further...


February 05, 2014

Creative industries generate £8 million an hour. So states Creative Industries Statistical Report

From Jonnie Turpie, Founder & Director of Maverick TV

Creative Industries Economic Estimates Report

The report came off the press on the 13th January 2014. The Office for National Statistics report is a detailed analysis of the Creative Industries' growing contribution to UKPLC. This is not a new phenomenon! It has been well known for the last 40 or so years, since the emergence of popular culture, that the UK punches above its weight in this sector. The report is accompanied by a supportive statement by the Secretary of State for Culture, Media and Sport, Maria Miller: “These incredible statistics are confirmation that the creative industries consistently punch well above their weight, outperforming all the other main industry sectors, and are a powerhouse within the UK economy.”

The report is based on analysis of the extensive data, and additional consultation, supported by a cross-industry collaboration overseen by the Creative Industries Council, Creative Skillset, Creative and Cultural Skills, Nesta, DCMS and a range of industry bodies.

The headline findings are that:

The UK’s creative industries are worth £71.4 billion per year to the UK economy. The creative industries, which include film, television and music, generate a staggering £8 million an hour.

Growth of almost 10 per cent also meant that the creative sector outperformed all other sectors of UK industry, accounting for 1.68 million jobs (5.6 per cent of the UK total) in 2012.

The UK creative industries are renowned across the globe; driving growth, investment and tourism.

Creative Industries Council Chair Nicola Mendelsohn said: “These figures amply demonstrate the huge contribution our sector makes to the economy and it’s vital that the right framework is in place to nurture and support the industry. We are working with Government on developing a growth strategy for the sector which will identify how all involved can ensure the creative industries continue to go from strength to strength.”

The report is here: https://www.gov.uk/government/publications/creative-industries-economic-estimates-january-2014


December 10, 2013

Cloud Cannon

Writing about web page http://bigeasyband.co.uk

I was interested to try out Cloud Cannon as a web site development / hosting platform. Using Dropbox as the starting point for your content suits my workflow well, and their extra features - auto-minification, simple editing of DIVs for other people - suit me too. I hope they can make a go of it.


December 05, 2013

A message from one of the founders of Warwick Creative Exchange

What do the arts, especially the performing arts, and education have in common? In my experience creative imagination, collaborative working, fluency and conviction in communication, attention to detail, ability to meet deadlines.

The recognition of this commonality of experience between the theatre and the academy has been a significant feature of Warwick’s teaching and research since its foundation in the 1960s with the Department of English and the Institute of Education in its Drama and Education courses at the forefront of work in this area, joined by colleagues in the exploration of creativity and performance in its application across the disciplines. So the establishment of the Warwick Creative Exchange in 2011 was the next step on a journey for the University (and for me), following on from the work of the CAPITAL Centre (http://www2.warwick.ac.uk/fac/cross_fac/capital) and more recently WBS Create at Warwick Business School http://www.wbs.ac.uk/research/specialisms/teaching-groups/wbs-create/, with which I have been involved as administrator since 2005.

The Arts and Humanities Research Council’s call for proposals for Knowledge Exchange Hubs for the Creative Economy in April 2011, made us look with fresh eyes at the work going on at Warwick and encouraged us to take the opportunity to formalise the partnerships and projects that had been so fruitfully developed in the region with, for example, Birmingham Royal Ballet, Sante Theatre, Talking Birds, China Plate, the RSC. Although we weren’t selected as an AHRC hub we didn’t want to lose the momentum we had gathered from the enthusiastic support of arts practitioners and organisations. The most practical way forward to facilitate partnerships and projects to the mutual advantage of all in terms of knowledge exchange was to create a more formal network. So the Warwick Creative Exchange (WCE) was established in autumn 2011, with the aim of bringing together Warwick academics and West Midlands cultural organisations to identify and encourage interdisciplinary research and knowledge exchanges between the University and the regional cultural community.

What essentially drives all WCE partners, whether academic or cultural, after our core activities and commitments, is funding to support our ideas, dreams, and practical needs. How do we get the support we need to do what we want to do? How do we persuade funders that we are worthy of that support? How do we evaluate what we do so we know – and our funders know - that we have made a difference to the audiences we serve whether in the theatre, the classroom or the academy. Talking to each other helps to develop ideas and strategies but I think practical actions help more. Warwick’s expertise in cultural policy, digital innovation, education and training is there to be tapped by our cultural partners and their expertise in audience development, community engagement, creative interpretation can inform and inspire the University. Having recently retired, I am outside WCE looking in, but I continue to believe that WCE’s success will come from making real work that capitalises on those strengths. I look forward to seeing the results!

Susan Brock

Formerly Administrator, the CAPITAL Centre; Academic Project Manager, WBS Create.


The curiosity of academia together with the creativity of the arts

Birmingham Royal Ballet’s mission statement is to be one of the world’s leading large-scale international dance companies Achieved through… 1. Promoting outstanding creativity and artistic excellence 2. Being a model of good management practice 3. Serving a global range of constituencies and communities 4. Maximising arts development opportunities 5. Increasing understanding of the cultural and social importance of the arts To achieve this we are constantly reflecting on our work and developing the learning to strive for excellence in all that we do. Being involved in activities and discussions with Warwick Creative Exchange enables us to bring the rigour of academic interrogation into our work. The curiosity of academia together with the creativity of the arts will enable new and exciting developments to emerge with circles of influence spreading far and wide. Birmingham Royal Ballet is delighted to be involved in this exciting venture.

Pearl Chesterman

Director for Learning, Birmingham Royal Ballet


July 26, 2013

Shibboleth IdP authentication context class

Yeah, with a blog title like that you’d better get your brain ready.

We recently had an SP whose authentication attempts were failing, and returning no user. We found the difference in the SAML request was that it was requesting a particular authentication context class, which describes what level of logging in a user might be doing, e.g. username/password; username/password + HTTP; Certificate, etc.

The RemoteUser login handler by default doesn’t specify any class, which means it’ll be used if the SP doesn’t ask for anything in particular, but will never match if it does. No problem; we change the value in handler.xml to PasswordProtectedTransport, since we take passwords over HTTPS. But this particular SP still doesn’t work because it’s requesting simply Password and nothing else. The IdP supports multiple values so we just add that value too – it’s a generalisation of the PasswordProtectedTransport so it’s still true to say that we support both.

That fixed it for the SP, but now when logging in through any other SP, the authentication method in the final SAML response is Password, not PasswordProtectedTransport. This might not bother most SPs, but it’s inaccurate so I’d like it to report the “best” method we provide by default. It doesn’t seem to matter what order I specify them in handler.xml. A bit of searching through the documentation gives the answer:

If the IdP is configured for one or more of the methods requested by the SP, then that method may be used. If the Service Provider does not specify a particular method, and the user does not have an existing session, then the default method identified on the relying party’s configuration is used. If no default is identified there, the IdP will choose one of the available methods; the way this choice is made is unspecified and you should not rely on it being the same from release to release.

- https://wiki.shibboleth.net/confluence/display/SHIB2/IdPUserAuthn

So without any default specified in the relying party, the IdP is just picking at random, probably the first alphabetically. I’ve specified PasswordProtectedTransport in the relying party and that’s fixed that; the final SAML response reports that it used PasswordProtectedTransport as the authentication method.


February 20, 2013

Blogbuilder 3.26 and 3.27

Over the past month we have released 2 new versions of Blogbuilder, with a number of improvements and long-standing bug fixes:

  • You can now choose a fixed width version of any of the existing blog designs, and also easily move the sidebar to the right side of the page, by selecting from the options on the Appearance admin page.
  • We've added the ability to add social sharing buttons to each of your entries, using the "Show 'Like' buttons" option. This will add Facebook Like, Tweet, and Google +1 buttons to the entry.
  • We've adjusted the layout of each blog on small-screen mobile devices for easier reading.
  • And we've fixed issues including:
    • Adding tags to entries on the 'Admin entries' page.
    • Links disappearing in IE8 on the Create entry drop-down.
    • Listing entries by tags with Chinese characters, and untagged entries.
    • Apostrophes in image descriptions causing problems with inserting images.
    • Departments being listed in the wrong faculty in the directory.
    • Blockquotes not being inserted correctly.

February 19, 2013

dfvsdfvsdvsdfvsd


September 05, 2012

this is a test post

Another test. Ho hum.


April 25, 2012

A new era

This is a test post. Test, test, test.


November 13, 2011

New winter kit.

My new, shorter commute means I can now largely do away with my bike-specific kit, and ride in to work in “normal” clothes. Doubtless a relief for those who don’t have to suffer the sight of my lycra-centric outfits any more, but it has required some thought as to exactly what to replace then with.

So I was very pleased when the nice folk at Berghaus asked me to review some outdoor clothing which is just the ticket for a few miles riding through town on an autumn morning. First up, this:
Ardennes softshell
A splendid softshell jacket. It’s warm, windproof, waterproof-ish (the seams aren’t sealed, so I wouldn’t rely on it for a long time in really foul weather, but it’s stood up to 20 minutes or so of steady drizzle just fine). The cut is slightly more relaxed than a dedicated biking jacket (meaning it looks just fine down the pub), but has enough length in the arms and the back to avoid any annoying cold spots when leant over the handlebars. If I were being really picky, I could note that it doesn’t have pit-zips or any other kind of vents, which means it can get a bit wam if you’re working hard (e.g. when racing roadies on the way into work) – so far the weather’s still too warm to wear it for “proper” long rides. It’s also not very reflective, so you need something high-vis to slip over the top now the nights are dark.
But for riding to work it’s been ideal, and it’s also been great for standing on the sidelines of windswept football pitches watching the kids – which at this time of year is a pretty stern test of any fleece!

Item #2 on the list is a new rucksack – specifically, a Terabyte 25l .
Terabyte 25l
As the name suggests, this is optimised for carrying your laptop to work, rather than your ice axes into the northern Cairngorms or your inner tubes round Glentress. It features an impressively solid-feeling padded sleeve which will comfortably take anything from an ipad to a big-ish laptop and hold it securely, as well as a main compartment big enough for a packed lunch and my running kit, and the usual assortment of well-laid-out pockets and attachments. I particularly like the little strap on the back of the pack for attaching a extra bike light to. It’s comfortable even when loaded up, and plenty stable enough to bike with. Highly recommended.


October 28, 2011

Scala: 3 months in

So, I’ve been using scala in production for a few months now. Is it everything I expected it to be?

First, a little bit of background. The project I’m working on is a medium-sized app, with a Scala back-end and a WPF/C# fat client. The interface layer is a set of classes autogenerated from a rubyish DSL, communicating via a combination of client-initiated JSON-RPC over HTTP, and JSON-over-AMQP messages from server to client. It’s split into about half a dozen projects, each with it’s own team. (I should point out that I have little involvement in the architecture at this level. It’s a given; I just implement on top of it.)

The server side is probably a few thousand classes in all, but each team only builds 500 or so classes in its own project. It integrates with a bunch of COTS systems and other in-house legacy apps.

In functionality terms, it’s really fairly standard enterprisey stuff. Nothing is that complicated, but there’s a lot of stuff, and a lot of corner-cases and “oh yeah, we have to do it like that because they work differently in Tokyo” logic.

So, Scala. Language: Generally lovely. Lends itself to elegant solutions that are somehow satisfying to write in a way which java just isn’t. Everything else? Not so much. Here are my top-5 peeves:

  • The compiler is so slow. For a long time, the Fast(er) Scala Compiler simply wouldn’t compile our sources. It threw exceptions (no, it didn’t flag errors, it just died) on perfectly legitimate bits of code. This meant using the slow compiler, which meant that if you changed one line of code and wanted to re-run a test, you had to wait while your 8-core i7 churned for 2-3 minutes rebuilding the whole world. In a long-past life, I cut my teeth writing mainframe cobol with batch compilation. This was like that, all over again. TDD? Not so much. Recently the FSC has improved to the point where it can actually compile code, which is nice, but sometimes it gets confused and just forgets to recompile a class, which is hella confusing when you’re trying to work out why your fix doesn’t seem to be having any effect.
    I would gladly exchange a month of writing WPF automation code (that’s a big sacrifice, for those who haven’t endured it) for the ability to have eclipse’s hot-code-replace, unreliable as it is, working with scala.
  • The IDEs blow. Eclipse + ScalaIDE just plain doesn’t work – apart from choking on our maven config, even if you manually build up the project dependencies, it will randomly refuse to compile classes, highlight errors that don’t exist (highlighting errors on lines that contain no code is a personal favourite), and ignore actual errors. IDEA + Scala Plugin is better – this is now my day-to-day IDE, despite starting out as a dyed-in-the-wool Eclipse fanboy, but it’s slow and clunky, requires vast amounts of heap, and the code completion varies from arbitrary to capricious.
  • Debugging is a pain. Oh yes,
    data.head.zipWithIndex.map((t) => data.map((row: List[Int]) => row(t._2)).reverse)    

feels great when you write it, but just try stepping through that sucker to find out what’s not working. You end up splitting it up into so many temporary variables and function calls that you might as well have just written a big nested for-loop with couple of if-elses in it. Sure, you can take them all out again when you’ve finished, but are you really helping the next guy who’s going to have to maintain that line?

  • The java interop is a mixed blessing. Yes, it’s nice that you have access to all that great java code that’s out there, but the impedance mismatch is always lurking around causing trouble. Of recent note:
    – Scala private scope is not sufficiently similar to java private scope for hibernate field-level access to work reliably
    – Velocity can’t iterate scala collections, nor access scala maps.
    – Mokito can’t mock a method signature with default parameter values.

The list goes on. None of these are blockers in any sense – once you’ve learned about them, they’re all pretty easy to work around or avoid. But each one represents an hour or so’s worth of “WTF”-ing and hair-pulling, and these are just the ones that are fresh enough to remember. Scala-Java interop is a constant stream of these little papercuts.

  • Trivial parallelisation is of minimal value to me. Sure, there are loads of cases where being able to spread a workload out over multiple threads is useful. But, like a great many enterprise apps, this one is largely not CPU-bound, and where it is, it needs to operate on a bunch of persistent objects within the context of a JTA transaction. That means staying on one thread. Additionally, since we’re running on VMWare, we don’t have large numbers of cores to spread onto, so parallelisation wouldn’t buy us more than a factor of 2 or 3 in the best case. In a related vein, immutable classes are all well and good, but JPA is all about mutable entities. There have been a few bits of the design where baked-in immutability has led to a cleaner model, but they’re surprisingly few and far between.

Not that long after I started work on this project, there was a video doing the rounds, showing how Notch, the creator of Minecraft, used Eclipse to create a pretty functional Wolfenstein clone in java in a 48-hour hackathon. Whilst that kind of sustained awesomeness is out of my reach, it illustrates the degree to which the incredibly sophisticated tooling that’s available for Java can compensate for the language’s shortcomings. If you haven’t watched it, I recommend skimming though to get a sense of what it means to have a toolset that’s completely reliable, predictable, and capable of just getting out of the way and letting you create.

If I were starting this project again, I’d swap the elegance and power of the scala language for the flow of java with a good IDE. Even if it did mean having to put up with more FooManagerStrategyFactory classes


September 07, 2011

5 Operations metrics that Java web developers should care about

Sometimes it’s nice, as a developer, to ignore the world outside and focus in on “I’m going to implement what the specification says I should, and anything else is somebody else’s problem”. And sometimes that’s the right thing to do. But if you really want to make your product better, then real production data can provide valuable insights into whether you’re building the right thing, rather than just building the thing right. Operations groups gather this kind of stuff as part of normal business*, so here are a handful of ops. metrics that I’ve found particularly useful from a Java Web Application point of view. Note that I’m assuming here that you as the developer aren’t actually responsible for running the aplicationp in production – rather, you just want the data so you can make better informed decisions about design and functionality.

How should you expose this information? Well, emailing round a weekly summary is definitely the wrong thing to do. The agile concept of “information radiators” applies here: Big Visible Charts showing this kind of data in real time will allow it to seep into the subconscious of everyone who passes by.

Request & Error rates

How many requests per second does your application normally handle? How often do you get a 5-fold burst in traffic? How often do you get a 50-fold burst? How does your application perform when this happens? Knowing this kind of thing allows you to make better decisions about which parts of the application should be optimised, and which don’t need it yet.
Requests per minute graph
Request rates for one node on a medium-scale CMS. Note the variance throughout the day. The 5AM spike is someone’s crazy web crawler spidering more than it ought to.

Error rates – be they counting the number of HTTP 500 responses, or the number of stack traces in your logs, are extremely useful for spotting those edge-cases that you thought should never happen, but actually turn out to be disappointingly common.
Error rates graph
Whoa. What are all those spikes? Better go take a look in the logs…

GC Performance

GC is a very common cause of application slowdowns, but it’s also not unusual to find people blaming GC for problems which are entirely unrelated. Using GC logging, and capturing the results, will allow you to get a feel for what “normal” means in the context of your application, which can help both in identifying GC problems, and also in excluding it from the list of suspects. The most helpful metrics to track, in my experience, are the minimum heap size (which will vary, but should drop down to the same value after each full GC) and the frequency of full GCs (which should be low and constant).

weekly GC performance
A healthy-looking application. Full GCs (the big drops) are happening about one per hour at peak, and although there’s a bit of variance in the minimum heap levels, there’s no noticeable upwards trend

Request duration

Request duration is a fairly standard non-functional requirement for most web applications. What tends not to be measured and tracked quite so well is the variance in request times in production. It’s not much good having an average page-load time of 50 milliseconds if 20% of your requests are taking 10 seconds to load. Facebook credit their focus on minimising variance as a key enabler for their ability to scale to the sizes they have.
Render performance graph
the jitter on this graph gives a reasonable approximation of the variance, though it’s not quite as good as a proper statistical measure. Note that the requests have been split into various different kinds, each of which has different performance characteristics.
smokeping screenshot
Request speed with proper variance overlaid

Worker pool sizes

How many apache processes does your application normally use? How often do you get to within 10% of the apache worker limit? What about tomcat worker threads? Pooled JDBC connections? Asynchronous worker pools? All of these rate-limiting mechanisms need careful observation, or you’re likely to run into hard-to-reproduce performance problems. Simply increasing pool sizes is inefficient, will more likely than not will just move the bottleneck to somewhere else, and will leave you with less protection against request-flooding denial of service attacks (deliberate or otherwise).

Apache workers graph
Apache worker pool, split by worker state. Remember that enabling HTTP keepalive on your front-end makes a huge difference to client performance, but will require a significantly larger pool of connections (most of which will be idle for most of the time)

If you have large numbers of very long-running requests bound by the network speed to your clients (e.g. large file downloads) consider either offloading them to your web tier, using mod_x_sendfile or similar. For long-running requests that are bound by server-side performance, (giant database queries or complex computations), consider making them asynchronous, and having the client poll periodically for status.

Helpdesk call statistics

The primary role of a helpdesk is to ensure that second and third-tier support isn’t overwhelmed by the volume of calls coming in. And many helpdesks measure performance in terms of the number of calls which agents are able to close without escalation. This can sometimes create a perverse incentive, whereby you release a new, buggy version of your software, users begin calling to complain, but the helpdesk simply issue the same workaround instructions over and over again – after all, that’s what they’re paid for. If only you’d known, you could have rolled back, or rolled out a patch there and then, instead of waiting for the fortnightly call volumes meeting to highlight the issue. If you can get near-real-time statistics for the volume of calls (ideally, just those related to your service) you can look for sudden jumps, and ask yourself what might be causing them

Helpdesk call volumes

* “But but… my operations team doesn’t have graphs like this!” you say? Well, if only there were some kind of person who could build such a thing… hold on a minute, that’s you, isn’t it? Building the infrastructure to collect and manage this kind of data is pretty much a solved problem nowadays, and ensuring that it’s all in place really should be part and parcel of the overall development effort for any large-scale web application project. Of course, if operations don’t want to have access to this kind of information, then you have a different problem – which is topic for a whole ‘nother blog post some other time.


August 05, 2011

There's no such thing as bad weather…

...only the wrong clothes. Continuing the camping-kit theme, let’s talk about waterproofs. If you’re going camping in the UK, it’s going to rain sooner or later. There are a few things you can do to make this not be a problem

- a tarp, or gazebo, or event shelter, or other tent-without-a-floor-or-walls, allows you to sit around, cook, and generally be outside without getting rained on or feeling quite as penned-in as sitting inside the tent does. Watch out in high winds though.

- Wet grass will soak trainers and leather boots. Get some wellies, some crocs, or some flip-flops

- An umbrella is pretty handy

- Most importantly, get some decent waterproof clothing . For summer camping, I like a lightweight jacket rather than a big heavy winter gore-tex – drying stuff out in a tent is hard (especially if it’s damp outside), so the less fabric there is to dry, the easier it’ll be. My jacket of choice at the moment is a Montane Atomic DT, but my lovely wife has been testing a The North Face Resolve
It uses a 2-layer construction with a mesh drop liner, dries fast (and the mesh liner means it feels dry even when slightly damp), breathes well, and is slightly fitted so it doesn’t look too much like a giant sack. Packs down nice and small, and, of course, it keeps the rain out. It’s cut fairly short, so if you’re out in the rain for a long time you’ll either need waterproof trousers or a tolerance for wet legs. For 3-season use, I’d say it’s ideal.

Update: We’ve had a bit longer to test the TNF Resolve jacket, and so far (after 3 months) the signs are still good. It’s stood up to some pretty torrential rain without showing any signs of weakness, and I am informed that the colour is excellent (I wouldn’t know about that, obviously). The DWR coating is still working well, which is alway a good sign. The previously-mentioned fitted cut means that it doesn’t flap about when it’s blowy, but it’s not tight or restrictive, even when wearing a fleece underneath. The only shortcoming, which is common to a lot of lightweight jackets, is that the peak on the hood isn’t stiffened at all, so if it’s really windy, it tends to get blown onto your face a bit. This isn’t really an issue unless you’re up in the hills in really foul weather, though, and of course adding a wire to the hood would make the jacket a lot less packable, so for a 3-season jacket it’s a worthwhile trade-off.


August 04, 2011

How to back up your gmail on ubuntu

Quick-and-dirty solution:

1. install python and getmail
2. make a getmail config like this:

[retriever]
type = SimplePOP3SSLRetriever
server = pop.gmail.com
username = your_user_name@gmail.com
password = your_password

[destination]
type = Mboxrd
path = ~/gmail-archive/gmail-backup.mbox

[options]
# print messages about each action (verbose = 2)
# Other options:
# 0 prints only warnings and errors
# 1 prints messages about retrieving and deleting messages only
verbose = 2
message_log = ~/.getmail/gmail.log 

3. enable pop on your gmail account
4. add a cronjob like this:

13,33,53 * * * * /usr/bin/getmail -q -r /path/to/your/getmail/config

You’ll end up with a .mbox file which grows and grows over time. Mine is currently at 5GB. I have no idea whether thunderbird or evolution can open such a big file, but mutt can (use “mutt -f ~/gmail-archive/gmail-backup.mbox -R” , unless you really like waiting whilst mutt rewrites the mbox on each save) , or it’s easy enough to grep through if you just need to find the text of a specific message. If you needed to break it into chunks, you could always just use split, and accept that you’ll lose a message where the split occurs (or use split, and then patch toghether the message that gets split in half.


July 15, 2011

Gnome 3 / gnome–shell on Ubuntu Natty

OK, so I’m not the first to do this by a long chalk, but here’s what worked for me:

1: Install the gnome 3 PPAs (c.f. http://nirajsk.wordpress.com/category/gnome-3/) :

sudo add-apt-repository ppa:gnome3-team/gnome3
sudo apt-get update
sudo apt-get dist-upgrade
sudo apt-get install gnome-shell
sudo apt-get install gnome-shell-extensions-user-theme

For some reason, the default theme (Adwaita) was missing, as was the default font (Cantarell). (Could be because I didn’t install gnome-themes and gnome-themes-extra – see here). You can download Adwaita from here Get Cantarell from here, copy to /usr/share/fonts/ttf, and sudo fc-cache -rv to update.
I wasn’t too keen on the fat titlebars in adwaita, so I used this to shrink them down:

sed -i "/title_vertical_pad/s/value=\"[0-9]\{1,2\}\"/value=\"0\"/g" /usr/share/themes/Adwaita/metacity-1/metacity-theme-3.xml

You can set the theme and the font using gnome-tweak-tool (aptitude install it if it didn’t arrive with gnome-shell). I’m still looking for a nice icon set; the ubuntu unity ones are a bit orange, and the gnome ones are an unispiring shade of khaki. For now I’ve settled on the KDE-inspired Oxygen (aptitude install oxygen-icon-theme) which is OK, but still doesn’t quite look right.

There’s an adwaita theme for chrome which is nice, and makes everything a bit more consistent.

The London Smoke gnome-shell theme is really nice

Switching from pidgin to empathy gets you nice, clickable message notifications, although I’d rather they were at the top of the screen than the bottom.

Aero-style snap-to-window-edge works fine, except that for nvidia graphics cards with TwinView, you can’t snap to the edges that are between two monitors. Right-click context menus look weird in a way that I can’t quite put my finger on, but they function as expected.

Other than that, it pretty much just works. The only glitch I’ve yet to work out is that spawning a new gnome-terminal window freezes the whole UI for a second or so. Not got to the bottom of why that might be yet; if I find it I’ll post something. In gnome 2 there was a similar problem with the nvidia drivers, which could be “solved” by disabling compiz, but that’s not an option here. Update it seems to be connected to using a semi-transparent background for the terminal; if I use a solid background the problem goes away.

Things I like better than Unity, after a week of playing:
– Clickable notifications from empathy. The auto-hiding notification bar at the bottom of the screen is great, although I’ve found it can sometimes be a bit reluctant to come out of hiding when you mouse over it.
– alt-tab that shows you the window title (good when you’ve got 20 terminals open). I like the ability to tab through multiple instances of the same app in order, too. To get alt-shift-tab to work, I used these instructions
– menus on the application. Fitt’s law can go hang; I like my menus to be connected to the window they control


July 06, 2011

Camping time!

It’s summer time, and that means it’s camping season again. Camping is ace, and it’s doubly ace if you have kids. Almost without exception, kids love weaselling about in the countryside, so once you’ve got them onto the campsite, they’ll amuse themselves and you can get on with the serious business of drinking beer, talking rubbish and playing with fires. What could be finer?

There’s a curious kind of Moore’s Law at the moment, as far as tents are concerned.
Every year, technology trickles down from the top-end, so low and mid-range tents get better and better. My first tent, long long ago, was pretty much bottom-of-the-range and cost about £50 (which was a lot of money for a fourteen year old in ninteen-eighty-something). It weighed approximately a tonne, leaked like a sieve, and stood me in good stead for a few years worth of adventures. Now, for half of that price you can get one of these pop up tents pop-up tent

You don’t so much “pitch” it as just take it out of the bag and stand back. It sleeps two close friends, or one with too much gear, it’s pretty waterproof, and if you peg out the guy-ropes it’s surprisingly sturdy in the wind.

Downsides? It doesn’t pack down small, and I’m not sure it would be my first choice for a week in the Lake District in November – in cold, wet conditions the single-skin design means you’ll end up damp from condensation on the inside of the tent – but for summer weekend trips it’s brilliant; fun, easy, and it costs roughly 1/25th as much as an Ultra Quasar (though if you do have £600 to spare, I can highly recommend one of those as an alternative!).

ProTip: If you want to extend the range of weather you can use this in, get a tarp and pitch it over the top of the tent, overhanging the front by a meter or two. You get an extra layer of waterproofing, and a porch so you don’t have to bring your soggy boots into the tent.


one–liner of the day: quick–and–dirty network summary

If you’ve got a solaris box with a load of zones on, you might sometimes have an issue whereby you can see that the box is pushing a lot of traffic over the network, but you’re unsure which zone(s) are responsible. Here’s a super-hacky way to get an overview:

 snoop -c 1000 | awk '{print $1}' | sort | uniq -c | sort -n

basically, catch the first 1000 packets, find the source for each one (assuming most of your traffic is outbound; if it’s inbound then print $3), and then count the distinct hosts (i.e. your zone names) and list them.

If you have a slow nameserver you may want to add “-r” to the snoop command and then map IPs to names afterwards.


June 26, 2011

Scala: fun with Lists

So, I’m slowly working my way though some of the programming exercises at Project Euler as a means of learning scala (and hopefully a little bit of FP at the same time).
Problem #11 is my most recent success, and it’s highlighted a number of interesting points. In brief, the problem provides you with a 20×20 square of 2-digit integers, and asks for the largest product of any 4 adjacent numbers in any direction (up, down, left, right, diagonally). Kind of like a word-search, but with extra maths.

For no better reason than the afore-mentioned desire to get a bit more acquainted with a functional style of programming, I chose to implement it with no mutable state. I’m not sure if that was really sensible or not. It led to a solution that’s almost certainly less efficient, but potentially simpler than some of the other solutions I’ve seen.

So, step one: Give that we’re going to represent the grid as a List[List[Int]], let’s see if we can get the set of 4-element runs going horizontally left-to-right:

    def toQuadList()= {
      def lrQuadsFromLine(line: List[Int], accumulator: List[List[Int]]): List[List[Int]] = {
        if (line.length < 4)
          accumulator
        else
          lrQuadsFromLine(line.tail, accumulator :+ line.slice(0, 4))
      }
      data.flatMap(line => lrQuadsFromLine(line, List.empty))
    }

(“data” is the List[List[Int]] representing the grid). Here we define a recursive function that takes a List[Int], obtains the first 4-number run (line.slice), adds it to an accumulator, and then pops the first item from the list and calls itself with the remainder.
Then we just call flatMap() on the original List-of-Lists to obtain all such quadruples. We could call Map, but that would give us a list of lists of quadruples – flatMap mushes it down into one list.
It would be nice to find a way to make the if/else go away, but other than just transliterating it into a match, I can’t see a way to do it.

Given that list, finding the largest product is trivial.

   def findMaxProduct = data.map((x: List[Int]) => x.reduce(_ * _)).max

- take each Quadruple in turn and transform it into a single number by multiplying each element with the next. Then call .max to find the largest.

So now we can do one of the 6 directions. However, a bit of thought at this point can save us some time: the Right-to-Left result is guaranteed to be the same as the left-to-right, since multiplication doesn’t care about order (the set of quadruples will be the same in each direction). Similarly, the downwards result will be the same as the upwards one.

Next time-saver: Calculating the downwards result is the same as rotating the grid by 90 degrees and then calculating the left-to-right result. So we just need a rotate function, and we get the vertical results for free:

   def rotate90Deg() = {
      data.head.zipWithIndex.map((t) => data.map((row: List[Int]) => row(t._2)).reverse)
    }

Doing this without mutable state took me some pondering, but the solution is reasonably concise. Doing zipWithIndex on an arbitrary row of the grid (I used .head because it’s easy to access) gives us an iterator with an index number in, so we can now build up a List containing the _n_th element from each of the original lists. (The outer map() iterates over the columns in the grid, the inner one over the rows in the grid))

So now we have the horizontal and vertical totals, we need to do the diagonals. It would be nice if we could once again re-use the left-to-right quadruple-finding code, so we need to get the grid into a form where going left to right gets the sequences that would previously have been diagonals. We can do this by offsetting subsequent rows by one each time, the rotating; like this:

1,2,3
4,5,6
7,8,9

?,?,1,2,3
?,4,5,6,?
7,8,9,?,?

7,?,?
8,4,?
9,5,1
?,6,2
?,?,3

You can see that the horizontal sequences are now the original diagonals. Initially, I started using Option[Int] for the items in the grid, so I could use None for the “question marks”. However, after more time than it ought to have taken me, I realised that using zero for those would work perfectly, as any zero in a quadruple will immediately set that quad’s product to zero, thus excluding it from our calculation (which is what we want).
The function to do the offseting is still rather complex, but not too bad (it owes a lot to the recursive toQuadList function above:

    def diagonalize() = {
      def shiftOneRow(rowsRemaining: List[List[Int]], lPad: List[Int], rPad: List[Int], rowsDone: List[List[Int]]): List[List[Int]] = {
        rowsRemaining match {
          case Nil => rowsDone
          case _ => {
            val newRow: List[Int] = lPad ::: rowsRemaining.head ::: rPad
            shiftOneRow(rowsRemaining.tail,
              lPad.tail,
              0 :: rPad,
              rowsDone ::: List(newRow))
          }
        }
      }
      shiftOneRow(data, List.fill(data.size)(0), List.empty, List.empty)
    }

We define a recursive function that takes a list of rows yet to be padded, a list of zeros to pad the left-hand-side with, another for the right-hand-side, and an accumulator of rows already padded. As we loop through, we remove from the remaining rows, add to the done rows, remove from the left-padding, and add to the right padding. It kind of feels like there ought to be a neater way to do this, but I’m not quite sure what yet.

To get the “other” diagonals, just flip the grid left-to-right before diagonalizing it.

Once that’s done, all that remains is to glue it together. Because I’ve been thinking in OO for so long, I chose to wrap this behaviour in a couple of classes; one for the list-of-quadruples, and one for the grid itself. Then I can finish the job with:

 println(
      List(new Grid(data).toQuadList.findMaxProduct,
      new Grid(data).rotate90Deg.toQuadList.findMaxProduct,
      new Grid(data).diagonalize.rotate90Deg.toQuadList.findMaxProduct,
      new Grid(data).flipLR.diagonalize.rotate90Deg.toQuadList.findMaxProduct).max)

June 20, 2011

The one where I fail to go on any exciting expeditions

Writing about web page http://www.gooutdoors.co.uk/thermarest-prolite-small-p147594

So, in the spirit of previous product reviews, this should have been an entry that described various recent feats of derring-do, and casually slipped in a plug for the latest bit of camping equipment that I’ve been sent to review. Unfortunately, the last few weeks have been characterised by some especially foul weather. Last week’s planned camping trip to do the Torq rough ride was abandoned, in favour of an early morning drive down on sunday, followed by three-and-a-half hours of squelching round the “short” route in driving rain.
Then tonight’s Second Annual Summer-Solstice-Mountain-Bike-Bivi-Trip was abandoned when it became apparent that the chances of a scenic sunrise were pretty much zero, whereas the chance of a night in a plastic bag in the rain on a hill were looking close to 100%.

All of which means that my shiny new toy has so far not seen much action outside of the back garden. But it seems churlish not to write something about it; so here goes. It’s a sleeping mat specifically a Thermarest Pro-Lite. Mats might seem like a pretty mundane item, but if you’re trying to travel light, either running or on a bike, then they’re a tricky thing to get right. Too minimalist and you don’t get any sleep, then you feel like crap the next morning. Too heavy, and they start to make a serious contribution to your pack weight, which matters a lot if you’re having to run up a hill or ride down one.
For a long time, my preferred option was a cut-down foam karrimat, which was a reasonable compromise, but suffered from being a bit on the bulky side, and also not terribly warm. I have an old thermarest as well, which is fabulously comfy – great for winter camping, but far too bulky for fast & light summer trips.

There will be some pics here, when it stops raining for long enough… for now, here’s a stock photo…

So, the pro-lite: Point 1; I got the small one; it’s very small. (note the artful perspective on the photo!) If you want something that will reach alll the way down to your toes (or even your knees) this isn’t it; buy the large size. I don’t mind this, though; in the summertime I’m happy with something that just reaches from head-to-thigh. Equally, it’s quite slim. My shoulders are fairly scrawny, and this just-about reaches from one to the other. If you’ve been spending longer in the gym (or the cake shop) than on the track or the turbo, then you might want a bigger size.

Point 2: It is just unbelievably compact. Really. It rolls up into something about the size of a 750ml drink bottle. Foam karimats can’t come near to this. This makes more of a difference than you might think, because it means you can get away with a pack that’s 5L or so smaller (and therefore lighter), and still fit the mat inside (keeping your mat inside your pack is a winning plan, because you can keep it dry). It’s also great if you’re backpacking on a bike, because the smaller your pack, the less it affects the bike’s handling.

Point 3: It’s as light as a foam mat. Unlike my old thermarest, there’s not a lot inside this mat, so it’s super-lightweight. Mine weighed in at a smidge over 300g according to the kitchen scales.

Point 4: Back-garden tests strongly suggest that it’s a lot more comfy than my old foam mat. I’ll report back once it’s stopped raining long enough to try it out for real!

Update #1 : Still not had the chance to take this backpacking, but a car-camping trip confirms that it’s very comfortable indeed – just as good as my old thermarest, though in a big tent you have to be careful not to roll off it in the night!


June 16, 2011

Partial and Curried Functions

On with the scala show! Partially-applied functions and curried functions took me a while to get my head around, but really they’re quite simple, and partials at least are super-useful.

Partially-applied functions are just functions where you pre-bind one of the parameters. e.g.

scala> val message:(String,String,String)=>String = "Dear " + _ +", " + _ + " from " + _
message: (String, String, String) => String = <function3>

scala> message("Alice","hello","Bob")
res24: String = Dear Alice, hello from Bob

scala> val helloMessage=message(_:String,"hello",_:String)
helloMessage: (String, String) => String = <function2>

scala> helloMessage("Alice","Bob")
res25: String = Dear Alice, hello from Bob

scala> val aliceToBobMessage=message("Alice",_:String,"Bob")
aliceToBobMessage: (String) => String = <function1>

scala> aliceToBobMessage("greetings")
res27: String = Dear Alice, greetings from Bob

What’s happening here is reasonably self-explanatory. We create a function “message” which takes 3 parameters, then we create two more functions, “helloMessage” and “aliceToBobMessage” which just are just aliases to the “message” function with some of the parameters pre-filled.

Since functions are first-class objects that you can pass around just like anything else, this means you can do stuff like

scala> val times:(Int,Int)=>Int = _*_
times: (Int, Int) => Int = <function2>

scala> val times2=times(2,_:Int)
times2: (Int) => Int = <function1>

scala> (1 to 10) map times2
res38: scala.collection.immutable.IndexedSeq[Int] = Vector(2, 4, 6, 8, 10, 12, 14, 16, 18, 20)

Currying is, at first sight, just a different (more complicated) way to achieve the same thing:

scala> val message:(String)=>(String)=>(String)=>String = (to:String)=>(message:String)=>(from:String)=>{ "Dear " + to +", " + message + " from " + from}
message: (String) => (String) => (String) => String = <function1>

scala> message("Alice")("Hello")("Bob")
res28: String = Dear Alice, Hello from Bob

scala> val aliceToBobMessage=message("Alice")(_:String)("Bob")aliceToBobMessage: (String) => String = <function1>

scala> aliceToBobMessage("greetings")res29: String = Dear Alice, greetings from Bob

What that giant line of verbiage at the start is doing, is creating a function which takes one string and returns a function that takes another string, which returns a function that takes another string, which returns a string. Phew.
This mapping from a function that takes n parameters, to a chain of n functions that each take 1 parameter, is known as “currying” (After Haskell Curry). Why on earth would you want to do this, rather than the much simpler partial application example above?
Several of the core scala API methods use curried functions – for example foldLeft in the collections classes

def foldLeft[B](z: B)(op: (B, A) => B): B = 
        foldl(0, length, z, op)

- here we’re exposing a curried function, and then proxying on to a non-curried implementation. Why do it like this?

It turns out that curried functions have an advantage (see update below) that partial ones dont. If I curry a function, and then bind some variables to form what is effectively the same as a partial function, I can also modify the type of the remaining unbound parameters. Example required!

scala> val listMunger:(String)=>(List[Any])=>String = (prefix:String)=>(contents:List[Any])=>prefix + (contents.reduce(_.toString +_.toString))
listMunger: (String) => (List[Any]) => String = <function1>

scala> val ints = List(1,2,3)
ints: List[Int] = List(1, 2, 3)

scala> listMunger("ints")(ints)
res31: String = ints123

scala> val strings = List("a","b","c")
strings: List[java.lang.String] = List(a, b, c)

scala> listMunger("strings")(strings)
res32: String = stringsabc

scala> val stringListMunger=listMunger("strings")(_:List[String])
stringListMunger: (List[String]) => String = <function1>

scala> stringListMunger(strings)
res33: String = stringsabc

scala> stringListMunger(ints)
<console>:11: error: type mismatch;
 found   : List[Int]
 required: List[String]
       stringListMunger(ints)

That last error is the important bit. We’ve specialized the parameter type for the unbound parameter in stringListMunger so it won’t accept a list of anything other than Strings. Note that you can’t arbitrarily re-assign the type; it has to be a subtype of the original (otherwise the implementation might fail).
OK; so now all I have to do is think of a real-world example where this would be useful…

Update Gah, I was wrong. You can do exactly the same type-specialization with a partial:

scala> val x:(Int,List[Any])=>Int = (_,_)=>1
x: (Int, List[Any]) => Int = <function2>

scala> val y:(List[Int])=>Int = x(1,_)
y: (List[Int]) => Int = <function1>

So now I still have no idea when you’d want to curry a function, rather than just leaving it with multiple arguments and partially applying when required. This blog entry suggests that it really exists to support languages like OCaml or Haskel that only allow one parameter per function – so maybe it’s only in scala to allow people to use that style if they like it. But then what’s it doing in the APIs ?


June 14, 2011

Blogbuilder 3.25

We've just released a new version of Warwick Blogs (the last one was nearly 2 years ago!) with a number of improvements and bug fixes:

  • We've improved the RSS and Atom feeds from your blogs, and also added JSON support (add ?json=json to the URL, and you can add callback and assign parameters to do JSONP). For example: http://blogs.warwick.ac.uk/news/?json=json
  • We've significantly improved our support for newer web browsers (IE9, Chrome, Firefox 4) and you should find fewer problems using these browsers with Warwick Blogs
  • We've added OAuth support to Warwick Blogs with the following details:
  • Request token: https://websignon.warwick.ac.uk/oauth/requestToken?scope=urn%3Ablogs.warwick.ac.uk%3Ablogbuilder%3Aservice
  • Authorisation: https://websignon.warwick.ac.uk/oauth/authorise
  • Access token: https://websignon.warwick.ac.uk/oauth/accessToken
  • You'll need a consumer key and secret to use OAuth to Warwick Blogs in your own application, you can contact the IT Services Helpdesk (helpdesk@warwick.ac.uk) to request this
  • We've modified the Atom API to allow setting of arbitrary permissions by adding the special elements <blogbuilder:read-permission> and <blogbuilder:comment-permission> - these can be set to webgroups, names of groups on the blog in question, or to the special strings Anyone, Staff, Students or Alumni.
  • We've increased the text limit for the biography and contact details sections of the profile page significantly (32,000 characters)
  • We've added more "Back to Blog Manager" and "Back to my blog" links to the Admin section to make it easier to navigate
  • We've fixed issues with uploading files with spaces in and inserting media into the editor

As always, if you have any problems you can comment below or email the IT Services helpdesk at helpdesk@warwick.ac.uk