December 20, 2012

Webcam timelapse – Monday 10th December 2012 – Sunday 16th December 2012

Follow-up to Webcam timelapse – Friday 7th December 2012. from Mike's blag

Same deal as before but this time it's of an entire week and they've been optimised for streaming, so they'll start playing back straight away rather than you having to wait for your web browser to download the whole thing. Depending on your patience there's three versions, presented here in decreasing duration.

10 images per second, duration approx 16:38


24 images per second, duration approx 6:56


48 images per second (y'know, like Peter Jackson did for The Hobbit. Only without being anything like that at all), duration approx 3:28


I've just realised these videos don't play in Firefox on my Linux machine. I can play the videos on my Linux machine, (I made them on my Linux machine), just not in Firefox. Lack of H264 decoding capability I guess, which I would guess means they won't work on Firefox on Windows either. I'm not curious enough to boot Windows and find out. The day long video was made on and posted from my Mac, on which Firefox happily plays back the videos, but my Mac is old and ffmpeg only manages to encode the videos at about 2fps. So I built ffmpeg with libx264 support on my not-as-old Linux machine to encode the week long ones in a sensible time. I guess I could upload an flv fallback video. But that would mean making such a thing and I can't be bothered, at least not right now.

Edit: They do play on another Linux machine of mine with Firefox and Totem plugin. Totem uses gstreamer and on the machine in question there's a gstreamer plugin that supports H264.


December 09, 2012

Webcam timelapse – Friday 7th December 2012.

Friday 7th December 2012 as seen through the University webcam in approximately 2.3 minutes.


Images were grabbed at a rate of one per minute, which in theory gives a total of 1440 images. There's actually only 1420 as 20 images scattered throughout the day didn't get saved for whatever reason. Video is constructed using 10 images per second. Total run time should be 142 seconds. According to ffmpeg the actual length of the generated video is 00:02:21.90. Don't know why.

I just occured to me that the video isn't optimised for streaming so it all needs to download before you can start to watch it. There's something in ffmpeg to do such optimisation apparently so I should look in to that at some point.

Coming soon, versions that cover a week, a month and a year. Maybe. And for varying values of soon. Imagine how long it could take to get a year's worth of images! Obviously wouldn't be able to get a consistent one image per minute for an entire year, that's obvious just from the 20 missing images in a single day. It'd be an exercise in maintaining uptime really. Or having more than one machine grabbing the images. By my calculations a year's worth of images would be about 11GB. A video created using 10 images per second would be over two days long.


July 02, 2012

Why not mash clips of all your videos together?

A long time ago, in a galaxy street far not all that far away, there is a nightclub I visited on N occasions. You might know it as Kasbah because you're much younger than I am and don't remember when it was called The Colosseum. Anyway, it had a big screen at one end that showed random video clips of a few seconds each. Bits of cartoons, the Enterprise D going to warp, all sorts of things. For reasons that completely escape me I started thinking about that recently and wondering how easy it would be to make a video sequence like that. I have no practical use for such a thing, but it seemed like an interesting exercise.

So if you've been wondering how you can easily create a video that comprises randomly selected clips then this is the blog post for you. To create such a video you will need:

  • One computer running a Unix-like Operating System. (I tested on Mac OS X 10.6 and openSUSE 12.1)
  • Lots of video. The more video you have, the more random (for a given value of random) the clips will be.
  • A copy of ffmpeg compiled with libx264 support. If you're using a Mac I highly recommend you have MacPorts installed or be prepared to hack the script a bit to make it work.
  • Script that mashes up videos which you save in to a directory on it's own and then run. It will plunder your video collection and assemble a montage of random clips of total length defined by a variable in the script (default is 60 seconds). The number of clips and length of each clip varies on each run.

You'll want to read the comments at the start of the script before running it. The output is 1280x720 H.264 encoded in an mp4 container. Aspect ratios are not preserved. Any video clip from a source with an aspect ratio that isn't 16:9 will be stretched to 16:9. I did consider preserving aspect ratio but doing so would have meant that some clips had back bars either side of them and some wouldn't. That would be visually jarring. The output has no sound because it would be horrible if it had sound. I did try adding sound but it went hopelessly out of sync with the video and I couldn't be bothered figuring out making that work.

Ideally there would be a sample of the output here. In practise there isn't because of concerns about licenses, credits and all that sort of stuff.


April 10, 2012

Handbrake on SLED 11 SP2

The recently released Handbrake 0.9.6 doesn't build on SLED 11 SP2, which is a shame. It's probably possible to make it build if you have more patience and knowledge of such things than I have, but 0.9.5 builds fine.

Get HandBrake-0.9.5.tar.bz2 from http://sourceforge.net/projects/handbrake/files/0.9.5/

If you don't already have it, get the SLE-SDK from http://download.novell.com/Download?buildid=NgW3ToaagDQ~ and use YaST to add it as an Add on Product.

Become root and install the packages you need

$ zypper in -y -l yasm patch autoconf automake libbz2-devel libwebkit-devel libnotify-devel libgudev-1_0-devel fribidi-devel  gstreamer-0_10-plugins-base-devel dbus-1-glib-devel libtool gcc gcc-c++ intltool gtk2-devel glib2-devel zlib-devel

Then as your your regular usercode you can do

$ cd /tmp
$ tar xf ~/HandBrake-0.9.5.tar.bz2
$ cd HandBrake-0.9.5
$ ./configure --prefix=/local/handbrake --launch --launch-jobs=0 && cd build && make install

--launch-jobs=0 causes the build process to make use of however many cores/cpus your machine has. On a machine with an Intel Core i7-2600 @ 3.40GHz it took under two minutes. Assuming all goes well, run it with

$ /local/handbrake/bin/ghb

October 28, 2011

Raspberry Pi. An ARM GNU/Linux box for $25/$35.

Writing about web page http://www.raspberrypi.org/

About the size of a credit card. $35 for the model with Ethernet. 1080p H264 video playback. Negligible power draw. Powered via 5v micro USB. If you want a tiny low power device that's a 'proper' computer rather than a hacked router or similar, this is probably it. Hopefully you'll be able to buy one soon.


July 27, 2011

Nothing says I Love you like…

...a fridge magnet bearing the University Crest with 'I Love you' written underneath.

Nothing says I love you like

Yours for only ${price} at the University bookshop. They also sell things the existence of which makes some kind of sense.


June 29, 2011

Update a driver in an initrd file

Recently I found myself wanting to put SUSE Linux Enterprise Desktop 11 SP1 on to machine with a Sandybridge chipset. This was a problem as the e1000e driver in SLED 11 SP1 isn't new enough to support the network card in the machine. Having found an updated driver I still had the problem that I wanted to be able to do installations over the network with AutoYaST and PXE boot. That didn't work because the initrd file being used for PXE boot didn't have the new e1000e driver in. So the install failed almost immediately due to the absence of a network connection.

The solution is to make a new initrd file containing the new e1000e driver. It's far from obvious how to do this but I found the solution at http://www.sharms.org/blog/2007/11/howto-add-updated-e1000-driver-to-sled-10-sp1/ This post is basically just me duplicating the information because you can never have such information in too many places. Also I've expanded it a little bit to include instructions on how to do some bits that I had to work out. You can of course adapt the following for whatever module you might find the need to update.

First of all, make a new directory and unpack the current initrd in to it

$ mkdir -p updated_initrd/initrd_unpack
$ cd updated_initrd/initrd_unpack
$ gunzip -dc /path/to/initrd | cpio -idmuv

Now get the new version of the e1000e module. I found this in an rpm on Novell's website which I needed to download and unpack to get the driver out of it.

$ cd ..
$ wget http://drivers.suse.com/driver-process/pub/update/Intel/sle11sp1/common/i586/intel-e1000e-kmp-pae-1.2.20_2.6.32.12_0.7-1.i586.rpm
$ mkdir rpmcontents
$ cd rpmcontents
$ rpm2cpio ../intel-e1000e-kmp-pae-1.2.20_2.6.32.12_0.7-1.i586.rpm | cpio -idv

Next copy the new driver over in to where you unpacked the initrd

$ cp lib/modules/2.6.32.12-0.7-pae/updates/e1000e.ko ../initrd_unpack/modules/
cp: overwrite `../initrd/modules/e1000e.ko'? y

Now you need to update files called modules.alias and modules.pcimap using information that you get from the depmod command. You can get the information to put in modules.alias with

$ /sbin/depmod -n $(pwd)/lib/modules/2.6.32.12-0.7-pae/updates/e1000e.ko | grep ^alias > /tmp/newaliases

Then I made a copy of the modules.alias file with the information for e1000e removed from it

$ grep  -v ' e1000e$' ../initrd_unpack/lib/modules/2.6.32.12-0.7-default/modules.alias > /tmp/modules.alias

Add the new information to that file

$ cat  /tmp/newaliases >>  /tmp/modules.alias

And then replace the original file

$ cp /tmp/modules.alias  ../initrd_unpack/lib/modules/2.6.32.12-0.7-default/modules.alias
cp: overwrite `../initrd_unpack/lib/modules/2.6.32.12-0.7-default/modules.alias'? y

The process is the same for the modules.pcimap file

$ /sbin/depmod -n $(pwd)/lib/modules/2.6.32.12-0.7-pae/updates/e1000e.ko | grep '^e1000e ' > /tmp/newpcimap
$ grep -v '^e1000e ' ../initrd_unpack/lib/modules/2.6.32.12-0.7-default/modules.pcimap > /tmp/modules.pcimap
$ cat /tmp/newpcimap >> /tmp/modules.pcimap
$ cp /tmp/modules.pcimap ../initrd_unpack/lib/modules/2.6.32.12-0.7-default/modules.pcimap
cp: overwrite `../initrd_unpack/lib/modules/2.6.32.12-0.7-default/modules.pcimap'? y

Finally, make the new initrd file

$ cd ../initrd_unpack
$ find . | cpio --quiet -o -H newc > ../initrd
$ cd ..
$ gzip -v9c initrd > initrd.gz
$ mv initrd.gz initrd
mv: overwrite `initrd'? y
$ file initrd
initrd: gzip compressed data, was "initrd", from Unix, last modified: Wed Jun 29 12:31:49 2011, max compression

March 25, 2011

SFTP access to My.Files

If you’ve ever wanted to be able to access your 'H drive' or your department shared volumes from a non-IT Services managed computer, or from you phone, or whatever device you have in mind, via an interface that's slicker than using a web page, you may wish to sign up for the access to My.Files via SFTP pilot. For details of how to participate see

http://go.warwick.ac.uk/its/servicessupport/datastorage/myfiles

The pilot’s been going a while but I kept forgetting to blog it. The documentation is all about using the ExpanDrive client, but any SFTP client should work. If you use GNOME/KDE then Nautilus/Dolphin both have built in SFTP support.


December 13, 2010

White to black

Follow-up to Red to green. from Mike's blag

I acquired a lid from an iBook G3 which, unlike my G4, has a clear plastic case with paint on the inside. So I striped the paint off and re-sprayed it. I chose black. Partly because the G4 iBook has a grey strip around the middle of the case, so the result is a case that is white on the bottom, grey in the middle and black on top. (The photo below totally fails to show the grey.) Largely because I had a can of black paint. Though said can turned out to contain only enough paint to cover about a third of the lid, so I ended up having to buy another can.

The result is this:

Black iBook

The new lid is a bit beat up, but no more so than one would expect. It's also currently held on with only three screws, the forth having fallen in to the sofa. I can't figure out how to retrieve it at the moment.

Not sure if it the result was really worth the effort, but one has to try such things sometimes.


November 18, 2010

Harry Potter film excerpt leaked online

Writing about web page http://www.bbc.co.uk/news/entertainment-arts-11783413

Warner Bros said it was "working actively" to remove the video, which it said was "stolen and illegally posted" on file-sharing websites on Tuesday.

"We are vigorously investigating this matter and will prosecute those involved to the full extent of the law," it added in a statement.

I look at this story and think that maybe Warner Bros should save themselves the effort. It's not the entire film that's been leaked.  No one is going to think, "I was going to go see the new Harry Potter film at the cinema but now that the first 36 minutes of it are available on the Internet I'll just watch that instead." On the other hand some people who are unsure about whether or not they want to go to the effort and expense of going to see the film in the cinema might see the first 36 minutes of it and subsequently go see the film at the cinema. The leak could in fact be beneficial for Warner Bros.

It won't be long before the entire film is available for free online via torrents and the like and being burnt on to DVDs slapped in a box with some dodgy cover art (E.g.) to be sold dirt cheap on market stalls in the far east. Consider that along with the above described possible benefit of the leak and I have to wonder, is it worth a lot of time and money, and it won' t just be Warner Bros' time and money but also that of various law enforcement agencies, being spent to "vigorously investigating" the leak of a 36 minute segment?

Warner Bros also say

"We are working actively to restrict and/or remove copies that may be available,"

Anyone with a passing familiarity of the Internet ought to realise this is futile. They'll never find all the sources because new sources will keep appearing.



Search this blog

Tags

RSS2.0 Atom
Not signed in
Sign in

Powered by BlogBuilder
© MMXXIV