All 6 entries tagged Webcam
View all 12 entries tagged Webcam on Warwick Blogs | View entries tagged Webcam at Technorati | View all 2 images tagged Webcam
January 14, 2014
Webcam timelapse – 2013
Follow-up to Webcam timelapse – January 2013 and February 2013 from Mike's blag
Yep, this is the whole of 2013 through the University webcam. I say whole. There will be a few small gaps of a minute or so here and there because the machine on which the script that grabbed the images was running did not have 100% uptime. Also it seems that at 15:23:26 on 19th March the webcam crashed or something because all the images from that time until 11:35:01 on 21st March are the same.
As with previous timelapses, images were grabbed from webcam once per minute. The video is made with 48 images per second. Each day lasts about 29 seconds and the video is 2 hours 59 minutes and ~1.2GB. No, I haven't sat and watched it all the way through.
I put the video together by making a video for each day then joining them up. It could be done all in one go but making separate videos means it's easier to spot issues. For example I noticed the video for 20th March had a considerably smaller filesize than the others and that the videos for 19th and 21st were also slightly smaller than average. It also reduces the risk of leaving something running, checking it two hours later and finding all the output is garbage.
I used ffmpeg. The command for each day's video looks like
$ ffmpeg -r 48 -pattern_type glob -i '*.jpg' -an -vcodec libx264 -f mp4 -threads 0 -b:v 1000k foo.mp4
It took about two hours to generate all the videos on a 2.8Ghz Intel Core 2 Quad. (A single video took about 16 seconds. On a 1.6Ghz Intel Core Duo a single video took about five and half minutes and on an ARM Marvell Kirkwood 1.2GHz it took about 42 minutes.)
To join them up you need to make a list of all the filenames
$ for i in mp4s/*;do echo "file '${i}'" ;done > list.txt
Then use ffmpeg's concat demuxer
$ ffmpeg -f concat -i list.txt -c copy -movflags faststart webcam2013.mp4
The -movflafs faststart argument tells ffmpeg to 'Run a second pass moving the index (moov atom) to the beginning of the file.' This means that when the video is viewed in a web browser playback can start straight away rather than waiting for the entire video to be downloaded.
March 09, 2013
Webcam timelapse – January 2013 and February 2013
Follow-up to Webcam timelapse – Monday 10th December 2012 – Sunday 16th December 2012 from Mike's blag
This time, entire months. As before, images grabbed from webcam once per minute. Videos are made at 48 images per second.
January 2013. If you want to see snow skip to 6:00. There's several instances of snow. You can also see the snow hangs around for ages on the roof on the far right.
February 2013.
December 20, 2012
Webcam timelapse – Monday 10th December 2012 – Sunday 16th December 2012
Follow-up to Webcam timelapse – Friday 7th December 2012. from Mike's blag
Same deal as before but this time it's of an entire week and they've been optimised for streaming, so they'll start playing back straight away rather than you having to wait for your web browser to download the whole thing. Depending on your patience there's three versions, presented here in decreasing duration.
10 images per second, duration approx 16:38
24 images per second, duration approx 6:56
48 images per second (y'know, like Peter Jackson did for The Hobbit. Only without being anything like that at all), duration approx 3:28
I've just realised these videos don't play in Firefox on my Linux machine. I can play the videos on my Linux machine, (I made them on my Linux machine), just not in Firefox. Lack of H264 decoding capability I guess, which I would guess means they won't work on Firefox on Windows either. I'm not curious enough to boot Windows and find out. The day long video was made on and posted from my Mac, on which Firefox happily plays back the videos, but my Mac is old and ffmpeg only manages to encode the videos at about 2fps. So I built ffmpeg with libx264 support on my not-as-old Linux machine to encode the week long ones in a sensible time. I guess I could upload an flv fallback video. But that would mean making such a thing and I can't be bothered, at least not right now.
Edit: They do play on another Linux machine of mine with Firefox and Totem plugin. Totem uses gstreamer and on the machine in question there's a gstreamer plugin that supports H264.
December 09, 2012
Webcam timelapse – Friday 7th December 2012.
Friday 7th December 2012 as seen through the University webcam in approximately 2.3 minutes.
Images were grabbed at a rate of one per minute, which in theory gives a total of 1440 images. There's actually only 1420 as 20 images scattered throughout the day didn't get saved for whatever reason. Video is constructed using 10 images per second. Total run time should be 142 seconds. According to ffmpeg the actual length of the generated video is 00:02:21.90. Don't know why.
I just occured to me that the video isn't optimised for streaming so it all needs to download before you can start to watch it. There's something in ffmpeg to do such optimisation apparently so I should look in to that at some point.
Coming soon, versions that cover a week, a month and a year. Maybe. And for varying values of soon. Imagine how long it could take to get a year's worth of images! Obviously wouldn't be able to get a consistent one image per minute for an entire year, that's obvious just from the 20 missing images in a single day. It'd be an exercise in maintaining uptime really. Or having more than one machine grabbing the images. By my calculations a year's worth of images would be about 11GB. A video created using 10 images per second would be over two days long.
October 28, 2009
University webcam image grab script thing
For no particularly good reason I've written a Bash(*) script which grabs an image from the University webcam and optionally does stuff to it. Aside from Bash it requires ImageMagick and curl, is located here and works thus:
mike@continuity:~$ ./uow_webcam_grab
uow_webcam_grab - script to grab an image from University of Warwick webcam.
Usage: uow_webcam_grab [options] [filename]
Image format is jpg unless -p is used in which case it's png. .jpg or .png extension automatically added to supplied filename it not specified.
If no filename specified output is written to stdout.
Available options:
-t : Trims image to remove the date stamp.
-h : Halves the width and height of the image. (Can be used multiple times.)
-p : Creates polaroid style snapshot. (Requires ImageMagick 6.3.1-6 or higher.)
-c : Caption for polaroid. (Does nothing if -p not used.)
-g : Converts image to greyscale
-n : Negates the image. (white -> black. yellow -> blue. etc.)
-s : Converts image to sepia like from days of yore. (Requires a version of ImageMagick higher than 5.5.3 but don't know how much higher. Ignored if filesize of retrieved image is less than 40k since that indicates image is mostly black and sepia conversion of such an image looks weird.)
E.g.
mike@continuity:~$ ./uow_webcam_grab -htp -c "$(date)" camgrab_$(date +%F-%H-%M-%S)
creates a png image called camgrab_2009-10-28-20-17-09.png which looks like:
The view from the webcam is not very interesting after dark.
(*) Or maybe it's bash, or BASH. It seems to depend where one looks. Does it matter? Well given that Unix like operating systems on which one would usually use it usually use a case-sensitive file system which would allow the creation of files named bash, Bash and BASH in the same directory on account of how they are different file names then... oh who am I kidding, no, not really.
October 31, 2008
Lunchtime webcam experiment
Writing about web page http://www2.warwick.ac.uk/about/campus/webcam
Take still images from University webcam at five second intervals, apply colour masks, assemble in grid.
Click for larger version
Script to generate is here Requires bash, imagemagick and curl and is somewhat hackish.
Edit: The larger version is not quite as large as it should be. It seems that the image has been resized when I uploaded it for some reason. Don't have time to look in to that right now though.