Day 3 Keynotes
Writing about web page http://conferences.oreillynet.com/cs/et2005/view/e_sess/5910
Neil Gershenfeld Bits & Atoms
The state-of-the-art in fabrication is the chip factory: Actually right now it's not very sophisticated: You spread some stuff around and cook it. Compared to biology the big difference is that the things you're making don't know anything about being made – whereas when you make an animal the cells know how to make more cells – the specification of the structure lives within the structure itself.
For traditional manufacturing, errors in mfg are proportional to noise in the process. In signal theory (e.g. networks) a certain amount of noise can be tolerated without having any effect on errors in the system. If we can make fabrication processes where the object being fabbed knows the specification we can get the same kind of noise-toleration (e.g. genes can cope with errors and still make an organism)
20K buys you a field fab-lab: a laser cutter, sign cutter, micron-scale milling machine, and a microcontroller programming setup. Microfabrication is now in the same place that computing was about 25 years ago (e.g. when minicomputers like the PDP were around). The PC equivalent of a microfabricator is not far off
Fabrication labs at this scale are a disruptive technology, Neil's group have been introducing them into developing countries to see what can be acheived: Answer – all sorts of cool small-scale solutions to local problems
Cory Doctorow All complex ecosystems have parasites
- AOL chooses to allow spam through despite the cost, because if you solve spam you break email. Uncontrollability is a key element of a fault-tolerant system like email
- DVD has been developed to be controllable; CDs were not. The result is that if you invest in CDs, you can re-use them as MP3s, ringtones, etc,etc… With an investment in DVDs you never get any increase in the value.
- The DVD control model is fragile and unscalable; trying to extend it out to other devices – wider DRM - won't work, or will cripple the industry if it does. DRM isn't working now – any movie is available over P2P, depsite the huge costs of implementation.
Justin Chepweske, Onion networks
2 billion dollars a year is spent on http optimisation: load balancers, caches, etc. This is at least partly because HTTP is sub-optimal for the size of the web
- Http is very bad at transferring large (multi-GB) files – packet loss, broken 32-bit apps, etc.
- One solution is to use very high-quality transports, but it would be better to have a fault-tolerant transport (like RAID for storage)
- swarming is RAID for the web: tolerates failures of transport and failures of servers
- swarming features: it's a content-delivery system. data is signed and encrypted so you don't need to trust the host you download from. runs over standard protocols – it's an extension to HTTP
- standard java http stack replacement available (open-source)
Jimmy Wales – Wikipedia & the future of social software
- 500K entries
- taxonomy: 350K categories, hierarchical, dynamic
- 500MM page views / month
- the original dream of the net – people sharing information freely
- problems – quality control, author fatigue
- solution: wiki[pedia|cities] – a social computing successor to 'free homepages'. Uses a free content license so that people can take their content with them if they want to leave
- sites are maintained by communities rather than by an individual, thus mitigating the risks of quality control and author fatigue
- wiki software doesn't eforce social rules – for exampe 'votes for deletion' page
- wikipedia is a social innovation, not a technological one.
- software which enables collaboration is the future of the net
Panel discussion Folksonomy
Why do companies allow end-users to participate in tagging?
In flickr's case it was primarily done for the individual user and then aggregated, in wikipedia's case it was primarily done for the community. SB (flickr) – folksonomies are not a replacement for a formal taxonomy, they are an addition. JS (del.icio.us) – also started from the assumption that tags were a personal thing, and just the the folksonomy emerge.
Some tags are nothing to do with categorisation e.g. toread on del.icio.us, even though they are interesting as a social behaviour
flickr / del.icio.us are different to wikipedia, because they start with individual spaces and then aggregate them, whereas WP starts with a shared space and uses negotiation/governance to manage it. The individual approach is less optimal for the social stuff – e.g. people tagging pictures of their trip to mexico as 'etech' because they went just before the conference – right for the individual, but breaks the aggregation.
JS: Although you can key between tags between de.icio.us / flikr / technorati, it's not always appropriate – the tags mean different things in the different applications
Q: How do you provide feedback to people to improve their tagging? In wikipedia it's easy; in flikr it doesn't matter – the primary purpose of a flikr tag is personal. Also the volume of pics. is so great that you don't need a perfect vocabular. In del.icio.us, there are some tools to help you see which of your tags are also used by others.
SB: formal taxonomies are ultimately limited because (as far as we can tell) the real world isn't easily classified.