Friday, 24 July 2009

The Stone Age.

I was having dinner with lots of computer sciencists last night, these things happen in academia (I love my job). The older scientists told us all stories about what it was like to do computer science in the olden days, particularly one of my supervisors, Professor Alan Winfield. He's a real engineer! I don't want to spoil his anecdotes, but this is amazing. When he was doing his PhD back in the 70s one of the academics very excitedly brought in their new (first generation) Intel processor. There was absolutely no software running on it (like y'know an operating system) it was just some hardware attached to a board. Without any support. So what him and a few others did is develop a compiler for it and a system for downloading code and a bootloader and all the stuff that's needed to just make the thing work (same story as told by Alan). These days just one of those things would be a phd worthy endeavor, they did it just because it was there! I wouldn't even know where to begin, I don't know what a bootloader even is. It's amazing, I have a Master's degree in electronic engineering and I know NOTHING about real engineering like that. How was it possible for me to get that degree? When I went to Aberystwyth there were guys there (specifically Mark Neal and Colin Sauze) and they could just knock together some stuff and it becomes a fully working robot!
Anyway, back to the point. Alan's mentioned that they transferred code between their giant processors and computers using punch cards. Punch cards! Really that's only one level up from bashing rocks together. To debug their code it was easier to edit it on the card than in the code on the processor, he would de-bug stuff using a hole punch and some white tape. He had another story about how he de-bugged a problem in his code using an oven! Computer science, it would seem, was difficult in the past. You wrote a program and the next day it would be compiled, the day after that you might be able to get time on the university computer to run your code. What discipline it must take to make your code perfect first time. How much dedication must you have needed to de-bug something on punch card or wait 2 days to find the results of your code?

The point I'm making here is that technology has come a LONG way since the late 70s / early 80s (hereafter refered to as "the stone age"). Even during my lifetime there's been a hell of an improvement. During the same time there's been a dumbing down of us as computer scientists. I don't think I know anyone (apart from Alan obviously) who is so very capable and dedicated that they could work with this old technology and remain motivated and sane. The sheer range of knowledge required is astounding! So much of this tedious, difficult work is done for us these days. It's easy to write something that doesn't quite work, there's no rigour. I don't need to read and re-read my code to check for syntax errors or memory leaks: it takes 2 seconds not 2 days to compile and run code. The level of dedication and perservance that was required to complete a computer science PhD in the stone age has been lost, what will things be like in the future when I'm Alan's age? Will I be telling the young students about how when I started university computers only had ONE core? Will the role of programmer be usurped altogether? Will robots assemble themselves out of a heap of compoent parts? Given the current rate of technology development the gulf will be even bigger than the current stuff and punch cards. A prospect both scary and exciting. Exciting in that we'll have new toys, but scary to consider how much lazier we'll have become.

3 comments:

  1. Thanks Jenny - I'm flattered! Here's a blog post from 2007 with the full story of that first 8080 development board:
    http://alanwinfield.blogspot.com/2007/03/you-were-lucky.html

    Alan.

    ReplyDelete
  2. That's a crazy story! Although it has inspired me to better myself and get an Arduino board to play with in my spare time. Adam presented them on Friday, £25 and they're about the same as your old Intel 8080: 1K EEPROM, 2K RAM, 32K of memory and a 20MHz processor. Hot stuff!

    ReplyDelete
  3. I'm not sure modern day Computer Scientists are dumbed down (at least not the competent ones!). My Dad did a Physics PhD at a similar time to Alan and had all the same marvelous debug techniques for punch cards. It took him a year of compiles before a technician told him about the brand new STORE_PROGRAM instruction that meant he could do something marvelous - not have to compile every month. He's worked in computing since that time until he retired last year.

    However he has less skill understanding BIG modern software architecture. Something that is spread over 4 operating systems, networks of desktops and servers, uses 5 scripting languages, 2 data interchange formats and eventually spits out an AJAX web app for several million users to asynchronously talk to some cloud storage service running in the Amazon cloud via a custom XML protocol.

    We live in an age of software over hardware. We want generic reliable hardware that can be programmed, this is why embedded micro-controllers are taking off more than ever because we have a surplus of young people who know software, but could no way reverse engineer an interface to an unknown CPU or solder together lots of components to produce logic circuits.

    The kind of knowledge that is relevant and the skills needed have changed dramatically since the 1970s. The generation before have done a big chunk of the harder work, but now we just have hard applications. I agree though I can't imagine myself ever being able to pull off these old school low level wizardry techniques, but we should also value the modern high level technology wizardry :)

    I've bought myself some micro controller kit to play with too, seeing Paul and Wenguo work on electronics is inspiring.

    ReplyDelete