Reinstalling and rebooting

I’ve just reinstalled my computer’s operating system – I’ve messed around with it too much, and changed too many random settings to figure out how to undo some random things that were happening on it. While reinstalling, I noticed something a bit odd about the number of times I rebooted the computer. I use Apple’s Mac OS, so I’m not used to rebooting that often. During the reinstall process – the operating system, and a number of applications – I rebooted a total of three times.

Reboot 1: from old operating system to installer.

Reboot 2: installing latest updates to the OS from Apple
Reboot 3: installing mouse

(When the installer ended, it did reboot the computer into the new operating system, but I’m not counting that as it didn’t realy load up an operating system to install the new OS. :) )

Note the odd one out. Why does something as simple as installing a mouse warrant a reboot of the whole computer? Ah yes, the mouse was made by Microsoft.

Need I say more?

Laptops in Lectures

This post has been brought on by this post on Slashdot (or more directly, this article). Basically, a professor at the University of Memphis, USA, has banned laptops from her classroom. As most people who’ve been in a lecture that I’m also attending will know, I use a laptop to take notes down. I’ve decided that I should really explain why I do this, and while I’m at it I’ll describe how I view lectures in general too.

Why do I use a laptop? Well, for starters my handwriting is pretty bad. It’s probably gotten worse now as I don’t write much any more, but even at the start I found reading typed notes much easier. Another reason, which has developed over time, is that I type an awful lot faster than I can write – so I can get down much more of the detail given in the lectures, which is useful when reading through the material again at some future point. And finally, I can type without thinking much about the typing – so I listen to the lecture, and understand more of the material given in it straight away.

What’s the downsides to using a laptop? Well, at the start, I had problems with equations – but I quickly found Mathtype, which means that I can quickly type equations, rather than having to use the mouse to do them. So I can now type equations in pretty easily and quickly, although slower than I can type prose as it requires more key strokes per character (e.g. for greek letters, I hold down the Apple key (I use an Apple Mac computer), and press the ‘g’ key. I then release the Apple key, and press the key for the greek letter – e.g. ‘d’ for delta, ‘g’ for gamma, etc.

Another big problem, and this is one I have yet to resolve adequately, is diagrams. I’ve tried multiple approaches to this over the years – using a mouse (or rather, touchpad) to draw them in takes too long, using a graphics tablet can be confusing (you’re drawing in one place, and it’s appearing in another – though you get used to this), messy (wires everywhere) and slow (mainly due to the software, but also the delays in picking up and putting down the stylus, especially when typing in labels for the diagrams). So at the moment I just get the pen and paper out, doodle the diagrams down, give them an ID, insert the ID into the document at the appropriate place, and draw them in later.

If you’ve read the article I linked to above, you’ll know that the professor basically said that computers take up all of the student’s attention, and also creates a ‘picket fence’ between the student and the teacher. If you’ve read my comments above, you’ll realise that I don’t think this is a problem. What I do think can be problems with using a laptop in class is if the student isn’t using it to take notes – playing games, surfing the internet or chatting via IM has no place in a lecture – or if they’re using it badly, e.g. they can’t type fast. Also, if it distracts the teacher, or other students, then it’s not good. (Incidentally, if you’re in a lecture with me and I’m distracting or annoying you with typing, let me know – I’ll probably ask you why it’s disturbing you, and if you’ve got a valid reason I’ll put the laptop away and use pen and paper. I’ll then chat to you after the lecture about how I can keep the laptop from disturbing you in the future.)

Now, on to my views of lectures in general. I firmly believe that the purpose of a lecture is to convey understanding of the subject material from the teacher to the student. It is not a group note-taking session; that only distracts the student from the subject. Note that this is actually contrary to what the physics department of the University of Manchester (where I am at the moment) officially states.

My ideal lecture course would be either lecture notes provided beforehand (either on paper, or via the web – preferably both; also, either verbose lecture notes or presentation slides), which can be read by the student before the lecture starts. Then, the lecture goes through the material in the notes, with the emphasis being on explaining the material and making sure the students understand it. Regular “Put your hand up if you understand what’s going on” prompts from the teacher should make sure that everyone’s paying attention, and also prevents the “I don’t want to be the only person to put my hand up” that often happens if you ask who doesn’t understand the material. Also, at the end of the lecture do a quick sum up of the lecture, and say what will be taught in the next one – and make sure the students are paying attention, not packing up and trying to leave. In fact, it’s probably a good idea to start off the lecture in a similar way.

Broadcasting via the Net

In a departure from the usual, this post is about technology. Specifically, TV and the internet. Precisely, how to use the latter to transmit the former.

The internet’s developing pretty nicely – it currently connects a large proportion of the First World, and will hopefully be getting greater inroads into the Third World in the future. Of those that are connected to it now, a large amount of them have nice, fast connections – easily enough to download TW-quality video in real time.

Yet we don’t have TV broadcast by the internet? Why? One reason would probably be piracy concerns, but I won’t talk about here. Another big reason is the sheer amount of bandwidth that the broadcaster would need. Let me explain.

The internet works by you requesting data from a server, and that server sending the data to you via a series of relays. That data goes to you, and only you (excluding people snooping on it, but that’s another topic). So were you to receive a TV channel by the ‘net, then there would be a dedicated stream running from the server directly to you. For decent video quality, that requires a fair bit rate – and that bit rate needs to be delivered to a few million people simultaneously. That’s a huge amount of data that the TV station’s server needs to pump out – far more than is feasible.

So, how can this be got around? My answer would be to mirror what TV stations currently do to a certain extent – broadcast something once, and let everyone get copies of it. How? Imagine the server, sending out a single stream of video. You want to get this to the millions of recipients. Those recipients are connected to the server via a whole set of wires, relays and routers. The last of these is important here. Whenever the signal gets to a router, and needs to go more than one way, the router should just send copies of it each way. Think of it as a tree system, with the broadcasting server at the trunk, the recipients as the leaves, and the routers as the points where branches sprout off.

There’s a number of things that you need in order to do this. Some are easy, some are very difficult. First, an easy one: you need to know all of the recipients that want the TV signal. That’s easy because you just continue receiving the same requests as currently happen. Now, the difficult ones. You need to know the topography of the internet – the quickest routes to each of the recipients, and also the most economical (the routes which will cut the number of copies, and hence the total amount of data traveling through the internet, down to a minimum). That’s difficult, but not impossible with a fair bit of math and computer programming.

The most difficult problem is that you need to split the signal at the routers, which requires software running on the routers looking for the splitting commands. On the internet as a whole, that’s a huge amount of routers – most of which would probably need replacing to be able to cope with this (Cisco and the like would probably love that). The good news is that not all routers need to be able to do this – you can substitute for those that can’t by using the current system of multiple streams, such that you end up with multiple trees.

I should say that this doesn’t only have applications with TV broadcasting – it would apply to normal data being transmitted, if routers could combine pieces of data that are the same and are going to geographically close-together locations. That would probably cut down the amount of data being transmitted at any one time by a fair amount, in the same way that zipping a set of files decreases the amount of disk space needed to store them. It would also remove the problem of servers dying whenever large amounts of people simultaneously access them (e.g. the Slashdot effect).

I’ll finish with the downsides. First, this system would be very much time-based – the data would have to be requested, and/or transmitted, simultaneously to multiple recipients. The second is probably the killer – privacy and copy-protection. The routers would need to read through the content to some extent to process it, i.e. compress it and tag it with multiple locations. People would probably consider that to be rather Big Brother-ish. Also, such things as the so-called Digital Rights Management (DRM) would probably be incompatible with this system, as would encryption (as I understand it, both of these mix up the data in unique ways, such that only the intended recipients can view it – these would then have to be treated as separate data streams). But then, the current TV broadcasting systems don’t have DRM or encryption – anyone with a TV and an ariel can receive them. So maybe there’s hope for this idea yet.