Sunday 27 May 2012

The myth of e-mail overload

As a recent convert to the cult of GTD (all praise the Good Book: my inbox has been emptied daily for three weeks, amen) I was interested to pick up my employer's internal propaganda rag and see an article on e-mail overload. The thrust of the article was that e-mail overload is caused by thoughtless senders recklessly CCing and replying to all, damn their eyes.

Thinking about this in the light of my recent experience with GTD I came to the conclusion that the article was exactly wrong. E-mail overload isn't a problem of sender behaviour, it is a problem of recipient behaviour, specifically with recipients mismanaging their focus.

And what led me to conclude this? Humility is endless. I was edging into e-mail overload, and a few simple GTD-related changes of behaviour have convinced me that I wasn't being drowned in crap by idiots, but that instead I was the idiot, drowning myself in crap.

So what did I start to do differently? Here is list, largely for myself, should I start backsliding.

1 Turn off all e-mail notifications and don't leave your inbox open

E-mail is not instant messaging. Better off thinking of it as a electronic memo system. Would you allow the office posty to walk up to at your desk or wherever, force a paper memo into your hand and say 'hey, it's from Bob, he says thanks!' or 'HR have restructured!' Would you let the posty do that ten seconds after every single memo was posted?

No? Then why do you allow the e-mail delivery system to do that? Is every e-mail really so important that you need to drop what you are doing to focus on it, even for a few seconds? If not, why don't you disable all e-mail notification?

And why not close your inbox while you are at it? That way you can focus on what you are doing, which is probably more important. Make your default Outlook view your calendar or task list, something that'll help you concentrate on the work ahead.

2 Don't read mail in delivery order

Reading e-mail in the order it arrived gives it a false sense of urgency. Look this happened! Then this happened! Wow! Also, it makes you dance from one topic to another and lose focus.

Instead, order by subject and within each subject read the latest first. You probably didn't need a blow-by-blow account of something that's already been sorted out. Also, it makes it easier to delete entire threads.

3 Deal with all your e-mail in one go

This is a real GTD thing and something that is only really made possible if you do what David tells you to. Delete, do in less than two minutes, delegate or defer. If you are doing e-mail, then focus on doing e-mail: chew through the lot, starting at the top. Don't get sucked in to other activities.

This is made easier if you move the entire contents of your inbox into a processing folder first. Incoming e-mail is a distraction, and seeing something arrive gives it a false sense of urgency. Also, it is pretty demoralising to see stuff coming in faster that you can deal with it. Just move it all out of your inbox, and leave any new stuff for the next round of processing.

4 Empty your inbox

Another real GTD thing. Never leave stuff in your inbox for later. Decide what action each e-mail entails, then delete, do in less that two minutes, delegate or defer. Deal with it once, otherwise you are duplicating effort. All the stuff that needs to be deferred should be put into an action list or filed.

There is nothing like an empty inbox to beat feelings of e-mail overload.

Wednesday 12 October 2011

For better or for worse

I've never thought I had much imagination. For example, when I faced my fiancee (or possibly the registrar, I forget exactly who I was facing) and vowed something about for better or for worse, I'm not sure I imagined exactly what that meant in practice. I had vague ideas about us and our families and what it meant in terms of duty to each other and to them, but the details?

I didn't imagine it meant trying to comfort my wife while, with her mother and sister in a room in Poole Hospital, we literally watched her father breath his last.

I didn't imagine it meant watching my wife's mother bury her husband of 44 years in an eight-foot grave which she knew would one day be hers too.

I didn't imagine it meant holding our three-year-old son's shirt collar to stop him falling in to the grave as I asked him to throw a handful of earth on the coffin 'to help bury Grandad', which he did, bless him.

I didn't imagine any of these details.

But what of it? Does it matter? No, it doesn't. I gave my word, and I stand by it. There is better and worse to come yet. I still can't imagine what that means in practice, although I've seen more of the possibilities now. But I've given my word and I'll stand by it. And I'll stand by my wife, for better or for worse.

RIP Gerry.

Friday 7 October 2011

Stupid smartphones

After just over a year of owning a smartphone, I've come to the conclusion that smartphones are really bad phones.

Don't get me wrong, I like the whole app, GPS, mobile web, media thing. I bought a smartphone to keep me sane in the wilds of Corfe Mullen, in a house untouched by broadband, in a town that can't sell me a copy of the Economist, and for that it was great. It's also been very useful for occasional satnav. And even on a 320 by 240 screen, ebooks have been great.

But as a telephone? You know, for making a few calls, sending a few texts? No, sorry, as a plain old phone, smartphones suck.

I've decided that the things I really want from a phone are, first, near infinite battery life, second, a decent keypad, and third, a decent level of robustness.

To that end, I've reclassified my smartphone as a mini tablet, sticking the cheapest possible pay-as-you-go SIM in it for data (giffgaff at 20p a day) since it is on Wi-Fi for most of its time. For voice and text, I've bought a cheap-and-cheerful SIM free Nokia handset, a C1-02, which doesn't even have a camera, but also doesn't need charging every damned day, doesn't randomly pick up or drop calls when taken out of a pocket and doesn't have a screen that'll crack when dropped out of said pocket (I'm looking at you, Wildfire).

That's progress.

Sunday 7 August 2011

Node: strangely familiar and not in a good way

Programmers are always interested in new idioms and languages, generally as a route to making their lives easier. One of the latest trendy idioms is seen in Node, which promises an easy way to program systems, such as high-traffic web servers, that need a lot of concurrency and a lot of raw performance. Concurrency, performance and ease of programming have always had fraught relationships, so Node has produced a lot of interest. But is Node really the way forward?

Node is essentially a JavaScript library in which nothing ever blocks. Anything that could plausibly block, like I/O, takes a callback function, which is queued until a result is available. This approach has benchmarked well for tasks for web-server type stuff.

So what about the non-blocking and the callbacks as a programming idiom? Easy to code? You see a lot of callback functions in Node. So many callback functions in fact that computer scientists will start to get a funny feeling that they've seen something similar before. Now where did we see lots and lots of callbacks? Erm, its coming, yes, it was, er, yes, it was back in a class at university, probably the second year, something about interpreters, meta-interpreters perhaps, oh my good god no, surely not. It is! It's continuation passing! Ah, no, no the horror! The flashbacks! The brain ache! Make it stop, make it stop, please!

Why would anyone do this to themselves again? People have been drawn to Node because of the benchmarks but surely no one is going to put up with writing programs in a continuation-passing style? CPS is fine for writing a Scheme meta-interpreter for an assignment, but for production code? Really? Also, while Node handles concurrency, it is fundamentally single threaded, and any true parallelism has to be bodged on.

Node does have a certain something though, and the benchmarks cannot be ignored. There is a lesson to be learned from Node, and the lesson is that concurrency can no longer be left to operating system threads. OS threads are too heavyweight. They use too much memory and impose an unacceptable context switching overhead when used with blocking I/O. Node shows that concurrency can be left to the programming language.

But what is that alternative? Node provides a proof of concept, but do we have to accept the cost of programming in a continuation-passing style? Well, no, we don't. CPS was invented as a way of specifying programming languages, so obviously the burden can be shifted into a language.

This is where Go comes in. Go introduces a very lightweight concurrency construct which they've called the goroutine. Goroutines have the Node-like property that they don't cause their OS thread to block. Even better Go can run goroutines in parallel by multiplexing them on to operating system threads. And goroutines are so cheap that even my five-year-old laptop can run hundreds of thousands of the things.

In a neat acronym twist, goroutines are partly inspired by Hoare's work on communicating sequential processes. So we can say that what Node is trying to do with CPS, Go does better using CSP. Nicely done, Bell Labs Google.

So where does that leave Node? The way forward? No, it isn't. But it makes an important point well, and it will be influential.

Saturday 16 July 2011

The new kids: Go, Scala and Clojure

Somehow I managed to sleep through a generation of programming languages. Blame a career diversion and the kids, but the whole Python and Ruby thing passed me by. And why bother with them anyway? What did they have that Perl didn't?

Now I find another generation has arrived in the form of Scala, Go and Clojure. And very promising they look too, inheriting a lot of the best features from their parents (or, more correctly in Scala's case, from its gene donors and, in Clojure's case, from its parent organism).

So where did they come from? What are their lineages? Which one should we be following?

Let's start with Go. Go was born in a New Jersey resort town after a drunken hook-up between C and JavaScript at a Bell Labs party. The father, C, is the archetypal taciturn systems guy. Wears black. Slightly unstable. The mother, JavaScript, is much younger and is fun in a kooky way. Go was raised in poverty by its grandparents, who had to move from Jersey to the West Coast in search of work. Being poor, Go make everything it owns from scratch. Despite its hard start, Go grew into a quite a fun adult – like his mom – and, even with a few noticeable quirks, is remarkably easy to get along with.

Scala – at the other end of the social spectrum – wasn't even conceived in the usual way. Scala was created by splicing genes together in a (German) Swiss laboratory. SML and Java had tried to conceive naturally, but they couldn't get over SML's religious mania and Java's bizarre XML fetishism. Their genetic material was padded out by a rogue lab technician with a whole load of other crazy stuff that happened to be lying around from past experiments. The resulting wunderkind was then sent to all the right schools. It inherited amazing riches from its parents. As an adult, Scala is refined company with an encyclopaedic knowledge, though it is – if one were to be critical – a bit too preachy and up itself on occasion.

Clojure is the daughter organism of the infamous Lisp bacterium. Lisp has once again budded, as it has many times since its inception in the 1950s. This time is has incorporated new genetic material from its hosts allowing it to survive on the JVM. Typical of a Lisp-family bacterium, it retains the quirks of its parents while adding a few random mutations of its own. People infected by Clojure tend to lose the ability to use grammar. Carriers also tend to suffer from religious manias of an all-things-are-one sort.

So what does this diverse lot have in common? What trends can we see in this generation?

As with all generations, we see two effects. We see the reaction against their parents' values. And we see adaptation to a changed environment.

With Go and Clojure we see the reaction against their parents values most clearly. The reaction is against their parents' love of hierarchy. In the Java generation, hierarchy was everything. Scala, the rich-kid institutional product, respects the tradition, though it makes concessions to its peers. Go and Clojure reject tradition completely. Go inherited its mother's distaste of hierarchy (but its father's love of structure). Clojure inherits the amorphous nature of its parent.

Go, Scala and Clojure all reject their parents verbosity. Why did they have to use so many words? Couldn't they just get to the damn point without rambling and repeating themselves? First sign of senility, dude.

On to the environment. The new generation exists in a different, faster, less forgiving environment than its parents. The new generation has had to adapt to keep up. While their parents would get themselves into trouble trying to do more than one thing at once, the new generation excels at it. Go can keep hundreds of thousands of plates spinning where Java would have had a heart attack keeping up just a few hundred.

So who is likely to be the star of the new generation?

Rich-kid Scala certainly has the best pedigree and is the one least likely to upset the parents when brought home. Expect to see it around the more respectable and more academic districts.

Clojure's forbears have been around forever and proved hard to shift from their hosts. No doubt Clojure will show the same epidemiology.

Go has a nice internship with a big company, which could be its launchpad to better things. Who knows?

Whatever, good luck to them all. May they go forth and prosper. And of course sire the next next generation.

Friday 1 July 2011

Scala: coming to a course near you very soon

Java has been the default teaching language in computer science now for over a decade, and it served its purpose. But times have move on. Java is looking increasingly dated, but what is there to replace it? The answer is most likely Scala.

Why Scala? Because it ticks all the computer science boxes. Higher-order functions? Tick. Consistent object model? Tick. The static type system to end all static type systems? Tick. Call by name? Tick. Enough functional gloss to shut up the refugee mathematicians pure computer scientists? Tick.

Fair play to Martin Odersky. He's taken all the good bits from the last fifty-odd years of programming language design and – the hubris of it – managed to shoe-horn them into one surprisingly neat language.

And who could possible resist a language in which you can code infinitely long lists without having to resort to Haskell? Call by name, for heaven sake! How, since the passing of Algol, did we live without it? How cool is this:

class LazyList[T](h: T, t: => LazyList[T]) {
  def head = h
  def tail = t
}

implicit def lazyCons[T](h: T) =
  new {
    def :=> (t: => LazyList[T]): LazyList[T] =
      new LazyList(h, t)
  }

def zip[T, S](l1: LazyList[T], l2: LazyList[S]): LazyList[(T, S)] =
  (l1.head, l2.head) :=> zip(l1.tail, l2.tail)

def map[T, S](f: T => S, l: LazyList[T]): LazyList[S] =
  f(l.head) :=> map(f, l.tail)

def addPair(t: (Int, Int)) = t._1 + t._2

def fib: LazyList[Int] =
  1 :=> (1 :=> map(addPair, zip(fib, fib.tail)))

var l = fib
for (i <- 1 to 20) {
  println(l.head)
  l = l.tail 
} 
  
From a teaching angle Scala looks good too. No more semicolons to forget. Type inference to hide a lot of nastiness that comes with static typing. The interpreter to make println("Hello, World") work without the embarrassment of scaffolding needed in Java. The return of thin-end-first learning. But with a production ready language. Sweet.

So, Scala. Coming to a programming course near you soon. No bad thing.