The Cracked Mirror

I remember reading somewhere (perhaps here but I can’t find the reference now) that the business of programming is the act of producing a simplified mirror-model of the business processes that it is trying to encapsulate.

This seems an intuitive statement. If I come to your company and make software to help your business it must capture at least some of the essence of what your business does. Indeed the programs that I and my colleagues write should in some way mirror the businesses that they belong to. If they don’t then we have to consider that I and my colleagues have probably failed.

In 1997 I entered the finance industry knowing almost nothing about finance. Pure green. So, eager to learn, I thought that if I looked at the code that I would be able to understand some of what was going on in the business. It turned out that this was as true as it was false. Yes the code mirrored the business but it turns out that that particular mirror was cracked. What I thought I understood about the business was distorted by irrelevant detail.

It’s obvious when I think about it now but the code that I was looking at had not been placed infront of me by an alien life-force (although some of the dudes were pretty strange) it had evolved. Code had been added to support business ventures that had subsequently been ended or even worse code had been added that was just plain wrong. In both these circumstances the users of the system compensated for the semantic gap between business and system, by doing what humans do best: working around the problem.

It seems then that this cracked mirror is inevitable because software decays. To really know what’s going on in your organisation you have to bang on doors. You have to ask the users the questions that make you look like Mr. Stupid. Only then can you build the model in your head of what is really going on.

What’s that you say? You’ve got business analysts? FIRE THEM! They don’t work people. I mean, yes they do work, but unless they are top-notch they create more problems than they solve.

Perhaps I’m preaching to the choir. Perhaps the choir went home. Perhaps I’ve been abducted by aliens and I’m still living in my 1997. Perhaps not.


Why I Make A Better Software Developer Than Parent

A while back a friend and colleague revealed to me that he took an interest in programming from the very young age of six. Today he is all grown up and he’s just a little younger than me, and sadly (for me) a far better developer, in many respects, than I’ll probably ever be. If I’m being honest though, I don’t actually think that it would have helped me very much to have started at six years old, seven years earlier than I actually did. The truth is I think I’m doomed to a life of mediocrity (in this arena at least) simply because my head isn’t as well wired for the task as his. Still, though, I wonder about whether programmer parents exposing young children to programming at a young age is really a form of vanity-torture.

For me it’s the paradox of being a parent. I want to guide my kids into being socially acceptable people but I don’t want to tell them how to live their lives. My biggest hope is that they are happy people and I want to help them to achieve that. Nothing else really matters. It will matter, of course, if they wind up in jail because they stole my car whilst high on the latest techno-drug. Then I’ll be doing a lot of guiding, telling and most probably shouting. I am hoping though that it won’t come to that, but I’ll let you know how I get on. The heads on this tails-up coin is that I want to give my kids enough of everything so that they can make their own choices.

So, it was with some trepidation then that I wondered if I should teach my children a programming language as suggested last week by another esteemed ex-colleague. He offhanded-ly recommended (I haven’t asked him if he actually tried it) a programming language developed by Mitchel Resnick’s Media Lab team at MIT called “Scratch”. Now I don’t know if you know (I didn’t until 2 weeks ago) but Media Labs and Media Labs people seem to be doing good things everywhere. You know that “One Laptop Per Child” thingy everyone’s talking about? Well the founder of that association is a Nicholas Negroponte who is persuing the OLPC project whilst on leave from MIT Media Labs. Oh and did I mention that Nicholas co-founded and directed Media Labs too? Nice.

I was sold then. I thought I’d give Scratch a try myself and possibly expose my progeny to it. Now bearing in mind that my daughter is a few years below the target age of 8+, it was possible that this could very well not work out at all. But she is a fairly comptent mouse and keyboard user and is learning to read and write so I thought that there was a chance that she might like it if I did the driving.

The first thing I did was to tell her that Scrach was a game, and yes it seems I lie to my kids too. Next I told her that there was this little cat called Scratch and we could make him do what we wanted. She immediately, and with that childish-flair for the ridiculous ordered him to go and eat an apple. Dutifully, I hunted about in the clip art for an apple but could find only a banana so we settled on that. I showed her what I was doing, how when I took the building pieces from one side I could build up a list of things for him to do. She made a noise like she understood the concept at least so I was pleased.

After a while we got tired of watching the little cat eat the banana so I said there was a few ready made games to play. We tried them all out and the one she really liked was a game where a big fish chased little fish. Sometimes I’m really glad I’m not a psychologist. Anyway, when we were done I said if she would like me to I could make her a game of her own. Again, without even a pause for breath she told me she would like a game where a dragon chased a princess.

I like challenges so I spent 15 minutes after her bath time trying to figure out how to do that and built up this charming fantasy below using all stock clip-art and photos from the Scratch install.

Help!  Save me!

I then built up a little “Scratch” program where the dragon would follow the princess about the screen if you moved her. Very pleased with myself (who’s the child now?) I declared to my daughter that I was done. I showed it to her and she liked it, but then something unexpected happened. She said, and I quote: “That’s nice daddy but it’s not really what I wanted. I want the princess to chase the dragon”. I explained that I’d made it behave like she asked me but I could change it if she wanted and so I did. So 5 more minutes passed and I wired up the two sprites the opposite way around and then, a little less pleased with myself announced that I had finished. We had fun letting the princess chase the dragon and then she said: “I want to be the dragon”.

I wasn’t as much exasperated by this as surprised. I scratched my head, pointed silently at the screen, wagged my finger at it, made a shape with my mouth like I was going to say something and then didn’t. I then turned to her and said: “But if you control the dragon where will the princess go?”. She shrugged and then said “You can use the keyboard to move her”. Now then. Let’s stop and review. The important point here is that my rudimentary “follow” AI of the dragon sort of just happened. I just created it because that’s what I thought she wanted. However, the more I think about it the more I know that I did it wrong. What she wanted was obvious all along. She wanted a version of the big-fish chases little fish game where the chasing character was a dragon (not a big-fish) and the chased character was a princess (not a little fish).

In about 15 minutes my daughter had shown me exactly why requirements gathering is crucial. I’d let my guard down and taken one on the chin from a 5 year old. Now this was play-time so there weren’t any penalty clauses in our contract but if this hadn’t been play-time she would have sued my sorry white ass. I joke of course. Her lawyer won’t touch software contracts, not after the last time. But you get the picture.

Overall though I really enjoyed using “Scratch” and I enjoyed showing Scratch to my daughter. I especially liked the way that the code ‘blocks’ are shaped in such a way that each building-piece has a limited number of places where it looks like it will fit. I also like the way that the whole think feels ‘chunky’, like big fat duplo bricks.

Dragon AI

Finally I also like the way everything is colour coded. All these things are real genius since it’s a visual reinforcement that teaches the difference between the different elements of the program. More importantly though there are plenty of visual cues about what you could do next if you weren’t really sure. The only thing I didn’t like was that I had to drop a screen resolution so that all the fonts were nice and big. It would have been nice to be able to adjust them to suit my needs. But hey, it’s not a big deal.

So, there you have it. I tried and nothing bad happened. I might have created a revenue stream for tomorrow’s psycho-analyst or I might have created tomorrow’s psycho-analyst or maybe even programmer. Who knows or cares. It was just fun to do, that’s all.

linux programming

Caring For Your Environment

I’ve been thinking about where all my spare time goes recently because I just don’t have the time to do things I want to do, development-wise. In my life I have a couple of obvious (non-family related) time sinks like:

  • XBOX 360: Bioshock, Assain’s Creed, Gears of War, …
  • Physical Exercise

Now clearly physical exercise is unnecessary for a desk-jockey like me right? But there’s evidence to suggest that physical exercise might make me smarter. I need all the smarts I can get so I guess I’ll be continuing that for now.
What about gaming? Well I’ve been gaming since Jet Set Willy so I don’t think this is really a time luxury anymore. It’s now a defining character trait.

So that’s not given me very much wiggle room so I started to look more closely at what I actually do when I’m the big fat “H” in HCI. I discovered that I was spending some time on Project Euler which is definitely not time-wasted but is perhaps a little frivolous so I stopped it. But after that still no development work was getting done. Then I found I would spend a fair amount of time tending my development environment garden.

Gardening Again

Recent projects have included:

  • Switching my development machine from Gentoo to Ubuntu, ooo-ooo-ooo
  • Setting up SVN over SSH
  • Getting Emacs to provide true type font support
  • Upgrading Ubuntu to Gutsy for Compiz-Fusion support
  • Trying to get Gutsy to work with my f*cking ATI Radeon 9600 so I can actually use Compiz-Fusion
  • Trialling Lisp based tiling X window managers

And so it goes on. I can always think of something to do, and it’s very much like gardening I think. I admit that I haven’t really done much real gardening but when I did have a garden I found I could spend hours in it mowing, pruning, removing stones, painting fences, … you get the idea. The only difference is that with gardening, gardening and aesthetics are the objectives. The objectives of my development environment “gardening” are less clear. I’m clearly not getting very much productivity benefit from trying to get Compiz-Fusion to work, only that it makes me feel very powerful to be able to make an ATI graphics card work as expected with Linux.

What’s in your garden? Are the weeds ten feet high but you just can’t see them or could you submit it for a competition and win? Is this sort of OCD the preserve of the programmer or have I really lost it this time?


That Funny Nose Thing Again

In one of my previous jobs we had an unwritten rule that if you wanted to introduce a new programming language into the organisation you had to have a pretty good reason for it. When I joined that company around 1999 the company was using C/C++, Tcl and a little Java. By the time I had left they were using a lot of Java, a lot of Python (thanks, at least in part to me), a bunch more C++ and a steadily growing amount of C#. I wasn’t exactly responsible for the addition of the other languages but I contributed code to all of them I think.

Back then I decided that a small company should not adopt and keep that many technologies simultaneously within a single team without retiring some of the older code. From a company’s point of view it is in their interests to stop this in-house proliferation of tools & programming languages because it makes the code base both harder to support and harder to integrate. But it seems, to me at least that tools and programming languages are one part of the programmer condition. I just can’t get enough of them. They look shiny and new and full of promise. They are quite simply bewitching to me.

No one nose what I feel for you

Unlike Elizabeth Montgomery programming tools have little sex appeal and don’t do that funny nose thing. You know, the nose thing that makes everything better, when it all turns to shit at the end-of-the-show.

It is therefore in my company’s interests to pick languages & tools that are general purpose because it will reduce the possibility of tools proliferation later. But I know the drill. Hey, I practically wrote the drill. Find something I want to use, find a reason why I want to use it or why what we have now is deficient. Then bitch and moan until I get my way.

Sometimes though the benefits of a switch to a different tool or programming language can be compelling. Steve Yegge claimed last month that his MUD Wyvern has so many lines-of-code in Java that it is simply unsupportable by one tool/person and so he’s going to use Mozilla’s Rhino to reduce the LoCs. Yeah, that does sound like a good plan, but I think I’ll check back with Steve in 2010 to see how he’s getting on.

As I already mentioned in the “Towers of blub” I have been on a personal quest for about 1.5 years now to find a more powerful programming language. At the moment I have been learning Common Lisp. So it was that this week that my Tabitha nose picked up the strong scent of a new programming language gaining ground. The new player is Scala. I read a couple of blog posts about it, had a look at a tutorial a reference manual or two and was, as you Americans say, pretty stoked. I was thinking about when I was going to download it to see what it could do for me.

But then I was hit by a 10 foot wall of apathy. Whilst it’s interesting to be able to see as much of the language & tools landscape as is humanly possible I’m starting to wonder if it’s a very worthwhile use of my time. Perhaps I should stop evaluating all these different tools and languages and actually write some code. In fact if I was going to list over the years which technologies I’ve learned and subsequently forgotten, instead of coding, it would probably make quite a long list.

So I think I’ll do what Dan Weinreb’s going to do and just keep an eye on Scala to see what happens next. Now, since he is way smarter than me I reckon this is a pretty safe bet. BTW, I’ve tried it before people and this technique really does work. Pick someone whose opinion you respect (this is obviously never going to be a politician, a teacher or a member of law enforcement) and simply base your opinion on theirs. You don’t really need a great deal of rhetoric to back your arguments up just remember whose opinion you copied and come back to it later. So I’ll keep an eye on Scala and remember that I might need it one day and save the rest of my time for some more serious keyboard intercourse and beer.

Then I had the other slightly larger epiphany. More important, at least in hindisght, are not the tools and languages I use but the things that I do with those tools & languages. I’ll be more specific. The important things about what I do can be broken into technical and non-technical. The technical things like distributed computing, defensive coding, testing, multi-threading, relational databases and networking are important knowledge and experience that I draw on all the time and are language and tool independent. Those are the things that I really need to know to know how to program. But the things that make me (or would make me if I was any good at them!) really effective are the non-technical things like communication, interview and planning skills. I need to spend time working on developing all these other skills rather than finding the next new programming language or tool.

Bewitched first aired in the US on 17 September, 1964. A time when COBOL was pretty shiny and new. In 1991 I had to learn COBOL for my undergraduate degree. I have not used COBOL since the last programming assignment we did and I remember almost nothing about it. But I did learn something valuable from that assignment because it was the first time I had ever tried to produce a piece of software in a team.

So, it seems then that I really shouldn’t care very much, about what I have to create my solutions ‘in’, as long as I don’t have to use too many. I think that there’s ways round most programming language deficiencies, unless you use COBOL of course.