Tuesday, December 22, 2009

Building Tommorow's Legacy Systems Today

I started this blog with the aim of sharing my views and anecdotes regarding the world of IT with anyone out there who might be interested. There isn't any great masterplan or strategy in operation other than I aim at least once a day to post something. As I'll be on holiday until the New Year I figured I'd write this last post of the noughties and pick up the emerging common themes from the last few months?

The most obvious one is the "wasn't everything great in the old days?". That's a real shame because nostalgia shouldn't have a meaningful place in today's IT. As technologists we surely should be looking forwards not backwards. Mostly though the realisation dawning on me is that we've lost a golden opportunity in IT to do something significant. A similar theme is explored in a friend’s blog http://excapite.wordpress.com/2009/12/23/is-the-iphone-really-the-avatar-of-mobile-phones/ which compares advances in cgi technology as compared with IT over the last 15 years.

Why? Lots of reasons I suspect but my endearing hope for the next decade is that we can rediscover just what it was about the world of IT that made it so fresh and exciting twenty years ago. To do that we will have to unload some of the legacy baggage that it holding us back (at a recent expo I saw a quite that some 80% of IT budget was spent on keeping the lights on!). Maybe that will be the real benefit of outsourcing and offshoring. I certainly hope so because I for one am getting a little exasperated "building tomorrow's legacy systems today".

Billion Dollar Brain

In a recent post I speculated about the concept of Apple making a television. In part it was a flippant post but interestingly enough this article on techcrunch treads a similar tangent about Apple's intentions in the TV space. http://www.techcrunch.com/2009/12/21/apple-tv-kill-cable/

The real standout fact that hits me is that Apple is building a $1 bn data center. http://www.datacenterknowledge.com/archives/2009/05/26/apple-planning-1-billion-idatacenter/

To date I've always considered Apple a design and device shop - but spending $1 bn on something to support iTunes and MobileMe - I don't think so. You'd have to sell lots of singles on iTunes to fund that (well a billion I guess!)

Whatever the plans Steve Jobs has for this datacenter I suspect that the ambitions are considerable.

The IT Wars

The history of IT is littered with lots of technical, idealogical and mindshare wars. A few examples are listed below:

The desktop wars: Microsoft Windows vs IBM OS/2
The server wars: Mainframe vs midrange
The RDBMS wars: Oracle vs Ingres
The browser wars: Internet Explorere vs Netscape
The Search Engine Wars: Google vs Yahoo

My question today is: Where is todays IT war?

I just can't think of one. Yes, there's lots of competition out there, but I can't think of a really big battlefront that is dividing IT departments right down the middle. Years ago lots of committed IT people were passionate about their decisions and allegiances.

I suppose the answer is that IT has matured to the point where everything is pretty much a commodity. I saw evidence of this when I was recently interviewed for a Data Warehouse Architecture role for a SME software house. Idealogically I hit it off with my potential boss and I know that it could have been a great opportunity for both me and the firm. However they were at that point of selecting a BI technology stack to develop their products and had correctly concluded that technically it was pretty much of a muchness out there. The selection therefore was made on commercial and not technical grounds, which didn't go my way. No hard feelings as I totally understood and endorsed their logic of their decision.

My issue with all of this is however, is that each of these wars have been fought out in an emerging arena that changed the face of IT. Using the examples above there are:

The adoption of the PC as a business tool
The challenge to mainframes from lower cost midrange computers
The move from flat file and hierarchical database to SQL
The internet explosion

So if there isn't a war being fought then I'd derive from that the fact that we aren't on the cusp of a new technological revolution - which for me is a great shame.

Wednesday, December 16, 2009

Uncommon Sense

I had the pleasure not so long ago to work with a fantastic Program Manager called Declan. He had coined the term "Uncommon Sense" in response to the fact that "Common Sense" seems to be something of a rarity in todays IT landscape. Here's a point of example.

In this weeks development team meeting we had on the agenda the issue of Disaster Recovery Planning. To cut a long story short our Data Warehouse Manager does not have confidence that our Tech Support colleages on level 2 can reliably backup and recover our data warehouse databases.

I'd like to say that this is the first time I worked in an IT site where confidence in backup and recovery was low but especially in these days of outsourcing I'm afraid that I can't, but that's a topic probably best explored in a separate post.

Because we cannot trust our backups the data warehouse manager has insisted that we keep in our staging database effectively every row that has ever been applied to the data warehouse as an insurance policy. FYI our warehouse contains, in parts, data over 10 years old.

There are lots of reasons why the DRP strategy will fail which I won't bore you with now but the part of the discussion that really tells me that the lunatics are running the asylum is this.

Our staging server is in the same data center as our data warehouse server. So please Qantas, can I ask you that when one of your badly serviced 747's falls out of the sky onto our data center can you make sure that you land on the data warehouse box and not the staging box or vica versa - but please not both. Cheers.

Tuesday, December 15, 2009

The Kids of Today

I travel to work by bus. Today I sat near the back. On the back seat were three school age teenagers. I'd guess about 15 years of age. I couldn't help but listen to their conversation as they were, as is the nature of groups of teenagers, quite loud. They chatted about the usual teenage stuff. Who's seeing who? who fancies who? who's dumped who? What caught my attention wasn't what they were talking about. It was how they communicated. The conversation consisted almost entirely of short sporadic staccato-like sentences. Just like this post so far.

Okay, enough of the silly stuff but if you've read anything from this blog to date you'll realise that I don't write, talk or think in this way. I'm much more verbose which I'm sure that to a teenager equates to very, very boring. I don't think that I used to talk like the teenagers of today so what's changed?

MTV? SMS? Sound bytes? Short attention spans? Directors like Paul Greengrass using an average shot length of 2 seconds in the Jason Bourne films? Well all of these I suppose, but the one that interests me most is SMS. Personally speaking I never really liked SMS'ing until I had a phone with a QWERTY keyboard. I just couldn't get past the frustration of three letters on a key in an unfamiliar layout. That didn't stop it becoming a massive global hit. Even my mum wanted to learn how to text. It's another one of the accidental heroes like UNIX that was never designed for adoption by the mass market. However as the world's desire for Smartphones, typically adopting QWERTY hard or soft keyboards, booms I wander what the future holds for the short attention span generation. Will they be a passing phase? Probably not, but I live in hope.

Monday, December 14, 2009

The Great Lost Art of Communication

An Italian ex-colleague once told me that in his country it was impolite in the business place not to spend twenty odd minutes in conversation with a colleague before getting round to discussing work. For example "My mother made some great gnocchi last night? How hot was Berlusconi’s latest starlet? How bad were Inter this weekend? Oh and by the way can I just get those cost centres from you?" sort of thing. Actually he told me it was 50 minutes of chat for 5 minutes of business not 20, but I just can't bring myself to believe that. Whatever the duration this may go a long way to explaining Italian productivity relative to Northern Europe for the last thirty years. Whether this practice remains true today or not I have no idea and whilst I'm obviously not suggesting that it be adopted as a business model per se but there are things we can learn from this.

Why? Because somewhere along the line we've I think lost sight of what it is that makes us productive and cooperative in the workplace.

Real Human Communication.

Read any book on the subject and they say that 90% of communication is non-verbal (i.e. body language, facial expressions) so it seems mad to me that the default communication we now find in the workplace often isn't face to face. It isn't even voice to voice. Much of the time it's email.

Here's some stuff to ponder:

How many times do we fire off an email when we could easily pick up the phone?
Have you received a one to one email from the person sitting less than three yards from you recently?
Have you every kicked off a unintentional sh!tstorm when an email you sent got misinterpreted or only half read?
Do you need to know that the hot water in the kitchen on level2 is not working/working/not working again or the white Toyota Camry has left its lights on?
Let's not even mentions the infamous Claire Swire email?

It gets worse. I once had a boss who managed by email. Every day he'd walk into his office, barely acknowledging his team's presence, only to commence a day long stream of email dialogue.

I'm not suggesting that we abandon e-mail - it is a vital business tool after all - but what I am suggesting is that we think more carefully about how we use it. For example, I once worked with a very successful and organised sales rep who configured his email to refresh every two hours to reduce the disruption. The fundamental problem with email is that it is perceived to be 'convenient' and 'free'. The reality is that they often it isn't and until we find a way to measure the metrics of lost productivity because of this inferior form of communication I think we will suffer.

What if we followed the sales rep's example on a corporate level and say configured two email deliveries per day. People would argue about those vital high priority emails but that's missing the point. Collaboration tools and document management tools exist outside of email that could better serve those needs anyway. Making this small change I believe would drive behavioural change and perhaps we'd end up picking up the phone or even, shock horror, actually having a face to face conversation. When in Rome ...

Sunday, December 13, 2009

The Paperless Office

In the early 90's, before PC's and email were widely adopted, I remember seeing a news report about a company that had adopted a 'Paperless Office' policy. As such the company's Post Room was the only place that would handle 'dead tree' technology. All incoming mail was scanned internally emailed to the respective employee. The original document would then be archived or shredded. The Post Room also had the only printer in the office which was used for generating outgoing correspondence.

The report predicted that this was how we would all soon be working which obviously didn't happen.

In fact pretty much the reverse has. The Post Room has effectively been bypassed by e-mail and in the main we still print out most of the documents that we're expected to read, most of which are generated internally by co-workers.

I just can't help wondering, however, if we didn't miss a trick somehow. Wouldn't life be better without paper. Maybe I've been too harsh on the Kindle with my earlier post and there is a place for an e-reader. It's probably a better reason for adoption than being able to read Jackie Collin's latest knee trembler on the bus.

Margaret Thatcher: My Part in Her Downfall

The title of today's post is a parody of the Spike Milligan book "Adolf Hitler: My Part in His Downfall" hopefully for reasons that will become clear. My first real job in IT was to implement a packaged application for a County Council. That application, COMCIS, stood for Community Charge Information System. The Community Charge is better remembered by its informal title - The Poll Tax - and it was the countrywide deep seated unpopularity of this tax that ultimately led to the Tory party deposing Margaret Thatcher. Therefore I like to think I had a hand, albeit very small, in the removal of Maggie T from power.

I'm also going to introduce a movie quote into this post, this time from the movie Ferris Bueller's Day Off. "Life moves pretty fast. You don't stop and look around once in a while, you could miss it."

The connection, tenuous though it may be, is as follows.

At the time COMCIS was the largest single system in the County Council. We used 13Gb of disk, give or take, which was enough to hold all the account information required to manage of about 300,000 people and 100,000 properties. That's less storage than I currently have in my iPhone. Okay, so times change and technology marches forward as anyone familiar with Moore's Law will tell you. Or so you’d believe.

I recently had to get an estimate from a supplier of how long it would take to develop and deploy a reasonably simple ROLAP report. Two weeks and $20,000 was the answer they came back with. The funny thing is that back in the heady days of the Poll Tax I remember I used to knock out a COBOL based report in about 3 to 4 days. If we could apply something like Moore's Law to productivity then today surely I should be able to knock the ROLAP report in minutes, maybe an hour at the outside. So what's gone wrong? We'll I have my suspicions but for now I think it's best to leave the question hanging there but in respects to Ferris’s quote I'm wondering what we've missed.

Thursday, December 10, 2009

Paper Chase

I saw my first Kindle the other day. I must admit I found it hard not to laugh out loud. Amazon, and to a lesser extent Sony, obviously hope that e-Readers will become the iPod of the printed media. I don't and here's my rationale.

1) It's a one trick pony - eReaders use clever technology to make the screen easy on the eyes when reading for long periods. Unfortunately this means it takes a second or so to refresh a screen. That's fine if your turning a page but not for other media content. The soon to be released tablet computers are multifunction and whilst technically inferior for reading printed material their multifunction nature will win out.
2) It doesn't improve on the original - The iPod was a great device becuase you could carry your whole musical collection in your pocket. This is great because based upon mood and whim you can select music. This accessibility advantage was big enough for people to accept inferior music quality over the CD's that were replaced. Do we want or need to carry out libraries in the same way. In my opinion no.
3) Welcome to the 1950's - Black and white may be fine for books but it just doesn't pass muster for online content. Maybe if this Philips technology had been around it would have had a chance -
http://www.engadget.com/2009/12/10/philips-develops-color-e-paper-wants-to-skin-your-gadgets-with/

Maybe I'll be proved wrong, but I don't think so.

Accidental Heroes

UNIX (in its many variants) is the OS at the heart of modern IT. It also is behind three of the four top desktop OS's (MAC OS X, Linux and, when available, Google's Chrome). It is the foundation for iPhone OS and Android. In short it's everywhere.

I like UNIX and have been working with it for nearly 20 years. You may already know that UNIX originated as a small reseach project of at AT&T in the late 60's. But what you probably don't know is that one of its creaters is gobsmacked that it became adopted for use as a commercial OS - for that was never the intent of the project. How do I know this - well I worked with one of the creators relatives and she told me.

Now let's look at Java. I don't have the stats but it is one, if not the, biggest language in use in enterprise IT today. Unlike UNIX however, Java, was written as a commercial product. It was a language developed for appliances - yes appliances. TV's, fridges, washing machines and the like. It was never intended for use in enterprise IT.

So what's the common denominator - well there are two that I can think of. Firstly both were given away free at some stage of their life, but more significantly they grew out of projects of somewhat humble origins and expectations. This is a phenomenon that is not uncommon in Research Science in that some of the greatest discoveries and developments have been accidental, for example Fleming's discovery of penicillin. So my point is that when somebody tries to sell you the future of the IT they're probably wrong. They can't know because the next accident may not yet have even occured.

If Apple made a TV

OK I know Apple makes the AppleTV media box but what if, as some have suggested, Apple actually stepped up and took on the Japanese and the Koreans and made a full blown 32/40/52" TV set. What could we expect?

Well obviously Mr Ive's team would deliver another fantastic industrial design. It would certainly be thin and maybe have an aluminium unibody construction. I'd expect it to be true HD and I think they'd bypass plasma/LCD technology and move straight on to OLED. I'd obviously like a built in DVD or Blu-ray Drive, built in wifi and a hard disk.

So what's the special about that? Well so far nothing of course. However, if you think I'm heading down the internet TV route now - and yes I'd expect all the sort of stuff you can get already using the AppleTV, photos, buy/rent movies from iTunes. But no. That's not where I'm headed with this post. Beside all of this functionality is offered by media boxed like the AppleTV already and the TV world as yet hasn't changed.

Lets look at what Apple could do if it played to its proven strengths - the UI and Simplicity. Apple would look to differentiate, innovate and improve. Where could it improve the current TV experience? We'll when you look at it pretty much everywhere.

Look at your current TV and examine something that has hardly changed in the last 40 years - The remote. OK so it has a lot of buttons - mine has 55, my cable remote (Foxtel) has 40 and my DVD player remote has 50. How many of these 155 buttons do I use on daily basis? Perhaps a dozen. Of course I can buy an all-in-one programmable remote - I did once - but ultimately that was just as frustrating.

Examine the TV further and ask yourself what are the key functions I will want to do all the time?

On/Off, Change channels, search for channels, set reminders or record programs, mute and volume and change inputs (i.e. Cable/DVD/Game console).

Let's say I want to select a channel. Currently I have three options. If I know the channel number I desire I can key it and presto. This was fine in the old days when there were half a dozen or less channels but now there are hundreds of channels. So I resort to channel hopping which apart from driving my wife to distraction is a pretty inefficient method of selecting (it's essentially a sequential scan). Not very clever. Ah, but what about the EPG. Well yes, it's an index of sorts but its still rubbish really and only partially help solve the problem. The EPG will typically show me about 10 channels for the next 2 hours in the default window.

But I'm still missing the point here. Why should I want to find a channel? Do I care whether I'm watching Channel 9 or Fox 8? No. What I should be searching for is content. As I stated I can use the EPG but they really are pretty useless. So let's imagine we could search for programs instead of channels. Well you can on Foxtel but have you tried it. It's rubbish. I can search by category or by A-Z. That's it.

The basic limitation that kicks in is that the UI is fundamentally based around a 10 digit numbering system, or a series of four coloured buttons, of one of four selections of arrows. Yes that's right. Here we are deep in the digital age and were selecting what we watch with 1970's remote control technology. Crazy isn't it. So how about if we could have a remote control and UI that was neat, friendly and allowed us to search content in a more meaningful way that Program Title. Let's say I wanted to watch something starring a particular actor, or something by the a known director. What if the remote resembled something like an iPod Touch or iPhone. Imagine a programmable remote that utilised the easy to use interface Apple bring to their consumer devices. Hey wouldn't that be cool.

That's just the start but I guess you get the idea. Apple has shown that it can enter a mature market (mobile phones) and shake things up a bit. I wonder if the TV is the next target in their sights - after the launch of the Tablet early next year, of course.

Wednesday, December 2, 2009

Six Sigma

A couple of years ago during an RFI an offshore software vendor presented indicated that their software labs were 'six sigma' certified. Essentially this was offered as a guarantee of quality. They won the business essentially becuase they were the incumbent and they were cheap. So I ask you - how can a six sigma company submit software that has allegedy been system tested that doesn't even compile. Funnily enough the vendor bid for some more business recently and dragged out the same six sigma powerpoint slide. Some companies have no shame.

The Safety Hat Dance

There are studies that show that cyclists who wear helmets are more prone to accidents than those who do not. It is believed that the perception of protection offered by said headwear encourages the wearer to take more risks.

In the old mainframe days, owing to costs and limited machine resources, often there were only development and production environments. This meant that before you promoted a piece of code into prod you have better be very confident that it was kosha. Nowadays its not unusual to find 5 or so environments (Dev, System Test, Acceptance Test, Pre Prod, Performance Assurance, Prod, Continuation of Business, etc.).

On a recent project we raised over 200 severe software defects. That would be unthinkable in the old mainframe days. My point is that the more layers we add in an attempt to improve software quality and reliability has exactly the opposite effect and made us all just a bit less rigorous in our testing regimes. After all why do I need to test it when someone else will do it for me later on before it makes it into Prod.