Tuesday, December 22, 2009

Building Tommorow's Legacy Systems Today

I started this blog with the aim of sharing my views and anecdotes regarding the world of IT with anyone out there who might be interested. There isn't any great masterplan or strategy in operation other than I aim at least once a day to post something. As I'll be on holiday until the New Year I figured I'd write this last post of the noughties and pick up the emerging common themes from the last few months?

The most obvious one is the "wasn't everything great in the old days?". That's a real shame because nostalgia shouldn't have a meaningful place in today's IT. As technologists we surely should be looking forwards not backwards. Mostly though the realisation dawning on me is that we've lost a golden opportunity in IT to do something significant. A similar theme is explored in a friend’s blog http://excapite.wordpress.com/2009/12/23/is-the-iphone-really-the-avatar-of-mobile-phones/ which compares advances in cgi technology as compared with IT over the last 15 years.

Why? Lots of reasons I suspect but my endearing hope for the next decade is that we can rediscover just what it was about the world of IT that made it so fresh and exciting twenty years ago. To do that we will have to unload some of the legacy baggage that it holding us back (at a recent expo I saw a quite that some 80% of IT budget was spent on keeping the lights on!). Maybe that will be the real benefit of outsourcing and offshoring. I certainly hope so because I for one am getting a little exasperated "building tomorrow's legacy systems today".

Billion Dollar Brain

In a recent post I speculated about the concept of Apple making a television. In part it was a flippant post but interestingly enough this article on techcrunch treads a similar tangent about Apple's intentions in the TV space. http://www.techcrunch.com/2009/12/21/apple-tv-kill-cable/

The real standout fact that hits me is that Apple is building a $1 bn data center. http://www.datacenterknowledge.com/archives/2009/05/26/apple-planning-1-billion-idatacenter/

To date I've always considered Apple a design and device shop - but spending $1 bn on something to support iTunes and MobileMe - I don't think so. You'd have to sell lots of singles on iTunes to fund that (well a billion I guess!)

Whatever the plans Steve Jobs has for this datacenter I suspect that the ambitions are considerable.

The IT Wars

The history of IT is littered with lots of technical, idealogical and mindshare wars. A few examples are listed below:

The desktop wars: Microsoft Windows vs IBM OS/2
The server wars: Mainframe vs midrange
The RDBMS wars: Oracle vs Ingres
The browser wars: Internet Explorere vs Netscape
The Search Engine Wars: Google vs Yahoo

My question today is: Where is todays IT war?

I just can't think of one. Yes, there's lots of competition out there, but I can't think of a really big battlefront that is dividing IT departments right down the middle. Years ago lots of committed IT people were passionate about their decisions and allegiances.

I suppose the answer is that IT has matured to the point where everything is pretty much a commodity. I saw evidence of this when I was recently interviewed for a Data Warehouse Architecture role for a SME software house. Idealogically I hit it off with my potential boss and I know that it could have been a great opportunity for both me and the firm. However they were at that point of selecting a BI technology stack to develop their products and had correctly concluded that technically it was pretty much of a muchness out there. The selection therefore was made on commercial and not technical grounds, which didn't go my way. No hard feelings as I totally understood and endorsed their logic of their decision.

My issue with all of this is however, is that each of these wars have been fought out in an emerging arena that changed the face of IT. Using the examples above there are:

The adoption of the PC as a business tool
The challenge to mainframes from lower cost midrange computers
The move from flat file and hierarchical database to SQL
The internet explosion

So if there isn't a war being fought then I'd derive from that the fact that we aren't on the cusp of a new technological revolution - which for me is a great shame.

Wednesday, December 16, 2009

Uncommon Sense

I had the pleasure not so long ago to work with a fantastic Program Manager called Declan. He had coined the term "Uncommon Sense" in response to the fact that "Common Sense" seems to be something of a rarity in todays IT landscape. Here's a point of example.

In this weeks development team meeting we had on the agenda the issue of Disaster Recovery Planning. To cut a long story short our Data Warehouse Manager does not have confidence that our Tech Support colleages on level 2 can reliably backup and recover our data warehouse databases.

I'd like to say that this is the first time I worked in an IT site where confidence in backup and recovery was low but especially in these days of outsourcing I'm afraid that I can't, but that's a topic probably best explored in a separate post.

Because we cannot trust our backups the data warehouse manager has insisted that we keep in our staging database effectively every row that has ever been applied to the data warehouse as an insurance policy. FYI our warehouse contains, in parts, data over 10 years old.

There are lots of reasons why the DRP strategy will fail which I won't bore you with now but the part of the discussion that really tells me that the lunatics are running the asylum is this.

Our staging server is in the same data center as our data warehouse server. So please Qantas, can I ask you that when one of your badly serviced 747's falls out of the sky onto our data center can you make sure that you land on the data warehouse box and not the staging box or vica versa - but please not both. Cheers.

Tuesday, December 15, 2009

The Kids of Today

I travel to work by bus. Today I sat near the back. On the back seat were three school age teenagers. I'd guess about 15 years of age. I couldn't help but listen to their conversation as they were, as is the nature of groups of teenagers, quite loud. They chatted about the usual teenage stuff. Who's seeing who? who fancies who? who's dumped who? What caught my attention wasn't what they were talking about. It was how they communicated. The conversation consisted almost entirely of short sporadic staccato-like sentences. Just like this post so far.

Okay, enough of the silly stuff but if you've read anything from this blog to date you'll realise that I don't write, talk or think in this way. I'm much more verbose which I'm sure that to a teenager equates to very, very boring. I don't think that I used to talk like the teenagers of today so what's changed?

MTV? SMS? Sound bytes? Short attention spans? Directors like Paul Greengrass using an average shot length of 2 seconds in the Jason Bourne films? Well all of these I suppose, but the one that interests me most is SMS. Personally speaking I never really liked SMS'ing until I had a phone with a QWERTY keyboard. I just couldn't get past the frustration of three letters on a key in an unfamiliar layout. That didn't stop it becoming a massive global hit. Even my mum wanted to learn how to text. It's another one of the accidental heroes like UNIX that was never designed for adoption by the mass market. However as the world's desire for Smartphones, typically adopting QWERTY hard or soft keyboards, booms I wander what the future holds for the short attention span generation. Will they be a passing phase? Probably not, but I live in hope.

Monday, December 14, 2009

The Great Lost Art of Communication

An Italian ex-colleague once told me that in his country it was impolite in the business place not to spend twenty odd minutes in conversation with a colleague before getting round to discussing work. For example "My mother made some great gnocchi last night? How hot was Berlusconi’s latest starlet? How bad were Inter this weekend? Oh and by the way can I just get those cost centres from you?" sort of thing. Actually he told me it was 50 minutes of chat for 5 minutes of business not 20, but I just can't bring myself to believe that. Whatever the duration this may go a long way to explaining Italian productivity relative to Northern Europe for the last thirty years. Whether this practice remains true today or not I have no idea and whilst I'm obviously not suggesting that it be adopted as a business model per se but there are things we can learn from this.

Why? Because somewhere along the line we've I think lost sight of what it is that makes us productive and cooperative in the workplace.

Real Human Communication.

Read any book on the subject and they say that 90% of communication is non-verbal (i.e. body language, facial expressions) so it seems mad to me that the default communication we now find in the workplace often isn't face to face. It isn't even voice to voice. Much of the time it's email.

Here's some stuff to ponder:

How many times do we fire off an email when we could easily pick up the phone?
Have you received a one to one email from the person sitting less than three yards from you recently?
Have you every kicked off a unintentional sh!tstorm when an email you sent got misinterpreted or only half read?
Do you need to know that the hot water in the kitchen on level2 is not working/working/not working again or the white Toyota Camry has left its lights on?
Let's not even mentions the infamous Claire Swire email?

It gets worse. I once had a boss who managed by email. Every day he'd walk into his office, barely acknowledging his team's presence, only to commence a day long stream of email dialogue.

I'm not suggesting that we abandon e-mail - it is a vital business tool after all - but what I am suggesting is that we think more carefully about how we use it. For example, I once worked with a very successful and organised sales rep who configured his email to refresh every two hours to reduce the disruption. The fundamental problem with email is that it is perceived to be 'convenient' and 'free'. The reality is that they often it isn't and until we find a way to measure the metrics of lost productivity because of this inferior form of communication I think we will suffer.

What if we followed the sales rep's example on a corporate level and say configured two email deliveries per day. People would argue about those vital high priority emails but that's missing the point. Collaboration tools and document management tools exist outside of email that could better serve those needs anyway. Making this small change I believe would drive behavioural change and perhaps we'd end up picking up the phone or even, shock horror, actually having a face to face conversation. When in Rome ...

Sunday, December 13, 2009

The Paperless Office

In the early 90's, before PC's and email were widely adopted, I remember seeing a news report about a company that had adopted a 'Paperless Office' policy. As such the company's Post Room was the only place that would handle 'dead tree' technology. All incoming mail was scanned internally emailed to the respective employee. The original document would then be archived or shredded. The Post Room also had the only printer in the office which was used for generating outgoing correspondence.

The report predicted that this was how we would all soon be working which obviously didn't happen.

In fact pretty much the reverse has. The Post Room has effectively been bypassed by e-mail and in the main we still print out most of the documents that we're expected to read, most of which are generated internally by co-workers.

I just can't help wondering, however, if we didn't miss a trick somehow. Wouldn't life be better without paper. Maybe I've been too harsh on the Kindle with my earlier post and there is a place for an e-reader. It's probably a better reason for adoption than being able to read Jackie Collin's latest knee trembler on the bus.

Margaret Thatcher: My Part in Her Downfall

The title of today's post is a parody of the Spike Milligan book "Adolf Hitler: My Part in His Downfall" hopefully for reasons that will become clear. My first real job in IT was to implement a packaged application for a County Council. That application, COMCIS, stood for Community Charge Information System. The Community Charge is better remembered by its informal title - The Poll Tax - and it was the countrywide deep seated unpopularity of this tax that ultimately led to the Tory party deposing Margaret Thatcher. Therefore I like to think I had a hand, albeit very small, in the removal of Maggie T from power.

I'm also going to introduce a movie quote into this post, this time from the movie Ferris Bueller's Day Off. "Life moves pretty fast. You don't stop and look around once in a while, you could miss it."

The connection, tenuous though it may be, is as follows.

At the time COMCIS was the largest single system in the County Council. We used 13Gb of disk, give or take, which was enough to hold all the account information required to manage of about 300,000 people and 100,000 properties. That's less storage than I currently have in my iPhone. Okay, so times change and technology marches forward as anyone familiar with Moore's Law will tell you. Or so you’d believe.

I recently had to get an estimate from a supplier of how long it would take to develop and deploy a reasonably simple ROLAP report. Two weeks and $20,000 was the answer they came back with. The funny thing is that back in the heady days of the Poll Tax I remember I used to knock out a COBOL based report in about 3 to 4 days. If we could apply something like Moore's Law to productivity then today surely I should be able to knock the ROLAP report in minutes, maybe an hour at the outside. So what's gone wrong? We'll I have my suspicions but for now I think it's best to leave the question hanging there but in respects to Ferris’s quote I'm wondering what we've missed.

Thursday, December 10, 2009

Paper Chase

I saw my first Kindle the other day. I must admit I found it hard not to laugh out loud. Amazon, and to a lesser extent Sony, obviously hope that e-Readers will become the iPod of the printed media. I don't and here's my rationale.

1) It's a one trick pony - eReaders use clever technology to make the screen easy on the eyes when reading for long periods. Unfortunately this means it takes a second or so to refresh a screen. That's fine if your turning a page but not for other media content. The soon to be released tablet computers are multifunction and whilst technically inferior for reading printed material their multifunction nature will win out.
2) It doesn't improve on the original - The iPod was a great device becuase you could carry your whole musical collection in your pocket. This is great because based upon mood and whim you can select music. This accessibility advantage was big enough for people to accept inferior music quality over the CD's that were replaced. Do we want or need to carry out libraries in the same way. In my opinion no.
3) Welcome to the 1950's - Black and white may be fine for books but it just doesn't pass muster for online content. Maybe if this Philips technology had been around it would have had a chance -
http://www.engadget.com/2009/12/10/philips-develops-color-e-paper-wants-to-skin-your-gadgets-with/

Maybe I'll be proved wrong, but I don't think so.

Accidental Heroes

UNIX (in its many variants) is the OS at the heart of modern IT. It also is behind three of the four top desktop OS's (MAC OS X, Linux and, when available, Google's Chrome). It is the foundation for iPhone OS and Android. In short it's everywhere.

I like UNIX and have been working with it for nearly 20 years. You may already know that UNIX originated as a small reseach project of at AT&T in the late 60's. But what you probably don't know is that one of its creaters is gobsmacked that it became adopted for use as a commercial OS - for that was never the intent of the project. How do I know this - well I worked with one of the creators relatives and she told me.

Now let's look at Java. I don't have the stats but it is one, if not the, biggest language in use in enterprise IT today. Unlike UNIX however, Java, was written as a commercial product. It was a language developed for appliances - yes appliances. TV's, fridges, washing machines and the like. It was never intended for use in enterprise IT.

So what's the common denominator - well there are two that I can think of. Firstly both were given away free at some stage of their life, but more significantly they grew out of projects of somewhat humble origins and expectations. This is a phenomenon that is not uncommon in Research Science in that some of the greatest discoveries and developments have been accidental, for example Fleming's discovery of penicillin. So my point is that when somebody tries to sell you the future of the IT they're probably wrong. They can't know because the next accident may not yet have even occured.

If Apple made a TV

OK I know Apple makes the AppleTV media box but what if, as some have suggested, Apple actually stepped up and took on the Japanese and the Koreans and made a full blown 32/40/52" TV set. What could we expect?

Well obviously Mr Ive's team would deliver another fantastic industrial design. It would certainly be thin and maybe have an aluminium unibody construction. I'd expect it to be true HD and I think they'd bypass plasma/LCD technology and move straight on to OLED. I'd obviously like a built in DVD or Blu-ray Drive, built in wifi and a hard disk.

So what's the special about that? Well so far nothing of course. However, if you think I'm heading down the internet TV route now - and yes I'd expect all the sort of stuff you can get already using the AppleTV, photos, buy/rent movies from iTunes. But no. That's not where I'm headed with this post. Beside all of this functionality is offered by media boxed like the AppleTV already and the TV world as yet hasn't changed.

Lets look at what Apple could do if it played to its proven strengths - the UI and Simplicity. Apple would look to differentiate, innovate and improve. Where could it improve the current TV experience? We'll when you look at it pretty much everywhere.

Look at your current TV and examine something that has hardly changed in the last 40 years - The remote. OK so it has a lot of buttons - mine has 55, my cable remote (Foxtel) has 40 and my DVD player remote has 50. How many of these 155 buttons do I use on daily basis? Perhaps a dozen. Of course I can buy an all-in-one programmable remote - I did once - but ultimately that was just as frustrating.

Examine the TV further and ask yourself what are the key functions I will want to do all the time?

On/Off, Change channels, search for channels, set reminders or record programs, mute and volume and change inputs (i.e. Cable/DVD/Game console).

Let's say I want to select a channel. Currently I have three options. If I know the channel number I desire I can key it and presto. This was fine in the old days when there were half a dozen or less channels but now there are hundreds of channels. So I resort to channel hopping which apart from driving my wife to distraction is a pretty inefficient method of selecting (it's essentially a sequential scan). Not very clever. Ah, but what about the EPG. Well yes, it's an index of sorts but its still rubbish really and only partially help solve the problem. The EPG will typically show me about 10 channels for the next 2 hours in the default window.

But I'm still missing the point here. Why should I want to find a channel? Do I care whether I'm watching Channel 9 or Fox 8? No. What I should be searching for is content. As I stated I can use the EPG but they really are pretty useless. So let's imagine we could search for programs instead of channels. Well you can on Foxtel but have you tried it. It's rubbish. I can search by category or by A-Z. That's it.

The basic limitation that kicks in is that the UI is fundamentally based around a 10 digit numbering system, or a series of four coloured buttons, of one of four selections of arrows. Yes that's right. Here we are deep in the digital age and were selecting what we watch with 1970's remote control technology. Crazy isn't it. So how about if we could have a remote control and UI that was neat, friendly and allowed us to search content in a more meaningful way that Program Title. Let's say I wanted to watch something starring a particular actor, or something by the a known director. What if the remote resembled something like an iPod Touch or iPhone. Imagine a programmable remote that utilised the easy to use interface Apple bring to their consumer devices. Hey wouldn't that be cool.

That's just the start but I guess you get the idea. Apple has shown that it can enter a mature market (mobile phones) and shake things up a bit. I wonder if the TV is the next target in their sights - after the launch of the Tablet early next year, of course.

Wednesday, December 2, 2009

Six Sigma

A couple of years ago during an RFI an offshore software vendor presented indicated that their software labs were 'six sigma' certified. Essentially this was offered as a guarantee of quality. They won the business essentially becuase they were the incumbent and they were cheap. So I ask you - how can a six sigma company submit software that has allegedy been system tested that doesn't even compile. Funnily enough the vendor bid for some more business recently and dragged out the same six sigma powerpoint slide. Some companies have no shame.

The Safety Hat Dance

There are studies that show that cyclists who wear helmets are more prone to accidents than those who do not. It is believed that the perception of protection offered by said headwear encourages the wearer to take more risks.

In the old mainframe days, owing to costs and limited machine resources, often there were only development and production environments. This meant that before you promoted a piece of code into prod you have better be very confident that it was kosha. Nowadays its not unusual to find 5 or so environments (Dev, System Test, Acceptance Test, Pre Prod, Performance Assurance, Prod, Continuation of Business, etc.).

On a recent project we raised over 200 severe software defects. That would be unthinkable in the old mainframe days. My point is that the more layers we add in an attempt to improve software quality and reliability has exactly the opposite effect and made us all just a bit less rigorous in our testing regimes. After all why do I need to test it when someone else will do it for me later on before it makes it into Prod.

Monday, November 30, 2009

The Enemy Within

I have in my time had the displeasure to work in a couple of companies who have outsourced parts of their IT organisation (whether it be support, development). Whatever the relative merits of outsourcing, IMO, outsouring brings very few real benefits in the medium to long term. Obviously there is a percieved short term value proposition which leads companies down the outsourcing path in the first place (i.e. by reducing headcount) otherwise companies wouldn't do it.

As a result many IT people fear outsourcing, especially when cheaper offshoring is mixed into the equation. I used to be in this camp and a few years ago I believed that the best personal survival strategy against this trend was to move up the IT ladder into the 'talking' and 'thinking' space and distance myself from the 'doing' bit. I now don't have these worries.

Why? Well I recently had the opportunity to look at a large company that had outsourced almost its entire IT operation 5 years ago. The only elements not outsourced were the IT executive ('the talkers') and an enterprise architecture function ('the thinkers').

Their problem was that they had stopped delivering anything meaningful to the business. Why? Well there are a number of reasons but the one I wanted to focus on here is my belief that the outsourcer ('the doers') had effectively morphed into a fifth column working inside the IT shop. The resisted change at every opportunity and driven by the nature of the SLA's in place had redefined their role 'keeping the lights on'.

So did the business take this lying down. Of course not. It simply can't afford to. What it does is it effectively insources IT, usually covertly. Five years down the line they had more IT architects employed directly by the business than were left in central IT.

Thursday, November 26, 2009

Your call is important to us ...

Most large companies spend millions of dollars implementing CRM (Customer Relationship Management) software in order to 'know and understand their customers better'. CRM was one of those industry buzzwords that came out of nowhere about a dozen years ago. So here's a question for you?

Is the customer experience now better than it was in say 1990. No, I didn't think so.

So what's it all about. The obvious answer is that CRM's go hand in hand with call centres and the means a lower cost per customer interaction transaction. So the question is "Do you have a relationship with your call centre?". No, me neither. Any ideas what we should be calling CRM?

Wednesday, November 25, 2009

Induction Blues

A few years ago I got inducted. To be more precise I have attended the company's compulsory 1 day induction course. It was a cut down version of the three day course that all executives attended the previous year.

One part of the course concerned meeting etiquette. We were instructed that a meeting organiser we would always have to produce, in advance, a meeting agenda and somebody would also need to take minutes.

So, given that everybody in the company from the executive down had already attended the course, how many meetings actually included an agenda and were minuted. Yep that's - buckleys.

Tuesday, November 24, 2009

What's in a title ...

For my sins I have worked previously as an IT Architect. However, I've never been entirely comfortable with this title as applied to IT. To me architects design buildings and bridges and the like.

So it was quite refreshing when I had a recent conversation with one of our 'Data Architects' when he admitted that he only applied for Architect jobs because they paid better than other IT jobs he was qualified to do (i.e. Data Modeller).

Which leads me to my point. In my last job I managed a small team of Solution Designers. To all intents and purposes the role was actually that of Solution Architect, but because the department already had another team of 'Architects' (Enterprise in this case) it was decided to call us Designers. So what's the difference - same job, same job description, different title - about $300 per day - according to the recruitment agencies I dealt with.

Redundancy Blues ....

No not that sort of redundancy ....

I used to work with files and hierarchical databases on mainframes before I saw the light and chose to focus on relational databases. I was an early adopter and technology convert/evangelist. I can hardly believe that I used to voluntarily read UNIX and RDBMS books as I'm just not that much of a nerd.

Which leads me to the point of this post. One of the big concepts at the heart of RDBMS is the adoption of third normal form (3NF), and one of the key fundamentals of 3NF is that a piece of data should only be stored once and once only (i.e. with zero redundancy).

So I'm happy to report that 20 years down the line we're happily working away with our single 'corporate database' reporting on our 'single version of the truth' becuase it would be mad if we'd just swapped technologies and repeated all the issues we used to have with the legacy technology. Just mad.

Think of the environment ...

... before you print this email.

I hate this common addition email signatures. It's right up there with the "Honk if you had it last night" bumper sticker in my books. More to the point some recent analysis suggested that Google Mail expends the same amount of electical energy in sending an email (Google is known how have some of the worlds largest data centers) as it takes to boil a kettle. Google contest this suggestion, which is their perogative.

What I'm suggesting is that perhaps people should think about sending the email in the first place rather than worrying about how many sustainable trees were cut down to print it. Speaking of which I wonder how many cups of tea I've used up in this post.

I think therefore I am ....

... a Project Manager.

Years ago I used to respect PM's. On the whole they were experienced, grey haired, time-served IT professionals. You couldn't bs a decent PM. Nowadays, most PM's I work with only need to have two attributes. The first is to be pushy and the second is a laptop with Microsoft Project.

The Third Trojan Horse

I don't profess to be an industry guru. I bought into the whole concept of Network Computers and JavaStations (remember them). I couldn't understand why anybody wanted a Blackberry. I dismissed the original iPods as expensive pieces of platic that would never take off. What I totally misread though was the timing, the importance of the user experience for successful adoption and the completeness of the end to end solution - namely through the provision of content via iTunes. The rest is history and the iPod became the Trojan Horse that wormed its way into the pockets of millions and went a large way to restoring Apple's fortunes and coolness. IMO the iPhone has pickedup the gauntlet and is proving to be a worthy successor.

So here I go making another prediction - the rumoured 2010 release of the Apple Tablet (could they possibly resurect the name iBook or Newton name?) will be huge. So in 2001 when Bill Gates predicted that the tablet PC would be the most popular form of PC sold he wasn't entirely wrong. His timing was just a bit out, the user experience was woeful and there was no end to end compelling argument to change. I don't think Steve Jobs will repeat those mistakes.

It's progress init 2!

A couple of months ago I was made to attend a computer seminar. I try to avoid these things as they are normally snoozefests but in this case my boss ordered me to attend. The theme of this conference was trying to better manage some of the complexity of modern IT. Amongst 6 major themes pushed by the conference one was virtualisation.

In laymans terms virtualisation means replacing many physical servers - currently clogging up your data centres - with a big server and then through the magic of 21st century software hosting lots of virtual servers on the single box.

If only sombody thought of this forty odd years ago we could have saved ourselves a lot of effort. Now what did I put that VM manual again?

When I was a kid ...

When I was a kid I wanted to be a pilot. Of course I ended up in IT, but that's OK because most of the time I quite like the challenge of programming. Most programmers do. My poser to you today is "Who grows up wanting to be a software tester?".

Layer Cake

In my first ever job in IT I worked as an COBOL Analyst/Programmer. In the development section there were precisely 4 roles defined in a simple top down hierarchy. The roles were: Development Manager, Project Manager, Project Leader and Analyst/Programmer. Funnily enough we got stuff done. Lots of stuff done.

If I wanted or needed advice I could seek out my Project Leader - who I knew had started off as an A/P and over the years had accrued a fair amount of technical experience.

Next time you're in an IT Development meeting ask yourself how many of these people around the table have ever cut a line of code in anger. In my experience I'd say you'd be lucky if it was more than one in four.

The BodyShop suffle

Early in my career I remember my first ever experience in dealing with external IT consultant. My bosses boss paid a flat twelve hundred quid daily rate (a lot in 1992) for they guy to fly up to our office sit in meetings for 6 hours and provide some quidance to a specific performance issue that we were having with their software at a client site. The said consultant was experienced, eloquent, convincing and as it turned out dead wrong in his analysis and diagnosis of the problem. Of course our advice was ignored cos it didn't cost 1200 smackers.

So a number of years later when I joined a large professional services company I asked myself whether I thought I was good enough, would clients really pay exhorbitent rates for my advice and counsel. As it turns out the answer was yes and I had a good and successful stint. But at least I asked myself the question.

Over the last 10 years I've dealt with many consultants. I don't think many ask that question anymore.

eXPerience

Like many of you I earn my daily bread whilst sitting infront of a PC running Windows XP. I have nothing against XP per se except that it's 9 years old. Almost a decade in the maelstrom of change that is the IT industry is like a century in other professions. So the question I'm asking is why haven't we upgraded to something more modern. Of course the obvious answer is that until the launch of Windows 7 there were no viable alternatives - Vista - you cannot be serious, Linux - too nerdy, etc.

Which brings me to my point. The last time I worked for a company that upgraded all their desktop OS's (in this case a large retail insurance company with 1000 pc's in head office) from Windows 95 to NT the project burned through 8 PM's, blew out from 6 months to 2.5 years and upset just about every employee from financial controller down in the organisation. The IT Director at the time admitted that the desktop o/s rollout caused more anger, bitterness, frustration and confusion in the business than anything she had ever witnessed in her many years in business.

Volunteers for Windows 7 Desktop Rollout please form an orderly line ...

Monday, November 23, 2009

It progress init!

I currently work for the Department of Hopes and Dreams (not their real name) and like many out there I have to complete a weekly timesheet for project accounting purposes. Whist it's never much fun I recognise that it is a necesary evil.

However my real bugbear here is the timesheet software that we use, let's call it Opacity (again not their real name) which is a commonly used application that has been sold extensively worldwide. Here's the problem.

How many clicks/keystrokes should it take to enter my start time, end times and breaks for 5 days. Have a guess. Maybe 20, 25? Wrong. I counted it this week. It took 73 mouse click and 20 character presses. This raises more questions than answers.

How can anyone write a user interface so bad? How could anyone demo this package and go home and sleep at night? What idiot actually sat through a demo and selected this software?

Answer please on a postcard to ....

Me and my blog

Welcome to the very first post on IT Journeyman. The purpose of this blog is to share stories, anecdotes and observations from the wonderful world of IT - as I see IT. Am I qualified to write this - well I suppose so is the simple answer and more the the point you can't stop me.

About me - I've been active in the world of IT for the last two decades and have had exposure to a wide variety of platforms and systems - everything from mainframe to micro and transactional applications to data warehouses. Among others I've worked for software houses, consultancies, banks, insurers, media and telcos, government and sometime for myself.

So welcome to my blog and enjoy. If any of what I write strikes a chord within you feel free to share with your contacts.