Back in the post 'The IT Wars' I referred to an RDBMS war that was fought between Oracle and Ingres. In reality competition in the RDBMS space has always been a little more complex than that.
The Oracle/Ingres battle just happended to be the main one at play when I entered the fray. FYI - my first RDBMS was Ingres and I have to say that from the developers perspective it was a far superior product to Oracle. Ultimately it is recognised that Oracle won out because of superior sales and marketing prowess, although this simplistic argument undermines the fact that Oracle was the more reliable and scalable database (row vs page level locking anyone?).
In reality Oracle has seen off the following RDBMS competition over the years:
- Ingres in the late 80's
- Sybase in the early 90's
- Informix in the late 90's
Along the way the big O has also also gobbled up Rdb and MySQL databases.
The interesting thing is that when I worked for Larry in the mid to late 90's Oracle always saw the following two as being their major database rivals:
- IBM with DB2
- Microsoft with SQL Server
So it comes as no great surprise that with the acqusition of Sun Microsystems and the plans that Oracle has for its Database Machine that the likes of IBM and Microsoft are scathing in their response calling it a return to the bad old days of the 1960's.
So how does the combination of software/hardware affect the big players:
Microsoft - SQL Server runs on any platform so long as its Windows/x86 so no change there.
IBM - The various forms of DB/2 run on various hardware platforms - so long as they're IBM that is. Again no change.
Oracle - I'm struggling to see a downside here. Larry has picked up Sun at pretty much a rock bottom price and with the new capability can attack integrated hardware/software threats from IBM, Netezza and Teradata who have all espoused the integrated harware/software solution. The combination of hardware and software is not unlike that taken by Apple, but it doesn't preclude Oracle's regular business of selling database on pretty muuch every major platform out there.
He's also picked up Java which was the lynchpin of Oracle's development platform anyway and moved a long way to removing the potential threat from Open Source by picking up the most successful of the Open Source databases in MySQL. In short, it's a win win because it's unlikely that he will alienate any of the existing hardware partners as they know that they cannot ignore Oracle.
Does it herald a return to the heyday of the 1960's, however, or for anyone following this blog, does that mean that we can look forward to fully integrated corporate data model as I've referred to in a few prior posts. Not likely. When Larry refers to an integrated server platform I think you have to look deeper than the marketing blurb to understand that apart from improvements to the systems management space and perhaps better performance integration there really isn't anything new here. Im not saying that the Database Machine is a bad concept, but I'd be wary in thinking that it herald's a return to when life was simple.
Sunday, February 28, 2010
Back to the Future
Labels:
Database Machine,
DB2,
IBM,
Microsoft,
Oracle,
SQL Sever,
Sun Microsystems
Thursday, February 25, 2010
Check, check, check and check!
Part 3 of a 3 part post on the role of the modern smartphone
In the previous two posts I've explored the impact of the smartphone on our lives. The first post explored my hopes and aspirations for a convergence device ten years ago and how the current smartphone vastly exceeds them. The second post explored what devices it has definitely replaced or is likely to replace in the near future. In the final post I wanted to explore what's left for the smartphone to achieve. Now I'm not pretending to be a futurologist and the potential applications for the smartphone are almost limitless. Why else have Google and Apple branded themselves as mobile devices companies when neither was their core competency less than a decade ago?
The focus of my smartphone predictions won't be around the likes of location services, enhanced reality, advertising and mobile TV as many are predicting, it's something far more mundane but essential to our daily lives.
Before I do this lets explore something that I do, and I'm guessing I'm not alone here, every day before I leave the house and go to work:
Check, check, check and check!
- Have I got my wallet? Check.
- Have I got my house keys? Check.
- Have I picked up my loose change? Check.
- Have I got my mobile phone? Check.
(Note that in my case I don't drive to work but if I did I'd obviously check for my car keys too).
So could the smartphone change things here? Well when you examine all these things they resolve down to two fundamentals of life: identity and money.
In my wallet is my drivers licence (a defacto identity card), my prepaid bus ticket (currently anonymous, but in other cases not so like the London Oyster Card), my work entry card (identity), some cash (money), credit and debit cards (money) and a lottery ticket (potential money). The other items in my pockets: loose change (money) and house keys (identity) and car keys (identity). Yes I know the keys don't strictly identify me but there is an implicit assumption that because I have the key I'm authorised to use it.
So can the smartphone morph and replace the above devices. Well technically yes. There's nothing revolutionary about the concept of using a smartphone for a payment device. Nokia were talking about this sort of stuff 10 years ago and I believe that many others are working on this technology right now. Lets also not forget that I can access our bank accounts never mind ebay and paypal on my phone and that we also already access our mobile phone accounts (either pre or postpay) every time we make a call or send a text. In short it doesn't take a huge leap of faith to believe that smartphones will become payment channels in the near future.
Looking at cars we are already starting to see keys be replaced by 'dongle-based' access devices. Is it such a stretch to see your car iPod/iPhone integration extending to cover car security too? The same could be said for home security too.
This leaves us with the rest of identity. For example can you imagine your driving licence being stored on your smartphone (or a cloud server accessed by your phone) If so what about your passport or even some future DNA based identity scheme? This is where I'm probably stretching the limits of how we will use smartphones in the future but not because of some inherent technical limitations, but bureaucratic ones. Could you imagine government departments accepting smartphones for ID. And I can already hear the din from the civil libertarians.
However, if the smartphone is going to evolve in these directions then we're going to need a pretty secure way for the device to identify us. Biometrics would definitely be required in the handset for this. We're also going to need some pretty secure encryption technology to ensure that nobody is hacking our accounts through the ether.
But the important concept here is that nothing mentioned in this post is beyond the bounds to reasonable technological advances in the next few years. Considering how far the smartphone has come in such a short space of time I'm beginning to see why Apple and Google are so interested in them.
In the previous two posts I've explored the impact of the smartphone on our lives. The first post explored my hopes and aspirations for a convergence device ten years ago and how the current smartphone vastly exceeds them. The second post explored what devices it has definitely replaced or is likely to replace in the near future. In the final post I wanted to explore what's left for the smartphone to achieve. Now I'm not pretending to be a futurologist and the potential applications for the smartphone are almost limitless. Why else have Google and Apple branded themselves as mobile devices companies when neither was their core competency less than a decade ago?
The focus of my smartphone predictions won't be around the likes of location services, enhanced reality, advertising and mobile TV as many are predicting, it's something far more mundane but essential to our daily lives.
Before I do this lets explore something that I do, and I'm guessing I'm not alone here, every day before I leave the house and go to work:
Check, check, check and check!
- Have I got my wallet? Check.
- Have I got my house keys? Check.
- Have I picked up my loose change? Check.
- Have I got my mobile phone? Check.
(Note that in my case I don't drive to work but if I did I'd obviously check for my car keys too).
So could the smartphone change things here? Well when you examine all these things they resolve down to two fundamentals of life: identity and money.
In my wallet is my drivers licence (a defacto identity card), my prepaid bus ticket (currently anonymous, but in other cases not so like the London Oyster Card), my work entry card (identity), some cash (money), credit and debit cards (money) and a lottery ticket (potential money). The other items in my pockets: loose change (money) and house keys (identity) and car keys (identity). Yes I know the keys don't strictly identify me but there is an implicit assumption that because I have the key I'm authorised to use it.
So can the smartphone morph and replace the above devices. Well technically yes. There's nothing revolutionary about the concept of using a smartphone for a payment device. Nokia were talking about this sort of stuff 10 years ago and I believe that many others are working on this technology right now. Lets also not forget that I can access our bank accounts never mind ebay and paypal on my phone and that we also already access our mobile phone accounts (either pre or postpay) every time we make a call or send a text. In short it doesn't take a huge leap of faith to believe that smartphones will become payment channels in the near future.
Looking at cars we are already starting to see keys be replaced by 'dongle-based' access devices. Is it such a stretch to see your car iPod/iPhone integration extending to cover car security too? The same could be said for home security too.
This leaves us with the rest of identity. For example can you imagine your driving licence being stored on your smartphone (or a cloud server accessed by your phone) If so what about your passport or even some future DNA based identity scheme? This is where I'm probably stretching the limits of how we will use smartphones in the future but not because of some inherent technical limitations, but bureaucratic ones. Could you imagine government departments accepting smartphones for ID. And I can already hear the din from the civil libertarians.
However, if the smartphone is going to evolve in these directions then we're going to need a pretty secure way for the device to identify us. Biometrics would definitely be required in the handset for this. We're also going to need some pretty secure encryption technology to ensure that nobody is hacking our accounts through the ether.
But the important concept here is that nothing mentioned in this post is beyond the bounds to reasonable technological advances in the next few years. Considering how far the smartphone has come in such a short space of time I'm beginning to see why Apple and Google are so interested in them.
Wednesday, February 24, 2010
How many devices does your iPhone replace?
Part 2 of a 3 part post on the role of the modern smartphone
In the previous post I reflected on how far the smartphone, in my case the ubiquitous iPhone but in reality it could be any smartphone, had come and how it had not only superceeded the mobile phone and the pda but also a plethora of other functions/devices. So lets look at personally what it's replaced or is likely to replace in the next few years in my scenario:
- PDA - will I ever buy another decicated PDA. Don't think so. Sorry Palm.
- iPod - again, I'm not sure I'll ever buy another dedicated music player nor an MP4 player. Bye Apple.
Sat Nav - Would I ever buy another dedicated Sat Nav device - probably not, although for true replacement technology I'd likely buy Tom Tom for iPhone. If I had a Google or Nokia smartphone this wouldn't be the case. Sorry Tom Tom.
- digital camera/camcorder - I'm not sure we'll replace our compact digicam when it dies. Maybe not the camcorder either. Sorry Sony and Canon. The digital SLR is, however, safe for now.
- Portable games console - The PSP barely get's any use and could well end up on ebay before the month is out. Sorry Sony.
- Portable DVD player - My better half wanted a portable DVD player so we could take the kiddie dvds with us when we travel. I've ripped and encoded this stuff onto my - iPhone and got a iPhone Composite cable for Christmas so we're pretty set on that front now too.
- Watch and Alarm Clock - this is probably not typical but for a few years I've stopped wearing a wristwatch and have relied on my phone for timekeeping.
It's amazing really that all this stuff has been or is likely to be replaced by the phone in my pocket. I'm not saying that the iPhone is better than any of these devices but in most cases its certainly good enough to replace many of the dedicated devices.
The other advantages are that it's always to hand and charged, unlike most of the other stuff which is in the cupboard most of the time out of battery.
In the previous post I reflected on how far the smartphone, in my case the ubiquitous iPhone but in reality it could be any smartphone, had come and how it had not only superceeded the mobile phone and the pda but also a plethora of other functions/devices. So lets look at personally what it's replaced or is likely to replace in the next few years in my scenario:
- PDA - will I ever buy another decicated PDA. Don't think so. Sorry Palm.
- iPod - again, I'm not sure I'll ever buy another dedicated music player nor an MP4 player. Bye Apple.
Sat Nav - Would I ever buy another dedicated Sat Nav device - probably not, although for true replacement technology I'd likely buy Tom Tom for iPhone. If I had a Google or Nokia smartphone this wouldn't be the case. Sorry Tom Tom.
- digital camera/camcorder - I'm not sure we'll replace our compact digicam when it dies. Maybe not the camcorder either. Sorry Sony and Canon. The digital SLR is, however, safe for now.
- Portable games console - The PSP barely get's any use and could well end up on ebay before the month is out. Sorry Sony.
- Portable DVD player - My better half wanted a portable DVD player so we could take the kiddie dvds with us when we travel. I've ripped and encoded this stuff onto my - iPhone and got a iPhone Composite cable for Christmas so we're pretty set on that front now too.
- Watch and Alarm Clock - this is probably not typical but for a few years I've stopped wearing a wristwatch and have relied on my phone for timekeeping.
It's amazing really that all this stuff has been or is likely to be replaced by the phone in my pocket. I'm not saying that the iPhone is better than any of these devices but in most cases its certainly good enough to replace many of the dedicated devices.
The other advantages are that it's always to hand and charged, unlike most of the other stuff which is in the cupboard most of the time out of battery.
Lets see how far we've come
Part 1 of a 3 part post on the role of the modern smartphone.
Back in the late 1990's I really wanted a mobile phone that was a true convergence device, something that meant I didn't have to carry around a mobile and a pda. At the time I had an Ericsson GF768 flip phone and Palm Pilot 3. Ultimately the single device that I wanted would be able to make calls, manage a consolidated contact list, scribble a few notes and maybe just maybe write an e-mail. WAP was the next big thing then so maybe access to the WAP web would be a nice to have.
These days any half decent smartphone will more than meet these needs and more but lets just consider how much more they can do. Today my iPhone gives me all of these incredible devices and applications in my pocket: phone, calendar, web browser, contacts, camera, video camera, audio recorder, texting, messaging & skype, IM & chat, access to social networking, email, music, movies, photo album, books, gps & maps and games. Oh and access to an almost unlimited suppy of applications too.
I may have had to wait ten years to get it but when it arrived my convergence device really did exceed my expectations. As Ferris Bueller says 'Life moves pretty fast. You don't stop and look around once in a while, you could miss it'.
Back in the late 1990's I really wanted a mobile phone that was a true convergence device, something that meant I didn't have to carry around a mobile and a pda. At the time I had an Ericsson GF768 flip phone and Palm Pilot 3. Ultimately the single device that I wanted would be able to make calls, manage a consolidated contact list, scribble a few notes and maybe just maybe write an e-mail. WAP was the next big thing then so maybe access to the WAP web would be a nice to have.
These days any half decent smartphone will more than meet these needs and more but lets just consider how much more they can do. Today my iPhone gives me all of these incredible devices and applications in my pocket: phone, calendar, web browser, contacts, camera, video camera, audio recorder, texting, messaging & skype, IM & chat, access to social networking, email, music, movies, photo album, books, gps & maps and games. Oh and access to an almost unlimited suppy of applications too.
I may have had to wait ten years to get it but when it arrived my convergence device really did exceed my expectations. As Ferris Bueller says 'Life moves pretty fast. You don't stop and look around once in a while, you could miss it'.
Thursday, February 18, 2010
Six Degrees of Preparation
Over a coffee the other day with a friend we were discussing what qualifications were most appropriate for a 21st Century IT professional. Obviously you would assume that a BSc in Computing or equivalent would rank highly, maybe even a BA in Information Technology, although at my Uni this were regarded as a soft option in IT degrees. Other studies related to New Media and Communications may also be relevant.
(FYI I studied on a Combined Sciences course of which one of my majors was Computer Sciences).
The funny thing is that when I got my first job in IT I was already COBOL trained so productive from pretty much day one. Not so for a couple of my new starter colleagues who read History and English Uni degree respectively. They embarked on a 6 month programming course.
IT is pretty unique in that way as it has alway been open to all comers. This is something that would be incoceivable in most other professions unless you had a relevant degree (i.e. Law, Medicine and Engineering) or even new fields like Biotechnology.
Back to the original question - what is the best degree. Well my mate believed that it would be a Law degree, followed by an MBA, because as we to oursourcing and cloud computing he believes that the only relevant skills for modern IT was drawing up and managing contracts.
That's a real shame.
(FYI I studied on a Combined Sciences course of which one of my majors was Computer Sciences).
The funny thing is that when I got my first job in IT I was already COBOL trained so productive from pretty much day one. Not so for a couple of my new starter colleagues who read History and English Uni degree respectively. They embarked on a 6 month programming course.
IT is pretty unique in that way as it has alway been open to all comers. This is something that would be incoceivable in most other professions unless you had a relevant degree (i.e. Law, Medicine and Engineering) or even new fields like Biotechnology.
Back to the original question - what is the best degree. Well my mate believed that it would be a Law degree, followed by an MBA, because as we to oursourcing and cloud computing he believes that the only relevant skills for modern IT was drawing up and managing contracts.
That's a real shame.
Tuesday, February 16, 2010
Dark Days at 1 Infinite Loop?
According to some recent techpress articles it's been a week to forget for Apple. Firstly, Redmond launched what appears to be their first viable alternative in the mobile devices space with Windows Phone Series 7 (catchy title, eh?) and then the rest of the also rans (excluding Google, RIM, Microsoft and Nokia) in the mobile phone industry announce the WAC as an alternative to the App Store.
Take this into account with some lacklustre reviews on the iPad and should we be calling an end to the Apple Renaissance?
Well, time will tell whether the Microsoft gamble will succeed but many pundits are already writing it off just like the Zune. Steve Jobs has used the Wayne Gretzky quote 'I skate to where the puck is going to be, not where it has been.' and I think Microsoft as skating at where the iPhone has been.
And as to the WAC check out http://www.theregister.co.uk/2010/02/16/app_stores_szzz/ for some thoughts. Also if the OviStore has to resort to giving away the Nokia Maps Application for free to get some traction then how will the WAC fare any better? Nokia reported that there were 3m dowloads of its Maps application last week. This compares to Apple's overall 3bn App Store downloads and 10bn iTunes downloads to put some context around these numbers.
The much more serious threat to Apple in the mobile space comes from Google with its Android platform, despite Eric Schmidt's low key address today at MWC. Android, and a plethora of mobile phone makers, could challenge the all-in-one hardware/software philisophy that Apple has today, just like like Windows and a plethora of PC makers did for the original Macintosh computer. However, there are real differences between the early PC days and the current smartphone wars. Namely:
- different pricing models are at play - Back in the 80's everyone agreed that the Mac was superior to the PC but cost twice as much. Most mobiles are bought on the back of network contracts so unless Android sets start significantly undercutting the iPhone then that doesn't apply here. Indeed given that the mobile phone is something of a status symbol a market awash with cheap Android clones may even be counterproductive.
- Smartphones are as much fashion statements as a utilitarian devices meaning that design of the handset, operating system and application softeware is crucial. Here is where I think Apple have the edge. The key principles at the core of Apple are superior design and controlling the entire end to end user experience.
Will Android overtake the iPhone - almost certainly, but will that be at the expense of Apple. I don't think so. I suspect that it will be Nokia, Microsoft and RIM have more to fear from this weeks announcements.
Take this into account with some lacklustre reviews on the iPad and should we be calling an end to the Apple Renaissance?
Well, time will tell whether the Microsoft gamble will succeed but many pundits are already writing it off just like the Zune. Steve Jobs has used the Wayne Gretzky quote 'I skate to where the puck is going to be, not where it has been.' and I think Microsoft as skating at where the iPhone has been.
And as to the WAC check out http://www.theregister.co.uk/2010/02/16/app_stores_szzz/ for some thoughts. Also if the OviStore has to resort to giving away the Nokia Maps Application for free to get some traction then how will the WAC fare any better? Nokia reported that there were 3m dowloads of its Maps application last week. This compares to Apple's overall 3bn App Store downloads and 10bn iTunes downloads to put some context around these numbers.
The much more serious threat to Apple in the mobile space comes from Google with its Android platform, despite Eric Schmidt's low key address today at MWC. Android, and a plethora of mobile phone makers, could challenge the all-in-one hardware/software philisophy that Apple has today, just like like Windows and a plethora of PC makers did for the original Macintosh computer. However, there are real differences between the early PC days and the current smartphone wars. Namely:
- different pricing models are at play - Back in the 80's everyone agreed that the Mac was superior to the PC but cost twice as much. Most mobiles are bought on the back of network contracts so unless Android sets start significantly undercutting the iPhone then that doesn't apply here. Indeed given that the mobile phone is something of a status symbol a market awash with cheap Android clones may even be counterproductive.
- Smartphones are as much fashion statements as a utilitarian devices meaning that design of the handset, operating system and application softeware is crucial. Here is where I think Apple have the edge. The key principles at the core of Apple are superior design and controlling the entire end to end user experience.
Will Android overtake the iPhone - almost certainly, but will that be at the expense of Apple. I don't think so. I suspect that it will be Nokia, Microsoft and RIM have more to fear from this weeks announcements.
Monday, February 15, 2010
I think I need an App for that
You may not know it but we are now in the middle of a new tech land grab as established iPhone developers desperately scramble to get native iPad versions of their apps ready for the new platform. This land grab has been measured by the incredible uptake of the iPad SDK.
And before you say it - yes I know that the vast majority of the existing 150,000 iPhone Apps will upscale and run on the iPad but that's a stopgap at best. No doubt new iPad owners will want shiny new iPad apps.
The land grab is underway because the smart developers know that there's a narrow window of opportunity before the AppStore becomes awash with tens of thousands of iPad apps.
No here's the thing I find most interesting. Viewing the AppStore today on my iPhone I can basically flick through 20 categories and about 300 apps per category - give or take, assuming I have an hour or so spare, that is. That's 6,000 apps I can access before I have to resort to a search strings or the genius recommendation.
This is tip of the iceberg stuff. There are another 144,000 apps out there that it's highly unlikely that I'll every find, unless they're featured in some way. Hence my problem how do I know that these apps exist if they're not featured and I can't easliy access them?
Maybe I need a better way of searching for the apps. You know what - I think I need an App for that!
And before you say it - yes I know that the vast majority of the existing 150,000 iPhone Apps will upscale and run on the iPad but that's a stopgap at best. No doubt new iPad owners will want shiny new iPad apps.
The land grab is underway because the smart developers know that there's a narrow window of opportunity before the AppStore becomes awash with tens of thousands of iPad apps.
No here's the thing I find most interesting. Viewing the AppStore today on my iPhone I can basically flick through 20 categories and about 300 apps per category - give or take, assuming I have an hour or so spare, that is. That's 6,000 apps I can access before I have to resort to a search strings or the genius recommendation.
This is tip of the iceberg stuff. There are another 144,000 apps out there that it's highly unlikely that I'll every find, unless they're featured in some way. Hence my problem how do I know that these apps exist if they're not featured and I can't easliy access them?
Maybe I need a better way of searching for the apps. You know what - I think I need an App for that!
Sunday, February 14, 2010
Object Blues
One of the key differentiators between Mac OSX and Windows, according to quotes attributed to Steve Jobs, is that Mac OSX is Object Orientated. He places this fact at the heart of a key productivity advantage allowing Apple to turn a major release every 18 month or so. Redmond seems to take a lot longer than that for Windows. It's also enabled them to turn out the respective iPhone and iPad OS's relatively quickly.
That's fine as far as it goes but I personally have always had a bit of an issue with OOP (Object Orientated Programming) in the real world in the fact that it isn't exactly consistent with the RDBMS view of the world.
- OOP is Function Centric whereas Relational is Data Centric
IMHO OOP works well when you're writing things that are function centric (like operating systems, flashy web sites, computer games, graphics packages) but is less successful when writing large data handling systems (like data warehouses etl's, high volume transaction systems).
When we have to merge the object and relational world together we have three options. The first is to use an Object Relational Bridge technology or secondly we store Object datatypes inside an RDBMS. Both of the represent the RDBMS is an OO friendly format.
Then of course there's the third way which is to use a OOP language (like Java) and write code that doesn't conform to OOP basics negating all the benefits of Object Orientation. Guess which option I've mainly seen in my travels.
That's fine as far as it goes but I personally have always had a bit of an issue with OOP (Object Orientated Programming) in the real world in the fact that it isn't exactly consistent with the RDBMS view of the world.
- OOP is Function Centric whereas Relational is Data Centric
IMHO OOP works well when you're writing things that are function centric (like operating systems, flashy web sites, computer games, graphics packages) but is less successful when writing large data handling systems (like data warehouses etl's, high volume transaction systems).
When we have to merge the object and relational world together we have three options. The first is to use an Object Relational Bridge technology or secondly we store Object datatypes inside an RDBMS. Both of the represent the RDBMS is an OO friendly format.
Then of course there's the third way which is to use a OOP language (like Java) and write code that doesn't conform to OOP basics negating all the benefits of Object Orientation. Guess which option I've mainly seen in my travels.
Labels:
Apple,
Microsoft,
Object Orientated Programming,
RDBMS
Slumdog Billionaires
Congratulations are due today to Larry Ellison and his BMW Oracle Racing team for bringing home the Americas Cup to, err, America. Back in 1998 I was working as one of Larry's minions when he came down under to win the ill-fated Sydney to Hobart so I know how important this win will mean to him.
What is of interest to me, however, is that the Americas Cup has essentially become a massive ego contest between some of the world's richest billionaires. In this case between one Lawrence J Ellison, 65, US software billionaire and 4th richest man on the planet versus Ernesto Bertarelli, 44, Swiss Biotec billionaire and 52nd richest man.
Checking a list of billionaires on wikipedia I wasn't surprised to see that the computer industry contains more than its fair share including Bill Gates, Steve Ballmer and Paul Allen from Microsoft. Not present was Steve Jobs but at a rumoured $5bn and with Apple's remarkable growth he can't be far off joining that list.
Fortune has shined on Bill, Steve and Larry as some point, aside from dropping out of college, they were at the right place (Silicon Valley) at the right time (early 1970's and the advent of the silicon chip) with the right idea (hardware and/or software for the PC, or database for client server).
My query for today is where and when will the next set of billionaires be sourced. Obviously Larry and Sergey are the real winners from the Internet boom, but it looks like they are the exception of their age rather than the rule.
With hindsight it becomes clear that the early days of the personal computer and client/server computing presented opportunities for hobbyists working for love not money in their garages and with like minded friends in clubs (like the Homebrew Computer Club).
My suspicion is that this was a one off and it is unlikely that the next set of billionaires will be from the hobbyist fold. They will be from the likes of the Biotechs and it's more likely that they will complete their studies than drop out from the likes of MIT or Stanford.
I don't know why but I,for one, find that quite sad.
What is of interest to me, however, is that the Americas Cup has essentially become a massive ego contest between some of the world's richest billionaires. In this case between one Lawrence J Ellison, 65, US software billionaire and 4th richest man on the planet versus Ernesto Bertarelli, 44, Swiss Biotec billionaire and 52nd richest man.
Checking a list of billionaires on wikipedia I wasn't surprised to see that the computer industry contains more than its fair share including Bill Gates, Steve Ballmer and Paul Allen from Microsoft. Not present was Steve Jobs but at a rumoured $5bn and with Apple's remarkable growth he can't be far off joining that list.
Fortune has shined on Bill, Steve and Larry as some point, aside from dropping out of college, they were at the right place (Silicon Valley) at the right time (early 1970's and the advent of the silicon chip) with the right idea (hardware and/or software for the PC, or database for client server).
My query for today is where and when will the next set of billionaires be sourced. Obviously Larry and Sergey are the real winners from the Internet boom, but it looks like they are the exception of their age rather than the rule.
With hindsight it becomes clear that the early days of the personal computer and client/server computing presented opportunities for hobbyists working for love not money in their garages and with like minded friends in clubs (like the Homebrew Computer Club).
My suspicion is that this was a one off and it is unlikely that the next set of billionaires will be from the hobbyist fold. They will be from the likes of the Biotechs and it's more likely that they will complete their studies than drop out from the likes of MIT or Stanford.
I don't know why but I,for one, find that quite sad.
Labels:
Apple,
Bill Gates,
Google,
Larry Elliison,
Miscrosoft,
Oracle,
Steve Jobs
Thursday, February 11, 2010
IT Nirvana
Over the last few posts I have raised the unfulfilled concept that was prevalent in the early days of Open Systems - namely the Corporate or Enterprise Database. This concept was where corporate data would be stored and accessed centrally from a central managed place. Of course the central database never happened and there are lots of reasons for this, but I like to think of the Corporate Database as a kind of IT Nirvana. Twenty years ago if we could show IT managers what their systems landscapes would look like today I think this idea may have taken off and just possibly we could have a achieved that blissful IT state.
Instead we face the following issues:
- Systems complexity - get your local IT architect to print out your current systems landscape. Even better get him to walk through all your interfaces.
- Data Warehouses - Often the holy grail of the data warehouse is the search for a single view of the truth. This is because your multiple systems will all have different versions. If the data was kept in one place then we wouldn't need to build expensive and complex data warehouses.
- Master Data Management - The Corporate Database is the MD solution
- Enterprise Data Bus - Why would we need a bus if all the data sits in the same location?
So are we likely to find spiritual enlightenment in IT soon? Well not so long as we can't see past the next quarter, or FY budget for that matter.
Instead we face the following issues:
- Systems complexity - get your local IT architect to print out your current systems landscape. Even better get him to walk through all your interfaces.
- Data Warehouses - Often the holy grail of the data warehouse is the search for a single view of the truth. This is because your multiple systems will all have different versions. If the data was kept in one place then we wouldn't need to build expensive and complex data warehouses.
- Master Data Management - The Corporate Database is the MD solution
- Enterprise Data Bus - Why would we need a bus if all the data sits in the same location?
So are we likely to find spiritual enlightenment in IT soon? Well not so long as we can't see past the next quarter, or FY budget for that matter.
Luddites Unite
When I first started in IT I was a COBOL programmer and the database I mainly used at that time was IDMSX. We had a large powerfull mainframe that supported over 250 transaction processing applications. The interesting thing was that at a conference my boss got talking to his equivalent from a similar sized organisation. They had the same mix of systems and same sort of user community - only their mainframe had less than half the grunt of ours.
Why? Well they had rejected database technology and kept everything in flat files. I can't vouch for how responsive they were as an IT team but it makes you think.
This highlights something we've all seen happen in IT over time - huge advances in chip, bus, memory, disk and network technology are soaked up by software bloat. OK this is a simplistic argument and in reality corporate IT is doing a lot more than it did 20 years ago, but it still makes me wonder just how those flatfile systems would perform on todays equivalent platforms. I think they'd scream along.
Why? Well they had rejected database technology and kept everything in flat files. I can't vouch for how responsive they were as an IT team but it makes you think.
This highlights something we've all seen happen in IT over time - huge advances in chip, bus, memory, disk and network technology are soaked up by software bloat. OK this is a simplistic argument and in reality corporate IT is doing a lot more than it did 20 years ago, but it still makes me wonder just how those flatfile systems would perform on todays equivalent platforms. I think they'd scream along.
Suspect Packages
In the 70's and 80's companies used to write bespoke software for all their needs. Everything from payroll to financials to core business systems would be written inhouse in COBOL, RPG, PL/1, Pascal and the like. Then in the 80's and 90's the concept of packaged applications became popular. The rationale was that it was better to purchase and implement off-the-shelf packages rather than develop from scratch. This made perfect sense and much of my early IT work was involved in package implementation. Ideally a package could be implemented in months rather than written in years.
Software selection became vital as the business would often need to assess how flexible that package was and how easily it would fit into the existing processes. Working on the Technical Pre-Sales side it was always important to try to get a product fit of 90% or better. Inevitably though there was never a 100% product fit and in these situations there were two potential solutions:
1) For the business to adjust its way of working (sometimes called Business Process Reengineering)
2) For the package to be changed or modified (i.e. mods)
In the vast majority of cases the latter option was chosen as the business was reluctant to take on change. In one scenario I worked on a customer site where 70% of the programs had been modded making you wonder why they had ever selected the package.
The one exception to this model has always been SAP. They have always insisted that BPR is a fundamental principle of their implementations and as far as I know SAP implementations are never quick. Here at the Department of Hopes and Dreams we're currently 3 years into a SAP implementation and the frustration it is causing the business is unprecedented. Makes me wonder whether the advantages of packaged apps are still there.
Software selection became vital as the business would often need to assess how flexible that package was and how easily it would fit into the existing processes. Working on the Technical Pre-Sales side it was always important to try to get a product fit of 90% or better. Inevitably though there was never a 100% product fit and in these situations there were two potential solutions:
1) For the business to adjust its way of working (sometimes called Business Process Reengineering)
2) For the package to be changed or modified (i.e. mods)
In the vast majority of cases the latter option was chosen as the business was reluctant to take on change. In one scenario I worked on a customer site where 70% of the programs had been modded making you wonder why they had ever selected the package.
The one exception to this model has always been SAP. They have always insisted that BPR is a fundamental principle of their implementations and as far as I know SAP implementations are never quick. Here at the Department of Hopes and Dreams we're currently 3 years into a SAP implementation and the frustration it is causing the business is unprecedented. Makes me wonder whether the advantages of packaged apps are still there.
Master Data Blues
Master Data Management is currently one of the hot topics doing the rounds of Enterprise IT. For those unaware of the concept MDM is basically the storing, either physically or virtually, of all corporate reference data in a single MDM repository. Consultancies love to sell MDM solutions for two reasons:
1) They are quite easy to justify
2) MDM technology, whilst expensive, is available
So in theory we should all be swimming in MDM's by now. I don't know about you but I'm not. In fact a friend of mine became the first and only person I know to land a successful MDM Project.
So if we agree that it's the correct thing to do, and that the technology is available then why don't we see MDM's all around.
Simple really - the business side of MDM is very, very hard so before you begin you'd best make sure that the business understand what they are biting off.
Of course, if we'd kept a tight rein on our corporate IT systems in the first place then we wouldn't need MDM would we!
1) They are quite easy to justify
2) MDM technology, whilst expensive, is available
So in theory we should all be swimming in MDM's by now. I don't know about you but I'm not. In fact a friend of mine became the first and only person I know to land a successful MDM Project.
So if we agree that it's the correct thing to do, and that the technology is available then why don't we see MDM's all around.
Simple really - the business side of MDM is very, very hard so before you begin you'd best make sure that the business understand what they are biting off.
Of course, if we'd kept a tight rein on our corporate IT systems in the first place then we wouldn't need MDM would we!
Labels:
Consulting,
IT,
Metadata,
Project Manager,
Software Architect
Benson - 9th Generation PC
I was a huge fan of Paul Woakes' groundbreaking 'Mercenary' computer game, which is probably best remembered for its smooth 3D vector and polygonal graphics. What is perhaps less memorable about the game was that your character was guided through the game by a wiseass sidekick called Benson, a so called 9th generation PC. Benson would alert you when you were under attack, communicate with the locals (the Paylars and Mechanoids), etc. At the time I imagined that Benson was some sort of wearable PDA like device - maybe strapped to your arm, for example. I certainly didn't believe that a 9th generation PC was some beige box, screen and keyboard that you lugged around an alien landscape.
So this has got me thinking - what would the ninth generations of PC look like?
1st - Well a first generation PC is easy - your probably still using one on your desk today
2nd - That would have to be the laptop
3rd - I'm figuring that would be a PDA (Palm Pilots, Psions, Windows CE) or smartphone (iPhone, Blackberry, etc.) - If I'm honest these are pretty close to what I imagined Benson would be twenty years ago - only perhaps wearable.
4th - Is it the tablet? Time will tell.
What the 5th - 9th generations are still open for the likes of futurologists and sci-fi fans to debate, but here are some candidates:
I'm thinking that 5th generation PC's will be devices like an augmented reality e-paper displays as seen in the movie Red Planet. Alternatively they could be wearable PC's that display augmented reality information onto a HUD style goggles/spectacles, or as some have proposed, contact lenses.
Further afield is impossible to predict but if I had to have a stab at it then genrations 6-9 would adopt technologies proposed in many sci-fi movies (Firefox, Strange Days, Existenz, The Lawnmover Man, The Matrix, Johnny Mnemonic, etc.) which interact directly with our brains via some digital thought bridge, as scary as that might seem.
The one thing I can be certain of - that the existing first gen PC won't be around forever.
So this has got me thinking - what would the ninth generations of PC look like?
1st - Well a first generation PC is easy - your probably still using one on your desk today
2nd - That would have to be the laptop
3rd - I'm figuring that would be a PDA (Palm Pilots, Psions, Windows CE) or smartphone (iPhone, Blackberry, etc.) - If I'm honest these are pretty close to what I imagined Benson would be twenty years ago - only perhaps wearable.
4th - Is it the tablet? Time will tell.
What the 5th - 9th generations are still open for the likes of futurologists and sci-fi fans to debate, but here are some candidates:
I'm thinking that 5th generation PC's will be devices like an augmented reality e-paper displays as seen in the movie Red Planet. Alternatively they could be wearable PC's that display augmented reality information onto a HUD style goggles/spectacles, or as some have proposed, contact lenses.
Further afield is impossible to predict but if I had to have a stab at it then genrations 6-9 would adopt technologies proposed in many sci-fi movies (Firefox, Strange Days, Existenz, The Lawnmover Man, The Matrix, Johnny Mnemonic, etc.) which interact directly with our brains via some digital thought bridge, as scary as that might seem.
The one thing I can be certain of - that the existing first gen PC won't be around forever.
Tuesday, February 9, 2010
Home or Away
In the previous post I touched on working from home. Many of us have PC's at home or work laptops on our desks so in theory many of us could be doing the same job from home. In the main, however, we don't and I believe that there are a number of reasons for this:
- Communication - I think face to face communication is great and by working from home we'd lose this. However, as I've already touched on in "The Great Lost Art of Communication" I think this is becoming less relevant in today's workplace. If we're emailing each other arcoss the office floor we might as well be emailing each other across the city.
- Management vision - I don't for one moment think our management have seriously sat down and debated whether they should contemplate offering work from home options for large parts of their workforce.
- Measuring how we work - Easy enough for Sales guys and call centre workers but what about the rest of us. Performance is usually measured on hours and perception of our abilities, etc. at our annual reviews. These can be difficult to assess when you work from home. If we effectively want to measure how our home-based workers are doing then we would need to put a lot more effort into assessing how long tasks should take. Perhaps we even need to a system of paying not based upon hours but based upon achievements!
- Trust - If we still measure by hours then this brings the question of trust into the equation. Often work from home is abused.
None of the above is insurmountable, but they do require management vision and an ability to think outside the current way in which we allocated work and measure our success in achieving our goals.
- Communication - I think face to face communication is great and by working from home we'd lose this. However, as I've already touched on in "The Great Lost Art of Communication" I think this is becoming less relevant in today's workplace. If we're emailing each other arcoss the office floor we might as well be emailing each other across the city.
- Management vision - I don't for one moment think our management have seriously sat down and debated whether they should contemplate offering work from home options for large parts of their workforce.
- Measuring how we work - Easy enough for Sales guys and call centre workers but what about the rest of us. Performance is usually measured on hours and perception of our abilities, etc. at our annual reviews. These can be difficult to assess when you work from home. If we effectively want to measure how our home-based workers are doing then we would need to put a lot more effort into assessing how long tasks should take. Perhaps we even need to a system of paying not based upon hours but based upon achievements!
- Trust - If we still measure by hours then this brings the question of trust into the equation. Often work from home is abused.
None of the above is insurmountable, but they do require management vision and an ability to think outside the current way in which we allocated work and measure our success in achieving our goals.
Congestion Blues
My average work commute speed is about 8mph*, which is just better than the average London rushour car journey at 7mph - worse than the horse and cart achieved at the start of the 20th Century.
Cars are great so what's went wrong? Well the problem is the tarmac and the fact that we all want to use the same bit at the same time. So if that's the problem then what are the possible answers?
- Build more roads - yet studies have shown that new roads encourage people off public transport and into cars with no net benefit to congestion.
- Build great public transport - yet most governments are too bankrupt to fund the sort of public works programs required to make major infrastructure improvements here.
- Work from home - the best commute is the one you don't have to take. With boadband and mobile technology we have the infrastructure and tools that enables us to really embrace this solution, but so far it's only a minority solution.
So what's this got to do with IT? The answer is below:
- Build more roads - is the equivalent of building more interfaces between our systems. Except that in our case each road is made from different materials, using different construction techniques, different traffic lights and different road signs.
- Better public transport - is the equivalent of building an enterise bus architecture. A great potential solution but still very costly and complex.
- Work from home - back in my earlier post "Redundancy Blues" I lamented the lack of the single corporate database. Imagine if all the data our company held was in one database. We would ensure that it ran on the best hardware, backup and recovery would be easy, we wouldn't need to build data warehouses to try to discover a single version of the truth. In short,if we had realised the potential benefits offered to us by Open Systems we'd be in a far better place than we are today.
Oh, well, must go and get my pick axe. I'm building a new road today.
*Please note that I consider myself very lucky that my commute is only half an hour. Across the world many, many people spend about 2+ hours per day commuting to work and back - that's about 25% of the time that they are actually there!
Cars are great so what's went wrong? Well the problem is the tarmac and the fact that we all want to use the same bit at the same time. So if that's the problem then what are the possible answers?
- Build more roads - yet studies have shown that new roads encourage people off public transport and into cars with no net benefit to congestion.
- Build great public transport - yet most governments are too bankrupt to fund the sort of public works programs required to make major infrastructure improvements here.
- Work from home - the best commute is the one you don't have to take. With boadband and mobile technology we have the infrastructure and tools that enables us to really embrace this solution, but so far it's only a minority solution.
So what's this got to do with IT? The answer is below:
- Build more roads - is the equivalent of building more interfaces between our systems. Except that in our case each road is made from different materials, using different construction techniques, different traffic lights and different road signs.
- Better public transport - is the equivalent of building an enterise bus architecture. A great potential solution but still very costly and complex.
- Work from home - back in my earlier post "Redundancy Blues" I lamented the lack of the single corporate database. Imagine if all the data our company held was in one database. We would ensure that it ran on the best hardware, backup and recovery would be easy, we wouldn't need to build data warehouses to try to discover a single version of the truth. In short,if we had realised the potential benefits offered to us by Open Systems we'd be in a far better place than we are today.
Oh, well, must go and get my pick axe. I'm building a new road today.
*Please note that I consider myself very lucky that my commute is only half an hour. Across the world many, many people spend about 2+ hours per day commuting to work and back - that's about 25% of the time that they are actually there!
Monday, February 8, 2010
ROLAP smolap 2
Years ago at ungodly hours BBC2 used to broadcast TV lectures on behalf of the UK's Open University. Many of the films date from the 60's and 70's and are usually quite funny to watch because of the dubious haircuts and fashions prevalent in academia at that time.
I remember one such program caught my attention because it focussed on the SQL language. The lesson extolled the virtues of putting an english like query language in the workplace, implying that your average office worker would be comfortable using it. I suppose compared to the pointer driven databases that were prevalent in the 70's I can understand why they might have thought this way. However in practice companies never let their users loose with SQL and it remains very much an IT development tool. There are a number of reasons for this (cartesian products, anyone?) but one of them is NOT because SQL is an inherently complex language - it isn't - but rather that the data that we work with often is.
This led to the development of simpler dimensional data models which would format the data in a more user friendly manner - mostly with the Star Schema and to a lesser extent the snowflake schema. Fundamentally all OLAP and ROLAP technology is built to utilise these dimensional models.
So what's the problem? Well in a couple of projects recently I've had end users who have rejected the Star Schema as being too complex. In both cases they wanted a single denormalised view of the world - kind of a denormalised fact and dimension all-in-one superset.
Why did they reject the star schema? Well the reasons varied from not wanting to undertsand the complexities of the Star schema (i.e. surrogate join keys, current flags, effective dates, etc.) to a misguided belief that querying a single 'all-in-one' table would outperform a star query.
It gets worse as on one of the BI Managers had come up with the construct of a daily snapshot, such that every day the full denormalised snapshot dataset would be inserted even if only there were no changes to the source data.
In both cases I fundamentally disagreed with the customer and argued on behalf of the star schema. One battle I won and one I lost. Funnily enough though the battle I lost wasn't because of any inherent objection to the star schema - it was lost because the users had built a Visual Basic front end application through which to query and report their data. What they were able to achieve in VB had my Business Objects developers heads spinning. For those reasons I state again. ROLAP is old hat. If you want to keep your BI consumer happy you'll need to start building them some nice apps.
I remember one such program caught my attention because it focussed on the SQL language. The lesson extolled the virtues of putting an english like query language in the workplace, implying that your average office worker would be comfortable using it. I suppose compared to the pointer driven databases that were prevalent in the 70's I can understand why they might have thought this way. However in practice companies never let their users loose with SQL and it remains very much an IT development tool. There are a number of reasons for this (cartesian products, anyone?) but one of them is NOT because SQL is an inherently complex language - it isn't - but rather that the data that we work with often is.
This led to the development of simpler dimensional data models which would format the data in a more user friendly manner - mostly with the Star Schema and to a lesser extent the snowflake schema. Fundamentally all OLAP and ROLAP technology is built to utilise these dimensional models.
So what's the problem? Well in a couple of projects recently I've had end users who have rejected the Star Schema as being too complex. In both cases they wanted a single denormalised view of the world - kind of a denormalised fact and dimension all-in-one superset.
Why did they reject the star schema? Well the reasons varied from not wanting to undertsand the complexities of the Star schema (i.e. surrogate join keys, current flags, effective dates, etc.) to a misguided belief that querying a single 'all-in-one' table would outperform a star query.
It gets worse as on one of the BI Managers had come up with the construct of a daily snapshot, such that every day the full denormalised snapshot dataset would be inserted even if only there were no changes to the source data.
In both cases I fundamentally disagreed with the customer and argued on behalf of the star schema. One battle I won and one I lost. Funnily enough though the battle I lost wasn't because of any inherent objection to the star schema - it was lost because the users had built a Visual Basic front end application through which to query and report their data. What they were able to achieve in VB had my Business Objects developers heads spinning. For those reasons I state again. ROLAP is old hat. If you want to keep your BI consumer happy you'll need to start building them some nice apps.
Sunday, February 7, 2010
Avatar and the Real Time Data Warehouse
Last night I saw a feature on the making of Avatar. What caught my attention about this was the way in which they had created a 'virtual camera' through which James Cameron could observe the live action of his actors morphed into their avatar bodies and displayed in the CGI generated world of Pandora - in REAL-TIME.
In my business world, which is about as far removed from Pandora as you could get, I build data warehouses and have been doing so for many a year. From time to time the concept of Real-Time Data Warehousing has arisen and largely been dismissed for the following technical reasons:
- Data Dependencies
- Slowly Changing Dimenions
- Complex Transformations
- Updating Aggregate/Summary Data and Cubes
All of the above are still valid, and there would still need to be a valid business reason to go to the extra cost and expense of building a real-time data warehouse as opposed to a cheaper batch one.
However, I'm sure the technical hurdles that the Avatar movie makers overcame must have been a whole lot longer than my sorry list. Maybe it's time for a rethink!
In my business world, which is about as far removed from Pandora as you could get, I build data warehouses and have been doing so for many a year. From time to time the concept of Real-Time Data Warehousing has arisen and largely been dismissed for the following technical reasons:
- Data Dependencies
- Slowly Changing Dimenions
- Complex Transformations
- Updating Aggregate/Summary Data and Cubes
All of the above are still valid, and there would still need to be a valid business reason to go to the extra cost and expense of building a real-time data warehouse as opposed to a cheaper batch one.
However, I'm sure the technical hurdles that the Avatar movie makers overcame must have been a whole lot longer than my sorry list. Maybe it's time for a rethink!
Thursday, February 4, 2010
What goes around
Every wondered how over the last 20 years or so popular culture has begged, borrowed and stolen everything from the past fifty years or so. I think we're currently upto in 1983 in this playback. Goodness knows where we will go after we've revisited grunge because I don't think anything original has been created since then.
Strangely Enterprise IT has it's cycles to:
Centralised IT - Mainframe and green screen dumb terminal
Client Server IT - Midrange UNIX boxes and desktop PC's. Custom built GUI Applications.
Distributed IT - n-Tier applications, middleware, web and application servers and browser delivered applications.
The problem with this model is that every time we've added something we've also taken away.
Green screens were great for data entry but try viewing a BI dashboard on one. GUI Apps were much prettier than browser apps but required all those installation and network overheads. Browsers are great for distribution and access from anywhere but often have a woeful user interface.
I've been waiting for the last 10 years for the next cycle to arrive. One candidate was the advent of Rich Internet Applications like Ajax, Flash or Curl but frankly I'm still waiting for these to arrive in Enterprise IT.
If the iPad get's picked up for the Enterprise them maybe all that will change and well start to see Apps being developed that are user friendly yet easy on the infrastructure. Isn't that a little bit like a return to the Client Server model.
Strangely Enterprise IT has it's cycles to:
Centralised IT - Mainframe and green screen dumb terminal
Client Server IT - Midrange UNIX boxes and desktop PC's. Custom built GUI Applications.
Distributed IT - n-Tier applications, middleware, web and application servers and browser delivered applications.
The problem with this model is that every time we've added something we've also taken away.
Green screens were great for data entry but try viewing a BI dashboard on one. GUI Apps were much prettier than browser apps but required all those installation and network overheads. Browsers are great for distribution and access from anywhere but often have a woeful user interface.
I've been waiting for the last 10 years for the next cycle to arrive. One candidate was the advent of Rich Internet Applications like Ajax, Flash or Curl but frankly I'm still waiting for these to arrive in Enterprise IT.
If the iPad get's picked up for the Enterprise them maybe all that will change and well start to see Apps being developed that are user friendly yet easy on the infrastructure. Isn't that a little bit like a return to the Client Server model.
Netbook Blues
Here at the Department of Hopes and Dreams one of the key strategic initiatives underway at present is Kevin Rudd's Digital Education Revolution (DER). In essence this plan involves providing all teenage school students with a netbook computer. Over time elements of the curriculum will hook into the equipment transforming the education experience. I'm no expert in this field but even I can see the potential here so in one sense hats off to Kevin and Julia for the program. More about that later.
A few years ago I gave a hand me down PC to my Mum as she showed interest in getting on the internet. Despite my Mum attending a few computer courses she's still a technophobe at heart meaning that the PC is effectively used for solitaire, skype, e-mail and a little bit of internet. As we now live 9,000 miles apart Skype Video has become by far the most important of these applications.
A few months ago she got a virus ultimately requireing a complete PC reinstall, a new ISP software CD, etc. Long story short is that the outage lasted about 3 weeks and was only fixed by using the services of a PC repair man.
Now back to the DER. Here in NSW some 500 Technical Support Officers (TSO's) - effectively glorified PC repair men - have been hired to support the program. Maybe it would have been better just to pick a better device than an underpowered PC running windows that wouldn't crash so much - say an iPad or a Google Tablet running Chrome perhaps?
Speaking of which, when Steve gets round to putting a front camera in the iPad I figure it'll be just the deveice for my technophobic Mum.
A few years ago I gave a hand me down PC to my Mum as she showed interest in getting on the internet. Despite my Mum attending a few computer courses she's still a technophobe at heart meaning that the PC is effectively used for solitaire, skype, e-mail and a little bit of internet. As we now live 9,000 miles apart Skype Video has become by far the most important of these applications.
A few months ago she got a virus ultimately requireing a complete PC reinstall, a new ISP software CD, etc. Long story short is that the outage lasted about 3 weeks and was only fixed by using the services of a PC repair man.
Now back to the DER. Here in NSW some 500 Technical Support Officers (TSO's) - effectively glorified PC repair men - have been hired to support the program. Maybe it would have been better just to pick a better device than an underpowered PC running windows that wouldn't crash so much - say an iPad or a Google Tablet running Chrome perhaps?
Speaking of which, when Steve gets round to putting a front camera in the iPad I figure it'll be just the deveice for my technophobic Mum.
Wednesday, February 3, 2010
The next iPhone Killer please stand up
In the movie 'The Right Stuff' there's a scene where the wannabe test pilots enviously discuss Chuck Yeager and how every time a challenger comes along he just suits up and pushes the flight envelope a little bit further.
I think the following list of phones must feel a little bit like those wannabes as, at product launch, each one has been dubbed as a potential iPhone killer:
- Nokia N95
- T-mobile G1
- Blackberry Storm
- HTC Magic
- Palm Pre
- HTC Hero
- Nokia N97
- Motorola Driod
- Google Nexus One
The important thing to understand in their own way each of these phones has had something to offer that was better than the equivalent iPhone it was pitched against, whether that be a improved camera, physical keypad, multitasking, push email, etc. All have tried yet so far none have even come close to usurping Cupertinos touchscreen miracle. You can speculate yourself as to the reasons why.
What must be particularly exasperating for their competitors is how little Apple has actually had to do to stay at the top of the pyramid. In the 3 years since its launch we've seen one case redesign, a couple of processor speed bumps, extra storage, a slight camera improvement and evolutionary software improvements. That's it.
They always say that true greats make everything look simple. I guess you could say the same for Apple and the iPhone.
I think the following list of phones must feel a little bit like those wannabes as, at product launch, each one has been dubbed as a potential iPhone killer:
- Nokia N95
- T-mobile G1
- Blackberry Storm
- HTC Magic
- Palm Pre
- HTC Hero
- Nokia N97
- Motorola Driod
- Google Nexus One
The important thing to understand in their own way each of these phones has had something to offer that was better than the equivalent iPhone it was pitched against, whether that be a improved camera, physical keypad, multitasking, push email, etc. All have tried yet so far none have even come close to usurping Cupertinos touchscreen miracle. You can speculate yourself as to the reasons why.
What must be particularly exasperating for their competitors is how little Apple has actually had to do to stay at the top of the pyramid. In the 3 years since its launch we've seen one case redesign, a couple of processor speed bumps, extra storage, a slight camera improvement and evolutionary software improvements. That's it.
They always say that true greats make everything look simple. I guess you could say the same for Apple and the iPhone.
The Price of Everything, The Value of Nothing
Back in the days when I wore a proferssional services hat the T&M (time and materials) contract was prevalent. T&M was good for a consultancy as it loaded most of the the risk on the client. I also think it did a good job of focussing the client minds as the last thing they wanted was expensive consultants sitting idle whilst they provaracated over making a decision.
Over time clients more often moved to a Fixed Price model for their consulting engagements which helped control their costs. It also shifted the risk to the consulting firm and for that I know that my consultancy would add a margin of 30% to offset that risk. What the clients failed to understand in the shift from T&M to Fixed Price was how that would change the mindset of the consultancies that they engaged.
Back in the T&M world we really focussed on doing things right. We knew we were expensive and we knew we had deadlinesto meet. I can honestly say that when I worked on site I believe I acted in the clients best interests. The problem with Fixed Price is that the consultancy focusses on doing as little as possible to cover its contractual obligations.
Hence the client may think they're getting a better deal by fixing the price - but you know what - I'm pretty sure they haven't figured out how to measure whether they're getting good value or not.
Over time clients more often moved to a Fixed Price model for their consulting engagements which helped control their costs. It also shifted the risk to the consulting firm and for that I know that my consultancy would add a margin of 30% to offset that risk. What the clients failed to understand in the shift from T&M to Fixed Price was how that would change the mindset of the consultancies that they engaged.
Back in the T&M world we really focussed on doing things right. We knew we were expensive and we knew we had deadlinesto meet. I can honestly say that when I worked on site I believe I acted in the clients best interests. The problem with Fixed Price is that the consultancy focusses on doing as little as possible to cover its contractual obligations.
Hence the client may think they're getting a better deal by fixing the price - but you know what - I'm pretty sure they haven't figured out how to measure whether they're getting good value or not.
The MobCon War
A little over a month ago in my post 'The IT Wars' I decried the lack of a current IT War. Like it or not wars are the engine of innovation hence the old quote that in 300 years of peace the Swiss invented the Cuckoo Clock whilst in the Second World War the protagonists invented the Radar, Rockets, the Jet engine and the Atom bomb.
My earlier assessment was that the following IT Wars had been fought and the victors had prospered.
The desktop wars: Microsoft Windows vs IBM OS/2
The server wars: Mainframe vs midrange
The RDBMS wars: Oracle vs Ingres
The browser wars: Internet Explorer vs Netscape
The Search Engine Wars: Google vs Yahoo
In my defence the focus of my original post was commercial IT but rereading the post in light of the second reading, and also that fact that on occasion my blog has strayed into the MobCon space I figured it was time to put the record straight and comment on the current war at hand.
To paraphrase the Excapite blog: The MobCon Wars. If you want to understand the MobCon I'd recommend some essential reading at http://excapite.wordpress.com/
There are lots of protagonists originating from the technology, telcos and media fields but I believe that this war will be fought most bitterly between Apple and Google. Up until a couple of years ago these two tech giants seemed to be peacefully co-existing and focussed more on attacking Redmond. Eric Schmidt even had a seat at the Apple high table. There are even rumours that the two companies has a no poaching of staff agreement.
Judging by the recent second hand quotes attributed to Steve Jobs in a recent Town Hall meeting it would seem that relations are somewhat strained between the two tech goliaths. He appears to be somewhat aggrieved that the Mountain View mob have strayed onto his turf with the Nexus One and potential Chrome based tablet whilst Apple to date have stayed out of the Search Engine field. Nothing like a bit of siege mentality to get the minds focussed and steel your troops for combat.
You can place your bets on the eventual winner but if I was a betting man I'd have to put my money on Apple. After all they're a company I choose to spend my hard earned cash with. When's the last time you bought anything from Google?
My earlier assessment was that the following IT Wars had been fought and the victors had prospered.
The desktop wars: Microsoft Windows vs IBM OS/2
The server wars: Mainframe vs midrange
The RDBMS wars: Oracle vs Ingres
The browser wars: Internet Explorer vs Netscape
The Search Engine Wars: Google vs Yahoo
In my defence the focus of my original post was commercial IT but rereading the post in light of the second reading, and also that fact that on occasion my blog has strayed into the MobCon space I figured it was time to put the record straight and comment on the current war at hand.
To paraphrase the Excapite blog: The MobCon Wars. If you want to understand the MobCon I'd recommend some essential reading at http://excapite.wordpress.com/
There are lots of protagonists originating from the technology, telcos and media fields but I believe that this war will be fought most bitterly between Apple and Google. Up until a couple of years ago these two tech giants seemed to be peacefully co-existing and focussed more on attacking Redmond. Eric Schmidt even had a seat at the Apple high table. There are even rumours that the two companies has a no poaching of staff agreement.
Judging by the recent second hand quotes attributed to Steve Jobs in a recent Town Hall meeting it would seem that relations are somewhat strained between the two tech goliaths. He appears to be somewhat aggrieved that the Mountain View mob have strayed onto his turf with the Nexus One and potential Chrome based tablet whilst Apple to date have stayed out of the Search Engine field. Nothing like a bit of siege mentality to get the minds focussed and steel your troops for combat.
You can place your bets on the eventual winner but if I was a betting man I'd have to put my money on Apple. After all they're a company I choose to spend my hard earned cash with. When's the last time you bought anything from Google?
Tuesday, February 2, 2010
ROLAP smolap
One of the first ROLAP tools that I came across was Oracle's Discoverer product. As one of Larry's consultants I lead a Data Warehouse Team that delivered our reports using it when it was a brand new product. So new in fact that the client didn't realise that the paint hadn't dried on it and it was actually pre production software. They assumed that Discoverer 3.0 had been preceeded by versions 1.0 and 2.0. and there's another story in there about trusting Oracle Sales and Marketing, but I digress.
Some 12 years later I came across Oracle Discoverer again. To my suprise very little appears to have changed. The EUL and full client looked almost identical. I'm sure that under the hood there have been some changes for intranet, pdf's and web delivery but I'm still a bit amazed about the lack of innovation in the ROLAP world.
Business Objects finally seem to be getting things together with BOXI R3 and I have to admit that I haven't seen Cognos's stuff for a while so for them I can't comment.
The only real innovation I've seen in the last few years was ProClarity, before they were swallowed up by Microsoft, but that's OLAP and not ROLAP offering.
A few years ago I started to believe that the Reporting tools were stagnating and that cubes - whether OLAP or ROLAP weren't the answer. I hoped that the move into RIA (Rich Internet Applications) and tools like Curl would fill that gap but as yet nothing seems to have developed there.
Maybe new platforms like the iPhone and more importantly the iPad will spur the sleeping Reporting giants into a new series of innovation. I hope so.
Some 12 years later I came across Oracle Discoverer again. To my suprise very little appears to have changed. The EUL and full client looked almost identical. I'm sure that under the hood there have been some changes for intranet, pdf's and web delivery but I'm still a bit amazed about the lack of innovation in the ROLAP world.
Business Objects finally seem to be getting things together with BOXI R3 and I have to admit that I haven't seen Cognos's stuff for a while so for them I can't comment.
The only real innovation I've seen in the last few years was ProClarity, before they were swallowed up by Microsoft, but that's OLAP and not ROLAP offering.
A few years ago I started to believe that the Reporting tools were stagnating and that cubes - whether OLAP or ROLAP weren't the answer. I hoped that the move into RIA (Rich Internet Applications) and tools like Curl would fill that gap but as yet nothing seems to have developed there.
Maybe new platforms like the iPhone and more importantly the iPad will spur the sleeping Reporting giants into a new series of innovation. I hope so.
Labels:
BI,
Business Objects,
Curl,
Data Warehousing,
iPad,
iPhone,
Microsoft,
OLAP,
Oracle,
ROLAP
Apple's BusinessAppStore
With the launch of the iPad, Apple will have three content based online stores. These are:
- iTunes
- AppStore
- iBookstore
What I'm wondering today is whether they need a fourth store dedicated for Business? What I mean by that is that could be potentially hundreds of thousands of business apps that companies might want to deliver internally but not make available to the world at large.
For example, what if I wanted to develop a front end dashboard application for my EIS? As I've already stated in a previous post we are being pressured to deliver pdf reports to our CFO's iPhone which breaks our Warehouse Security model. The potential could be enormous but then so would be the challenges.
For a start - how would Apple charge for a BusinessAppStore. Currently they take 30% of the revenue from all paid iPhone Apps. Security would also be an issue to, of course. But I think the idea has legs.
Just out of interest I noted the other day that there was a SAP Business Objects App for the iPhone so it's obvious that not only Game Developers see interest in Apples devices in a business context.
Expand this concept wider and hook it into Apple's cloud platform, MobileMe, and we really could see something of interest.
- iTunes
- AppStore
- iBookstore
What I'm wondering today is whether they need a fourth store dedicated for Business? What I mean by that is that could be potentially hundreds of thousands of business apps that companies might want to deliver internally but not make available to the world at large.
For example, what if I wanted to develop a front end dashboard application for my EIS? As I've already stated in a previous post we are being pressured to deliver pdf reports to our CFO's iPhone which breaks our Warehouse Security model. The potential could be enormous but then so would be the challenges.
For a start - how would Apple charge for a BusinessAppStore. Currently they take 30% of the revenue from all paid iPhone Apps. Security would also be an issue to, of course. But I think the idea has legs.
Just out of interest I noted the other day that there was a SAP Business Objects App for the iPhone so it's obvious that not only Game Developers see interest in Apples devices in a business context.
Expand this concept wider and hook it into Apple's cloud platform, MobileMe, and we really could see something of interest.
Labels:
Apple,
AppStore,
Cloud Computing,
iBookstore,
iPad,
iPhone,
iTunes,
MobileMe
My Online Doppelgänger
It seems that a few months after I started my IT Journeyman blog that I have an online doppelgänger. That's OK because I'm not the jealous type.
Having skimmed said blog the following post http://www.itjourneyman.com/2010/01/16/data-warehouse-2nd-time-is-a-charm caught my interest and it's essentially a rehash of a few white papers on "pitfalls/mistakes to avoid when building data warehouses". The long and the short of the post is that your first data warehouse will be a failure but don't worry because the second one will learn from those lessons and succeed.
I love to say that this was true but imho its just not that simple. In my travels I've worked on first stab data warehouses that have been blinding successes and also third tries that have had no more luck than their predecessors.
There are lots of elements that go into making a data warehouse project succeed or fail and often the initial expectation setting exercise is crucial. We have to be very careful in determining the criteria of what makes a data warehouse work and what doesn't.
It's a bit like marriage and divorce. Most people would assume that definition a fifty year marriage must have succeeded - but what if the husband and wife were at each others throats for the duration. Likewise divorce after 10 years is seen as failure but what if you've produced a couple of wonderful and well adjusted kids and went your own way amicably. Expectation is everything.
What I can say is that in my experience Data Warehouse projects are difficult and that's why I choose to work in that field and not implemeting somebody elses off the shelf package.
Data Warehouse Projects are voyages of discovery and it's what we learn along the way and not necesarily where we end up that's really important. The problem is that most organisations and most PM's just don't understand that yet.
Having skimmed said blog the following post http://www.itjourneyman.com/2010/01/16/data-warehouse-2nd-time-is-a-charm caught my interest and it's essentially a rehash of a few white papers on "pitfalls/mistakes to avoid when building data warehouses". The long and the short of the post is that your first data warehouse will be a failure but don't worry because the second one will learn from those lessons and succeed.
I love to say that this was true but imho its just not that simple. In my travels I've worked on first stab data warehouses that have been blinding successes and also third tries that have had no more luck than their predecessors.
There are lots of elements that go into making a data warehouse project succeed or fail and often the initial expectation setting exercise is crucial. We have to be very careful in determining the criteria of what makes a data warehouse work and what doesn't.
It's a bit like marriage and divorce. Most people would assume that definition a fifty year marriage must have succeeded - but what if the husband and wife were at each others throats for the duration. Likewise divorce after 10 years is seen as failure but what if you've produced a couple of wonderful and well adjusted kids and went your own way amicably. Expectation is everything.
What I can say is that in my experience Data Warehouse projects are difficult and that's why I choose to work in that field and not implemeting somebody elses off the shelf package.
Data Warehouse Projects are voyages of discovery and it's what we learn along the way and not necesarily where we end up that's really important. The problem is that most organisations and most PM's just don't understand that yet.
Labels:
BI,
Data Warehousing,
IT,
Project Manager,
Software Development
7x24
If you've been around in IT for a while you've probably come across the term 7x24 meaning 100% system uptime.
I was once employed in London by a Investment Bank as a DBA where we were developing a mission critical global options trading system. Luckily the data volumes were small, the servers and environments stable and I'd had plenty of time to work through a reliable hot standby failover solution with an excellent UNIX Sysadm. All was good in my world.
Then during the preparations for go-live the topic of Availability arose. The Project Manager threw into the mix that we had to guarantee 7x24 availability.
My response was that we could aim for 100% uptime excluding planned outages but that we couldn't guarantee it. This resulted in a bit of table-thumping, as was quite often in IT projects in an Investment Bank.
Suffice it to say that when I explained the costs and complexities involved in guaranteeing high that availability from a solutions side and the human side it the PM became a bit more reasonable, especially when I threw in the fact that neither Scott McNealy nor Larry Ellison could guarantee 100% uptime on the configuration of Solaris and Oracle that the solution was constructed.
So the lesson is that before you start discussing High Availability the metric that needs to be understood is the actual cost, either in dollars or reputation, to your business of the mission critical app being unavailable. Until you have that there's really no point in discussing the HA requirements of the system. The funny thing is that when I was consulting I designed lots of Technical Architectures and never once could I get that fact out of the client.
I was once employed in London by a Investment Bank as a DBA where we were developing a mission critical global options trading system. Luckily the data volumes were small, the servers and environments stable and I'd had plenty of time to work through a reliable hot standby failover solution with an excellent UNIX Sysadm. All was good in my world.
Then during the preparations for go-live the topic of Availability arose. The Project Manager threw into the mix that we had to guarantee 7x24 availability.
My response was that we could aim for 100% uptime excluding planned outages but that we couldn't guarantee it. This resulted in a bit of table-thumping, as was quite often in IT projects in an Investment Bank.
Suffice it to say that when I explained the costs and complexities involved in guaranteeing high that availability from a solutions side and the human side it the PM became a bit more reasonable, especially when I threw in the fact that neither Scott McNealy nor Larry Ellison could guarantee 100% uptime on the configuration of Solaris and Oracle that the solution was constructed.
So the lesson is that before you start discussing High Availability the metric that needs to be understood is the actual cost, either in dollars or reputation, to your business of the mission critical app being unavailable. Until you have that there's really no point in discussing the HA requirements of the system. The funny thing is that when I was consulting I designed lots of Technical Architectures and never once could I get that fact out of the client.
Kindle Surprise
Today I saw my second Kindle on my commute home. Unlike my first encounter I didn't feel that it was a LOL moment, but neither did I come away with any sense of envy regarding the device. I probably categorise it as an interesting piece of technology but one that I will pass on.
Monday, February 1, 2010
Assisting the Police with their inquiries
Back in 1998 I was doing some Pre-Sales Consulting for an Account Manager trying to sell a Data Warehouse solution to a local state police force. I badgered the salesman to let me use the above title as a tagline on the demo but unsurprisingly he didn't see the funny side.
During the demo the thorny question of Metadata came up. More precisely - Consolidated Metedata. As I'd just come off a project where I'd defined the Metadata Architecture and Solution I was well qualified to answer the query.
At the time we had three sources of metadata for our solution. These were:
- The Database Data Dictionary
- The CASE/Data Modeling tool in use
- The ROLAP Semantic Layer
Note that this we didn't use an ETL product that would have been a fourth source of Metadata.
Now the interesting thing here is that all the software was written by the same company in the same software labs so one would hope that some level of shared metadata would be possible. Alas no. Not only did the metadata in each repository overlap but there was no easy way of combining it into a single source of consolidated metadata repository.
I answered the question honestly that nobody had a good story here, not us nor our competition. I think the client appreciated my honesty here. The account manager obviously not wanting to leave a bad impression did what all account managers are prone to do and started promising vaporware with some cock and bull story about the software labs in California working on that problem.
The interesting thing is that here we are over a decade later and I've still to see a good answer to this problem.
During the demo the thorny question of Metadata came up. More precisely - Consolidated Metedata. As I'd just come off a project where I'd defined the Metadata Architecture and Solution I was well qualified to answer the query.
At the time we had three sources of metadata for our solution. These were:
- The Database Data Dictionary
- The CASE/Data Modeling tool in use
- The ROLAP Semantic Layer
Note that this we didn't use an ETL product that would have been a fourth source of Metadata.
Now the interesting thing here is that all the software was written by the same company in the same software labs so one would hope that some level of shared metadata would be possible. Alas no. Not only did the metadata in each repository overlap but there was no easy way of combining it into a single source of consolidated metadata repository.
I answered the question honestly that nobody had a good story here, not us nor our competition. I think the client appreciated my honesty here. The account manager obviously not wanting to leave a bad impression did what all account managers are prone to do and started promising vaporware with some cock and bull story about the software labs in California working on that problem.
The interesting thing is that here we are over a decade later and I've still to see a good answer to this problem.
Taking the Mountain to Mohammed
I've been working in the field of Data Warehousing for some 13 years now. Actually my first every data warehouse was a Reporting System I did back in 1992 long before I'd ever heard the terms DW & BI but that's another story.
The interesting thing that, so far, has been a constant in all that time, no matter what style of Data Warehouse (from full blown Inmon Corporate Information Factory to Kimball Federated Data Marts), is that we extract data from source systems and move it and load it into a data warehouse (be it an EDW, Data Mart, ODS, RDS, whatever). We'll use terminology like ETL, OLAP, ROLAP, Cubes, Star Schemas, Metadata, Slowly Changing Dimensions, etc. along the way to baffle the business and make ourselves seem clever but fundamentally any data warehouse or data mart involves moving data from a source system into target reporting system.
Back in the 90's this made perfect sense because it was inconceivable that we could slap resource consuming queries on reports against the mission critical core business systems.
Nowadays that just not the case. There are many technical solutions out there that could enable us to place a large and significant batch query and reporting load against our production data that would have zero impact on the core business systems. Technologies that spring to mind include Server Virtualisation, Disk Replication and Mirroring, O/S and Database Parallel Server technologies, etc.
The question is why don't we employ these technologies? I suspect that in the field of DW & BI we're in a stuck in a Kimball or Inmon rut and that for the time being we will continue to Take the Mountain to Mohammed.
Ah, but what about history I hear you ask? Well yes it's true that we often capture history in the data warehouse that we cannot keep in our online systems but often the need and justification for history is overstated. Besides another way in which we could keep all the history we'd ever need (and we probably already do this to some degree anyway) is to ensure that all PDF reports that are produced are kept online in some fashion. There are alternatives if we are creative.
Maybe within the decade well see a shift away from this and let Mohammed walk to the mountain for a change.
The interesting thing that, so far, has been a constant in all that time, no matter what style of Data Warehouse (from full blown Inmon Corporate Information Factory to Kimball Federated Data Marts), is that we extract data from source systems and move it and load it into a data warehouse (be it an EDW, Data Mart, ODS, RDS, whatever). We'll use terminology like ETL, OLAP, ROLAP, Cubes, Star Schemas, Metadata, Slowly Changing Dimensions, etc. along the way to baffle the business and make ourselves seem clever but fundamentally any data warehouse or data mart involves moving data from a source system into target reporting system.
Back in the 90's this made perfect sense because it was inconceivable that we could slap resource consuming queries on reports against the mission critical core business systems.
Nowadays that just not the case. There are many technical solutions out there that could enable us to place a large and significant batch query and reporting load against our production data that would have zero impact on the core business systems. Technologies that spring to mind include Server Virtualisation, Disk Replication and Mirroring, O/S and Database Parallel Server technologies, etc.
The question is why don't we employ these technologies? I suspect that in the field of DW & BI we're in a stuck in a Kimball or Inmon rut and that for the time being we will continue to Take the Mountain to Mohammed.
Ah, but what about history I hear you ask? Well yes it's true that we often capture history in the data warehouse that we cannot keep in our online systems but often the need and justification for history is overstated. Besides another way in which we could keep all the history we'd ever need (and we probably already do this to some degree anyway) is to ensure that all PDF reports that are produced are kept online in some fashion. There are alternatives if we are creative.
Maybe within the decade well see a shift away from this and let Mohammed walk to the mountain for a change.
Where is the Henry Ford of IT?
I can only imagine that Henry Ford was an amazing man. His invention of the production line is probably the greatest commercial achievement of the 20th Century. Here's my question for today - Does IT need its own Henry Ford?
If we compare the production line analagy to IT development we will traditionally end up with something based upon the SDLC (System Development Lifecycle). That's fine as far as it goes but as always the devil is in the detail. In all my 20 years industry experience I've never been able to use the same development process unchanged between sites. Think about that for a moment. Every IT Development Team has had different processes, standards, tollgates, etc. but we're all effectively doing the same thing. I should be able to move from one job to another and technology aside be instantly productive. However, as a new developer to a site we spend longer dancing our way through the process minefield than we ever do in writing code. Something just isn't right there.
If we compare the production line analagy to IT development we will traditionally end up with something based upon the SDLC (System Development Lifecycle). That's fine as far as it goes but as always the devil is in the detail. In all my 20 years industry experience I've never been able to use the same development process unchanged between sites. Think about that for a moment. Every IT Development Team has had different processes, standards, tollgates, etc. but we're all effectively doing the same thing. I should be able to move from one job to another and technology aside be instantly productive. However, as a new developer to a site we spend longer dancing our way through the process minefield than we ever do in writing code. Something just isn't right there.
Rancid Aluminium2: 26 billion smackers down the gurgler
In a recent report it was stated that the UK Govt under NuLabor had wasted about GBP26bn on failed IT initiatives. That's about $2bn for every year in office with an ROI of zero. Just think about that for a moment?
Maybe some of that money was wasted on building 1,700 websites of which only 431 will remain by the end of 2010 after recommendations that most be culled in a recent audit.
Meanwhile James Cameron spends about GBP180mn making 'Avatar' over 4 years, whilst pioneering new technology and gets an ROI of over GBP1bn in less than three months.
Honestly, the UK would have been wiser investing this money in Cameron's Lightstorm and Peter Jackson's WETA and conservatively could have made a profit of GBP100bn which is half the money that the BofE has printed with its policy of Quantative Easing to bail out the banks.
OK it doesn't work like that and we know that when UK Goverment money finds it's way into the arts (via Lottery funding) we end up with films like 'Rancid Aluminium' and not Cameron's smash hit.
What puzzles me is that the government still tries to run large IT projects anymore because everyone know that it's just a licence for government approved suppliers to print money.
I don't know how much the US has spent on intelligence related IT projects post 911 but what I do know is that they failed to stop a known terrorist suspect from boarding a flight on Christmas Day.
So what's my point?
Over the last decade I've had the pleasure to work with two managers who successfully defined how they would structure and govern large projects in order to avoid the wastage so profligate in government IT spend.
The first even wrote a thesis about how large projects are inherently more difficult and risky to land than smaller ones. The second built an IT governance framework that consisted of a few simple groundrules:
- all projects to be sponsored by the business without exception
- no project to last more than 9 months. Any piece of work identified larger than this would be broken into phases less than 9 months in duration.
- no project to cost more than GBP2m
Sounds simple and yes it works, but what about when you need to do the big projects? Well I guess we probably need Project Managers of the calibre of James Cameron for that otherwise you're better off saving your pennies for a bailing out a bank or two.
Maybe some of that money was wasted on building 1,700 websites of which only 431 will remain by the end of 2010 after recommendations that most be culled in a recent audit.
Meanwhile James Cameron spends about GBP180mn making 'Avatar' over 4 years, whilst pioneering new technology and gets an ROI of over GBP1bn in less than three months.
Honestly, the UK would have been wiser investing this money in Cameron's Lightstorm and Peter Jackson's WETA and conservatively could have made a profit of GBP100bn which is half the money that the BofE has printed with its policy of Quantative Easing to bail out the banks.
OK it doesn't work like that and we know that when UK Goverment money finds it's way into the arts (via Lottery funding) we end up with films like 'Rancid Aluminium' and not Cameron's smash hit.
What puzzles me is that the government still tries to run large IT projects anymore because everyone know that it's just a licence for government approved suppliers to print money.
I don't know how much the US has spent on intelligence related IT projects post 911 but what I do know is that they failed to stop a known terrorist suspect from boarding a flight on Christmas Day.
So what's my point?
Over the last decade I've had the pleasure to work with two managers who successfully defined how they would structure and govern large projects in order to avoid the wastage so profligate in government IT spend.
The first even wrote a thesis about how large projects are inherently more difficult and risky to land than smaller ones. The second built an IT governance framework that consisted of a few simple groundrules:
- all projects to be sponsored by the business without exception
- no project to last more than 9 months. Any piece of work identified larger than this would be broken into phases less than 9 months in duration.
- no project to cost more than GBP2m
Sounds simple and yes it works, but what about when you need to do the big projects? Well I guess we probably need Project Managers of the calibre of James Cameron for that otherwise you're better off saving your pennies for a bailing out a bank or two.
Subscribe to:
Posts (Atom)